A Cloudflare-Shaped Hole

It's been almost a year since Cloudflare Pages and Workers were blocked for me. I thought I was fine, but it turned out that all the alternatives lacked speed, which I really care about. So today we're going to test many Workers alternatives and hopefully find a good one. Methodology: time to serve "hi" (NOT STATIC) from the nearest free server to my home network.

Why is Cloudflare blocked? Because it became a common place to host "games", so now I have to find more niche places.

Why do I need servers?

Classical cloud (VM/containers)

GCP

It's a bit legacy but I can work with it.

Latency to running VM is 40-50ms (e2-micro in us-west1).

Latency to code (Cloud Run Function) is higher, like 60-80ms. When cold it gets as high as 2.2 seconds, although this might be better than a VM. Still us-west1.

I won't be using Cloud Run since it requires building the container on device and using someone else's container registry (👎), but the demo container (serves more than hi) is also 60-80ms.

Azure

Surely the company from Redmond will be friendly and fast. Surely.

Well, I've used Azure before, but it seems I can't anymore. My school Azure account errors with "you might be signing in from a browser, app, or location that is restricted by your admin", and when I tried to verify my personal account (in case they now want you to use personal accounts for student benefits?) the benefits never came and trying again just errored.

I can tell you this, from the services I deployed there already, though: the latency is also bad. On Functions, a wrapper around a 500ms API call takes 2 seconds cold and 600-1000 ms warm. That's 100-500ms of latency.

Amazon

I accidentally racked up a $100 AWS bill when I was 10. Never again.

Containers-based

Fly

Fly is the most lovable cloud company there is, and the latency is great: 40ms-60ms.

Code-based

Deno Deploy

Drawbacks: in California and serverless. Means ~200ms latency.

Val Town

Drawbacks: in Ohio and serverless. Means 270-400ms latency.

Fastly

Fastly has a thing where if you make requests in quick succession, they take 10-20ms. In JS, this happens often, although when it doesn't it takes 500ms or so to run. With WASM, it's more of a consistent 60-100ms.

Baselines (currently blocked)

Cloudflare Workers

80-100ms.

Vercel

Vercel's the fastest... just for static assets? Their home page is 3x faster than Cloudflare's, but their Functions take 160-180 ms by default and 150-170ms when the region is manually configured.

Conclusion

VMs and containers mean complicated off/on management, containers mean complicated abstractions, and serverless can mean taking a penalty for every request. But I'm walking away with a plan.

GCP VM 45ms average latency
Fly 50ms average latency
GCP CRF 70ms average latency
Fastly 70ms average latency
Cloudflare Workers 90ms average latency
Vercel 160ms average latency
Deno Deploy 200ms average latency
Azure Functions 300ms average latency
Val Town 335ms average latency
vps
container
serverless
serverless blocked

This is a bar chart of the latency of each service I tested. It shows 4 services with better-than-Workers latency, all unblocked, all free, and diverse in function. I'm excited to use them.