Do you ever look at a page and wonder who else is looking at it at the same time, elsewhere in the world?
This mini-app is a POC; a tentative exploration into a non-trivial problem space.
If you've ever seen a shopping app that tells you how many other shoppers are looking at the product you're looking at; it's about that kind of thing but with live updates, a geographically distributed data store, pub-sub architecture that spans multiple datacenters, and ping times (hopefully) measured in tens of milliseconds.
Note: if you're the *only* person looking at this page right now, it'll be a lot less interesting. Ping a friend!
If you click the switch below you can turn the lightbulb on or off. Pretty neat!
The weird part is: *anybody else* who's currently viewing this page from anywhere else in the world can also click the switch...
This mini-app is simultaneously deployed in 6 different regions. Anycast DNS routes incoming requests to the closest datacenter. The closest edge-region your request passed through was Ashburn, Virginia (US) and the closest region the app could be served from was Ashburn, Virginia (US). They might be different regions. That last one is where you "are" currently, from the point of view of this app.
Below is a live-updated overview of all the other users currently viewing this page (per app-region):
You've been assigned a computed ID and a random colour. What? You don't like the colour?! Refresh the page to get a new one.
Anyway, here's what you currently look like from this app's point of view:
Feel free to say hi...
I wasn't sure whether to leave this in or not, but here it is.
Tracking user presence is a bit tricky when a user can be viewing the same page actively in multiple tabs at the same time. It's even more tricky when your app is deployed in multiple regions. Multiple users, with multiple tabs, in multiple regions. There are some fun problems there...
As part of this experiment I ended up trying to keep a live count of the number of tabs each user has open on this page in their current browser. You'll see what I mean if you open another tab or two...
Kinda creepy, right?
Is that it?
What's the point?
The little examples here might seem quite trivial from the user perspective, but the backend that's powering them is the interesting bit, and the point of the POC. A lot of other more interesting/useful/complex features could be built on top of it very easily.
Take the lightswitch example; if a user in Canada clicks the switch, that event will be propagated out to all app-regions and translated into live DOM updates for users in say France and Australia and India, in close to "real-time".
Does it use some kind of polling on the client side?
Nope. That would be really slow and inefficient! :P
What's the stack then?
The underlying web framework is Ruby on Rails
The interactive parts are built with StimulusReflex and it's sister-library CableReady
The multi-region deployments are done with Fly.io
The data store is a geographically distributed multi-active KeyDB setup (think multiregion-Redis-on-steroids)
How does it work?
TLDR; Websockets and pixie dust.
1) Initial interactions from client -> server happen over a Websocket with StimulusReflex via Rails's ActionCable.
2) From there, event-data can be pushed into the distributed KeyDB cache-store for the current region (with very fast writes).
3) Data written to a cache instance in one region is propagated to the cache instances in all other regions (1->N) with eventual consistency (it's fast though!). Or to put it another way; automatic cross-region replication in a mesh topology where each node is both "active" and a "replica".
4) There's a custom pub-sub implementation on top of that, which listens for *keychange events* in each KeyDB instance, and acts on them. This might be incoming "event-data" from another region, for example. Cache keys are customised to include the region of origin.
5) That's then hooked up to CableReady (which allows pushing/broadcasting DOM updates/changes to N clients, from the server side) based on those pub-sub events. In this particular case, the "server-side" in question could be another server on another continent.
That might sound pretty complicated, but TBH the tools and frameworks in the stack make it surprisingly simple!
Source code on Github
Me on The Bird App
Leastbad's Redis Firehose