Follow

Mastodon doesn't fork off a new process or spawn a new thread for every request, but it's darn close to it.

How did the web evolve past this scalability challenge? It didn't necessarily involve buying a faster computer. The developers of the venerable `nginx` web server famously struck first blood when they cracked what they called the "c10k" problem for the first time. (handling 10 thousand simultaneous connections to the same server application).

This happened in the early 2000s, and the nginx server in question was consuming only about 2.5MB of RAM during the load test.

@forestjohnson Mastodon is written in #ruby which is fine when it comes to speed of development but it's far from being performant. Big companies moved already to languages that can easily handle many thousands of connections concurrently using a fraction of system resources that Ruby app would need. Take a look at #rustlang or #golang.

Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.