Follow

What's going on with right now is a good example of why having sponsors doesn't mean you have an infinite budget.

Once the sponsor pulls out - and they eventually will! - it's always the community that's left to pick up the pieces, and so you have an obligation to ensure that the funding burden is minimal.

· · Web · 4 · 2 · 7

@joepie91 We've spent the last decade trying to build a central repository of all the software that has ever existed, instead of making sure packages from different sources could coexist nicely, and this is the result.

@raito @joepie91 Not exactly. There are two things that come to mind:

1. There are a lot of packages in Nixpkgs that don't really form the core part of the software distribution. These wouldn't be part of the main repository of other distributions, and they should probably be packaged separately from Nixpkgs, too. Today, this is enabled by flakes and the flake index, but for more than ten years we worked in the opposite direction.

2. For languages with their own mature package management schemes (e.g. Python, Rust, Haskell, Node) we can derive dependencies as-needed instead of having a central repository. IIRC we do this with Rust and Node already. Do we need a build of every Python package for multiple versions of Python in Nixpkgs? Surely not. Likewise, Haskell packages and compiler versions.

I also strongly agree with the suggestion (made elsewhere) to GC staging builds (at least whenever staging is merged).

@ttuegel @joepie91

1. I'm not convinced by this argument, there's a enormous value of having those in the "nixpkgs repository" because of all the automation. I think the other distribution don't do it because they cannot realistically afford it (or don't see this value). If we can afford it, we should let "limitless" growth of nixpkgs be a thing.

2. If packaging was a solved problem… I'd agree, but for me, this way is the only sane way to consume software.

Agreed @ suggestion.

@ttuegel I mean, I certainly have my opinions about the monolithic nature of nixpkgs, but that is not really the issue at hand here; the problem is more that the binary cache was built using just about the most expensive hosting option available, under the (wrong) assumption that sponsors would pick up the bill forever.

There are many, *many* ways to host the exact same content at a significantly lower cost.

@joepie91 True, nothing I said matters in the short term (1 month), but beyond that, the project needs to change how its costs scale over time. If we cut the hosting costs in half, but costs continue to double every year, then we'll be right back here in 12 months.

@joepie91 I love how it didn't take long for someone to suggest ipfs 🙃

@joepie91 I get this for people power and work stuff but if its infra I am am not too sure anymore. Decentral storage is a solved problem. Static page hosting is also available for basically free. Apart from maybe email infra (because Gmail isn't a solution obviously)

@MTRNord How do you figure that decentralized storage is a "solved problem"?

@joepie91 Well how old is git? Look at Gentoo's package system. It is able to add git based sources to the package manager rather easily. Even Ubuntu and Fedora allow self-hosting package repos (though imho their workflow is less user-friendly than Gentoo's)

But to be fair nix does have something like this with flakes as well now right?

So that was what I was trying to refer to. Specifically, the package management. Rust has a similar issue with crates.io imho.

@MTRNord This is not at all what the issue is about, though; this is about the *binary cache*. Which is a massive collection of historical binary builds.

@joepie91 tangentially related, but this is why I try to keep a small hosting footprint for my personal projects. Scaling down under financial constraints would be painful.
Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.