extracting wikipedia
So everything graphical uses tar wrong. And just doing it manually works, but I have to have enough disk space to extract the tar from within the 7zip archive as a result. Around 200GB before final extraction, just to get started.
extracted wikipedia
English is like 247ish GB, but you need about twice that to actually extract any of it at all. Nothing can handle file listings this big.