extracting wikipedia
file roller is next to bat, and I am able to list the /en/ directory within root, but I've been stuck loading the file listing for articles/ with a severe CPU bottleneck and 5GB of disk reads queued up in memory waiting to see Dr. CPU. No more I/O on the disk containing wikipedia for the past... 10 minutes? At least it isn't crashing? A slimegirl can hope.
extracted wikipedia
English is like 247ish GB, but you need about twice that to actually extract any of it at all. Nothing can handle file listings this big.