Follow

:sayori_like: downloading wikipedia
:sayori_dislike: extracting wikipedia

extracting wikipedia 

Process engrampa dumped core, coredump file size on disk 1G.

Engrampa seems to trigger a segfault when giving me file listings of the the contents of the root folder, not even the "articles" directory.

extracting wikipedia 

file roller is next to bat, and I am able to list the /en/ directory within root, but I've been stuck loading the file listing for articles/ with a severe CPU bottleneck and 5GB of disk reads queued up in memory waiting to see Dr. CPU. No more I/O on the disk containing wikipedia for the past... 10 minutes? At least it isn't crashing? A slimegirl can hope.

extracting wikipedia 

So everything graphical uses tar wrong. And just doing it manually works, but I have to have enough disk space to extract the tar from within the 7zip archive as a result. Around 200GB before final extraction, just to get started.

extracted wikipedia 

English is like 247ish GB, but you need about twice that to actually extract any of it at all. Nothing can handle file listings this big.

Sign in to participate in the conversation
Pixietown

Small server part of the pixie.town infrastructure. Registration is closed.