http://jr-management.com.au/wp-content/uploads/2015/12/
I want to download every picture from this directory, and a few others as well, and down them all is skipping 99% of the images behind the links. How can I quickly save every picture??
1. you shouldve googled
2. you shoulve posted this in sqt
3. wget -r --no-parent http://jr-management.com.au/wp-content/uploads/2015/12/
Really op? It's the first thing on Google. Less than 10 seconds to see the command on stack overflow. Less time than it took you to create this thread and read the reply. I think you're just that stupid, just download everything manually.
>>57343836
I have no idea what wget-r-pico means
>>57343956
and therefore stack overflow is useless
>>57344008
Open Terminal.
Press here, and type terminal. Then click enter.
type:
mkdir wget
cd wget
wget -r --no-parent http://jr-management.com.au/wp-content/uploads/2015/12/
then go to the user folder, youll see a wget folder which will contain all that.
>>57344008
How is it useless? They usually explain the stuff they write. Don't be such a hypocrite and accept all the commands on 4chan. the wget solution will work, but be careful what you type.
>>57343804
hows the drop shipping business OP
Do you even curl bro
>>57343804
>mirror of website
HTTrack
Not him but how does wget know to open each of the links on the website and save the page instead of just saving the first page and not going into the links? Also how does it know when it's suppose to save a picture instead of a text file or something. Is it all undercover inside of the wget function?