[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Hello /g Lets make a thread about compression > What Program

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 141
Thread images: 15

File: tar-gzip.png (14KB, 128x128px) Image search: [Google]
tar-gzip.png
14KB, 128x128px
Hello /g
Lets make a thread about compression

> What Program use?
> What algo use and for witch type of file?
> What is the best Algo ?
> What about encrypted archive?

ex:
i like using tar
with
 tar Jcvf archive.tar.bz2 ~ 


For encryption i use gpg-zip
>>
Never heard of gpg-zip
>>
>>62264696
Usage: gpg-zip [--help] [--version] [--encrypt] [--decrypt] [--symmetric]
[--list-archive] [--output FILE] [--gpg GPG] [--gpg-args ARGS]
[--tar TAR] [--tar-args ARGS] filename1 [filename2, ...]
directory1 [directory2, ...]

Encrypt or sign files into an archive.

It's usefull for encrypt an archive without
1 : make the archive
2 : encrypt it
all in one cmd and no wast space disk
>>
>>62264684
>compression: .tar.gz
>encryption: my own program which uses a secure hash function in CTR mode and some group theory for mixing.
>>
xz -9
>>
> compression

PNG, Vorbis, h264, etc.

> archiving

Plain tar or zip with store. No point in wasting CPU time.
>>
>>62264798
>encryption: my own program which uses a secure hash function in CTR mode and some group theory for mixing.

you have use witch language?
>>
>>62264943
PHP
>>
>>62264684
>gzip
>scrypt

>>62264929
>PNG, Vorbis, h264, etc.
all lossy

>plain uncompressed archive
I take it you don't pay for bandwidth
>>
>>62264951

Nope. Fortunately. Fuck data caps.
>>
>>62264752
>tar cf - file file file | gpg --armor --encrypt -r Faggot
>>
>>62264958
I don't have a data cap at home either.
Compression does help make my upload speed less awful though.
>>
.zip - best for compatibility generally
.7z - best for decent compression ratio on Windows
.rar - best for (scene) warez (especially on Windoes)
tar.gz - best for compatibility on linux
.gz - best for speed on linux
.Z - best for ancient packages on Unix
.xz - best for compression ratio on linux (xz -9)
.bzip2 - totally useless, never use this
>>
>>62265006
thanks for this =)
>>
>>62265006
Bzip2 would be good if it weren't so slow. The only way to make it useful is to make it multithreaded which apparently changes the output somewhat so that it's not the same as the reference implementation.

I was messing with 7zip's implementation of Bzip2 and found that it compresses images better than LZMA for some reason but other than that LZMA beats it at everything.
>>
zpaq for storing shit, zip/tar/whateverthefuckisavailable for packing multiple files to send them wherever i need
>>
>>62265006
>tar.gz
tar is not compression
>.rar - best for (scene) warez (especially on Windoes)
free unrar implementation works well as does non-free
rar is used in scene for its multipart
>>
>>62265059
>The only way to make it useful is to make it multithreaded


tar -I lbzip2 -cvf archive.tar.bz2
>>
>>62264983
Or you can use --symmetric instead of --encrypt to make use of a passphrase if the recipient does not have access to the public key of user Faggot.
Correct?
>>
>>62265244
The recipient would need access to the private key of user Faggot, but yes, --symmetric makes it use a passphrase instead.
>>
>>62265261
Thanks, and goodnight.
>>
RAR5
>>
>>62265004

Got 100mbit both up and down here. But I do see your point.
>>
there is some case, you paye for bandwidth and compression is interessting here.
>>
tar when i want to bundle
tar.gz if i also want to compress
zip if im using gui
>>
>>62267397
osx =(
>>
>>62264684
7z
LZMA2
>>
>>62264951
>bandwidth
>implying you backup in the clouds®
>>
>>62265006
>no dar
>no zpaq
into the trash it goes
>>
File: 1501031064583.png (509KB, 558x800px) Image search: [Google]
1501031064583.png
509KB, 558x800px
>>62264798
>encryption: my own program which uses a secure hash function in CTR mode and some group theory for mixing.
Wow, someone fell for my meme, nice.
You still need to use a MAC though.
>>
>>62265006
>.zip - best for compatibility generally
This is .gz

>.rar - best for (scene) warez (especially on Windoes)
Bad for anything


>>62265065
The free unrar is total crap.
>>
>>62268673
This^
Top kek
>>
>>62269110
>Wow, someone fell for my meme, nice.
I don't know what you are talking about, I would never take any advice from 4chan. I chose SHA-2 because it can be implemented far more easily than AES, therefore it's less error prone but is still secure enough according to research papers.
>>
>>62269674
>I chose SHA-2 because it can be implemented far more easily than AES
Chacha20 is easier to implement than both, same for BLAKE and Keccak (though more complex than chacha20).
>>
>>62269712
My main goal wasn't to choose the most simple algorithm but one that has gone through thorough cryptoanalysis and is still considered strong. Sponge based algoritms are relatively new and aren't well tested, therefore I will stick with SHA-2 while I can.
>>
>>62269851
>Sponge based algoritms
The sponge construction by itself is provably secure. Also, BLAKE and Chacha20 do not use the sponge construction.

>but one that has gone through thorough cryptoanalysis and is still considered strong
All of the ones that I mentioned fit this.
>>
Why hasn't there been any advancement in compression technology for the past 20 years?
>>
>>62269890
It has
>>
>>62269890
They was.
>>
>>62264684
Idk, whatever comes by default on Ubuntu based Linux distros.
>>
>>62269890
Because you're a fucking retard. Retard.
>>
>>62264684
Honestly, I use 7zip, because that's the one I'm most familiar with, and it supports encryption.
>>
>>62264951
PNG is not lossy, and using lossless video compression is retarded. You only have a lossless video source if you own a high-end camera or if you're recording your screen to begin with. I guess you are right about lossy audio compression being retarded, but that's still justified if you put your music on a device with limited space, such as a smartphone.
>>
>>62265006
pbzip2 is the fastest out of any of them and works great.
>>
>>62269890
There has been some advancement, there are some new image formats such as webp or FLIF that are better at losslessly compressing images. There has also been advancement in the world of lossy compression with things like h.265 or webp (it has a lossy and a lossless variant). There will never be any huge advancement though, because you are fundamentally fucked in the ass by the pigeon hole principle.
>>
>>62264684
tar -cf - files/ | xz -9 | openssl aes-256-cbc -out files.bin
>>
What does /g/ use to optimize their PNG's? zopfli here.
>>
>>62272228
>zopfli
Put a png you optimized with zopfli. I'll losslessly reduce it even more.
>>
File: 1503806646841.png (373KB, 667x832px) Image search: [Google]
1503806646841.png
373KB, 667x832px
>>62272246
>>
>>62264684
If you're compressing your files using Jcvf for arguments your file extension should be .tar.xz
>>
File: out.png (362KB, 667x832px) Image search: [Google]
out.png
362KB, 667x832px
>>62272304
here you go
$ compare -metric MAE 1504646467153.png out.png /dev/null ; echo
0 (0)

I've additionally removed all the pixels with full transparency set, so it's not strictly lossless since you cannot "undelete" pixels hidden beneath 100% transparency. You can view the difference if you
$ convert [input] -alpha off [output]

on both your image and mine. In mine, pixels with 100% transparency had their RGB reset to black.
I could have shrinked it anyway even without this trick. The toolset is a kde-dev script (optimizegraphics) plus pngwolf-zolfi at the end.
>>
>>62272485
>pngwolf-zolfi
*pngwolf-zopfli
>>
File: le pelican.png (2MB, 1920x1080px) Image search: [Google]
le pelican.png
2MB, 1920x1080px
>>62272485
btw, removing everything beneath alpha 0 is the default mode for FLIF
Example. this is an (unoptimized) PNG with hidden info...
>>
File: le lonely FLIF'd pelican.png (104KB, 1920x1080px) Image search: [Google]
le lonely FLIF'd pelican.png
104KB, 1920x1080px
>>62272723
..this is the (unoptimized) output of FLIF after an having encoded the previous image to FLIF.
It's visually identical, but...
>>
File: le not so lonely pelican.png (1MB, 1920x1080px) Image search: [Google]
le not so lonely pelican.png
1MB, 1920x1080px
>>62272752
..this is the first image with Alpha Channel removed...
>>
File: le darkasmysoul pelican.png (90KB, 1920x1080px) Image search: [Google]
le darkasmysoul pelican.png
90KB, 1920x1080px
>>62272770
...and this is the second image (FLIF output) without alpha.
>>
incredible, some minutes have passed and yet not a single "it's not a pelican, it's a seagull" comment
>>
What is the besto way to compress a brunch of linux isos?
>>
>>62272723
>>62272752
>>62272770
>>62272785
This shit terrifies me because I stress over all the PNGs I've compressed that might have had something cool in them.
>>
>>62272446
I think it would be j for bz2 but I'm too lazy to man.
>>62272785
Too bad there's such an atrocious number of dependencies for libsdl2-dev which is required to build the lib for anything that supports viewing flifs.
>>
>>62272881
Sorry I meant to build viewflif.
(libsdl2-dev is a depencency)
>>
Does anyone here remember the 8 chan board dedicated to archiving music, movies, and almost every other type of media ? I fell on it months ago, but I don't remember the name. Basically theses peoples where thinking that most of the content currently accessible online where eventually be taken down because of the upcoming enforcement of copyright laws.
This correlation between this thread and the board I'm talking is that they where also discussing the best ways to compress files, depending of theirs formats.
>>
>>62273161
/eternalarchive/
>>
Thanks, that's the one.
Here's the thread I was talking about :
/eternalarchive/res/263.html
>>
>>62264684
>not using .rar
I bet you didn't even purchase a WinRAR license. Fucking pleb
>>
>>62264684
redpill me on gz vs bz2 vs xz vs lz4 vs lzo
>>
Timely bump for an interesting thread
>>
>>62264684
>For encryption i use gpg-zip
Why not block device encryption?
>>
File: tests.png (7KB, 327x205px) Image search: [Google]
tests.png
7KB, 327x205px
>>62264684
For text, usually bz2 but sometimes xz. For some reason bzip2 beats xz in my experience, but ONLY on text files.
I use gpg for encryption, it has built in support for several compression methods as well.
If you're autistic you can use lrzip which makes for smaller files but is not supported by anything other than itself. Also has zpaq support for supreme autism.
LZO is actually really good, it can compress faster than copying a file but has the worst compression ratio. I use it for when I need to move a large uncompressed file across the internet (to myself).
I don't use much other than those. zip for interacting with normalfags. rar is complete trash, never use it.
For images, I use the following:
>mozjpeg for jpeg
>gifsicle for gif
>optipng for png
Pic related is a test I ran on a large mostly text (I think?) tarball.

>>62273617
>gz
Good compatibility for Linux, most web browsers can open html compressed with gzip
>bz2
Strange middle ground, I only found use for it in text files
>xz
Good compression but takes longer than others
>lz4
Never used it, think I have heard of it
>lzo
Really fast and good for when you're time-limited (either by compression time or upload/download time). I think openvpn uses it.
>>
Does /g/ have any ideas on how to compress ebooks? Optimize PDFs and epubs?
>>
>>62273964
>compress ebooks
>Optimize PDFs
Define "optimizing" please.
qpdf is a tool capable of linearizing and it can perform some lossless optimizations.
ghostscript will entirely rebuild the pdf. This isn't a 100% lossy procedure if the PDF contains jpegs. Plus, there are a few twists about image compression in gs.
exiftool alone may remove some metadata, but you'll have to feed the output to qpdf in order to suppress recovery of the metadata you stripped.
For example, it's possible to linearize and slim down Brian Abraham's pdf in the chinkshite general from 14M to 3.1M with a quick gs+qpdf.
embedded data in embedded jpegs may survive if the pdf isn't entirely reprocessed.
MAT is a tool that (if you compile it from source) can _still_ remove cruft and "anonymize" pdfs; there's a chance it misses something in a defined scenario (which led from a pre-emptive removal of the pdf "anonymization" feature in the MAT version available in the debian repo) and the process isn't lossless.
It's always better to start from the original .ps and from the original images, if any.
for other ebooks, the best route is always to convert to ps and then back in your final format. In the .ps you'll do all the necessary cleaning (most of it would be done from the interpreter/converter itself). Or read MAT's paper and follow a similar approach
For endured compatibility and for archival purposes, PDFA is suggested. A free as in freedom validator is veraPDF.
>>
The reason I use 7z for anything that is not tiny is it compresses a header containing a list of contents. So I don't have to unpack the whole archive just to know what is inside it.
Is there any other tools that can do that? I would also like something that compresses but also adds an option to have some kind of redundancy so if a couple of bits get flipped I can still recover my data.
>>
>>62273791
>mozjpeg
beware that in some older version of mozjpeg, jpegtran's process wasn't entirely lossless. At least, not in -fastcrush runs. Now they fixed it, but it would be nice to perceptually compare (i.e. fetching graphicsmagick/imagemagick compare's output) mozjpeg's results with the source before overwriting the source with mozjpeg's results.
If you're compressing pngs to jpegs, guetzli beats smallfry (jpeg-recompress, algo allegedly used in jpegmini).
>>
>>62274291
>Is there any other tools that can do that?
Is there any other tool that does not do that? Tar does it, unzip does it, zpaq does it
>I would also like something that compresses but also adds an option to have some kind of redundancy so if a couple of bits get flipped I can still recover my data.
par2cmdline
>>
>>62274242
>This isn't a 100% lossy
*This isn't a 100% lossless
>which led from
*which led to
>>
>>62274330
>Tar does it, unzip does it, zpaq does it
I mean keep a header containing the contents separately. Fair enough, tar does that, but if you run it trough gz for example it becomes useless as you have to decompress everything in order to get to it.
Zip does it, but I see no advantage over 7z. The 7z format is far more modern and supports many more compression algorithms.
>zpaq, par2cmdline
Never heard of them, will look it up. Thanks anon.
>>
>>62274363
zpaq is also pretty resistant in case of corruption (i.e. some bit flips won't ruin the entire archive). It's also de-duplicating, incremental and extremely efficient, at the cost of being slow in the compression phase.
one of the problem with compression for archival purposes is that anything that ain't plain zip (or zpaq) will suffer a lot from flipped bits unless you have some parity laying around. rar notoriously can add a "recovery header" but rar isn't free as in freedom and its max compression is lower than 7z or xz + parity.
Note that parity can be added to single archives but better yet to collection of archives. i.e. you can create backup DVDs/BDs with dvdisaster (adds reed-solomon correction codes at the fs level) and parity (at file level). You can create some RAID-alike scenarios with par2cmdline alone. par3 is a proposed upgrade on par2 but it's not ready yet and par2 is rock solid and ancient enough to be considered well-tested.
>>
>>62274461
>plain zip
*plain tar
>>
>>62274363
>>62274461
I read a bit about zpaq and it looks REALLY good. Far more features than I need, but looks much better than 7z in the majority of cases.
Still trying to picture in my mind how can I build a robust backup/archival solution using a combination of snapraid, zpaq, par2cmdline, etc.

I would love to be able to 'merge' various type of media (hd, dvds, cloud, etc) in a single volume and assign the files inside it different levels of "importance" which controls how strongly they should be protected against loss. Also be able to manually assign each file a score on how readily accessible it should be and so on.
Would be real neat to have something like that working, but it would take far too much effort. Probably will settle for something simpler that I can do using existing tools.
>>
>>62274330
This, par2 is good shit, if you're backing up to optical just fill the remaining space with parity data. That way if you scratch the fuck out of the disk or get rot you're still good to go
>>
>>62272196
Bad idea

>no kdf
>cbc
>no mac
>aes
>>
>>62273617
Fast with bad compression to slow with good compression. This is for compression speed only.
lz4
gz
bz2
xz
>>
>>62264684
>witch type of file
>witch
>>
>>62272228
ECT.
It's basically zopfli, but better and way faster. It even works on jpg files (uses mozjpeg's jpegtran I think).
>>
>>62275018
Fuck off fatso
>>
>>62275179
Go drink some bleach, shit stain.
>>
File: _.jpg (150KB, 1200x849px) Image search: [Google]
_.jpg
150KB, 1200x849px
>>62274916
I'd rather use dvdisaster to protect the DVD at the filesystem level and then create a separate DVD with all the parity
e.g. 4 DVDs + 1 DVD containing 25% parity of the others - now you can lose any DVD and still recover everything (RAID4-alike)
another option (more costly, space wise) would be to distribute 4 DVDs in 5DVD+partial parity of the whole array (RAID5-alike); you still can lose up to one DVD but you'll need to add additional parity, so rather than parity=25% you'll need sum(x=1)->∞ 100*(25/100)^x = 33% (and given that you won't increase the size of the dvds, you'll end up will less data saved); adding parity for the whole batch on each medium is less convenient
I'd keep dvdisater in the background because of pic related
BlockHashLoc ( https://github.com/MarcoPon/BlockHashLoc ) may or may not serve a similar purpose

>>62275159
it's a collage of various different tools and it's worse than pngwolf-zopfli iirc.
>>
>>62275465
>so rather than parity=25% you'll need sum(x=1)->∞ 100*(25/100)^x = 33% (and given that you won't increase the size of the dvds, you'll end up will less data saved);
well kek I went retardo while considering a different scenario (parity spread on each media without spreading data over the fifth DVD as well); loss of parity won't impair recovery if you're going to lose 1/5 of the actual data rather than 1/4 (and 25% is sum(x=1)->∞ 100*(20/100)^x = 25%)
time to sleep
>>
>>62267397
>Using macOS
>not using keka
You're doing it wrong
>>
don't you thread on me
>>
Just found out yesterday that tar -xvf will overwrite files with the same path/filename without confirming RIP
>>
>>62264684
ilikecompresionsomuchthatidontevenusecomasorspaceslel
>>
rar is objectively best compression algorithm, though linux fags will never use it cause "muh freedom"
>>
>>62278220
You're making me so angry. STOP!
>>
>>62269674
>SHA-2 because it can be implemented
Are you using a library? If no, why not?

(not that it matters, but if your decryption program may be prone to side-channel attacks if it's remotely accessible - crypto is hard yo)
>>
>>62264684
friendly reminder that your UID and GID names get stored unless you specify
--numeric-owner

>tfw your tar crafted as pedo:loli get investigated by a fellow sysadmin

also, just use
-cavf
and forget all the compression-related syntax
>>
File: 10gb.png (54KB, 651x396px) Image search: [Google]
10gb.png
54KB, 651x396px
>>62264684
Gotta use zpaq. Everything else is for normies.
>>
>>62278220
>objectively best
Horse shit, how can you say this?

It sucks.
>>
>>62279663
If you are using zpaq, you probably want to use lrzip; one of the options it has is to use the zpaq backend with the compression-enhancing prefilter from rzip.
>>
>>62279695
I'll check it out anon.
>>
>>62279695
>compression-enhancing prefilter from rzip
man, you don't really want to enhance zpaq. It already does all the enhancing. lz77 is already in the package.
>>
>>62272827
you don't want to compress something containing compressed data.
>>
>>62279677
it combines great compression with fast work with both compression and extracting. it also doesn't have any limits on archive size like zip's ridiculous 4gb
>>
>>62280095
>>62279663
rar sucks
>>
File: over 9000 hours in pinta.jpg (5KB, 320x240px) Image search: [Google]
over 9000 hours in pinta.jpg
5KB, 320x240px
y u show no love for DAR
>https://en.wikipedia.org/wiki/Dar_(disk_archiver)
>http://dar.linux.free.fr/doc/FAQ.html
>http://dar.linux.free.fr/doc/usage_notes.html
- {differential,incremental,decremental} backup
- _any_ inode/file forks/ACL on any OS
- per-file compression (!) with gzip, bzip2, lzo, xz or lzma. An individual can choose not to compress already compressed files based on their filename suffix.
- Optional Blowfish, Twofish, AES, Serpent, Camellia encryption.
- Optional public key encryption and signature (OpenPGP)
- hash file (md5, sha1 or sha512) generated on-fly for each slice
- no limit in max size of the archive
- can detect corruption in any part of the archive, not just in the headers (slice header, archive header, saved file's data, saved file's EA, table of contents)
- integration with par2/par2cmdline
- full autismo logo
>>
>>62280095
realistically speaking, how often are creating 4GB archives and for which purpose
I just can't get a realistic scenario; e.g. a BD rip won't be losslessly compressed in any meaningful way
>>
>>62269131
is gzip (.gz) the same as zip (.zip)?
>>
>>62279663
>decompression takes about the same time as compression (i.e. forever)
I dunno anon.
>>
>>62280745
They use the same compression algorithm. On individual files they would behave exactly the same. Zip is an archive and compression format while Gzip is just a compression format with no archiving at all. You have to use an archiving format like tar with gzip to make it compress multiple files.
>>
>>62280745
No, it's better.
>>
why does the ancient .tar format (fucking TAPE) still exist? what's wrong with packing folders?
>>
>>62282976
Because of how space is allocated on disk every single file on disk uses anywhere from 512B to 64KB even if the file is empty. With archiving formats you can pack all of these files into one file.
>>
>>62283274
Note: 512B to 64KB is the minimum size not maximum. The filesystem allocates space in blocks of between 512B to 64KB. If a file exceeds 512B it uses two blocks of 512B. Same with bigger block sizes. Smaller block sizes are more space efficient but are more intensive to access.
>>
>>62283274
>>62283338
not really the reason for it, cpio was and is around with similar features and does not block everything like tar. cpio is still used in rpm packaging nowadays
tar simply got used more. we even have a standardized pax tool that should handle both tar and cpio. but people still use tar. we even have dar which is superior in every single way, and yet people use tar.
people are lazy. even GNU/Linux and power users. they often throw a tantrum against normies who simply won't understand why $alternative is superior, yet they fall for the same logic. Even worse, they're smug. look at how any pushed upstream changes shakes everyone's titties.
>>62282976
you can pack folders in multiple ways, handling corner cases (soft/hard links, attributes, inodes, resource forks, special devices, allowed chars) in different ways. when you have multiple ways to do something, none of them particularly better, masses crave to settle for a old standard behaving in a predictable way
>>
>>62273964
If we are talking scanned ebooks containing mostly text and not many images, you should avoid pdf entirely, and use djvu instead.
>>
>>62283896
also: bot cpio and tar are nowadays just an intermediary step before a compressor tool (gzip, bzip2, xz, lzip) making the entire "block size" features of tar entirely moot.
>>
>>62283989
unironical html is vastly more versatile and more useful to serve compressed images in modern formats
pdf is an archiving standard, html is everywhere, djvu is... a deja vu. it hasn't undergone any revision in the last decade and it's essentially abandonware used by a niche of aficionados. it didn't reach a critical mass adoption when it mattered.
>>
>>62280095
7z is fast too if you turn off solid archive, and rar is dogshit slow too when you leave solid archive on. As for compression speed, they are pretty much the same, in my experience. Modern versions of zip don't have that filesize limit btw.

>>62280313
I put my porn in an encrypted 7z, and I have more than 4 GiB. Obviously it doesn't give great compression ratios, I use it because I'm too lazy to learn to use a different program for encryption.
Many cloud storage sites have an option to download entire folders as zip, as it would be far too tedious to download each file individually, and obviously those zips can get bigger than 4 GiB (and they actually do, as that anon was wrong about the 4 GiB limit of zip).
There are many niche applications too, like compressing source code or large datasets, or archiving a bunch of wavs.
>>
>>62284133
>djvu is abandonware
There are tools for creating and reading djvus on any major platform (even fucking android), and djvu files are 10 times smaller than pdfs on average (2-5 MiB), because it was specifically designed for compressing images with text, while modern image compression algorithms are only ~twice better than jpg, and some of them have arguably worse support. It is also widely used on libgen.
>>
>>62284146
>I put my porn in an encrypted 7z, and I have more than 4 GiB. Obviously it doesn't give great compression ratios, I use it because I'm too lazy to learn to use a different program for encryption.
y u no use veracrypt
or "good enough" shit for your purposes like encfs
>Many cloud storage sites have an option to download entire folders as zip, as it would be far too tedious to download each file individually, and obviously those zips can get bigger than 4 GiB (and they actually do, as that anon was wrong about the 4 GiB limit of zip).
I can't imagine why one would go through all the gimmick to move entire jiggabytes of cuck movies from the cloud® to win10®
but ok, generally speaking, I can imagine such tar-alike uses of zip64 (a .zip don't scare as much as an obscure .tar)
>There are many niche applications too, like compressing source code or large datasets, or archiving a bunch of wavs.
ok for the large data assets, for wavs you want flac, or alac, or .wv, or .ape - but not zip pls
>>62284211
>tools exist
>libgen
do you hear them? yeah, it's the sound of thousands of fucks given, flapping away in the darkness
don't get me wrong, it was superior to pdf in performing a specific operation, but it didn't get traction.
>modern image compression algorithms are only ~twice better than jpg
"No". With the power of html one could use webp and with [trigger warning] javascript one can use bpg or flif. there's nothing really peculiar to .djvu that can't be entirely ported and bettered in html.
>>
>>62284385
>y u no use veracrypt
I actually do use full disk encryption (dm-crypt with LUKS), the reason I have my porn in an encrypted 7z is so that people with access to my computer, such as friends when I bring my laptop somewhere for some reason, can't discover folders upon folders of porn. I know it's not a likely scenario, it's mainly for peace of mind.
>or "good enough" shit
7z is literally that, good enough. Adding files to the archive doesn't take long, neither does decrypting, so I went with that.
>I can't imagine why one would go through all the gimmick to move entire jiggabytes of cuck movies from the cloud® to win10®
Not movies, I have ~7 gigs of music backed up on mega.nz (I'm not retarded, I make full disk backups with rsync to an external HDD, but I have secondary backups of important shit on mega).
>for wavs you want flac
I imagined a scenario where someone wants to keep a lossless archive of their music somewhere, but they don't want to be able to play it, like if they have a laptop with a 128 GB SSD so they store lossy versions on that, but keep lossless copies on an external HDD in case they want to convert it to another lossy format in the future or whatever. In that case, I feel like you could achieve better compression ratio if you used something like 7z or rar or whatever, because lossless music compression algorithms have limited ways of reducing filesize, as they are designed to be played back and to be seekable, but this really is just an assumption, I just did a test and 7z actually underperformed flac.
>>
If I take a 1 GB video and put into a tar.gz multiple times can I eventually result in an easy to share 1 MB tar.gz file?
>>
>>62284385
I guess I was wrong about modern image compression, but you've gotta admit that throwing a bunch of images into a gui program like djvu solo and pressing two buttons takes a lot less initial effort compared to making custom htmls for your books (you can of course reuse pretty much the same html for future books though), and you'd have to work even more to have common features of any book reader, like switching continuous mode on or off, having single and dual page layout, and being able to jump to page X.
>>
>>62285340
Yes, I believe in you.
>>
>>62285318
>I feel like you could achieve better compression ratio if you used something like 7z or rar or whatever, because lossless music compression algorithms have limited ways of reducing filesize, as they are designed to be played back and to be seekable, but this really is just an assumption
eh. that assumption is quite incorrect.
>I have my porn in an encrypted 7z is so that people with access to my computer, such as friends when I bring my laptop somewhere for some reason, can't discover folders upon folders of porn
I don't get why someone would invade your privacy that much and how could get a chance to do that, with the obvious exception of coppers
one could set an unmount timeout in fstab/crypttab for a luks volume or even better, go full pedo: usb killswitch modified so that if you remove an unrelated usb drive or i.e if you disconnect a usb mouse, the volume is unmounted immediately. it could be even configured as an udev rule (volume could be decrypted using a keyfile and mounted in /media/shamefurdispray when the mouse is connected back in a specific port)
for full akbar mode, recompile the kernel without printk
>>
What compression compresses a file the absolute smallest it can possibly be compressed?
>>
>>62285691
there are two things wrong in your post: it's pretty clear you didn't make any effort to read the thread and you didn't represent an archival scenario (hardware constraints, type of file, purpose of the backup... e.g. a deduplicating incremental backup can "compress" yottabytes down to few KB if there are similar or nearly identical copies saved already)
>>
>>62285803
Won't deduplicating incremental backup compressing yottabytes to a few KB result in file corruption?
>>
>>62285645
>I don't get why someone would invade your privacy that much and how could get a chance to do that
I guess they could randomly find my porn folder if they used my laptop, or just asked me what that folder is if they sat next to me while I browsed through the filesystem, and then I would have to make up some excuse for not clicking on the folder if they push the question, and I'm really bad at bullshitting. With an encrypted 7z I can just say "I dunno lol, must be some shit I downloaded years ago", because my HDD has a bunch of random old crap on it anyway.
It really would be easier to just use an encrypted filesystem image though, because I could mount it and see thumbnails as opposed to just looking at a list of files in the terminal. I guess I'll do it some day, maybe right now.
>>
File: cloppety.gif (1MB, 275x252px) Image search: [Google]
cloppety.gif
1MB, 275x252px
>>62285820
>Won't deduplicating incremental backup compressing yottabytes to a few KB result in file corruption?
Not with the correct spell
>>
>>62285866
>I guess they could randomly find my porn folder if they used my laptop, or just asked me what that folder is if they sat next to me while I browsed through the filesystem
Do you fap to your porn folder with your friends and relatives sitting next to you?
>>
>>62287361
Cool
>>
>>62269890
they be
>>
>>62284385
Now this is shitposting.
>>
>>62273759
I assume because you need disk space for block device
>>
>>62290603
>doesn't encrypts file
>doesn't have a block device where to save it

>encrypts file
>have a block device where to save it

makes sense
>>
>>62290745
I see your point but.you have to preallocate space to a block device

If it's a large archive it's often taking away space from the main partition
>>
>>62290870
What the fuck are you even on about?
>>
>>62290870
>main partitions and encrypted block device live in parallel universes
>>
>>62290920
>>62290904
it's a convoluted use case where the archive is being transferred to linear media
>>
RIP thread
Thread posts: 141
Thread images: 15


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.