>use clonezilla
>make image of SSD (180GB used)
>image is only 80GB
how the fuck can this be? can clonezilla compress it by that much?
>>59509974
id also like to know this
bump
>>59509974
>deletes metadata that can be rebuilt
>deletes CIA and NSA ghost bytes
>compresses everything with losless
yes
>>59509974
Empty space fa.m
>>59510161
space USED mate, USED. there is no empty space in the 180GB
>>59510189
There is now . . .
what drive did you save the image on?
also >>>/g/sqt
>>59510287
one of my other internal HDDs
>>59510298
probably just compressed everything it could. check your settings.
Did you defrag the hardrive?
I know that can be a cause of the space allocation.
As for the missing remaining 98 GB i have no idea
>>59509974
millions of small files take more space than one big file.
clusters and shit
>>59509974
do you have a ton of duplicate files?
beyond that, there are tons of inter-file dependencies in systems binaries (e.g., Windows keeping dozens of versions of libraries archived for whatever dependency reasons) that wouldn't be caught by file or block-level dedup but could be collectively compressed by a smarter utility.
>>59511383
>inter-file dependencies
shit, meant redundancies