So, am I missing something?
Do this people really believe that they can display 10-bit stuff with a standard video card on a crappy TFT monitor?
Or are those people all on Quadro/FirePro with an Eizo monitor?
Do they know that the video source (RAW) is 8-bit?
Why the madness of making the releases with this dumb bit depth?
That was never the point of using Hi10P for anime. 10-bit provides better compression because you can encode a smooth gradient and have it be dithered during playback instead of dithering it before encoding.
>Why the madness of making the releases with this dumb bit depth?
Well, not that I frequent this or other anime boards that often, nor watch anime a lot, last time this kind of video played a lot choppy on my old computer (3 years ago) with no noticeable quality differences from what I remember, and today I've only found Hi10p versions of an anime.
I see, it dithers the banding so, like with Photoshop gradient?
Watch out for Taneli Daiz
The public enemy of fansubbing #1
He is the main responsible for...
MX Media dropping honorifics
Leaking OreImo from ANN
Making ElitistFags quit
Killing fansubbing in2010,2011,2012 and 2013
Hostile takeover of Commie
The development of the .H264 10-bit profile
Attempted assassination of Coalgirls and Sakura!Fish
Successful assassination of ZeroYuki, Grumpy Jii-san, and Ken-sama
Successful assassination of the authors of ‘Kaze no Stigma’, ‘MM!’, and 'Zero no Tsukaima'
Shutting down the NyaaTorrents tracker
Humiliating Mitsuhiro Ichiki
Shutting down lolipower.org
Shutting down megaupload.com
Removing loli and rape from TVTropes
Ruining a Q&A session with Hadena's Head of Public Relations
Introducing Miura to iDOLM@STER, and constantly distracting Yoshihiro Togashi
The development of the future .H265 10-bit profile
Preventing KyoAni from adapting the continuation of FMP
Pushing to ban lolis in Japan
Hindering the development of VP9
Holding subs hostage and forcing people to use Commie.
Allowing Naruto threads on /a/.
Forcing Subdesu-H to hardsub and watermark hentai.
Imposing an anime embargo on Australia
Suggesting Rotoscoping for Aku no Hana
Convincing Anno to drop 2.0 subplots in favor of action to show off 10bit compression
Sabotaging official encoders' efforts in order to make hobbyists look good
Simultaneous kill of Nyaa, MAL, and HorribleSubs
Formation of ISIS
Giving Iran nukes in exchange for total control of anime in Iran
Assassination of Robin Williams and faking it for suicide
Giving rockets to Gaza
Assassination of the Buldozer of Fallujah
4chan's post length limit
copyrighting the body of a slut
His ultimate goal is to kill anime.
However, you can stop him. Spread this list!
>So, am I missing something?
Yes. I'll just paste you what I've written on the topic in the past:
Let's talk about the medium we're working with a bit first.
Banding is the most common issue with anime. Smooth color surfaces are aplenty, and consumer products (DVDs/BDs) made by "professionals" have a long history of terrible mastering (and let's not even get to the subject of what QTEC does to video quality). As such, the fansubbing scene has a long history with video processing in an effort to increase the perceived quality by fixing the various source issues.
This naturally includes debanding. However, due to the large smooth color surfaces, you pretty much always need to use dithering in order to have truly smooth-looking gradients in 8-bit. And since dithering is essentially noise to the encoder, preserving fine dither and not having the H.264 encoder introduce additional banding at the encoding stage meant that you'd have to throw a lot of extra bitrate at it. And remember that we're talking about digital download end products here, with bitrates usually varying between 1-4 Mbps for TV 720p stuff and 2-12 Mbps for BD 720p/1080p stuff, not encodes for Blu-ray discs where the video bitrate is around 30-40 Mbps.
Because of the whole "digital download end products" thing, banding was still the most common issue with anime encodes, and people did a whole bunch of tricks to try to minimize it, like overlaying masked static grain on top of the video (which I used to do, and incidentally is something I've later seen used in professional BDs as well - though they seem to have forgot to properly deband it first). These tricks worked to a degree, but usually came with a cost in picture quality (not everyone liked the look of the overlaid static grain, for example). Alternatively, the videos just had banding, and that was it.
Over the years, our video processing tools have got increasingly sophisticated. Nowadays the most used debanding solutions all work in 16-bit, and you can do a whole bunch of other filtering in 16-bit too. Which is nice and all, but ultimately, you'll have to dither it down to 8-bit and encode it, at which point you'll run into the issue of gradient preservation once again.
Enter 10-bit encoding: With the extra two bits per channel, encoding smooth gradients suddenly gets a lot easier. You can pass the 16-bit debanded video to the encoder and get nice and smooth gradients at much lower bitrates than what you'd need to have smooth dithered gradients with 8-bit. With the increased precision, truncation errors are also reduced and compression efficiency is increased (despite the extra two bits), so ultimately, if you're encoding at the same bitrate and settings using 8-bit and 10-bit, the latter will give you smoother gradients and more detail, and you don't really need to do any kind of processing tricks to preserve gradients anymore. Which is pretty great!
Now, obviously most people don't have 10-bit screens or so, so dithering the video down to 8-bit is still required at some point. However, with 10-bit, this job is moved from the encoder to the end-user, which is a much nicer scenario, since you don't need to throw a ton of bitrate for preserving the dither in the video anymore. The end result is that the video looks like such an encode on a 8-bit (or lower) screen, but without the whole "ton of bitrate" actually being required.
So the bottom line is that even with 8-bit sources and 8-bit (or lower) consumer displays, 10-bit encoding provides notable benefits, especially for anime. And since anime encoders generally don't give a toss about hardware decoder compatibility (because hardware players are generally terrible with the advanced subtitles that fansubbers have used for a long time), there really was no reason not to switch.
>CR's subs were amazingly good
Even HorribleSubs' were better. Just extract the HS .ass and retime it for the BD raws like everybody else. It takes a minute or two per episode if you know what you're doing, and you can do it while the next one is downloading.
>>CR's subs were amazingly good
>Even HorribleSubs' were better.
Thanks, now it's more clear.
At first I though that since the source was already 8-bit, there was no point in increasing the bit depth, like converting a .mp3 to a .wav to gain quality. But considering the re encoding process with the inevitable loss of quality this makes more sense.
>10-bit provides better compression
Which would be useful if the people encoding anime into 10-bit were actually using it. But when the 10-bit subtitled version is nearly identical in size to the 8-bit raw, it's fucking pointless.
I was trying Kawaii Codec Pack, but I can't stand that lame background.
I'l try CCCP; x86 or x64? Any difference or the x86 is the safest bet?
1080p was possible (9600M GT), but every now and then, it started to skip frames.
With my new machine, I think I choose the worst anime to make configuration test, Hana to Alice.
>last time this kind of video played a lot choppy on my old computer (3 years ago)
My 2006 HP prebuilt with a E4300 C2D and 945G iGPU could play it just fine 3 years ago
Literally how can you fuck up worse than that
Thinking about it, it could be the thermal paste almost gone, I changed it recently, reaching the CPU is a problem with laptops and I didn't want to risk my only computer.
Using a vanilla MPC may be another cause.