Why aren't we using this yet?
I'm not an expert.
From lurking the threads I remember that:
1) apparently it's good for low-end encodes, such as the ones that live-action movies typically receive.
2) The 40% save in space is a myth that only works for very low bitrates. When you have grain, the file size goes through the roof.
3) Daiz said not to use it yet.
>What kind of toaster are we talking about?
A shitty one from 2007 that had only his graphics card changed after the old one died.
You can't run 1080p 10-bit h.264 on your toaster either but it's still used.
>A C2D should still be enough for 1280x720 HEVC
Hell no. Lower end C2Ds can't even do some 1280x720 10-bit h.264 encodes.
>That's what's called a toaster nowadays
I tried some HEVC archives and they run with some lag. You'll be fine.
No one gives a fuck about 1080p until it actually starts airing in that format.
It is a toaster. I can't watch youtube 720p 60 fps videos without it stopping every few seconds.
Which doesn't make absolutely any sense. I can play some old games at 1050p over 60 fps, but can't watch a youtube video of the same game at 720p 60 fps.
>No one gives a fuck about 1080p until it actually starts airing in that format.
Depends on the source. Usually 720p is a downscale. Not by much, but it still is. Which means that 1080p, even being an upscale, is better.
2) Not any more so that X264. X265 can handle higher bitrates so qualityfags will make bigger files.
3) When you can encode X265 in hardware at the same rate as X264 we'll see an explosion of X265 adoption. What Daiz says is irrelevant.
is this the daiz circlejerk metashitposting thread?
>It is a toaster. I can't watch youtube 720p 60 fps videos without it stopping every few seconds.
try using mpv instead. Just download mpv and put youtube-dl in the same directory, then do
and I guarantee you that 1080p60 won't be a problem anymore. I know because I have an atom 1.7ghz single core and they run fine, while ANY kind of youtube stream from youtube has around 10 fps.
The point is that x264 looks better at high bit-rates then x265 currently.
While x265 looks better at low bit-rates that kind of shitty quality should be reserved for streamfags, almost all current BD encodes will not benefit from switching to x265 until they can get good quality at high bitrates.
Yeah well unless you got a gtx 960 your computer will need quite a lot of power to decode x265 which hurts the quality even more since you won't be able to run high end upscaling algorithms at the same time on a toaster anymore.
The gtx 960 is currently the only GPU on the market with X265 decoding, (yep not even the 980ti can do it). The new generation of GPUs launching soon will fully support it though with a hardware decoder on the cards.
That's really not my point, i never said that it wasn't possible on a toaster. My point is that without hardware decoder you will use up a lot of system resources, and those resources could be better used for an upscaler like NNEDI for instance.
Tell me he's joking.
Though, Coding Tree Units are a pretty interesting primitive for animation since so much of the scene is static. H.265 should have much better compression ratios compared to H.264 (for standard non-action scenes) if encoders can actually exploit this, in that respect I would have thought a SoL like Dagashi Kashi would be a be a better first test-case.
Once CR switches to HEVC I bet they're going to use that new Protected Video Path DRM baked in all new hardware. Horriblesubs probably won't be able to rip that one for a while if ever. Just letting you guys know. Oh yeah, right, nobody cares until it goes into effect and fucks you in the ass.
>Protected Video Path DRM
If you're talking about the HDCP implementation in Windows Vista, that hasn't been relevant since... fuck, Windows Vista.
Besides, even if somehow they magically make unbreakable DRM, I'm sure Commie will still be around subbing camrips.
because h265 was apparently designed for nvidia's 900 overpriced 900 meme line
i wouldn't trust nvidia not to push something into their drivers to fuck up 265 encodes as soon as their 1000 line is out soon, just like they're doing to vidya with gameworks
This is something a lot scarier.
Thankfully, hdcp is cracked, so that's a valid technique to capture it, but it means rather than ripping, you have to play the whole episode and record that...
Don't they say That you are allowed one digital copy per purchase of a physical copy? If thats the case why do they continue to try to stop me from riping the disk that i fucken paid for?
Should I buy this, /a/?
HEVC 10bit fixed function hardware decoding for lowest CPU usage, 75W so no external PCIe power connector needed
Because they don't have to respect that right.
You are supposed to be allowed to back up your media.
But, cracking DRM is illegal, so if the DRM blocks you from making backups....
If someone took a company to court and demanded to be able to make backups, they might win, but they couldn't just crack the DRM to make backup.
(Assuming burgerland laws)
Also, you could certainly make a backup of the encrypted, drmed up raw files, but good luck playing those.
So, ripping a those fancy new 4k online activation blurays to an iso is perfectly legal, but good luck ever playing that iso, even if you burnt it to a disk.
So you probably wouldn't win the law suit, since technically you can make backups, and they have more lawyers.
And since you'd be going up against the media industry, it'd be a crazy expensive law suit too.
The system is about to collapse. Give it a few more decades.
The only real problem is, we won't acquire balance even then. The rich will still be filthy rich, and nothing will really change other than that nothing works anymore unless you are filthy rich.
Is encoding time actually relevant in the context of fansubs? Like do current encodes take multiple hours to make?
I always assumed it was relatively quick. I'll admit I've no personal experience though so could be a dumb assumption.
There is a 3-way tradeoff between CPU-cycles spent, bitrate and visual quality.
Simplified said: HEVC partially achieves its superior visual quality by enabling you to burn even more CPU cycles in a useful manner on various new analysis steps.
So if you want good visual results on a small file size you need to crank up the settings to spend more CPU. Hardware encoders are equivalent to the lower end of the tradeoff spectrum where you get higher bitrates for the same quality.
So to get encodes done in a timely manner you need some expensive multicore or even multi-socket CPUs.
If you tried it on an i3 with high quality settings it would take hours.