>It's still young
>translation: Because I haven't heard much about it or used it, it's young and bad
Bruh, it's been around since like, 2009 and has become one of the best ray-tracers around. It is much more feature rich than Arnold and even LuxRender. It can utilize all the GPUs and CPUs in your system, and on your networked computers. It even naturally does light dispersion (splitting into colors). And many 3D packages have begun adopting it.
I still don't understand the model behind this product.
Is it going to be free for everyone to use?
Is it going to be a paid-for plugin?
Is it something that companies have to pay for themselves to get it in their packages?
Looks interesting, though. I am looking foward to a GPU-accelerated Arnold-like.
It's already integrated in 3DS Max and Substance Designer, you don't pay extra for it in either of them. It all comes down to the software company, whether they choose to license and integrate it themselves. The plugins Nvidia is providing will likely cost a bit of money, considering that the beta tests have a license expiry date. But that doesn't mean for sure it will cost money, and if it does, it will be decently priced at least.
> It can utilize all the GPUs and CPUs in your system, and on your networked computers.
Which renderer *can't* do that. That'd be pretty shit-tier. No problem to do this with e.g. cycles.
> It even naturally does light dispersion
Many other renderers can do that as well. Cycles is the only one I can think of that can't do that, but e.g. luxrender, vray, octane, mitsuba, ... can.
Not that I think anybody gives a shit, really. If light dispersion is somehow essential to telling your story, you can just map it to the wall in five minutes with a colored light(-map). Not only does that work anywhere, it's also gonna be easier to control and way faster to render.
>Which renderer *can't* do that
No other production renderers currently will do distributed GPU rendering across GPUs on a network, they'll do CPUs, yeah, that's nothing special.
>Many other renderers can do that as well
Only a few do it, basically the ones you listed, and maybe Maxwell. LuxRender's implementation is complicated and doesn't work with all of the other features. Iray's is a single option you turn on and it just works with all your materials/lights.
We're talking about photorealism here breh, any ray-tracer is good enough if you're just trying to tell a story.
Any renderer that can render on GPU and CPU can be used to render distributed across a network on GPUs and CPUs. Worst-case situation is that you have to do a few minutes of automation work to move over the scene data, set up a seed and kick off the jobs. Ain't no thing to set up e.g. cycles to render across a bunch of different machines, using the GPU on some, using the CPU on some, using both on some, whatever you feel like.
It doesn't matter how many samples each device in your network gets done, in the end you can always weight them all into one image, and that image will be as if you rendered on a single machine with samples = the sum of all the samples all your machines in the network rendered.
Is it going to be as nice as using tractor? No, but it gets the job done for sure, and it works with any renderer (basically all renderers can be started/controlled by scripts)
> Only a few do it, basically the ones you listed, and maybe Maxwell.
> We're talking about photorealism here breh
So, what you're saying is... basically every renderer (sans cycles) there is that has a focus on photorealism can do it? I mean, how many more are there? Indigo can also do it, Arion can do it, mental ray can do it, keyshot can do it, and now that's starting to get pretty obscure...
Yeah, renderman can't do it, but that one doesn't focus on realism.
Pretty good. Scales well accross my servers with low cost "gaming" grade gpu´s.
Still lacks a working proxy system but it can handle 6k renders pretty well if you know what you are doing.
Lmao, nice try troll, what is that, fucking Nearest Neighbor interpolation?
This was likely only a minute in Iray.
There was no clamping involved there anon... The fireflies scene in the full scale render are only 1 pixel big, so when you scale down, they get blended with the other colors and simply become a more natural looking noise.