Hey guys,
What's the best way to compare CPU and GPU computing power? It seems that when I look up FLOPS the difference is too high and therefore not very representive.
I'm trying to work out how efficient computational fluid dynamics can be solved on the GPU instead of the CPU. Now im getting speedups of like 200% but I have no base to compare it to...
Depends on the program buddy
Are you using fluent?
Solidworks?
>>56180431
nah for VFX... using Houdini
The numbers in FLOPS are huge... (50 PFLOPS to 2 TFLOPS). maybe im mixing double and single precision?
CPUs are several powerful processors
GPUs ate thousands of weak processors
Arbitrarily multithreaded problems to better on GPUs.
>>56180478
Yes, I know. What I need is a number to compare the computing power of the two. Its for a research paper, therefore I need an absolute base to compare it to...
Or is it impossible (or just not representive at all) to compare the two with a single number?
>>56180478
/thread
>>56180544
I really don't think it's possible.
The best you could do is try a bunch of different scenarios and compare them, not just graphics situations. There are CPU heavy games, like starcraft.
>>56180571
see
>>56180544
>>56180451
>>56180630
?
>>56180668
>nah for VFX... using Houdini
>nah for VFX
>VFX
But yeah, assuming that the VFXshit isn't retarded, you just need something that can solve matrices really quickly. While that sort of thing can usually be somewhat accelerated by a GPU, it's typically CPU-bound.
>>56180707
Go ahead and tell me that the fluids in pic related don't look awesome.
>>56180786
But I want to find out just how retarded the VFX stuff is... But I can only do that, if I know what it would have to deliver to not be retarded...
something as
- prefectly GPU optimized software takes 1h to calculate
- on the CPU it takes 3h
-> therefore the potential is 300%, but the VFX software only reaches 200%
What I need is a way to get the 1h and 3h numbers