I've seen plenty of videos on how to install the water block to the card but I need to install the water tubing and the well. Any suggestions or examples on were the well would go.
Pic related I stole from a guts thread in /g/. I want that setup for my gpu.
get a closed loop system?
reservoir location would depend heavily on the case, right? What case are you going to use?
When I see the extents of PC cooling systems today, it seems almost like a cry for help. We are reaching the practical limits of silicon and this is the last, desperate push to squeeze the remaining 1% out of it.
Well we're not really, because you can just buy a Xeon, but they can't afford to.
It's more like "squeezing the very limits out of a chip that wasn't good enough to be packaged as a Xeon".
remember water cooling is for show only
it has no benefits so its really wasting money, dont expect to suddenly increase your performance
a closed loop for cpu would work but gpu is really pushing it, its cheaper to invest in a better gpu
if you got money to waste and like how it looks then i say go for it
That depends. I want my shit to be quiet when it's running at 100%, and water cooling is the best way to do that. No more or less performance than air cooled, just trading money for quiet.
the limits of silicon are the 6-7nm area, and we aren't there yet
as for what we can do, it really depends, clock for clock may be able to improve in other ways than just die shrinking.
the moment you stack a cpu heat becomes a massive fuck off issue, though if we get enough multithreading solutions, i could see the traditional cpu die off in favor of a fucking big, but slower overall cpu stack... hell even make a cpu a legacy item that's just there for the sake of backwards compatibility and use a completely different instruction set that is based around multi threading.
dude, a closed loop stills sucks dick compared to a well made custom loop.
you want my tips... don't do water cooling, its almost never worth it unless you have autism levels of it needs to be quiet, there is almost no benefit from cooling it in today's cpus or gpus as they run cool enough that they will throw errors for other reasons than being a hot oc before you hit a thermal limit.
now in that pic, you have note that the person i believe is a fucking retard, see the way his gpus are set up? those two connection in between it? he fucked that gpu over hard, and killed almost all flow that setup has, you put one in the top, it flows through the system then empties out on the opposite side, what he did pushes almost all the coolant out of the gpu area altogether. this person is fucking retarded.
also, note the cpu and ram area, there is NO REASON AT ALL TO COOL RAM, but to add onto stupid, he made one of the tubes to short so it pinches a bit. i honestly cant tell how he has it set up. but you want to push water through the case not pull it out, but i honestly cant see the pump, so this retard may be pulling it through the case.
reservoir ->pump -> cpu -> rad -> gpu -> rad -> reservoir
but like i said, you are only doing this for the sake of lower operating temps and noise...
for my money i would never touch a water cooled anything, and instead get a 90-110$ air cooler, preforms damn near the same as water cooled, but FAR less of a bitch to clean, and FAR less risky for the system.
>if we get enough multithreading solutions, i could see the traditional cpu die off in favor of a fucking big, but slower overall cpu stack... hell even make a cpu a legacy item that's just there for the sake of backwards compatibility and use a completely different instruction set that is based around multi threading.
Every computer already has the item you describe: it's called a "GPU".
That it's not supplanted the CPU in the decade it's had to do it in is evidence that most problems do not parallelise linearly. It's not like the games industry lacks talent or hasn't been trying.
no, even that would get too hot if you stacked it. when I mean slower I mean slow enough that it can be passively cooled while still going... sure a gpu is many small dumb cpus, but that's all it is, dumb cpus, you could not feed normal computing instructions through it.
Its a bit hard to explain what I mean.
if 1 cpu core is a genius, than a gpu is 1000 fucking retards. what i'm talking about isn't stacking 1000's of retards, i'm talking about stacking fuckloads of geniuses just clocking them so low they are able to be passively cooled.
When you do dumb shit like rendering graphics you have thousands of pixels to calculate in parallel, the job is simple and everyone knows what they have to do and the storage for it can be read in parallel and everyone is happy.
As soon as you need to do pretty much anything else it's not as easy, most of the time the operations depend on previous values and things.
Even when the task is parallelisable there is a limit to how many extra cores actually helps, the classic example is matrix multiplication, to calculate a single position you need the entire row and column values and eventually it ends up taking longer parting out the data and getting everything organised than it does to actually do the calculation you wanted. After about 8 or 12 cores or so it starts taking longer because of the overhead.
Massively multiple core architecture isn't new, we had systems with stupid numbers of cores in the old days but they died out because its very difficult to keep track of operations and share resources effectively