Have we hit the maximum power draw from a wall socket for computer power supplies?
American current is 120V at 60Hz and most houses allow for about 15 amps per circuit (correct me if I'm wrong) so this means:
watts = 1 × 15A × 120V = 1800W
Of course the wiring and breakers in a house will never be perfect, and there will usually be other things using power from a circuit in a house other than the PSU (lights, TV, ceiling fan, etc). So, if you had a 1600 watt PSU in your computer taking in the maximum power draw (which is completely possible, 4P board with four 140 watt 16-core Opterons, 32 sticks of RAM, and four Titan X GPUs hits about the 1600 watt limit), you couldn't possibly have anything else on that circuit, otherwise it would trip the breaker?
As PC parts become more efficient per watt (Titan X only takes 250 watts while being one of the most powerful consumer GPUs on the market, while about two years ago the GTX 690 was market leader but used about 50 more watts), does anyone here think that we will ever need more than 1600 watts, or is that the peak wattage of computer PSUs that we will experience for the foreseeable future?
Before anyone else says anything, I'm just talking about home usage. Of course more powerful PSUs exceeding 2000w can be purchased but they are usually for server environments where the available amperage coming from the wall outlet would be much higher. There have been a few "home use" PSUs 2000 watts and higher, but I have never found one that isn't either complete shit or doesn't require two separate plugs from two separate circuits in your house.
Pic related, external 2000 watt PSU, (legitimate) reviews say it is completely terrible and one review said the buyer had to plug the two inputs into separate rooms in his house to avoid tripping the breaker.
I HIGHLY doubt a typical American house is wired for 15 amps.
In Australia a bog standard house connection is 75A @ 240Vrms = 18,000W
If you had less than 2kW then you wouldn't be able to run things like washing machines, driers, etc..
if anything we'll see more people putting in dedicated circuits for their PCs since obviously most people run them longer than 3 hours and you don't want to fry anything
I don't see any one upping the power on these because they don't want to risk anything
I worry about this sometimes since I have my computer and audio system on the same circuit.
Is it possible to tell the draw of this?
I've only ever blown circuits when running space heaters though.
> "15 amps per circuit"
> "per circuit"
Obviously your entire house is going to be wired for much more than 15 amps, but /each circuit/ is only going to be able to provide a certain amount, unless you don't use circuit breaker panels for the different areas of your house in Australia...
>That is some meme shit, right there.
IEC plugs that should be rated @ 10A Max, two seperate circuits needed - presume they are isolated (actually looks like two PS jammed together) but, this is a fucking terrible idea in general in a domestic environment. Plug in one side, something goes wrong, you standing with a live plug in the other hand?
Srsly, sh!t should be fucking banned, just un-needed idiocy. In general though, you want to run un-necessary shit like 2k gaming rigs, you upgrade your electric supply, or go three phase, with proper fucking plugs, etc. - least then you are not fucking around with domestic-rated shite like IECs.
He means 15A circuits - your house maybe 75A total (Think most murrican houses are 100A, dunno) - but thats split into circuits @15A-20A max, unless special (20-40A) for kitchen, heavy duty, etc. This thing manages to quite cleverly exceed 15A, thus, two circuits needed. In general, electric cars prob. mean a lot of peple will have to upgrade their wiring (Teslas use 50A@240v, for example) - but, again, an extreme example, and they (Tesla) at least use decent plugs.
This is why American supply is stupid.
Typically your house has split phase, if you knew what you were doing you could run it across both like cookers etc do. Then you could get >3kw out of it.
I have seen people piggy back a second PSU onto the primary to run peripherals, you can do some logic with the voltage good signal too if you want.
Nothing to stop you putting in a separate breaker for just your PC, it's common for new heavy loads like electric car charging points.
Superior UK here, typical circuit does 32A at 240v
That's quite a nice amp you got there. If you look at the back of it (one of the pictures on the page you linked shows) there's a sticker that says "1800 Watts Max", although the amp is only going to use at the absolute most as much watts as the connected speakers are rated. For example, if you have three 50 watt speaker towers connected to the amp, you won't be using more than 150 watts at any given point. The 1800 watt rating just means that you shouldn't exceed 1800 watts worth of connected speakers. I don't think you need to be worried unless you are either running a 1000 watt+ computer on 100% use at the same time that you are listening to some floor to ceiling speakers with a couple of huge subwoofers at maximum volume. Overall, consumer speakers don't really use that much power unless you are running giant subwoofers.
Obviously this post went way over my head.
I thought he was talking about computers requiring more power than was available. If it uses more than can be provided by one wall socket in your current set up, thats not a limit or anything, you just need to wire it up differently, like you both said. I assumed that went without saying because its so trivial that theres no point talking about it? I thought we were talking about a hard limit on power draw not some arbitrary soft limit way below that, nevermind
If someone with a high watt gaming computer was dedicated enough to change how the house was wired then they of course could install higher-amp circuit breakers after of course verifying that the existing wiring and outlets were able to handle it.
However, in a lot of situations (think apartments or really old houses) that kind of work is either not possible (because you don't own the building) or cost-prohibitive (because all the wiring is so old all of it would need to be replaced). It's not as simple as hard limits vs soft limits, of course there is enough power going to everyone's house to power even the most ridiculous of gaming computer, the issue is getting all that power to a single power supply without an issue.
Overall, I feel that this won't be a huge issue in the future since I can't imagine even an top spec gaming computer using more than 1600 watts in a home setting, as parts are getting more efficient every release cycle. There hasn't been many consumer CPUs released that use more than 140 watts (only a few server chips and AMD's two completely ridiculous 220 watt AM3+ offerings) and the newest top of the line GPU only uses 250 watts, which has been on the decline for the past few years.
I don't know if it helps, but when designing circuits for domestic use you use a principle called diversification. Basically, if you add additional load to a circuit gradually you can get quite a lot of load going on in comparison to the actual rating. Switching large loads in and out causes a big jump in resistance and can blow a circuit or cause damage over time which leads to degradation which makes the circuit or especially connection points/switches more prone to failure over time.
Like switching a 2.7kW water heater in and out causes a huge spike in resistance and current drawn through a circuit because as it heats up the resistance drops. So the initial spike on a 240v system might exceed 20amps or so, but once it is on, you could chain up further small loads in sequence and not overload the circuit. Things like big water heaters etc are usually on separate circuits in a correctly designed installation.
Domestic ring mains for plugs etc are usually designed with this in mind, as in the end user is going to use the property in a normal manner adding load gradually throughout the property instead of plugging 3kW into every single plug on a 32amp rated circuit and switching them all on at once.
Modern motherboards are pretty good at regulating power consumption for energy saving purposes so they don't switch on and show a total load to the circuit of the maximum rating and they increase loading gradually as needed until they are using the maximum wattage which while it might be causing some wiring somewhere to warm up, it isn't causing arcing, fat sparks and other shit which is going to throw breakers or risk fires.
Why is this not a Europe vs. US thread yet?
3680 W per outlet master race.
> Why is this not a Europe vs. US thread yet?
in the US, you can use two plugs and get 240V between them just be making sure you use outlets that are on different phases. so, a $5 cable adapter gets you 3600W, easy.
More common in my experience is a 20A breaker. NEC dictates sizing such that the planned load is only 80% of the rating, giving you the 16A wall socket rating.
That gives you 1920W to work with. That on it's own should be more than enough for any sane computer.
There's more math involved taking the period nature of voltage and current in to account, but I don't feel like working through it right now.
The glaring issue here is that that's a retarded amount of power for a single home computer to be drawing.
If you can afford the ridiculous $3000+ PC needed to exceed the limits of a typical 120V/15A plug, you can certainly afford to have a dedicated outlet installed for it.
>I can get 6kw from my outlet using 240v 25a setup over 12awg
I have no idea where you live, but if it's the US, 12AWG is, in no circumstance, to be used for circuits exceeding 20A according to the NEC. You'd need 10AWG for that (which is rated for 30A).
(The wire ampacity table shows ratings above 20A for 12AWG for all insulation types, but this is mostly so you don't get fucked when de-rating it.)
We use wires with 2,5mm2, what is nearly 13AWG (wires in the wall).
Our breakers are rated for 16A 230 V AC.
This ist for single - phase plugs, in 3-phase plugs we use 16 or 32A in a household. See https://en.wikipedia.org/wiki/IEC_60309
for example, flow-type heaters are connected to 3x35A breakers with 6mm2 wire, fixed installation without plugs.
So i can draw nearly 3,7kW from a single circuit without problems. Some older breakers (in older houses) are rated 10A, because the wires are only 1,5mm2 (nearly 16AWG).
Britbong here: Our mains rings are rated to 32A, so a PC with the two PSUs can use both sides of a domestic double outlet and draw 6kW (two flexes at 13A), while still leaving another kW or so of headroom for the other appliances in the room.
>both sides of a domestic double outlet
they be actually rated @ 13A total, not 13A per socket, or 26A total. If you pulling 6K from a double socket, you are running at slightly over 200% recommended max..
Rule of thumb (Bongoland), dont exceed 13A (or 3000W) from any single source point, be it single, double sockets, extensions, whatever.
i remember an argument about ratings of double sockets a few years ago, i think some are rated 13A per outlet, 26 in total but there is some ambiguity in the standard or something so it could be 13A shared between two
i cant remember the outcome and id love someone to prove me wrong but i would be weary of trying to pull 6kW though a single faceplate. a ring final could technically support a 32A commando outlet but you would have to check the little green (pink now?) book.
but yes uk is superior.