What's the expected performance boost you'd get from attaching an external GPU to a laptop via USB?
WHY would you attach a GPU via usb to a laptop? To see even a moderate boost in performance for gayman on a laptop would require some high end card, but you might as well get the whole entire package a.k.a build your own desktop if were to do that.
This eliminates all portability since you'll be dragging around 20 extra pounds that can easily be damaged.
>This eliminates all portability since you'll be dragging around 20 extra pounds that can easily be damaged.
Idea is you'd use it as a desktop replacement at home, and just plug off the GPU and carry the laptop with you elsewhere.
Also, OP, its been done and there's a company that sells you these devices.
Also they use the Expresscard slot of your laptop for higher bandwith.
>What's the expected performance boost you'd get from attaching an external GPU to a laptop via USB?
holy moly nigga im laughing like shit here
How far idiots go to not buy a simple desktop
The problem is getting a loop-back to the internal monitor on the laptop, although I think I've read something about diy thunderbolt external Nvidia GPUs that can output to the laptops internal display, although that eats into bandwidth and hurts performance vs. an external display.
The core problem with this is that it doesn't really save you any money. The only laptop part you're really using is the motherboard and CPU, but those aren't expensive and don't impact game performance much. You still need an expensive GPU and power supply and now you also need some sort of PCI interface. Better to just build a cheap gaming desktop.
If you're looking ahead, yeah. But if you already have a laptop, then why not?
Why shell out for a monitor, case, etc, when you can just buy a PSU, a GPU and some device that ties it all together
Are you fucking retarded?
Look at the OPs picture. The PSU is right there.
>Y-you said t-thunderbolt was the perfect interface
It is. Interface. Not power supply. PCIe is the interface for desktops too. That does not mean it provides any more than a fraction of the power required for a modern GPU.
Right now the half-baked solutions are pretty horrific to get working but the concept is nice.
You have your regular laptop that you were already going to use. Work, travel, school. Whatever.
When you take it back home you can hook it up to a dock. Now you can play games with any half-decent mobile CPU since 90% of games are GPU limited.
It'd be interesting to know what this all cost and how it performs. I highly doubt it's cost/performance competitive with simply buying/building a cheap dedicated gaming desktop, aside from not being able to use the Macbooks display (which is gorgeous but overkill resolution for games)
Someone'd have to price it out. But I suspect even with TB 2.0 it will be really bandwidth starved. You'll be paying a lot of money to put this together; likely a lot more than you could just buy a Ps4 for and get far better performance. Plus you can't use the GPU under OSX afaik, and the vast majority of people this external GPU setup would appeal to use macs.
Some people just want to see their motherboards burn
You can build a PC with similar or better performance for the full price of a PS4. That's already been established several times for Intel, Nvidia, and AMD builds. Consoles have a CPU architecture originally intended for netbooks, which is terrible, and mid-range GPUs. It's not hard.
>bang for buck
PC games are not only cheaper and easily copied, but you also can modify. Games with mod support have WAY more hours of gameplay than any console game would ever have. Also, you can emulate; can you run PS1 and PS2 classics on your PS4? Nope.
Not to mention you can also upgrade the PC itself and have a decent system for years to come.
That's a good point, but you have shit taste, m9.
Citation needed. PC for whatever price getting same performance at equivalent settings.
Right now the console parts are pretty current, and considering there's a lot less abstraction going on console-side they generally get a lot more performance out of the same parts.
>it will be the same
>even bottlenecking it to x1 PCI-e
>making it travel that far of a difference
>doesn't understand the reason why PCI-e traces on motherboards are as close as they are
I think it's proven the only ignorant fag around here is you.
They were obsolete on release and are obsolete now.
That's actually hard to prove due to how developers tend to handle cross-platform support and the fact that consoles run a completely restricted OS which doesn't allow any user to run anything other than whatever is developed for it (games) so benchmarks and monitoring software are out of the question.
just kill yerself m8
Wouldn't it be better to just get a used PS3/Xbox 360 for under $150 than do all this crap? Why even use such a high end GPU if you're just going to be playing games in 720p?
But I understand more if he was doing this for fun because he's a modder.
When you're talking about consoles vs PC in terms of how much performance you're getting, you've got to take into account the API used to develop whatever game you're running. For example, DX11 uses a lot of abstractions to allow any hardware combination to run a game. This introduces a lot of overhead that you generally don't see on a console game, provided it's not a port. A console has the benefit of being identical to every other unit. This means that when you design a game, you can take into account the specific hardware in that console and gain quite a lot of performance that would be ignored if you were developing for PC.
Remember that ATI graphics cards were capable of tessellation in 2001, but it was never used because Nvidia cards couldn't do it and developers didn't want to shut themselves out of that market. This is just one example of how developing for a uniform set of hardware can be beneficial.
I`m interested in getting an eGpu for my T420. I already use it as a desktop replacement with the mini dock, keyboard and hd display.
And when I need it as a laptop I just take it out of the dock and BAM super mobile.
I don`t get why y`all niggers hate on that.
You have no idea what you're talking about if you think that the speed at which electricity moves through a circuit would have a perceptible effect on latency over distances of a few feet. What you're thinking about is bandwidth, with PCIe having ~31GB/s transfer speeds and USB3 having ~600MB/s.
Now kindly fuck off, plebe.
If I meant bandwidth, I would have said bandwidth.
The bandwidth is shit, but eGPU over USB3.0 or eSATA is generally more powerful than iGPUs. Trouble is. The latency. No point having nice frames, if they're a second or two late.
I bet you could buy a Dell Ultrasharp, which would be a better display in every way, and still be more cost effective than a laptop+external GPU rig.
This kind of thing falls under the category of "tinkering and spending more money than I need to because I'm bored". There's nothing wrong with that, as long as you realize that that's what you're doing.
Will a gtx 780ti get bottlenecked hard as an eGPU?
My friend's willing to sell me one used for cheap, but I'm still saving for the rest of my desktop. Thus I'll have to make do with juicing up a laptop with an eGPU until I can build my rig
I'm just chiming in here but won't this setup add a significant amount of latency because the CPU has to do a bunch of image processing it wouldn't otherwise need to? USB TV tuners have a significant lag when you watch the same channel next to a TV.
On the other hand, Nvidia figured out a way to do streaming gaming to a tablet with very low latency over wifi, so I guess anything is possible.
This is input anon, the image need to be processed.
Output wise, there is no latency as you just buffer the gpu and it just process everything and send you a image back directly to the display buffer.
eGPUs don't exist and will never exist because there is no real market demand for them. To get a diy eGPU working, you need an expresscard slot, and most laptops stopped using them years ago.
>eGPUs don't exist and will never exist because there is no real market demand for them.
(also, don't get too excited, it uses a proprietary Alienware connector)