>12 Core with 30 mb of L3 Cache
>64GB ECC DDR3 1866MHz RAM
>1TB PCIe Flash Based Storeage
>Dual 6GB RAM Each GPU
Name one reason for not purchasing this Godlike machine.
>doesn't even have ECC VRAM
>mactard "professionals" buy into apple's bullshit anyway
Step aside and make room for a real workstation kiddo.
>Name one reason for not purchasing this Godlike machine.
You literally dont need it and can purchase a decent PC for 1/10th the price.
Why spend that much money just to take part in a pissing contest? Mac is playing the Alienware game and the only people that buy that shit are people that will use all they hardware just to play Facebook games.
Reminder that you can get a 50th anniversary edition Porsche for couple thousand more
Tell me something a comp sci student would do with that.
I can write assignments on a cheap ass laptop.
>buy 4ghz "workstation"
>can only run at 2ghz because ultra shit tier apple "engineering"
ECC memory is slower than non ECC, which for 3D rendering and the like means longer compute times. ECC memory on graphics is critical for applications where accuracy is a must, such as in scientific calculations.
Because even though I love Macs I can't stop thinking that I could build a better setup with half the money. Also MUH GAYMES means installing Windows because that's pretty much the only thing I do with my desktop anyway.
I'd buy it if I was rich, and I mean CEO-of-a-Fortune-500-company rich.
cooling is inadequate for a lengthy load
demands more power under load than the power supply is rated to deliver
sacrifices expandability for "muh aesthetics" and "muh innovation"
uses underclocked, non-ECC using FirePro cards
Eh not a very good argument. Workstation graphics cards are increasingly being used for scientific applications, and ECC can easily be disabled for better performance when needed.
Not the guy you're replying to, but I'll chime in anyway.
You're totally right. I could never build a desktop in that form factor, and even if I could it would never be that powerful. No amount of money can change that.
But that's not a sticking point for me. In fact, in the list of criteria that matter to me about a desktop, "size" definitely doesn't rank first, and it's DISTANTLY behind things like price/performance ratio, absolute performance, upgradability, and maybe reliability/maintainability.
I wouldn't be willing to take a significant hit on ANY of those criteria for this size and form factor. I'm sure there are people who are willing (like people who do video editing "in the field" and need essentially a desktop to take with them on shoots), but I'm not even remotely near being in that use case.
The middle model Mac Mini is perfect. I would not want to mostly waste 3000 euros for looking at pictures of Jessica Biel.
This is a bad topic. Not only are aesthetics highly subjective, but /g/ is notoriously hypocritical about aesthetic critique. Tons of case manufacturers put out cases that are far and away more ridiculous and retarded than a "trash can" (to whatever extent that should even bother you), but nobody criticizes them.
He may be full of shit, but I still wouldn't be surprised if it had massive problems with heat. Both my iMac and Mac Mini cooked themselves to death when they were actually put in heavy use. Apple puts form ahead of function and stuffs too many components into too tight space without the thermal engineering to make it work.
Also, never buy the first generation of a new Apple product. It's the number one rule of buying from Apple. They always have problems that get worked out in the second generation.
>retards that spent 30 fucking years to code this pile of shit that crashes if you breathe the wrong way near it
>can't even make their own OS
>steal kernel and duct tape their babby GUI over it
>thinks he's entitled to an opinion
linus is essentially saying that his own software works well with his own kernel's choices and not very well with another kernel's choices. everything else is rhetoric that isn't justified in the quote.
I hope you all realize this before swallowing his load.
More like Apple begged Linus to fix their abysmal abomination of an OS, but he took one look at their code and contemplated suicide that such retarded programmers existed.
mine isnt as bad, but this is still unacceptable for a "pro" laptop
Darwin isn't too similar to modern FreeBSD neither. He's actually basically saying they SHOULD switch to a modern FreeBSD kernel. And he wouldn't even try to convince them to switch to Linux because of the license.
He's pointing out objective flaws in the kernel. Whether they surfaced while he was testing them or not is irrelevant and they'll also happen to whoever is trying to do something similar.
here, i'll help you out
>But hey, it's pretty on top. If the Apple engineers actually knew what they were doing, they could use a known superior open-source kernel and put their pretty on top of that instead.
see here? the "superior open-source kernel" is linux. he proposes having apple build their system on top of linux. get it?
>He's pointing out objective flaws in the kernel.
Not necessarily so. Kernel development requires a lot of choices about various tradeoffs.
Here's an example:
BSD based kernels usually use highly abstracted reference counting data structures to make things like copy on write extremely efficient and easy to implement. linux does these things in a bastardized way.
git is not going to do much (or any) copy on write stuff, because it is mostly changing and updating pages, not spawning new processes or updating pages that are referenced.
So, if linux is optimized to handle the git case and BSD/Mach/OS X's kernel is optimized to handle a DIFFERENT case, who are you to say one is objectively better?
When a choice for less performance/security/etc is made, it's meant to have a reason. like FreeBSD being more performance oriented than openBSD but less secure. what was Apple's compensation to the flaws Linus pointed out? what the fuck would you possibly achieve by making a system more prone to page faults?
Face it, some "flaws" are made with a good intention in mind, but that doesn't mean the system in question is perfect and all the bugs and flaws in it are "design choices". That's fucking desperate.
>stating facts is shilling
What is the issue with OS X using XNU? I've used 10.0-10.5.8 and 10.9-10.10 and never really had issues with OS X. Even if the code is shit, it's working well for them.
Although it would be cool to see OS X be based on FreeBSD
>what the fuck would you possibly achieve by making a system more prone to page faults?
here's one example:
section 4.5.1 How copy-on-write really works
"Turning off write permission in the PTEs of writable pages is the first step of a copy-on-write; it ensures that neither process can write to the page without first causing a page fault."
Because both OS X and Macs are technology.
I know, I'm just saying that I've used most of the versions and it seems to be a pretty good system. Even if it's odd internally, it still works.
I never said you can go and buy one right now.
>Because both OS X and Macs are technology.
Like it or not, Apple is the company pushing the technology boundaries. They're the ones who came up with the modern smartphone (Android had to go back to drawing board with their shit when the original iPhone was released), they're the ones who put high resolution screens in the phones, they're the ones who put high resolution screens into laptops, they're the ones who put 5K screen into a consumer computer, etc. etc. etc.
If you want to be in the cutting edge of technology, you need to buy Apple. Or you can wait until the technology trickles down to other manufacturers when they slowly start copying what Apple does.
The problem would be
> it's working well for them.
If you have a dedicated team that is well aware of the system's quirks, every OS in the world is top tier. Doesn't mean the system couldn't be better, and it certainty restricts developers to a niche for certain stuff.
If there wasn't anything magical about Apple, they wouldn't be worth $700B and be the most valuable company in the world. Marketing shiny shit can get you far, but not that far. At some point you also need to actually produce the best products in the world, which Apple does.
Yeah, you realise how small the Mac Pro market is right? And how much work would go into getting one of these cards slotted into these machines?
You'd be lucky to get over 1000 cards sold per year. The thermal limit prohibits the use of higher end and more profitable GPUs to base the cards on, so you'd not be able to get any profit to recoup the R&D investment.
how dense are you? this isnt like the old mac pros where you could make a regular gpu that happened to have a firmware that worked in a mac as well, this is proprietary format that would only work in this specific thing, then you have to actually make sure the thing would actually work in the mac.
never will happen, the form factor is a fucking joke as usual and they might as well had soldered everything in place