Culture
The Hackintosh is dead
A 15-year cat and mouse game between Apple and hackers comes to an end.
Earlier this week Apple announced that it would spend the next two years transitioning all of its Macs to ARM-based processors. The justifications are clear and totally valid: Intel is sucking wind right now, Apple's A-series processors are basically the best ARM processors in the world, and there's a huge library of iOS software that could be brought to the desktop. From Apple's perspective it all stacks up.
But as developers on Twitter excitedly applied for Developer Transition Kits, or DTKs, people in the web's biggest hackintosh communities assembled to mourn the inevitable end of a 15-year cat and mouse game between Apple and people wanting affordable Macs.
Here's what a Tonymacx86 moderator said:
And on /r/hackintosh:
The mood in these communities started out pretty dour, but the conversation quickly turned pragmatic: Apple will, many argued, continue to support its current lineup of Macs for another 5-7 years, even if the company is trying to get to parity in just two. If we look back at Apple's transition from PowerPC processors to Intel, this looks a little dubious: The first Intel Macs arrived in 2006 and by 2009 new versions of macOS were no longer being made for PowerPC models. On the other hand, there are far more Intel Macs in circulation today than there were back then, so Apple might have to move a little bit slower.
So should the hackintosh community despair, even if they get another few years of support from Apple? What happens then? Because once Apple stops making drivers, it's game over.
To approach that question, it's important to understand why the hackintosh community exists at all. For some it's a fun computer project, and every time Apple updates macOS you have an excuse to fiddle with it a little more. For others it's simply a way to get into the Mac software ecosystem at the lowest possible price, but this justification has only weakened over time as sites like eBay and Amazon have institutionalized the market for used Macs. Instead, the cohort that has solidified over time are power users that need a Mac Pro that doesn't exist. I recently wrote an article about building a gaming PC for under $1,000, and that's the exact kind of machine that Apple hasn't offered in decades, but is exactly what the hackintosh community provides.
YouTuber Optimum Tech lays out the justification for building a pro hackintosh.
Frustration with this hole in the product line reached a boiling point in 2017 when the "days since the Mac Pro was updated" became a running joke on technology blogs. The Mac Pro became so old that off-the-shelf, middle-of-the-road consumer hardware was faster, and pros simply had nowhere to go. Apple eventually relieved the pressure by inviting the press to talk about the new Mac and eventually put out a statement saying that a new Mac Pro was in the works. Then, hilariously, Apple released the new tower Mac Pro starting at $5,999, forever signalling its absolute disinterest in the $1,000 (powerful) desktop segment.
So, to be clear, this market need definitely still exists. People still need to edit high resolution video, edit huge music arrangements, compile source code, and play 3D games. These are the people who would essentially yield the best return on investment given the time and money needed to set up a hackintosh, but I would argue that for many of them, the process is merely a means to an end. The goal was always to maximize utility in software like Final Cut or Logic, and the fun computer project was just the icing on the cake (for sick people like me).
In Apple's eyes, the iMac was always meant to address this segment of users, but then Intel began to plateau. Each new processor update brought meager 5-10% improvement at a time when iPhones began shooting 4K video with 4x the pixels. This was just a few years ago, and already we have phones and cameras that shook 8K video. Workloads are expanding exponentially while Intel's chips improve incrementally.
Scrub to 1:36:23 for the demo of pro-tier apps on ARM.
I'm focusing on video editing here because Apple's A-series chips are already addressing this problem, and the company even mentioned it (briefly) in its keynote segment. One way to improve video editing performance is the hackintosh way, where you add more and more processing power (and RAM) to chew through more pixels. This comes with increased cost and the need for more thermal headroom, which is fine in a PC you build yourself, but runs against Apple's design values. Instead, Apple has beefed up the GPU and added dedicated video encoders and decoders to the A12Z, allowing it to accomplish mostly the same workload at much lower power.
My friend Sean Hollister at The Verge wrote a piece detailing what we can and can't infer about the power of Apple's silicon, and it's true that there are more unknowns than knowns right now. But if we look at this graphic (above), we can see that Apple has already sliced out several specific workloads and designed dedicated cores within the A12Z chip to handle them, like the "High-efficiency audio processor," "Neural Engine," and "High-performance video editing."
Let's assume that the A12Z in the iPad Pro operates at roughly 7 watts, which is what the Qualcomm 8cx runs at. Consumer 13-inch laptops today use 15-watt Intel chips, so I think it would be interesting to see what Apple's silicon could do with twice the power. If we're generous and assume that twice the power yields twice the performance, then suddenly we're talking about a 15-watt chip that could process six streams of 4K video, a workload that my 13-inch Razer Blade Stealth certainly couldn't handle right now.
Coming back to hackintosh, right now Apple has four-ish devices in the $1,000 price range; the MacBooks Pro and Air, the iMac, the Mac Mini, and the iPad Pro. On the Mac side, all of these computers are either way underpowered for pro workloads, as is the case with the MacBooks and (base) iMac, and the Mac Mini lacks a dedicated graphics card. The iPad Pro with its dedicated cores and custom GPU actually can do some video editing and gaming, which we know because of apps like LumaFusion (shown below), but iPad OS just isn't suited to professional work. But, if those same apps ran in macOS Big Sur with its real file system and familiar desktop interface, maybe an iPad-class device could produce professional work. Isn't that what the iPad-as-computer people have been saying all along?
What could this look like? Well, according to analyst Ming-Chi Kuo, the first ARM-based Macs we're likely to see are a new 13-inch MacBook Pro and a 24-inch iMac. These probably won't be able to handle the highest tier of pro workloads, but with far more thermal headroom and raw wattage than an iPad, I think it's likely that both will actually be able to handle software that would otherwise bring a similarly priced Intel Mac to its knees.
It's true that, once the transition is complete and Apple drops support for Intel Macs, building a hackintosh tower with affordable off-the-shelf parts probably won't be an option. In fact, a Mac like the modular (but egregiously expensive) Mac Pro may not even be possible on Apple silicon (which is likely why Apple said it "will continue to support and release new versions of macOS for Intel-based Macs for years to come.”) But I think it's likely that the video editors, music producers, and gamers that have been pushed into hacking together their own Mac Pro towers may find that a much lighter duty ARM machine could meet their needs, and at a much lower price. Will it be the same? Probably not, but if Apple still doesn't have a ~$1,000 Mac that runs pro software, it'll have no one to blame but itself.