My Blog
Business

Apple’s Vision Pro: Why it’s so expensive

Apple’s Vision Pro: Why it’s so expensive
Apple’s Vision Pro: Why it’s so expensive


A customer tries on the Apple Vision Pro headset during the product launch at an Apple Store in New York City on Feb. 2, 2024.

Angela Weiss | Afp | Getty Images

The Vision Pro, the new virtual reality headset from Apple, can transport you to Hawaii or the surface of the moon.

It displays high-resolution computer graphics a few millimeters from the user’s eyes, all while allowing the user to control a desktop-like interface using their eyes and subtle hand gestures. The Vision Pro provides a preview of what using a computer could be like in five years, early adopters say.

The Vision Pro starts at $3,499. After adding storage and accessories such as straps, the whole package can cost as much as $4,500.

That’s a lot more expensive than competing headsets, such as Meta’s Quest 3, which starts at $499. It’s pricier than Meta’s high-end headset, the Quest Pro, which starts at $999. It’s also more expensive, even after controlling for inflation, than the first iPad ($499) or the first iPhone ($499 with a two-year contract).

The Vision Pro includes lots of pricey state-of-the-art parts. One estimate from research firm Omdia puts the “bill of materials” for the headset at $1,542, and that doesn’t include the costs of research and development, packaging, marketing or Apple’s profit margin.

The most expensive part in the headset is the 1.25 inch Sony Semiconductor display that goes in front of the user’s eye.

It’s a key component that helps the virtual experience feel more realistic than previous consumer headsets. The displays have a lot of pixels and lifelike colors, and are built with state-of-the-art manufacturing techniques.

Apple pays about $228 for the “Micro OLED” displays it uses, according to the Omdia estimate. Each Vision Pro needs two of them, one for each eye. Sony Semiconductor declined CNBC’s request to comment for this story.

The Vision Pro displays are the latest example of Apple embracing a new kind of display technology at a larger scale and earlier than the rest of the electronics industry.

Apple’s usage of LCD touchscreens for the first iPhone in 2007, and its later transition to organic LEDs or OLED displays with the iPhone X in 2017, upended existing supply chains and, after Apple shipped millions of units, ultimately drove the cost of the parts for the entire industry down.

Apple has a massive effect on the display industry, said Jacky Qiu, co-founder of OTI Lumionics, which makes materials for manufacturing micro LED panels. He said display makers fight for Apple’s business, which can be make or break for these companies.

“Apple is now the biggest player in terms of OLEDs, in terms of displays. They are the ones that are basically taking all the high-margin displays, all the stuff that is the high-spec type of stuff that is allowing the panel makers today to become profitable,” Qiu said.

“You look at the display business, you either work for Apple and make the iPhone screens and you’re profitable, or you don’t, and you lose money. It’s as brutal as that,” Qiu said.

Micro OLED

The Vision Pro’s displays are a defining feature. They’re packed with pixels and are sharper than any competing headset.

It’s one of the main points that Meta CEO Mark Zuckerberg complimented when comparing the $499 Quest 3 headset to Apple’s headset.

“Apple’s screen does have a higher resolution and that’s really nice,” Zuckerberg said in a video posted on his Instagram page, while saying that Quest’s screens are brighter.

“What’s so revolutionary about the OLED displays that are in the Vision Pro, the difference between Micro OLED and the OLED that you find on a television in your living room is that the pixels are actually a lot denser, they’re smaller and they’re more compact,” said Wayne Rickard, CEO of Terecircuits, a company that makes materials and techniques for display manufacturing.

An Apple Vision Pro headset is displayed during the product release at an Apple Store in New York City on Feb. 2, 2024.

Angela Weiss | AFP | Getty Images

According to a teardown analysis from repair firm iFixit, each Vision Pro display has a resolution of 3660 by 3200 pixels. That’s more pixels per eye than the iPhone 15, which has a screen resolution of 2556 by 1179 pixels. Meta’s Quest 3 comes in at a resolution of 2,064 by 2,208 per eye.

The Vision Pro’s screens are much smaller than the iPhone’s screen, which makes the pixels closer together, and more difficult to manufacture. The Vision Pro displays have 3,386 pixels per inch versus the iPhone 15, which has about 460 pixels per inch on its display.

In total, Apple says the Vision Pro’s displays have more than 23 million total pixels.

They’re some of the densest displays ever built. According to iFixit, 54 Vision Pro pixels can fit in a single iPhone pixel, and each pixel is about 7.5 microns from the next pixel, a measurement called “pixel pitch,” according to Apple’s specifications.

The Apple Vision Pro home screen.

Todd Haselton | CNBC

“With Micro LEDs in particular, it can get down to about below 10 microns. For comparison, a red blood cell might be about 20 microns, so half the size of a red blood cell,” Rickard said.

Apple opted for high-resolution displays so they’d be closer to simulating reality when using the headset’s passthrough mode, which uses outward-facing cameras to show video of the real world inside the headset. It also helps users read text or numbers in virtual reality. It helps remove the “screen door” effect of other headsets where you can see the pixels.

VR headsets need pixel-dense displays because the user’s eyes are so close to the screen. TVs have significantly fewer pixels, but it doesn’t matter because viewers are feet away.

The production of this kind of display requires cutting-edge manufacturing. For example, most displays are built on a backplane made out of glass. The Vision Pro displays are so pixel-dense that they use a silicon backplane, much like a semiconductor.

‘An incredible amount of technology packed into the product’

The new Apple Vision Pro headset is displayed during the Apple Worldwide Developers Conference in Cupertino, California, on June 5, 2023.

Justin Sullivan | Getty Images

The second most expensive part in the Vision Pro is the company’s main processor, which includes Apple’s M2 chip, the same chip it uses in the MacBook Air, and the R1 chip, which is a custom processor to handle video feeds and other sensors on the device.

Bill of materials estimates don’t take into account research and development costs, packaging or shipping. They also don’t take into account capital expenditures that can add up-front costs to big parts orders, but they’re useful for people in the manufacturing world to get an idea of how expensive the parts are in any given device.

Display technologies embraced by Apple typically come down in price after Apple makes them mainstream and as multiple suppliers compete for business.

“South Korean suppliers like Samsung Display and LG Display have shown their interest in this technology. Chinese suppliers like Seeya and BOE are also small-scale mass-produced [OLED on silicon] products,” said Jay Shao, Omdia analyst for displays, in an email. He expects the costs for Vision Pro spec screens to come down in the coming years.

Apple declined to comment, but Apple CEO Tim Cook is not a fan of cost estimates and teardowns. “I’ve never seen one that’s even close to accurate,” he said on an earnings call in 2015.

Apple doesn’t typically discuss its suppliers, but in February, Cook was asked about the device’s price tag on an earnings call.

“If you look at it from a price point of view, there’s an incredible amount of technology packed into the product,” Cook said.

He mentioned some of the most expensive parts in the device and emphasized the R&D costs that Apple spent developing it.

“There’s 5,000 patents in the product, and it’s built on many innovations that Apple has spent multiple years on from silicon to displays and significant AI and machine learning. All the hand tracking, the room mapping, all of this stuff is driven by AI, and so we’re incredibly excited about it,” Cook continued.

Don’t miss these stories from CNBC PRO:



Related posts

This catalyst could trigger a stock market sell-off in second half

newsconquest

Wall Street banks name their top software picks for 2023

newsconquest

Bitcoin tops $40,000 for first time in 2023 on ETF hopes, bets on Fed cuts

newsconquest