Apple Vision Pro needs custom-designed high-speed DRAM

Posted:
in Apple Vision Pro

To handle its multiple cameras without visible lag, Apple Vision Pro is to use a 1-gigabit DRAM chip, custom-designed to work with the headset's R1 chipset.

Apple Vision Pro
Apple Vision Pro
Apple's launch

of the Vision Pro headset concentrated on use cases for it more than technical details, and little else will be known until the device goes on sale. However, Apple did say that there are 12 cameras, five sensors, six microphones -- and a 12-millisecond lag.

Apple has said that this delay is eight times faster than the blink of an eye. Now according to The Korea Herald, a substantial part of that speed is because the headset will use a specially-designed DRAM chip by memory manufacturer SK Hynix.

Representatives of SK Hynix told the publication only that "we cannot provide comments related to our clients." However, as well as discussing the speed of the DRAM design, the unspecified sources also say that this company is to be the sole memory chip supplier for Apple Vision Pro.

Unnamed industry officials told The Korea Herald that the DRAM's design would help Apple's headset double the possible data processing speed of the device.

As well as having 1-gigabit DRAM capacity, the new design reportedly features an eightfold increase in the number of input and output pins. It's claimed that the DRAM will be attached to Apple's R1 chipset to become a single unit.

This appears to be an evolution of Apple's Unified Memory concept that it has been using with the M-series processor for some time. it's not presently clear, however, if the DRAM associated with the R1 is also connected to the M2 chip in the headset, or if that has its own RAM.

Apple's R1 chip is dedicated to processing inputs from the Vision Pro's large range of cameras and other sensors. It then works in conjunction with Apple's M2 chip, which is the headset's main processor.

SK Hynix is one of several memory manufacturing firms, several of whom have previously faced class-action lawsuits over alleged price fixing.

Read on AppleInsider

Comments

  • Reply 1 of 7
    thttht Posts: 5,605member
    ... that the DRAM's design would help Apple's headset double the possible data processing speed of the device.

    As well as having 1-gigabit DRAM capacity, the new design reportedly features an eightfold increase in the number of input and output pins. It's claimed that the DRAM will be attached to Apple's R1 chipset to become a single unit.
    Interesting.

    It's only 256 MByte of DRAM, and it sounds like it will be in a package-on-package configuration. Ie, it sits on top of the R1 chip. Double the data rate as LPDDR5? Around 100 GByte/s per channel? 8 fold increase in pins sounds like a custom HBM type of design. The RAM amount sounds likes it is used exclusively by the R1 chip.

    The M2 could be interesting too. I wonder if that is going to be a PoP package too. Package-on-package where the LPDDR sits on top of the M2 SoC chip like they do for A-series SoCs.
    watto_cobrabyronl
  • Reply 2 of 7
    danoxdanox Posts: 3,240member
    Just another nail in the coffin as to why the price isn’t cheap, nor will there be a price drop anytime soon if ever, the AVP is a fully fledged Apple laptop/12.9 iPad Pro.
    edited July 2023 grandact73watto_cobra
  • Reply 3 of 7
    thttht Posts: 5,605member
    My bet is that Apple will have "Vision Air" for about $1500 after driving component costs down. Probably will remove a couple of the cameras, consolidate parts, change materials to reduce component costs. There will be some features removed, and I think Apple is just waiting on the first year to figure out what features are being used or what are not. R1 functionality could be integrated into an "M3" perhaps.

    The trick is what display tech can they use that is cheaper than microOLED (is "OLED on Silicon" better?) and maintain 23m pixels?

    End of 2025 perhaps.
    watto_cobra
  • Reply 4 of 7
    The cameras and sensors are intriguing. Since the orientation of a headset is tied to a human head, not the most flexible platform for rotation, I wonder if the sensors will be square with software selectable rectangles (horizontal or vertical of different ratios) or square images. Support for flash synchronization would also enable unique capabilities. Furthermore, I wonder about the zoom ability. It would add to the value equation if Visions Pros could function as binoculars with heads up display. If the imaging system is insufficiently rich, then maybe an accessory imaging system could feed into the headset.
    watto_cobra
  • Reply 5 of 7
    thttht Posts: 5,605member
    The cameras and sensors are intriguing. Since the orientation of a headset is tied to a human head, not the most flexible platform for rotation, I wonder if the sensors will be square with software selectable rectangles (horizontal or vertical of different ratios) or square images. Support for flash synchronization would also enable unique capabilities. Furthermore, I wonder about the zoom ability. It would add to the value equation if Visions Pros could function as binoculars with heads up display. If the imaging system is insufficiently rich, then maybe an accessory imaging system could feed into the headset.
    Yeah, keep on thinking down this path. There is a bit of fixation by headset fans that "augmented reality" is about putting information in your field of view. That's big, but "augmentation" could mean augmentation of your senses. 10x optical zoom. 10x microscopic vision, IR/UV vision. Synesthesia. Object recognition and tracking. Telepresence for multiple people all over the world, including camera feed sharing (seeing through other people's headsets in near-realtime). They could put smell and chemical detection sensors in it.
    watto_cobraCurtisHightbyronl
  • Reply 6 of 7
    With the amount of dedicated technology going into the Apple Vision Pro, imagine the breakthroughs in computing and driving that the Apple Car may have! Even if they ultimately do not release an Apple Car, there must be a lot that they build/ create/ make that gives them some valuable tech.
    jony0byronl
  • Reply 7 of 7
    tht said:
    The cameras and sensors are intriguing. Since the orientation of a headset is tied to a human head, not the most flexible platform for rotation, I wonder if the sensors will be square with software selectable rectangles (horizontal or vertical of different ratios) or square images. Support for flash synchronization would also enable unique capabilities. Furthermore, I wonder about the zoom ability. It would add to the value equation if Visions Pros could function as binoculars with heads up display. If the imaging system is insufficiently rich, then maybe an accessory imaging system could feed into the headset.
    Yeah, keep on thinking down this path. There is a bit of fixation by headset fans that "augmented reality" is about putting information in your field of view. That's big, but "augmentation" could mean augmentation of your senses. 10x optical zoom. 10x microscopic vision, IR/UV vision. Synesthesia. Object recognition and tracking. Telepresence for multiple people all over the world, including camera feed sharing (seeing through other people's headsets in near-realtime). They could put smell and chemical detection sensors in it.
    One use case I'm working on is this: I'm counting anadromous fish in a river. I use my intelligence to sort salmonids by size and behavior. If I don a Vision Pro headset, then I can record data as I see it using hand gestures. Environmental measurements can be fed into the live database. I can share my view and live applied intelligence over Starlink to the outside world. The enclosed viewing environment is a strength because it augments my mosquito netting; I integrate my netting support and my headset support. Next, I bring a DIDSON feed into the headset, integrating it into my counting [spatial computing] environment.
    HirsuteJimbyronl
Sign In or Register to comment.