Apple today showed off its Vision Pro VR/AR headset. The device is powered by the company’s own M2 chip, but in order to do the real-time processing from the company’s wall of sensors, Apple had to develop a brand new processor too – which it dubs the R1.
The R1 chip is taking all the sensors embedded into the headset to create precise head and hand tracking, along with real-time 3D mapping and eye-tracking.
The specialized chip was designed specifically for the challenging task of real-time sensor processing, taking the input from 12 cameras, five sensors (including a LIDAR sensor!) and six microphones. The company claims it can process the sensor data within 12 milliseconds – eight times faster than the blink of an eye – and says this will dramatically reduce the motion sickness plagueing many other AR/VR systems.
Apple says that using all this vision and sensor data means it doesn’t need to use controllers and instead uses hand gesture tracking and eye tracking to control the experience.
The combination of an M2 chip to pack a ton of computing power and an R1 chip to deal with inputs, Apple describes its device as the most advanced device ever and claims it filed 5,000 patents to make it all happen.
Apple was short on details regarding the R1 processor, but we’ll keep an eye out for more technical specs.
To reduce motion sickness in VR, Apple developed new R1 chip by Haje Jan Kamps originally published on TechCrunch