Credit to AR MR XR
In the iOS 16 update, Apple introduces a new software interface to support “RoomPlan” apps, which relies on the LiDAR sensor to scan locations and create a 3D layout of rooms.
Augmented or virtual reality technology was not among Apple’s announcements during its recent developer conference, but the private developer beta of the iOS 16 update continues to reveal more features and new technologies Apple is delivering in the update.
Technologies revealed in the new “RoomPlan” API include improvements to create accurate 3D layouts of rooms by scanning the location with LiDAR on iPhone phones.
Previously, 3D graph apps used the iPhone or iPad camera to create a 3D graph, but it was not accurate enough.
On the other hand, Apple introduced the LiDAR scanner in the iPad Pro and iPhone 12 Pro phones launched in 2020, where the scanner determines the dimensions between objects with higher accuracy, and based on the work of the LiDAR sensor, the laser beams emitted from the scanner and then the timing of the return of the beams to the sensor is calculated to determine the exact dimensions.
The LiDAR sensor also supports scanning objects in rooms and accurately determining the space occupied by those objects in the room. The upgrade of this technology comes with the iOS 16 update, which also brings a new software interface to support RoomPlan applications with iPhone and iPad cameras .
The software interface helps users create 3D diagrams in just a few seconds, as Apple emphasizes that this software interface can be of great use in real estate applications and interior design applications due to its smooth user interface.
On the other hand, reports confirm that support for the new API for schematic design applications requires these applications to be updated to support the new interface and gain benefits.
Experience has confirmed that the new RoomPlan API can create a 3D diagram with the highest accuracy in seconds using the iPhone 13 Pro Max.