ARCore, also known as Google Play Services for AR, is a software development kit developed by Google that allows for augmented reality (AR) applications to be built. ARCore has been integrated into a multitude of devices.[2]
ARCore uses a few key technologies to integrate virtual content with the real world as seen through the camera of a smartphone or tablet.[3] Each of these technologies can be utilized by developers to create a high-quality, immersive AR experience.
Allows the phone to understand and track its position relative to the world.
A motion tracking process known as simultaneous localization and mapping (SLAM) utilizes feature points - which are visually distinct objects within camera view - to provide focal points for the phone to determine proper positioning (pose) of the device.[4]
Allows the phone to detect the size and location of flat surfaces - both vertical and horizontal - with feature points.
Geometric plane can be calculated based on detected feature points.
A scene semantics API is used to gather real-time semantic data about the user's surroundings and uses that data to identify objects and features in view.
Lighting Estimation API allows the phone to estimate the environment's current lighting conditions and display images accurately in relation to real-world lighting.
Lighting cues such as shadows and highlights are used to more immersively display virtual objects.[5]
Utilizes the phone's camera to create depth maps, which enable the device to more accurately determine the amount of space between surfaces based on what is captured.[6]
In order to properly assess the real world, depth maps are created to measure the amount of space between objects or surfaces.
A depth-from-motion algorithm takes the motion data from the user's camera and utilizes it to create a more detailed depth map.[7]