It’s been a long way since Pokémon Go introduced Augmented Reality to regular smartphone users back in 2016. And it was time the Tech industry realized that this tech brings something unique to the table. Augmented Reality is not fancy. Adding a digital object and overlay in the physical world lets us use it in new ways, and this is only the start. Let’s find out what exactly is this Augmented Reality.
Augmented Reality, a.k.a AR, is an interactive experience of a live world environment where computer-generated perceptual data improve and place objects in the real world, sometimes across multiple receptive modalities, including visual and audio, haptic, somatosensory. AR can simply be explained as a system that incorporates these primary characteristics: a blend of natural and virtual worlds, live interaction, and natural objects. The experience is perfectly knitted with the natural world recognized as an immersive aspect of the environment. In doing this, AR modifies one’s ongoing perception of a real-world setting, whereas VR substitutes the user’s real-world setting with a simulated one. Thus, AR sounds similar to mixed reality and computer-mediated reality.
Smartphones are the most used devices for AR technology; Since the past few years, Mobile devices have inbuilt AR frameworks from their manufacturers or operating system providers. Mobile AR has the advantage of a widely distributed base of hardware, including smartphones and tablets. Because AR exists within the world, it makes sense for AR programs to be mobile. There are many platforms and frameworks for AR development and integrations. Still, here we are talking about Mobile AR, And There are only two significant ecosystems that cover most of the market.
The first steps for AR technology using only a smartphone with Android started seven years ago with project Tango, developed by Google. You need a specific google made device to run Tango, and this made the project limited to only some of the developers. As a result of the project, Google has developed ARCore. The major benefit of ARCore is that it can be performed without any additional hardware, which makes ARCore scalable across Android OS. ARCore is widely supported by Android as well as iOS devices and comes built-in with upcoming Android devices.
ARKit is Apple’s AR platform for their iOS devices. Developers can build and design interactive apps that can efficiently interact with the real world around the user by utilizing data collected from devices’ sensors and cameras in real-time. Apple has been determined on the potential of AR for four years now and launched ARKit in 2017 along with their iOS11. Another major update came with iOS12, and Apple keeps it updated with their newer iOS release.
ARCore or ARKit? Which one is better?
The fundamentals of both of these software development kits are very similar, based on the three primary necessities for useful AR software: environmental understanding, motion tracking, and light estimation. In addition, both ARKit and ARCore have them well implemented, although their strategies differ slightly.
- Motion Tracking
AR tech needs a device capable of tracking its positions and orientation with user positions in real-world surroundings. ARKit and ARCore both can do that very efficiently with technology known as visual-inertial odometry. VIO works with data from a device’s motion sensors and camera to recognize every movement across six axes, allowing virtual AR objects to remain precisely positioned in the real world, which is necessary for a good AR experience.
- Plane Detection
As well as perceiving keeping virtual objects in place, an AR app must know where and what it is placing on. Plane Detection is the tech responsible for that. ARKit and ARCore apps can identify the difference between horizontal, vertical, and angular planes in the camera’s field of view, so virtual AR items can realistically function onto surfaces.
ARCore describes this as environmental understanding, where Apple names it scene understanding. Surpassing the semiotic variations, there is a minor difference in the technologies employed by the two SDK developers.
- Lighting Estimation
Virtual objects should resemble subjects to the same lighting dynamics as the real-world scene to make AR visuals realistic and the environment alike. ARKit and ARCore execute this challenge through a method called lighting estimation, which separates the ambient light sensed by a device’s camera and light sensors to apply photorealistic rendering of virtuals and objects.
Apart from these fundamentals, Every new AR development platform introduces features that utilize proprietary technologies. For example, it’s called TrueDepth camera on the Apple iPhone X, and on Android devices, It’s Project Tango.
Differences in ability between the two SDKs are pretty complex. However, both tools allow developers to develop games or more valuable apps that immerse users in an AR experience. In addition, the tracking, motion, and lighting technologies make sure users won’t feel that they are simply observing visual data overlaid with a camera screen.
Apple wants AR on any device despite whether it has dual cameras or not, whereas Google is pretty clear that ARCore is not a projected Tango. Still, it gives the Tango experience and reaches millions of devices. When Apple says that it will be deployed to iOS on millions of devices, it will act instantly, and when Google says that, it will need time. Even where ARCore is already working on millions of Android devices worldwide, with some conditions like running Android 7.0, Nought, and above android versions. Android’s high level of fragmentation will turn out to be the major problem for Google mixed reality.