Oops... your message was not sent

Your message has been successfully sent

Jetruby Blog
Let's talk!

Featured Technical Stories Based On Our Experience

Tech

To the Infinity and Beyond! — Implementing Augmented Reality in Android Using ARCore

A long time ago, Star Wars characters called each other using video calls. It seemed to be a real miracle. Now, it’s a matter of course as we can simply talk with our friends using our smartphones or desktop applications. In this article, we talk about the ultra-relevant topic, AR (Augmented Reality).

Google has released ARCore in response to Apple’s ARKit. ARCore is a library that allows you to create AR on Android platforms. Our hot issue is why do we actually need AR or VR when there’s a wide range of great applications available?

The answer to this question is immersed calculations, use of data interoperability in a more natural way. Put simply, this allows adding digital content to our reality. Objects created with the help of ARCore are scaled in real time. Besides, the majority of current smartphones support AR. This allows the combination of 2D with the 3D world.

Augmented Reality

Is there practical value for Augmented Reality in real life? Honestly, many fields pop up where this technology can be used. For example, using AR you can see how the furniture you’re going to buy, looks in your living room (without buying and moving it!). And that’s not all! AR allows you to paint your walls any color. Or install a Lego app and play with various digital versions of selected LEGO sets with real-world scenes. And these are only a few examples of AR applications.

Android AR sprang from the project, Tango. However, it didn’t become popular in the commercial market because it required a depth sensor. This sensor allowed it to put a device in a virtual environment that was a reflection of the real one.

Unfortunately, there were limitations, like the 4 meter maximum distance. It’s also impossible to build a distance reflected from bright or mirrored surfaces. As a result, Google decided to create an AR for Android that doesn’t depend on the hardware. ARCore was developed based on actual work experience with Tango and without reliance on depth sensors. Nowadays, ARCore has around 100 million users around the world.

At the heart of ARCore, we identify 3 main locations: motion tracking, environment understanding, and light estimation.

Motion Tracking

As your smartphone moves through the world, ARCore combines visual data from the device’s camera and IMU to estimate the pose (position and orientation) of the camera relative to the world over time. In this category, it’s necessary to identify the following calibrations:

Optical

  • The pinhole model is a mathematical model describing the relationship between the point’s coordinates in a 3 dimensional environment with its projection and also Field of View (FoV); the model describes the perspective distortion of the image;
  • The photometric calibration — the colors heatmap.

Environmental simulation based on inertia

  • Acceleration is measured (not distance or velocity);
  • Environmental simulation based on inertia is considered as stats for each specific use case.

Nevertheless, there’s a temperature issue related to the factory calibration of IMU. Various manufacturers produce devices with diverse temperatures, so the data on these devices differs.

Understanding the Environment

In general, ARCore looks for clusters of feature points appearing to lie on common horizontal surfaces and makes them available to your app as planes.

  • Understanding the environment is based on SLAM technology (simultaneous localization and mapping). A SLAM map is a graph of 3D points which represent a sparse point-cloud where each point corresponds to the coordinates of an Optical Feature in the scene (e.g. the corner of a table).

Android Augmented Reality

  • The same thing is valid for acceleration measurements. In this case, SLAM uses a dot map.
  • The main goal of SLAM is to build and update the map of an unknown environment while tracking the location of the agent inside it.
  • Tango also has some issues with mirror surfaces.

Light Estimation

ARCore can detect the average intensity of a given image from the camera. This allows you to light your virtual objects under the same conditions as the environment around them.
Android Augmented Reality

If you want to use ARCore in your Android apps, follow these steps:

  1. Download and install ARCore.
  2. Make sure you’ve a basic understanding of Android development with OpenGL.
  3. You’ll need a general understanding of Android framework.

The main ARCore classes are Session, Frame, Plane, and Anchor. Let’s look at them more closely:

Session. You need to check whether a device supports ARCore or not. If it does, create the configuration.

Frame. It’s necessary to define the camera’s object and update the image. The main question is what to use? Blocking or the latest camera image? With the use of Blocking, every pose aligns within every frame. This allows for reduction of rendering to the camera’s frames rate. On the other hand, the unblocking method allows for rendering objects as quickly as possible.

Anchors. In this method onDraw(), we need to check whether there’s a touching in range or not. In this case it is, and we place an anchor. However, remember the maximum number of anchors is 20.



Projection Matrix.
Projection matrix has a “pose” showing the location of the camera in the world. This matrix is responsible for the comparison of a virtual camera with the real one. In this method onDraw(), we get the matrix of projection by indicating the frames where it is to be placed. In the example below, we specify that we are tracking the objects starting from 10 cm and finishing at 100 meters.

From the object “frame,” we get the point cloud (remember SLAM and “trusted points”?). We update the object “point cloud” by giving it the projection matrix and the matrix of camera view. If the projection matrix contains the camera properties and the view match matrix contains the data about the camera location, the model matrix contains the location of the anchor within the world. You can put the pixels on the right side of the screen.

Render objects. The next step is to check all anchors and update the object we want to draw. We update the object’s matrix and the matrix of its shadow. Hereafter, we call them from the draw() method. In the example above, we can see the object’s “light intensity” that we transfer as a parameter to our object, so it’s now responsible for the shadow.

The Bottom Line

That’s pretty much everything you need in a nutshell to implement ARCore in an Android application. It’s helpful to understand 3D model rendering, but we’ll save that for a future article.

So where can the AR technology be super useful? Unfortunately, there’s no fireworks feature to make this technology a crowd-pleaser. Our current task is to amp up and sustain user interest by focusing on those applications users find the most appealing. Among these are the ones are capable of furnishing your home or changing the color of the leaves, furniture or lipstick using the full potential of Augmented Reality.

Don’t forget about the Chinese market where Google Play is not available. This area has great potential for AR, however, the alternative to ARCore is yet to be released.

Despite the fact the AR project was paused due to various problems (some of them related to legislation), Google has reopened the project with the focus on industrial applications, such as various spheres of production, factories, plants, and tech-heavy concerns.

 

As you can see, AR technology is advancing and growing. Who knows, what we can expect in the next few years for AR?! Will it be able to unlock its full potential? Leave your thoughts in the comments and stay tuned!


P/S: We hope you enjoyed today’s article =) If so, please support it with claps and by hitting the subscribe button below!

JetRuby Agency
This post has been contributed by:
New Articles