Improved preception with AR.
A technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view.
In layman’s term, just imagine you can also see the real world with extra sensors. Your eyes plus thermal imaging capabilities, or visualising the furniture set-up in an empty space, can be used in almost all fields where it is difficult to physically place items to check how they look. As a result, perception becomes less labourious.
What can AR do?
It changes the way
- you shop furniture
- renovate your house
- select paint color and combination for walls
- or even see your food before it is prepared, like pizza on an empty dish customised for you
Let’s see it in Action:
This is Apple’s latest SDK introduced a month ago; it helps the developer add augmented reality capablities, as shown in video above.
Google has its own answer to Apple’s ARkit. Named ARcore, this SDK helps you add an Augmented Reality layer to your camera application.
Let’s get our hands dirty playing with SDK.
- Download SDK from GitHub
- Add dependencies and jump into Android Studio
- Voila, there is no third step.
ARcore SDK is a single dependency that adds a layer for motion-sensing and learning the environment around it.
Here’s Bugdroid on my desk:
How does it works?
In a computer vision field, motion-sensing is a bunch of algorithms that coordinates video stream input into the Cartesian graph which detects the position relatively to the object.
The sample shown above
- detects a flat surface using Opengls libraries
- plots and learns environment as and when camera angle changes
- superimposes pre-defined object on co-ordinates
ARcore is still in preview (Alpha stage), but being far more promising to take over the future, it is estimated to bring about drastic changes in the Ecommerce industry.
The key takeaway is User Experience. Simulations of actual products help in making decisions right in the comfort of your home, without the tedium of commuting to the store.