MY KOLKATA EDUGRAPH
ADVERTISEMENT
Regular-article-logo Friday, 26 April 2024

Apple’s ARKit 3: Tech that you want more of

Though no word on an AR device yet, Tim Cook's response actively supports AR development

Mathures Paul Published 13.07.19, 01:33 PM
A round of AR-powered strike bowling on the iPad

A round of AR-powered strike bowling on the iPad Picture: Apple

Imagine this. A room with two men clinging to their iPads, pushing a giant striped bowling ball between them, stopping it from crashing into oversized pins kept at either ends of the room. In this game of strike bowling, the only things real are the room, the people and the iPads. The rest… augmented reality.

ARKit from Apple has been a game changer. “I see AR as being profound. AR has the ability to amplify human performance instead of isolating humans. So I am a huge, huge believer in AR. We put a lot of energy on AR. We’re moving very fast,” Apple CEO Tim Cook had said during the company’s first quarter earnings call last year. Though no word on an AR device yet, his response actively supports AR development.

ADVERTISEMENT

What was launched at WWDC in 2017 has come a long way. ARKit has already helped the Indian company Designmate to win the iPad App of 2018 with Froggipedia (a virtual dissection app) and now that the Cupertino-HQed company has revealed ARKit 3, RealityKit and Reality Composer, we are looking forward to new developments.

What makes ARKit3 special is the straightforward delivery of complex ideas. If you haven’t worked with 3D before, it would be difficult to master, say, traditional gaming engines; the learning curve can be pretty steep when it comes to rendering, animation, scenes and lighting effects.

RealityKit and Reality Composer

Since the ARKit launched two years ago, Apple has had 7,000-plus AR-enabled apps in its store, spread across categories — enterprise to education to games to healthcare and so on. What the kit does is not delivering AR for the sake of AR. Apple ensures developers can visualise things that are difficult, impossible or expensive. The ARKit is accompanied by RealityKit and Reality Composer.

RealityKit has a brand new physics, animation, rendering and special audio engine, designed to take the best of everything Apple has on its ARKit and give developers a way to automatically make content that look like it’s a part of the real world, which is not something other engines have done successfully.

Here’s an example. Pretend that one’s phone is part of the AR world. You can place it on the table because you would believe it’s real. How real? The reflections on the phone are similar to those in the real world and when kept on the table, it casts shadows. With the RealityKit you get some understanding of physics, like if you tap the phone, it will fall over. The lighting effect also comes across as real. All this, needless to say, make for great apps.

The second part is Reality Composer. Anyone can quickly prototype and produce content for AR experiences that are ready to integrate into apps using Xcode or export to AR Quick Look. Reality Composer lets you build animations and interactions on iOS and Mac to enrich your 3D content. There are all the tools to create scenes. You want objects like arrows, signs, charts for enterprises, games like chess, billiards… it’s there. Then there are textures, lighting effects, colour grading, material and so on. We tried making a chessboard in which tapping the knight would make it “behave” in a certain way. Tap to flip, jiggle and bounce. Among jiggle, there is a basic jiggle, wild jiggle and playful jiggle. All it took was five minutes. Even iOS’s new Dark Mode was thrown into the mix.

In the coming days, we are going to see more educational experiences using AR. There is also a huge opportunity on the e-commerce side. Say, if you were in a store and want to see how a TV would look like if placed on a table, you can do just that. Using the Reality Composer you can tap to turn on lights and have windows for pricing and shipping details.

Big steps forward

In the last two years, Apple’s AR tools have gone from understanding horizontal surfaces to vertical surfaces to understanding objects and faces, so that you can put AR content in the real world in a variety of context. Say when a person walks through AR content, the content usually gets placed on top of the body, which kills the immersive experience.

People inclusion is a big update. The camera can now understand where to place AR content — in front or behind a person. If a person is moving between an AR table and chair, the camera can understand where to put the furniture.

Also, imagine being able to dance with an AR creation and have the movements of the human in the scene impact the movements of the AR creation. The software can analyse movements of the head, torso, legs and wrists. To do this today, we need one of those expensive suits, camera systems and a green background. What Apple is doing is harnessing the full power of its Bionic processor.

There is also simultaneous front and back camera support, which means apps can be developed to track the face as well as see the world at the same time. Take an app like Wonderscope, one of the best educational apps around. The front camera can be made to pick up facial signals that will help guide the story that is being told using the rear camera. A secret facial signal can ask a character to go left or right. Instead of having to talk to a character, you can use your face as part of the experience. Something like this is certain to draw in people into the immersive experience.

A big area where AR can make huge progress is healthcare. Last year, Butterfly Network announced that it has developed an AR telemedicine technology on Butterfly iQ, a whole-body ultrasound imager. Butterfly iQ is a personal ultrasound device. Using Butterfly Tele-Guidance technology, an ultrasound expert can remotely guide any user to acquire even the most challenging ultrasound scans. Only a fraction of the millions of healthcare workers have the expertise to capture and interpret ultrasound images. Butterfly Tele-Guidance expands their reach.

It’s apps like these that make AR technology shine. Another example is the Measure app. What was created to measure closets is now being used to measure the size of cooking pots. We have to remember that AR is not a solution in itself; it’s not an answer to all technological problems. AR is a fantastic tool that gives direction that otherwise can be challenging. And Apple is in the driver’s seat.

Follow us on:
ADVERTISEMENT