4 AR insights we can take away from the Apple Event

4 AR insights we can take away from the Apple Event

The 12th of September has once again come and gone - and again we are left in the wake with three new iPhones and a new Apple watch. Every year we hold our breaths awaiting another "one more thing" moment that just never comes. Moreover, year after year disappointment and criticism reign the internet while Apple stubbornly continues to grow nonetheless - since everyone ironically rushes to buy their new iPhone models that they actually really wanted and enjoy.

A week has now passed, the dust has settled, and emotions are now out of the mix. Let's take a look at what the Apple event told us about the future of AR.

ARkit2_2

ARKit2

ARKit2 is sure to be where we see the most improvements in the short term for Apple AR development. While none of the announcements were genuinely new - as Apple unveiled ARKit 2 back in June - the promising fact is that developers are picking up the SDK to create a range of creative applications that are beginning to roll out. Of course, Apple showcased this in classic form as a multiplayer game. This is a little disappointing as ARKit2 has the power to do so much more. For example, the measure app they showcased in the June unveiling. Measure is a true AR tool that could become a natural part of our everyday lives (such as maps), simply because it is useful.

It really feels like Apple wants us to use AR, but they still haven't figured out how to convince the general public about its strengths and daily usage. Instead they are still stuck in the novelty of AR games. But all is not dismal, there are plenty more applications that showcase the power of ARKit - such IKEA place and our own Augmented Repair application. Keep your eyes open over the next weeks and months as more and more applications begin incorporating AR features that are quickly and easily possible with ARKit2 or the Google ARCore variation for android.

IOS 12 + AR Quick Look

One particularly big announcement was the release of iOS 12, which will now feature AR Quick Look. This particularly exciting addition will make it possible to enable AR features within web browsers without any other applications needed. Put into layman's terms, Apple is making usdz file formats readable in iOS 12. Before you ask, usdz file formats are one of many 3D file formats that can be used to view AR objects in the real world. Just imagine that you have a jpeg, png, and usdz all in your repertoire now. Being able to read this format means that you can send AR objects like images or add them to your web page. Just click on them, and your camera opens to view the image/AR object in the real world.

Naturally, this is set to have a significant impact on the online shopping environment. However, as with all new tech - we are sure to see this emerge in many varying fields. Also, usdz files are currently the only formats supported. Could this mean that Apple is establishing usdz as a standardized 3D format for the future...?

Core-ML

 

CoreML

Our favorite announcement of the day was the CoreML. This is a feature that we believe will have high relevance in the next 1-2 years. Currently, all of the big three (Apple, Facebook, and Google) working on this for a good reason. The power of device machine learning is sure to impact everything we do on our devices - including AR. CoreML is set to accelerate the processing power of the devices making all aspect more intelligent... for example the camera (for recognition).

Just imagine that your device can use your GPS, Image Recognition (via the camera) and online data to locate you and give you a guided tour of the city. Alternatively, point your camera at your coffee machine, and the device will identify the make and model, connect to its IoT troubleshooter to guide you through setup or maintenance. Very exciting stuff!

camera_1920

New and improved Camera

Apple's vice president of worldwide marketing, Phil Schiller, devoted a large portion of his stage time to highlight the prowess of the new camera and its features. Although we could use this point to resume the age-old discussion of whether smart-"phones" have transformed into pimped out camera's with calling and internet capabilities, lets instead think about what this means for AR.

AR is all done through the camera, so better camera = better AR., RIght?

Sadly this is not the case. A major leap forward would have been the inclusion of AR specific cameras like depth cameras or wide-angle high frequency cameras that could dramatically improve object recognition and the tracking of objects. This is a big deal if you think about it. Our entire world is in 3D. We, therefore, need 3D depth information to be able to measure the 3D spaces and place augmentations into it effectively. Hence the importance of the depth camera.

The strange point is that this technology is not new at all. We have seen it commercially in the Microsoft Kinect, Google tango devices such as the revolutionary Peanut phone (2014) and more recently in the Microsoft HoloLens (2016). Although Tim Cook seems to be wild about AR, the inclusion of a depth camera might be the breaking point for the all elusive apple glasses that are sure to appear in the coming years.