The Mobile Diary: Experimentation with ARCore v1.9.0 (Pt. 1)

from:https://medium.com/@dodgegt8/the-mobile-diary-experimentation-with-arcore-v1-9-0-pt-1-e3b1f4126b4c

Introduction

Hello there people of the internet! My name is Michael and I love mobile app development. On top of that, I love augmented reality. I have played around with ARCore in the past, but have always felt that it fell short of what Apple has done with ARKit. My pain points were that it didn’t support image tracking and that the refresh rate was slow causing the objects to stutter and skip around. It just didn’t feel fluid or polished in any way. Not to mention, it sometimes just didn’t work correctly on some supported devices.

Now I shouldn’t be too hard on Google. What they have done with ARCore in very impressive. They are truly moving mountains. Getting AR to run on a whole host of devices all with different hardware is no easy feat. Now I know that it seems that I’m being tough on ARCore. It actually does some things really well! One example of this is dynamic lighting. I have always felt that ARCore’s dynamic lighting is superior to Apple’s ARKit implementation. That being said, ARCore has always just left me wanting more.

Fast forward to this year’s Google I/O event (I/O 2019). The ARCore team has just released its latest build, v1.9.0. Google seems to be giving augmented reality a good push in the right direction. They mentioned augmented reality numerous times throughout the I/O main keynote. In addition to that, they hosted numerous talks and sessions all aimed at augmented reality!

 

 

I excitedly watched as many of them as I could. The ARCore team touted various performance changes like faster tick rates (Or so I think. I always though ARCore had a 16hz tick rate and now its 30hz? Not sure if this was updated a while back or in this new release) and faster planar detection. They also announced Augmented image tracking and the ability to change how augmented images are tracked. I think it’s time to try out these new features and test Google’s performance claims. Let’s get started!

 

 


Creating The Blueprint

For this app we will want to try and test the improvements Google has touted at I/O this year and also to try out the new features, such as image tracking. How should we go about this? Thats a great question. I am going to break this app down to a few main pieces.

 

 

(Blueprint)

The first part I want to focus on is the motion tracking (not image tracking, but motion tracking). Google claims that they have seen a 30% increase in motion tracking robustness. In other words, ARCore world tracking should be more stable and less jerky. Along with this, Google has added additional tracking failure reasons to better explain to the end user what they are doing wrong and how to correct it. So the first part of this app will be a simple 3D model viewer, where the user places a 3D model of Andy the android! I will be testing to see how well Andy stays in place and will take note of any drifting that occurs.

Part two of this app will feature planar detection. I am going to be testing how fast ARCore can find a planar surface. Google claims that planar detection time is down by a whopping 50 percent! Along with that, one Google engineer boasted about how ARCore was able to find the floor plane with next to no camera motion needed. If you are familiar with augmented reality you know all about the sweeping gesture to find planar surfaces. Rest assured, I will be testing his claims to find out for myself. The second part of this app will be simple planar surface visualizer. It will scan for planar surfaces (vertical and horizontal) and display a texture on top of them to allow the user to see the detected planar surfaces.

Finally, the last part of out app will be image tracking. This will be the part of the app I am most interested in! Here, we will try out ARCore’s augmented image tracking capabilities and see how it stacks up to ARKit. From one of their developer sessions, Google claims ARCore’s augmented images have been improved with 30% better tracking precision and 15% better detection recall. To test these claims, we will make a simple augmented image marker for our app. It will find an image marker and render a video on top of it. From there, I will move the image marker all around to see how well the video stays attached to its marker. Additionally I will be trying out the new image tracking modes for augmented images.

Now that we have planned out the blueprint for the app, lets get started and make some sweet AR content!

 

 


Getting Started

NOTE: I just want to quickly mention that I will be developing using the ARCore Android SDK and not the ARCore Unity SDK. If you get stuck in setup you can refer to the “Getting started with Sceneform and ARCore” section of Sceneform documentation. Click here to View.

Let’s open up Android Studio and create a new project! I am going to select “Add No Activity” because I will create my own empty activity later.

 

 

Next, we have to name our project. Just go ahead and name it whatever you want. This is not really that important of a step. After that, you can select your programming language (Java or Kotlin) and the minimum API level. I am going to be using Kotlin for my programming language and API 27: Android 8.1 (Oreo) for my minimum API level. Also I am going to be checking the “use androidx.* artifacts” box. It’s important to note that ARCore can support a minimum API level of API 24: Android 7.0 (Nougat).

Next up, we will make sure we have the Sceneform Tools plugin installed in Android Studio. For macOS go to the top bar and click: Android Studio > Preferences > Plugins. For windows go to: File > Settings > Plugins.Next we want to click Marketplace, search for and install the Google Sceneform Tools (Beta). This will allow us to import 3D models into Android Studio.

 

 

From here we will have to import all the required libraries and APIs that ARCore and Sceneform require. Let’s open up our project level build.gradle file. We need to make sure that our project includes Google’s Maven repository (Usually it does by default). It can be found under allprojects > repositories > google()

 

Next I am just going to add some extensions to make it easier to read and enter version numbers for our imports. Feel free to copy my full project level build.gradle file below:

 

That should be all we have to do with the project level build.gradle file. Now we need to open the app level build.gradle file and make some changes. First off, we need to import our ARCore and Sceneform dependencies.

 

If you set your Target SDK Version to anything above API 26 we need to add some compile options to let Sceneform libraries use language constructs from Java 8. To do this lets add this right below buildTypes

 

It’s also important to make sure we have our minSDKVersion set to 24 or higher. Now if you copied my project level build.gradle file make sure you set the version numbers from the versions extension I created. Here is how my app level build.gradle file looks fully completed:

 

We should be finished with all of our gradle files now. Android Studio should be asking you to sync your gradle files. Go ahead and do that now. After it finishes syncing, open up our AndroidManifest.xml file. We need to add permissions to use the camera and ar functionality.

 

Now that we have the required permissions and have told Google that this is an AR application we should be ready to start programming!

 

 

This will conclude the first part of this series. I will update this article with a link to part 2 when its available.

Thanks for reading!

NOTE: I would like to point out that I am requiring ARCore for this app. This is to make the article as simple as possible. If you would like to see a tutorial on handling optional AR functionality, let me know and I can write one!

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值