Place Objects in Augmented Reality via ARKit - Digital Leaves
3043
post-template-default,single,single-post,postid-3043,single-format-standard,qode-social-login-1.0,qode-restaurant-1.0,ajax_fade,page_not_loaded,,select-theme-ver-4.1,wpb-js-composer js-comp-ver-5.2,vc_responsive
Augmented Reality with ARKit in iOS. Detecting planes and placing objects in augmented reality.

Place Objects in Augmented Reality via ARKit

Hello there! In this article, I will explain you how to place objects in augmented reality via ARKit.

If you haven’t already, I recommend you to have a look at the initial article in which I give an in-depth introduction to ARKit and the possibilities of Augmented Reality in iOS.

Furthermore, if you are not proficient with detecting planes in augmented reality, I would suggest you to first have a look at the article on detecting planes in ARKit. This post will start from the latter. Thus, it’s important that you understand the concepts behind Augmented Reality and Plane Detection.

Are you still with me? Good! Let’s start then.

Oh, and as always, you can download the source code from my Github Repository.

Where Were We?

Augmented Reality with ARKit in iOS. Detecting planes.

In the previous article, we managed to detect and update the horizontal planes in the scene as we moved around. To make them easy to spot, we added a semi-transparent node highlighting the position of the plane.

Next, we will allow the user to place coffee mugs with delicious coffee on any plane detected in the scene. We will call this the delici-o-coffee-AR-machine.

In order to do that, first we will get a DAE three-dimensional representation of the coffee mug.

Then, we will just adjust our view controller to place these mugs when the user clicks on a plane.

The user should only be able to place objects when tracking is ready and we have some planes. Thus, the first thing we’ll do is adding a variable to track the status of the ARKit engine.

Caffeine Status

We will define a new variable called currentCaffeineStatus, of type ARCoffeeSessionState. When the Apps starts, its state will be “initialized”. Then, as soon as we detect at least one plane, it will transition to “ready”.

As I mentioned in the previous articles, there will be situations when tracking will be temporarily unavailable. For those situations, we will define a “temporarilyUnavailable” state. Last, if the worst happens, we will set a “failed” state to indicate the user that something went wrong.

This will be our ARCoffeeSessionState definition:

As Apple states in its documentation, we should provide visual feedback to the user on the status of the ARKit session. As a result, in our ViewController, when defining our currentCaffeineStatus variable, we will use the didSet trigger to update a label with the description we have defined for every case of our enumeration.

Place Objects In Augmented Reality Via ARKit

If the worst happens (status transitions to failed), we will clear all nodes from the augmented reality session.

Let’s Place Objects in Augmented Reality!

Next, let’s take care of placing the coffee mug when we have some planes ready. First thing we need to consider is how to add the mug nodes.

Defining the Mug Node

I have added a .dae file to the project containing a coffee cup. Actually, I modified its size, colors and position to make sure it looks OK when placed in the scene.

Due to the coordinate axis differences between SceneKit and ARKit, you would probably need to do some adjustments in your 3d models. Namely, changes in size, orientation and distance to the center.

Once we have our model ready, we will create a SceneKit node in our ViewController, and clone it whenever necessary.

Note how we need the whole scene where the mug is contained. We just need to extract the exact node, identified by its name. The name can be modified in the inspector when editing the .dae file. Additionally, if the node contains sub-nodes, like in this case (the plate and spoon), we set the recursively flag to true.

Detecting Touches

In order to detect touches, we will use the touchesBegan method of UIViewController. Then, once we extract the touching point, we will use a hit test to determine if a plane was touched. When this happens, we will select that plane in a selectedPlane variable and add a mug node on top.

First, we need to make sure that we have a valid touch, that we are ready to add objects to the scene, and that we have at least one valid plane to do so (currentCaffeineStatus == .ready). Then, we calculate the touch point in our scene view, and invoke the virtualPlaneProperlySet to check if that point corresponds to one of our planes. In that case, we add the coffee mug by calling the method addCoffeeToPlane.

Am I in Plane?

The virtualPlaneProperlySet method will perform a hitTest in the sceneView, using the modified .existingPlanesUsingExtent to make sure that we test if the touch lies within one of our detected planes.

Then, if we have a valid hit point, and we are able to extract an anchor that identifies one of our VirtualPlane instances, we set our selectedPlane and return it.

Adding the Mug Nodes

Finally, we have a valid location point and a valid plane, basically all we need to place objects in augmented reality 😎.

Placing the object is surprisingly easy. We just need to clone our mug node using the “clone” method for SceneKit nodes. Next, we set its position to a SCNVector3Make corresponding to the hit point coordinates, and add it to the root node of the scene.

Voila!

Here’s a video of a coffee spree in action.

Conclusion

In this article, I illustrated how to place objects in augmented reality using ARKit. Of course, this example can be improved a lot to make it more interactive and enjoyable. Thus, I will be writing about how to move the virtual objects and make them interactive in future posts.

Do you have any comments? Requests? Questions? Don’t hesitate to let me know in the comments below.

 

 

No Comments

Post a Comment

Before you continue...

Hi there! I created the Digital Tips List, the newsletter for Swift and iOS developers, to share my knowledge with you.


It features exclusive weekly tutorials on Swift and iOS development, Swift code snippets and the Digital Tip of the week.


Besides, If you join the list you'll receive the eBook: "Your First iOS App", that will teach you how to build your first iOS App in less than an hour with no prior Swift or iOS knowledge.

I hate spam, and I promise I'll keep your email address safe.