Augmented Reality with ARKit: Detecting Planes - Digital Leaves
post-template-default,single,single-post,postid-2978,single-format-standard,qode-social-login-1.0,qode-restaurant-1.0,ajax_fade,page_not_loaded,,select-theme-ver-4.1,wpb-js-composer js-comp-ver-5.2,vc_responsive
Augmented Reality with ARKit in iOS. Detecting planes and placing objects in augmented reality.

Augmented Reality with ARKit: Detecting Planes

In our previous article about Augmented Reality with ARKit, we learned the basics of the new augmented reality platform from Apple. In this post, we are going to continue our exploration of its possibilities by learning how detecting planes works, and how to visualize them in our applications. Let’s start!

As always, you can get the full code from my Github repository.

Why Detecting Planes is So Important

The whole concept behind Augmented Reality is blending together the reality around us with virtual objects that exist only within our App. In order to be able to do that successfully, we need to be aware of the geometry of our surroundings.

In other words, we need to be able to identify the ceilings, walls, tables and other physical objects.

Then, we can start adding objects to the scene so that they will look real. Of course, size and geometry are just one part of the equation. There are many other important factors, such as the lighting of the scene, that need to be taken into account to place a realistic virtual object.

However, in this article, we are going to focus on detecting planes in our scene and being able to visualize them.

When you have finished reading this, you will be able to build an app like this one that automatically detects and updates the planes of the scene.

How Plane Detection in ARKit Works

Detecting planes from a scene in ARKit is possible thanks to feature points detection and extraction.

During the augmented reality session, ARKit continuously reads video frames from the device’s camera. Then, it tries to extract feature points to identify objects in the scene such as ceilings or furniture.

These feature points can be anything that helps identify objects: corners, structure lines, fabric characteristics, gradients, changes in color or form, edges of objects, etc…

Currently, ARKit can identify horizontal planes. This is done by means of a configuration of the ARKit scene (more about this later). Perhaps in the future, we will be able to identify vertical planes or even objects. However, for now, that’s all we get.

ARKit does an amazing job detecting planes. Nevertheless, some work is required in order to recognize and update these planes properly.

Getting Into the Scene

All the ARKit magic in your application happens in the Scene. As we saw in the previous post, depending on the engine you are using for displaying the virtual objects (SceneKit, SpriteKit, Metal…), you will use a different type of view.

In our case, as we are using SceneKit, we will use an ARSCNView. This view gets initialized with a configuration and a SceneKit scene instance.

Let’s see how to add plane detection to our scene, starting with the scene configuration.

ARKit VS SceneKit Coordinate System

It is important to note when using SceneKit to display ARKit virtual objects, that both systems use different coordinate systems. Planes in SceneKit are vertical when compared to ARKit. Thus, when translating coordinates from ARKit into SceneKit, we will use the Z-axis coordinate as the Y-coordinate of a SceneKit plane.

We’ll see a concrete example later when defining our planes in response to ARKit events.

Configuring the Scene for World Tracking

First, we will configure our scene to allow world tracking. We can do this at the viewWillAppear method.

Of course, our view controller will act as the delegate of the scene view, as we can see in viewDidLoad. Also, we will assign the SceneKit scene as the scene of our scene view:

We need to track three different situations in response to delegate calls regarding plane detection. First, the delegate will be called when new nodes are detected and added to the scene via the method renderer(SCNSceneRenderer, didAdd: SCNNode, for: ARAnchor).

Then, every time a node is updated due to the movement of the camera, objects of the world, or any other situation, the delegate will receive a call to renderer(SCNSceneRenderer, didUpdate: SCNNode, for: ARAnchor).

Finally, if a node is deleted from the scene, the delegate will receive a call to renderer(SCNSceneRenderer, didRemove: SCNNode, for: ARAnchor).

Here, ARAnchor represents an “anchor” element indicating the point and dimensions where the node has been detected or modified.

The Virtual Plane Object

We will use a class called virtualPlane to represent the planes we add, update or remove from the scene in response to the delegate calls.

Our VirtualPlane object will store the anchor and a plane SceneKit node containing the visual representation of the plane.

Initializing a VirtualPlane from an Anchor

When the delegate gets a call to renderer(SCNSceneRenderer, didAdd: SCNNode, for: ARAnchor), we will create a new VirtualPlan with the following init method.

Let’s analyze this code step by step. The important concept you must understand is the difference between the physical dimensions (geometry) of the plane, and its representation with a concrete material and visual area in a SCNNode.

Step By Step

First, we will define the geometry for our virtual plane. The extent variable of the anchor gives us the area of the plane. As we mentioned earlier, planes in SceneKit are vertical when compared to ARKit, so we will use the Z coordinate of the extent for the Y-axis of the SCNPlane.

To give the plane a physical appearance and make it visible, we need to specify a material for our plane node. We will do this with the method initializePlaneMaterial. This simple method just creates a semi-transparent gray surface.

Then, we can define a SceneKit node to visually show this plane geometry we just defined. We will use a SCNNode, and make sure to adjust the coordinate system accordingly to account for the differences between ARKit and SceneKit. Thankfully, we can do this easily with a 90º rotation using SCNMatrix4MakeRotation.

Next, we need to make sure we adjust the material of the plane to have the right dimensions. If we do this on a separate method, updatePlaneMaterialDimensions, we will be able to reuse it when updating the planes later.

Similarly to the plane geometry, we must address the rotation for the different coordinate systems between ARKit and SceneKit.

Finally, we add this SceneKit node to our VirtualPlane. At that moment, our plane will be ready to display in the ARKit session.

From Plane Detection to Adding our Virtual Plane

Augmented Reality with ARKit in iOS. Detecting planes.So, what do we do when renderer(SCNSceneRenderer, didAdd: SCNNode, for: ARAnchor) gets called?

We will keep a list of virtual planes.

Each time the didAdd method is called, we will add one plane to our list. Every ARAnchor has a UUID identifier.

Then, we can use this identifier to tell if a plane is new and should be added to the scene, or it’s one of our previously detected planes and we should update it.

We can use a dictionary [UUID: VirtualPlane] for that in our UIViewController.

Thus, this will be the implementation of the delegate call.

Updating our Planes

As you move around the scene, go forward or change the position of the camera, ARKit will update the planes. We should respond to the delegate call to update the visual representation of our planes accordingly.

First, we will add an update method to our VirtualPlane class making use of some of the methods we have already defined.

Next, we just implement the delegate method in our view controller.

And that’s all we need to keep our planes updated as we move through the scene.

Similarly, we just need to delete the planes when ARKit tells us to.

Finally, run the application, move around the scene, and you should see the semi-transparent planes appearing as ARKit detects them.

About Lighting, Textures, and Surfaces when Detecting Planes

Even though ARKit does an excellent job recognizing planes and surfaces, its efficiency greatly depends on several factors:


Obviously, lighting has an important influence on the ability of ARKit of detecting planes. Computer vision algorithms, in general, behave poorly under bad lighting conditions.

You can check that directly in the application. Try the same path with different lighting environments (all lights switched on, only ambient light, etc), and you will see how it affects plane detection.

Textures and Surfaces

ARKit works best when it has some good, rich textures and surfaces with enough points to extract features.

This implies that a white, flat wall without texture or differentiating points will not give you good planes. On the contrary, corners, rich textures and well-defined surfaces will help ARKit do a good job when detecting planes from the scene.

Debugging Feature Points

Augmented Reality with ARKit in iOS. Detecting planes.In order to get an idea of how well features are being recognized on different scenarios, we can activate two debugging options in our sceneView object: “Show feature points” and “Show world origin”.

The first one will ask ARKit to render the “feature points” as they are being identified. They will appear as yellow dots. The second one will show a triple X-Y-Z axis at the world origin.

Experiment a little with this debugging options enabled and check how feature points appear on differents scenarios. Check corners, different fabrics, ceilings, vertical and horizontal surfaces, etc.

Enabling them is easy with just a line of code in viewDidLoad.


About Initialization Times, Going into Background and Detecting Planes

It’s important to take into account that ARKit takes some time to initialize. Obviously, during that time, no plane detection will be available, and your UI should clearly indicate that to the user.

Similarly, when the app goes into the background, ARKit stops tracking the world, and plane detection functionality is disabled. As a result, when the app comes into the foreground again, there will be a delay also before ARKit starts recognizing planes and objects around.

Then, if during that time the user moves its position or the camera changes significantly, all previously detected planes will not be accurate. Thus, you might want to remove or try to adjust all the planes when this situation occurs.


In this second article of “Augmented Reality with ARKit”, we learned about plane detection and how to visualize the planes we extract from the scene in SceneKit.

This knowledge will be the foundation for putting virtual objects in the scene. We will address that in a future article.

I hope you will have some fun moving around, detecting planes and playing with ARKit. Did you enjoy the post? Did you find it useful? Please don’t hesitate to leave a comment.

  • lei

    November 1, 2017 at 3:50 pm Reply

    This is helpful for me to learn ARKit, thanks very much! :)

  • AntonD

    November 5, 2017 at 1:30 pm Reply

    Thank you very much for your work!! Both tutorials in ARKit give me as a beginner, a good understand, how ARKit works “behind the scenes”. I’m so excited to read more!!
    Great Thanks!!

    • Ignacio Nieto Carvajal

      November 5, 2017 at 4:50 pm Reply

      Thank you Andon, nice to know it helped. I plan on releasing the new episode soon, so stay tuned! 😎

  • Vinod

    November 14, 2017 at 9:58 am Reply

    How to detect vertical plane?

    • Ignacio Nieto Carvajal

      November 14, 2017 at 10:08 am Reply

      Hello Vinod. Right now, it’s not possible to detect vertical planes. Maybe in future versions of ARKit it will be possible.

      • Vinod

        November 14, 2017 at 10:43 am Reply

        Can I detect wall using ARKit?Actually I saw app on app store which can change color of wall virtually with selected color.Thank you!

        • Ignacio Nieto Carvajal

          November 15, 2017 at 4:14 pm Reply

          Currently, ARKit only allows horizontal plane detection. Probably, you can get to detect walls with some effort :). Good luck!

Post a Comment

Before you continue...

Hi there! I created the Digital Tips List, the newsletter for Swift and iOS developers, to share my knowledge with you.

It features exclusive weekly tutorials on Swift and iOS development, Swift code snippets and the Digital Tip of the week.

Besides, If you join the list you'll receive the eBook: "Your First iOS App", that will teach you how to build your first iOS App in less than an hour with no prior Swift or iOS knowledge.

I hate spam, and I promise I'll keep your email address safe.