This is a series of tutorials focused on Augmented Reality with ARKit. Alongside MLKit, this was the framework announcement that impressed me the most. I strongly believe that ARKit is going to be a game changer for the Augmented Reality ecosystem.

This initial tutorial will get you started with ARKit in iOS with examples and little to no theory. First, we will start with just some basic concepts. Then, we will learn how to load different types of model, from static object to animated figures, mainly using SceneKit. Also, I will show you how to import blender models into ARKit. By the end of this tutorial, you will be able to load and place virtual objects in the real world.

As always, you can download the full source code for the sample project from my Github repository.

About ARKit

Before getting any further, let’s review the main concepts behind ARKit in iOS. ARKit is a high level augmented reality framework from Apple. It allows you to place digital or virtual objects into a physical world, and interact with them. Also, it has some exciting features, like world tracking, visual inertial odometry, and the fact that it works out of the box, without previous setup. Besides, it recognizes the world around your device fast and efficiently. Finally, plane detection, real world hit testing, and light estimation makes it a top notch base for any serious augmented reality application.

Obviously, all your iOS device’s sensors are put here to good use. The rendering of virtual objects is done natively via three main frameworks: SpriteKit, SceneKit, and Metal. However, you can integrate it with powerful engines like Unreal.

In this tutorial, we’ll discuss rendering through SceneKit only. In my opinion, it’s more suited for AR than SpriteKit (being fully 3D) and is also easier than Metal, which makes it the perfect choice for indie developers.

ARKit Concepts

Let’s see a very short description of the main concepts you need to keep in mind.

Sessions And Tracking

ARKit is a session based framework. This means that everything will happen in a concrete session. The session relates the virtual objects with the real world by means of the Tracking. Being built on top of AVFoundation and CoreMotion, tracking provides not only position and orientation, including rotation but also real world scaling, motion control, and distance measurement.

Not all devices will support tracking. The framework provides a method to test if the current device has tracking available. Then, we can configure our session with the right ARSessionConfiguration:

if ARWorldTrackingSessionConfiguration.isSupported {
   configuration = ARWorldTrackingSessionConfiguration() 
} else { 
   configuration = ARSessionConfiguration() 

Then, the session starts by calling the “run” method. You can pause it anytime with the “pause” method and resume it by calling “run” again, with the same, or a different configuration.

In order to be aware of changes in the session, you become its delegate. The most interesting delegate’s methods are:

func session(_: ARSession, didUpdate: ARFrame)
func session(_: ARSession, didFailWithError: Error)

You will supposedly receive an update on every frame, called ARFrame. This entity contains a captured image, tracking information and also, information about the scene.


Every session has a scene that will render the virtual objects in the real world, accessed by means of the iOS device sensors (like the camera, accelerometer, etc) used as part of the tracking. Everything happens inside the scene.

Depending on the renderer you are using, you will be able to load different types of nodes into the scene. These nodes are your virtual objects. In SceneKit, those are SCNNodes. We will see how to load different types of nodes from 3D models and geometric figures.


The camera is the point of view of the scene. All objects will be rendered relative to their positions and orientation relative to the camera. Usually, this will be the camera of the device. When defining our objects in a scene and loading them, we will learn how to position them initially at the right position, orientation and size in reference to the camera.

Every camera Contains a transform that includes its position, coordinates, orientation, and rotation. It also has a tracking state and some information about the camera intrinsic characteristics (physical properties, focus…).


Our virtual objects need to look real in the real world. As a result, we need to adjust the lighting on the object according to the lighting of the real world scene. In order to do that, ARKit uses light estimation, estimating the ambient intensity based on the captured image. This feature, enabled by default, can be controlled by means of the ARSessionConfiguration’s variable isLightEstimationEnabled.

configuration.isLightEstimationEnabled = true
// get the results of ambient light intensity.
let intensity = frame.lightEstimate?.ambientIntensity

Loading Scenes And Nodes

As we mentioned before, we will use SceneKit to render objects in this tutorial. Basically, a scene can contain different objects, alongside a camera and one or more sources of light. Thus, we can render a scene containing different objects, we can render an empty scene and start adding objects to it, or a combination of both. Both scenes and objects are nothing more than 3D models within a tridimensional coordinate space.

3D Models

Even though SceneKit works with scenes and nodes in scn format, we can use other, more common formats like dae (Digital Asset Exchange). In this tutorial, I will also show you how to import blender models as well.

Creating Our Project

In order to create an ARKit project, just open Xcode, choose New > Project, and select the “Augmented Reality App” template.

First, we’ll get rid of the boring Apple’s sample spaceship node, ship.scn. Next, it’s time to start loading scenes and objects.

Loading A SceneKit Node

We’ll start with a simple scn scene containing just a cup of coffee node. I already prepared one for you from a blender model. You can find it here.

Once you drop it into your project and inside the art.scnassets folder, you will be able to open it and see a visual representation of the scene in 3D.

Augmented Reality with ARKit in iOS. Detecting planes and placing objects in augmented reality.

You can navigate through the scene, select the objects, and change your perspective by using the mouse and gestures, like the video above depicts.

Take your time and familiarize yourself with the interface. Move some of the nodes.

Understanding The Camera And Geometry

Augmented Reality with ARKit in iOS

The video above already gave you a glance of the geometric system of the scenes. When using SceneKit to render virtual objects in ARKit, you have a tridimensional coordinate system, including positive and negative offsets, and rotation. Generally speaking, the X axis goes left to right, the Y axis goes up and down, and the Z axis signals the depth of the object in the scene, and it’s especially relevant for the position of the camera.

Also, you can modify the scale of the objects inside the scene, effectively altering its size.

Coordinates in ARKit are expressed in meters. Thus, a cube of 0.2 x 0.2 x 0.2 will have 20 cm per side.

The scene will usually have a camera. Remember that this camera will be the focal point for the initial position of the device and, thus, the user, in your virtual world. It’s important to place the initial static objects appropriately, both in size and in distance.

If the scene doesn’t have a camera, you can easily add one in the Object Library on the bottom right.

Augmented Reality with ARKit in iOS

I recommend you to play along with the mug example. Place the mug closer to the camera, or increment its size, and see the effects when you launch the application. This will help you gain some acquaintance with ARKit’s physical positioning and tracking of the virtual world inside the real world.

Understanding Light And Surface

Initially, our mug has a plain white color and no specific lighting. In order to make it look more real, we need to add a light to the scene and some surface and color properties to our object.

The scene graph view is a good place to examine the nodes and objects present in the scene. If it’s collapsed, you can expand it by clicking on the icon at the bottom left of Xcode while inside the scene file. There, you can easily check if there’s a lamp or light source in the scene. If not, you can add one from the Object Library too.

A light source is a node and behaves like one. It has a position, rotation, and even size. Additionally, it has some unique properties, like the type of light (ambien, omni, unidirectional, etc), intensity, color, etc.

Augmented Reality with ARKit in iOS

In the case of directionaly lights, you will see a fourth arrow vector when selecting it. That’s the direction of the light.

Obviously, the presence, position and properties of the light source will affect the objects rendered around it. However, each object has also a surface, made of certain material. This material has its own ways of reflecting the light, and also its own surface and color.

All these characteristics will certainly modify the visual appearance of the object. Here, you can see how I modified our plain white mug to have a more interesting, yellow-orangeish color using the Phong lighting model.

Augmented Reality with ARKit in iOS

As always, it’s worth spending some time experimenting with the different surfaces, materials and lighting models to understand how they work. Depending on your application, scene and objects, you may use a different setup.

Adding Geometric Figures

Next, now that we’ve managed to run the application and see our french flying teapot (if any of you know this nerdy musical reference, you get 10 extra points), let’s learn how to add geometric figures to our scenes.

SceneKit has a rich set of ready-to-use geometric nodes, from spheres to cubes.

In the scene editor, it’s really easy to add and configure geometric nodes. All of them are available in the Object Library. Similarly to our mug, you need to take care of the proper placement, size and orientation.

Augmented Reality with ARKit in iOS

In this example, I am using a cube, pyramid and sphere. Taking care of the correct placement, and making the light match the position of the windows, you get a similar effect to the real sunlight in your virtual objects.

Augmented Reality with ARKit in iOS

However, in a real application, you will probably be using the tracking capabilities of the device to make the light of the virtual objects match the luminosity of the environment.

Loading And Adding Nodes To A Scene

For simplicity’s sake, we’ve been loading scenes containing our objects, but what if we want to add nodes later, programmatically?

In order to do that, you just need to create the node that you want to add, whether is a geometric node or a tridimensional model and add it to a node within the scene.

All scenes have a rootNode, that you can use to add nodes in the scene (as opposed to adding a node as part of another, more complex node).

This example shows you how to add our cube programmatically in the scene. This is useful for real applications, where the contents of the scene appear dynamically.

let box = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0.0)
var node = SCNNode(geometry: box)
let position = SCNVector3DMake(0.2, -0.3, 0.1) // distance to the camera. Put it slightly away.
node.position = position

// ... assign scene to a SceneView.

Of course, you can also add nodes from 3D models with this technique.

Loading dae (Digital Asset Exchange) Nodes

The dae format (Digital Asset Exchange) is a standard when it comes to representing and rendering tridimensional models. Luckily for us, SceneKit allows you to load dae files directly as scenes or nodes. Actually, dae support is built into macOS. As a result, you can even view dae files using preview.

In order to load the scene, just specify the .dae file when creating the SCNScene.

let scene = SCNScene(named: "art.scnassets/myModel.dae")!

Textures And Animations

SceneKit nodes can also contain textures and animations.

Textures are used to give a more real look to an object. They are basically a set of images that will be rendered on the surface of the object.

Additionally, nodes can also contain animations. As a result, the object will move and perform those animations indefinitely. Of course, we can control when to launch an animation or keep the figure static, but we will talk about it in a further tutorial.

Luckily for us, indie developers, there’s a way of loading blender files into SceneKit for our ARKit applications by performing a simple export operation. Let’s see how to export an animated, textured blender of a spider and create a scene for our ARKit application.

Loading An Animated Spider With Texture From Blender

Blender is an awesome open source 3d modelling tool. There are lots of resources out there for it including 3d models and objects ready to use, for free. Thus, if your budget is limited, you should really have a look at it and immerse yourself in the Blender community.

As an example, let’s see how to import a blender animated model of a spider in our augmented reality application.

Augmented Reality with ARKit in iOS

Augmented Reality with ARKit in iOS

Exporting The Blender Model to DAE

First, you need to download blender, if you haven’t already. Then, you open the .blend file of your choice. For this tutorial, I chose a very well-known render of a spider.

Next, you need to choose “Export” in the file menu. Select the DAE (Collada) format.

The image on the left show you the recommended options. If your object contains textures, you should check “Include UV Textures” and “Include Material Textures”. You can pretty much leave the rest of the options as they appear by default.

The result will be not only a .dae file, but (in case your object had textures), a set of image files containing its textures.

Important to note, there are important differences in how the different formats and tools deal with the coordinate axis. Unfortunately, Blender and blender files are Z-Up. Contrarily, all OpenGL-related engines, such as SceneKit and Metal, are Y-Up, and negative Z-Forward. This means that if you load files from blender, the orientation of your nodes will be wrong. Apparently, every asset (.dae, .scn, etc) you drag and drop into a SceneKit assets folder will automatically be adjusted, but still, you will see it incorrectly in the scene editor. This is something to consider when you are designing your Augmented Reality apps.

Adding our DAE And Texture Files To The Project

Next, you need to drag and drop all these files inside your art.scnassets folder.

If everything goes well, you should be able to click on the .dae file and see the object in all of its glory.

Sometimes, the material is not properly recognized, and so the texture files are not properly assigned. If that occurs, you just need to go to the Material Inspector, and set the textures in their right places.

Take into account that an object might have multiple parts, including textures, materials and surfaces, and so textures can be distributed alongside different parts of the object.

This is the case for our little fella Boris, the Spider (another nerdy musical reference).

Also, depending on the original object, you might want to add a camara, or a light source, or modify the ones that are already there.

The figure below shows the textures within the different body parts of our spider.

My recommendation, as always, is to experiment, play with the configuration options and tweak the representation of the objects and scenes until you are happy with the results.

Augmented Reality with ARKit in iOS

Let’s place our spider right below our camera, so it lands right on our feet. Watch out! It wants to bite you! 😱

Augmented Reality with ARKit in iOS

❤️ Enjoying this post so far?

If you find this content useful, consider showing your appreciation by buying me a coffee using the button below 👇.

Buy me a coffeeBuy me a coffee

If you change the scene from mug.scn to spider.dae, you should see the spider if you turn the device’s camera down.

Where To Go Next

This is the initial article of a series of tutorials focused in Augmented Reality with ARKit for iOS. Today, we learned the basic concepts of ARKit, alongside how to load scenes and objects, and put together an app to start interacting with the new framework from Apple. Also, we learned how to use Blender, the Open Source tool, to export models into a format we can add to our projects.

In the next tutorial, we’ll explore adding more interactivity between our virtual objects and the real world, starting with plane detection, and hit testing.