Hello there! In this article, I will explain you how to place objects in augmented reality via ARKit.

If you haven’t already, I recommend you to have a look at the initial article in which I give an in-depth introduction to ARKit and the possibilities of Augmented Reality in iOS.

Furthermore, if you are not proficient with detecting planes in augmented reality, I would suggest you to first have a look at the article on detecting planes in ARKit. This post will start from the latter. Thus, it’s important that you understand the concepts behind Augmented Reality and Plane Detection.

Are you still with me? Good! Let’s start then.

Oh, and as always, you can download the source code from my Github Repository.

Where Were We?

Augmented Reality with ARKit in iOS. Detecting planes.

In the previous article, we managed to detect and update the horizontal planes in the scene as we moved around. To make them easy to spot, we added a semi-transparent node highlighting the position of the plane.

Next, we will allow the user to place coffee mugs with delicious coffee on any plane detected in the scene. We will call this the delici-o-coffee-AR-machine.

In order to do that, first we will get a DAE three-dimensional representation of the coffee mug.

Then, we will just adjust our view controller to place these mugs when the user clicks on a plane.

The user should only be able to place objects when tracking is ready and we have some planes. Thus, the first thing we’ll do is adding a variable to track the status of the ARKit engine.

Caffeine Status

We will define a new variable called currentCaffeineStatus, of type ARCoffeeSessionState. When the Apps starts, its state will be “initialized”. Then, as soon as we detect at least one plane, it will transition to “ready”.

As I mentioned in the previous articles, there will be situations when tracking will be temporarily unavailable. For those situations, we will define a “temporarilyUnavailable” state. Last, if the worst happens, we will set a “failed” state to indicate the user that something went wrong.

This will be our ARCoffeeSessionState definition:

enum ARCoffeeSessionState: String, CustomStringConvertible {
   case initialized = "initialized", ready = "ready", temporarilyUnavailable = "temporarily unavailable", failed = "failed"

  var description: String {
     switch self {
     case .initialized:
        return "👀 Look for a plane to place your coffee"
     case .ready:
        return "☕ Click any plane to place your coffee!"
     case .temporarilyUnavailable:
        return "😱 Adjusting caffeine levels. Please wait"
     case .failed:
        return "⛔ Caffeine crisis! Please restart App."

As Apple states in its documentation, we should provide visual feedback to the user on the status of the ARKit session. As a result, in our ViewController, when defining our currentCaffeineStatus variable, we will use the didSet trigger to update a label with the description we have defined for every case of our enumeration.

var currentCaffeineStatus = ARCoffeeSessionState.initialized {
   didSet {
      DispatchQueue.main.async { self.statusLabel.text = self.currentCaffeineStatus.description }
      if currentCaffeineStatus == .failed {

Place Objects In Augmented Reality Via ARKit

If the worst happens (status transitions to failed), we will clear all nodes from the augmented reality session.

func cleanupARSession() {
   sceneView.scene.rootNode.enumerateChildNodes { (node, stop) -> Void in

Let’s Place Objects in Augmented Reality!

Next, let’s take care of placing the coffee mug when we have some planes ready. First thing we need to consider is how to add the mug nodes.

Defining the Mug Node

I have added a .dae file to the project containing a coffee cup. Actually, I modified its size, colors and position to make sure it looks OK when placed in the scene.

Due to the coordinate axis differences between SceneKit and ARKit, you would probably need to do some adjustments in your 3d models. Namely, changes in size, orientation and distance to the center.

Once we have our model ready, we will create a SceneKit node in our ViewController, and clone it whenever necessary.

var mugNode: SCNNode!

func viewDidLoad() {
   // ...

func initializeMugNode() {
   let mugScene = SCNScene(named: "mug.dae")!
   self.mugNode = mugScene.rootNode.childNode(withName: "Mug", recursively: true)!

Note how we need the whole scene where the mug is contained. We just need to extract the exact node, identified by its name. The name can be modified in the inspector when editing the .dae file. Additionally, if the node contains sub-nodes, like in this case (the plate and spoon), we set the recursively flag to true.

Detecting Touches

In order to detect touches, we will use the touchesBegan method of UIViewController. Then, once we extract the touching point, we will use a hit test to determine if a plane was touched. When this happens, we will select that plane in a selectedPlane variable and add a mug node on top.

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
   guard let touch = touches.first else {
      print("Unable to identify touches on any plane. Ignoring interaction...")
   if currentCaffeineStatus != .ready {
      print("Unable to place objects when the planes are not ready...")
   let touchPoint = touch.location(in: sceneView)
   if let plane = virtualPlaneProperlySet(touchPoint: touchPoint) {
      print("Plane touched: \(plane)")
      addCoffeeToPlane(plane: plane, atPoint: touchPoint)

First, we need to make sure that we have a valid touch, that we are ready to add objects to the scene, and that we have at least one valid plane to do so (currentCaffeineStatus == .ready). Then, we calculate the touch point in our scene view, and invoke the virtualPlaneProperlySet to check if that point corresponds to one of our planes. In that case, we add the coffee mug by calling the method addCoffeeToPlane.

Am I in Plane?

The virtualPlaneProperlySet method will perform a hitTest in the sceneView, using the modified .existingPlanesUsingExtent to make sure that we test if the touch lies within one of our detected planes.

❤️ Enjoying this post so far?

If you find this content useful, consider showing your appreciation by buying me a coffee using the button below 👇.

Buy me a coffeeBuy me a coffee
func virtualPlaneProperlySet(touchPoint: CGPoint) -> VirtualPlane? {
   let hits = sceneView.hitTest(touchPoint, types: .existingPlaneUsingExtent)
   if hits.count > 0, let firstHit = hits.first, let identifier = firstHit.anchor?.identifier, let plane = planes[identifier] {
      self.selectedPlane = plane
      return plane
   return nil

Then, if we have a valid hit point, and we are able to extract an anchor that identifies one of our VirtualPlane instances, we set our selectedPlane and return it.

Adding the Mug Nodes

Finally, we have a valid location point and a valid plane, basically all we need to place objects in augmented reality 😎.

func addCoffeeToPlane(plane: VirtualPlane, atPoint point: CGPoint) {
   let hits = sceneView.hitTest(point, types: .existingPlaneUsingExtent)
   if hits.count > 0, let firstHit = hits.first {
      if let anotherMugYesPlease = mugNode?.clone() {
         anotherMugYesPlease.position = SCNVector3Make(firstHit.worldTransform.columns.3.x, firstHit.worldTransform.columns.3.y, firstHit.worldTransform.columns.3.z)

Placing the object is surprisingly easy. We just need to clone our mug node using the “clone” method for SceneKit nodes. Next, we set its position to a SCNVector3Make corresponding to the hit point coordinates, and add it to the root node of the scene.


Here’s a video of a coffee spree in action.

Augmented Reality with ARKit in iOS. Detecting planes and placing objects in augmented reality.


In this article, I illustrated how to place objects in augmented reality using ARKit. Of course, this example can be improved a lot to make it more interactive and enjoyable. Thus, I will be writing about how to move the virtual objects and make them interactive in future posts.

Do you have any comments? Requests? Questions? Don’t hesitate to let me know in the comments below.