How to Build a Simple AR App with Unity

What Is Augmented Reality (AR)?

Augmented Reality (AR) is a technology that enhances the physical world by overlaying digital information—such as 3D models, sounds, images, or text—onto the user’s real-time view. Unlike Virtual Reality (VR), which creates an entirely immersive digital environment, AR blends virtual elements with the real world, making it interactive and contextually relevant.

This is typically experienced through smartphones, tablets, or smart glasses that use the device’s camera and sensors to detect the environment and position digital elements accurately within it.

In short: AR makes the real world more informative, entertaining, or functional by adding layers of virtual content.

How Are Brands Using Augmented Reality?

Brands across industries are leveraging AR technology to enhance customer experience, boost engagement, and drive sales. Here are some of the most common ways AR is used in business:

  • Virtual Try-On: Fashion and beauty brands use AR to let customers preview products—like clothes, makeup, or glasses—on themselves using their phones.

  • AR Product Visualization: Furniture and home décor companies enable users to place 3D models of products in their real spaces before buying.

  • Interactive Advertising: AR-powered ads and packaging offer immersive experiences, turning passive viewers into active participants.

  • Gamification & Campaigns: Companies launch AR games or challenges tied to product promotions or events, increasing virality and user interaction.

  • Educational Demos: Tech and healthcare brands use AR to explain complex products or processes in an engaging, visual format.

Bottom line: AR is not just a novelty—it’s a strategic tool that helps brands stand out, tell better stories, and create memorable interactions.

How to Build a Simple AR App with Unity (Step-by-Step Guide)

1. Unity (Recommended Version: 2021.3 LTS or higher)

Unity is one of the most popular game engines for building real-time 3D applications, and it supports AR development through plugins and packages like AR Foundation.

  • Download Unity Hub: unity.com/download

  • Install Unity Editor with the Android Build Support and/or iOS Build Support modules.

Tip: Always use a long-term support (LTS) version for stability.


2. AR Foundation (Unity Package)

AR Foundation is Unity’s cross-platform framework that allows you to build once and deploy on both ARKit (iOS) and ARCore (Android).

To install:

  • Go to Window → Package Manager

  • Search for and install:

    • AR Foundation

    • ARKit XR Plugin (for iOS)

    • ARCore XR Plugin (for Android)

These packages provide device tracking, surface detection, and camera integration.


3. Mobile Device (for Testing)

You’ll need a physical device to test AR capabilities:

  • For iOS, you need an iPhone or iPad with ARKit support (iOS 11+).

  • For Android, a device with ARCore support is required (check the ARCore supported devices list).


4. 3D Model or Asset

AR apps often display 3D content — such as furniture, avatars, or branded objects — in the real world.

  • Use .FBX, .GLB, or .OBJ files.

  • You can create them in Blender, Cinema 4D, or download from libraries like Sketchfab or TurboSquid.

Keep your models optimized for mobile performance (low poly, compressed textures).


5. (Optional) IDEs & Companion Tools

Depending on your target platform:

  • For iOS: Install Xcode (from the Mac App Store) to compile and deploy builds to iPhone/iPad.

  • For Android: Use Android Studio for USB debugging, emulator testing, and APK building.

Step 2: Create and Configure Your First AR Scene

AR App with Unity : Now that your development environment is ready, it’s time to build your first functional AR scene in Unity. This is where the magic starts — where the digital meets the real.


1. Start a New Scene
  • Open Unity and create a new 3D project.

  • Save your scene as ARScene.unity.

  • In the Build Settings, switch to your target platform (iOS or Android) and click Switch Platform.

Tip: Keep your hierarchy clean by organizing your GameObjects into named folders (e.g., Managers, Models, UI).


2. Import AR Foundation Packages
  • Open Window → Package Manager.

  • Install the following:

    • AR Foundation

    • ARKit XR Plugin (if targeting iOS)

    • ARCore XR Plugin (if targeting Android)

These packages allow Unity to communicate with AR capabilities on your device.


3. Set Up AR Components

In your scene hierarchy:

  1. Create → XR → AR Session

    • Manages the lifecycle of the AR experience.

  2. Create → XR → AR Session Origin

    • Controls the AR camera and its relation to the real world.

The AR Session Origin contains the AR Camera and allows your digital objects to appear anchored in your space.


4. Add AR Interaction Components

In the AR Session Origin, add these components:

  • AR Plane Manager:
    Detects flat surfaces in the environment (e.g., floors, tables).

  • AR Raycast Manager:
    Allows the app to detect where the user is touching the screen to place objects.

You’ll also need a prefab (3D model) to instantiate on touch. You can create a simple cube as a placeholder:

GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);

Or import a .glb or .fbx model into your project.


5. Create Touch-to-Place Script

Attach this script to an empty GameObject called ARPlacementManager:

using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;
using System.Collections.Generic;

public class ARTapToPlace : MonoBehaviour
{
public GameObject objectToPlace;
public ARRaycastManager raycastManager;
private List<ARRaycastHit> hits = new List<ARRaycastHit>();

void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
Pose hitPose = hits[0].pose;
Instantiate(objectToPlace, hitPose.position, hitPose.rotation);
}}}}

Assign the ARRaycastManager and a prefab to this script in the inspector.


6. Test the Scene
  • Build your app to your test device (iOS or Android).

  • Run it and point your camera at a surface — tap to place your object.

Congrats! You’ve just created your first interactive AR scene!

Step 3: Add Object Placement Interaction

In this step, you’ll make your AR app interactive — allowing users to tap on a detected surface and place a 3D object into the real world. This is one of the core features of most AR apps and a great way to learn how user input and AR raycasting work together.


Goal of This Step:

Enable touch-to-place functionality:
When the user taps a detected plane, your app will place a 3D model on that spot in real time.


Prerequisites

Before starting Step 3, make sure your Unity scene includes:

  • AR Session

  • AR Session Origin with an AR Camera

  • AR Plane Manager

  • AR Raycast Manager

  • A 3D model prefab (e.g., a cube, chair, or logo)


1. Create an Empty GameObject for Interaction Logic
  • Name it ARPlacementManager

  • Attach a new script component called ARTapToPlace.cs


2. Add This Script to Handle Touch Interaction

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

public class ARTapToPlace : MonoBehaviour
{
public GameObject objectToPlace;
public ARRaycastManager raycastManager;

private List<ARRaycastHit> hits = new List<ARRaycastHit>();

void Update()
{
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);

if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
Pose placementPose = hits[0].pose;
Instantiate(objectToPlace, placementPose.position, placementPose.rotation);
}}}}

3. Assign Your Object and Raycast Manager in Inspector
  • Drag your 3D object prefab (e.g., CubePrefab) into the Object To Place field.

  • Drag the AR Session Origin (which holds the ARRaycastManager component) into the Raycast Manager field.


4. Build and Test
  • Deploy the app to your mobile device.

  • Move your camera to detect a surface (like a table or floor).

  • Tap the screen — your 3D object should appear exactly where you tapped.

Congrats! You’ve added real-time user interaction to your AR scene.


Pro Tip: Prevent Multiple Placements

If you want to limit to one object, update your code like this:

private GameObject placedObject;

if (placedObject == null)
{
placedObject = Instantiate(objectToPlace, placementPose.position, placementPose.rotation);
}

Step 4: Add Visual Feedback and Improve User Experience (UX)

Up to now, users can tap to place an object in the AR scene — but without clear visual cues, the interaction can feel uncertain or unintuitive. In this step, we’ll add real-time placement indicators and simple UX improvements to make your AR app more user-friendly and polished.


Why Visual Feedback Matters in AR

Augmented Reality happens in an open, uncontrolled environment — users need visual signals that tell them:

  • The app is working

  • A surface is detected

  • The object is ready to be placed

  • Their touch action did something

Adding a placement indicator helps reduce confusion and makes the experience feel smoother and more responsive.


1. Create a Placement Indicator

You can use a simple flat 3D object like a circular plane or a ring-shaped sprite.

  • In Unity, Create → 3D Object → Plane
    Scale it down (e.g., 0.1, 0.01, 0.1)
    Apply a transparent material or glowing texture

  • Name it PlacementIndicator

You can also use PNG images or particle effects for a more modern look.


2. Update the AR Tap Script for Real-Time Indicator

public class ARPlacementIndicator : MonoBehaviour
{
public GameObject placementIndicator;
public ARRaycastManager raycastManager;

private Pose placementPose;
private bool placementPoseIsValid = false;
private List<ARRaycastHit> hits = new List<ARRaycastHit>();

void Update()
{
UpdatePlacementPose();
UpdatePlacementIndicator();
}

private void UpdatePlacementPose()
{
var screenCenter = Camera.current.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));
raycastManager.Raycast(screenCenter, hits, TrackableType.Planes);

placementPoseIsValid = hits.Count > 0;
if (placementPoseIsValid)
{
placementPose = hits[0].pose;
}
}

private void UpdatePlacementIndicator()
{
if (placementPoseIsValid)
{
placementIndicator.SetActive(true);
placementIndicator.transform.SetPositionAndRotation(placementPose.position, placementPose.rotation);
}
else
{
placementIndicator.SetActive(false);}}}

3. Combine With Touch-Based Object Placement

Once your indicator is working, you can update your object placement script to instantiate the object at the location of the indicator, not wherever the screen is tapped.

This creates a more natural and accurate placement experience.

Step 5: Add Object Interaction — Move, Rotate, or Reset

Now that you’ve successfully placed a 3D object in AR, let’s make it interactive. Allowing users to manipulate the object (move, rotate, or remove it) creates a dynamic, game-like experience and significantly improves engagement.


1. Add Gesture-Based Controls

Use Unity’s built-in Touch input and physics system to let users rotate or scale the placed object.

Here’s an example to add basic rotation on drag:

public class RotateOnDrag : MonoBehaviour
{
private Vector2 startTouchPosition;
private float rotationSpeed = 0.2f;

void Update()
{
if (Input.touchCount == 1)
{
Touch touch = Input.GetTouch(0);

if (touch.phase == TouchPhase.Began)
{
startTouchPosition = touch.position;
}
else if (touch.phase == TouchPhase.Moved)
{
float deltaX = touch.deltaPosition.x;
transform.Rotate(0, -deltaX * rotationSpeed, 0);
}}}}

Add this script to the placed object prefab.


2. Add a Reset Button (UI)

You can also allow users to clear or replace the object with a simple UI button:

  • Add a Canvas and Button in Unity UI

  • On click, destroy the placed object and reset placement indicator state

public GameObject placedObject;

public void ResetObject()
{
if (placedObject != null)
{
Destroy(placedObject);
}}

Final Result: What You’ve Built
By completing this guide, we’ve created a fully functional and interactive AR app that includes:

✔️ Real-time surface detection

✔️ Tap-to-place 3D objects

✔️ Visual placement indicator

✔️ Gesture-based object rotation

✔️ Reset and UX enhancements

This is a solid foundation for any commercial or experimental AR experience — from product demos to education, art installations to virtual try-ons.

You can now:
Add UI menus and scene navigation
Integrate image or object tracking
Export your app to TestFlight or Google Play
Connect it with a backend or WebAR platform
Add analytics to measure usage and engagement

If you’re building AR professionally — for your brand, startup, or portfolio — this prototype is the first step to real impact.

Conclusion

Building AR apps is no longer just for massive tech companies. With tools like Unity and AR Foundation, you can create spatially aware, interactive, and immersive apps that feel like the future.