A guide to
interaction
patterns for AR

Version 1.0

Augmented Reality (AR) offers audiences the ability to interact with the world in totally new ways. Why is this important and how is it achieved?

There are over 90 million AR capable devices in the pockets of audiences around the world, and with Mixed Reality headsets already gaining traction, it’s the acceleration of accessibility that makes it such a powerful area for exploration.

As with all new technology, understanding its affordances can unlock the potential for mainstream adoption. We’ve created this guide to consolidate our experimentation within this space. The patterns detailed here are embryonic and will continue to evolve through our R&D and client project work.

Manipulation

Interactions offering the ability to manipulate the digital world, through physical movements of the device or headset. Pure and simple interactions spark curiosity, and encourage audiences to explore more deeply, thus increasing engagement with the experience.

  • 11
    Manipulation

    Move around object

    Looking at a single object, exploring it from different angles.

  • 01
    Manipulation

    Object/marker manipulation

    (Dynamics) manipulate an object or a marker to affect the AR world.

  • 04
    Manipulation

    Manipulate using rotate

    The orientation of the device affects the AR world.

  • 12MR
    Manipulation

    Move/Panning within an area

    Move around and change your pov to uncover new things out of frame.

  • 14MR
    Manipulation

    Adjust distance to change state

    Moving closer to something could reveal extra information, for example you see an AR house but if you move closer you can see into it like x-rays.

  • 06
    Manipulation

    Manipulate using tilt

    Tilt the device to affect the AR world.

  • 18MR
    Manipulation

    World space colliders

    There are invisible (potentially visible) AR colliders that map to the real world, if the device intersects them they can trigger something.

Perspective

These interactions are categorised by their specific ability to trigger multiple perspectives. Offering new ways of viewing our world and the objects within it extend our audience's perception of reality.

  • 08
    Perspective

    Asymmetric information per point of view

    By looking at a single object or area from different angles different information is displayed.

  • 02
    Perspective

    Gaze

    There is an invisible line shooting out from the camera into AR space, this can be used as an input.

  • 03MR
    Perspective

    Portals

    Trigger portals and step into 360 dioramas.

  • 07
    Perspective

    Asymmetric information per device

    By looking at a single object or area from different connected devices it shows different information.

Input

Input interactions which offer additional methods for the audience to engage with the experience. Physical interaction with the world, through immersive technologies, opens up opportunities for creativity, decision-making, and other forms of self-expression.

  • 15
    Input

    Manipulate using gesture

    Gestures on the screen affect something already represented in AR, for example a box flipping in the direction of a swipe, or a character jumping.

  • 13
    Input

    Add/Create using gestures

    Add things to the world using touch. (ie. draw on it, throw objects into it).

  • 05
    Input

    Audio as input

    The AR world reacts to real world audio input.

  • 16
    Input

    Area obstruction as input

    The system recognises part of an area is being obstructed, for example a finger covering a hole to stop a leak.

  • 09
    Input

    Combine trackers

    The device can recognise more than one tracker at the same time, different combinations have different results.

  • 17
    Input

    Facial recognition

    By recognising its looking at a face additional AR experiences are possible, for example a ‘who am I?’ type of game.

Shared

Patterns which exploit AR’s innate capability to bring people together around a shared experience. These examples explore different variations of social play.

  • shared-10
    Shared

    Freestanding device as viewport

    Forward facing camera is initialised, allowing all players to interact with physical space simultaneously.

  • shared-05
    Shared

    Synchronous tilt

    Players work together to tilt the device, manipulating AR elements.

  • shared-06
    Shared

    Synchronous rotate

    Players work together to rotate the device, manipulating AR elements.

  • shared-07
    Shared

    Multi-device network

    Multiple devices are linked together through a network, enabling synchronous play across devices.

  • shared-01
    Shared

    Pass and Play

    Player’s take it in turns, passing the device amongst themselves. Game UI/UX would reinforce this.

  • shared-09
    Shared

    Games master

    Single device is used by a player to direct other players in the physical space.

  • shared-02
    Shared

    Synchronous tap

    A simple one touch interaction using a virtual controller where multiple players can use a single device simultaneously.

  • shared-08
    Shared

    Synchronous look

    Players work together to move the device around the world space, looking together.

  • shared-03
    Shared

    Synchronous drag

    Players work together to use drag gestures, manipulating AR elements.

  • shared-04
    Shared

    Synchronous flick

    Players work together to use flick gestures, manipulating AR elements.