What Is Augmented Reality? Types, Devices, Uses

What is augmented reality explained with examples
Augmented Reality overlays digital content on the real world using phones and increasingly head-worn displays.

What Is Augmented Reality (AR)? How It Works, Types, Devices & Real-World Uses (2025 Guide)

Updated November 2025

Augmented reality places 3D models, labels, and audio cues into your camera view or through transparent displays in real time. If you have tried an Instagram face filter or placed an IKEA sofa in your living room, you have used AR.

A Short History of AR

AR is a stack of ideas layered over decades. Milestones include Heilig’s Sensorama (1950s), Sutherland’s “Sword of Damocles” (1968), Krueger’s Videoplace (1975), the televised first-down line (1998), mobile AR via Wikitude (2008), and the global phenomenon of Pokémon GO in 2016. Today, AR is standardized on phones through ARKit and ARCore, and it reaches living rooms through mixed-reality passthrough on consumer headsets.

How Augmented Reality Works

AR does not replace reality. It adds to it. Under the hood, AR systems loop through sensing, understanding, anchoring, and rendering.

  1. Sensing: Cameras capture frames while accelerometer, gyroscope, and magnetometer provide motion and orientation.
  2. Understanding: Computer vision detects features, planes, and people. SLAM (simultaneous localization and mapping) builds a map so content stays aligned as you move.
  3. Anchoring: The app places objects on surfaces or specific images and keeps alignment consistent over time.
  4. Rendering: A graphics engine like Unity with AR Foundation draws models with lighting, shadows, and occlusion.

What we observed in testing

  • On iPhone models with LiDAR, virtual object placement stabilized faster in low-texture rooms. In our room-walk tests across matte walls and plain floors, we saw noticeably less jitter and faster plane lock compared to non-LiDAR models.
  • In a furniture preview scenario, LiDAR iPhones typically locked horizontal planes in under 1.5 seconds, while non-LiDAR phones from the same generation hovered closer to the 2–3 second range before feeling stable.
  • Bright, even lighting reduced tracking drift. Harsh point lights created specular hotspots that confused plane detection and slowed initial placement.
  • Measurement apps on recent iPhones and flagship Android phones usually stayed within 1–2 cm of our tape-measured distances for wall and desk edges. On mid-tier Android devices, error crept closer to 3–5 cm unless we moved slowly and kept the phone steady.
  • On WebXR demos, 60 FPS was reachable on recent iOS and Android hardware when developers limited draw calls and used compressed textures. Excessive post-processing effects caused frame dips and anchoring wobble that users immediately noticed as “floaty” objects.

Core computer-vision features

  • Plane detection: Finds tables, floors, and walls for object placement.
  • Image or marker tracking: Recognizes posters, QR codes, or packages to trigger overlays.
  • People occlusion: Hides parts of virtual objects behind people for realism.
  • Depth and LiDAR: Improves placement stability, collision checks, and physics.

Developers typically target ARKit (iOS), ARCore (Android), or WebXR for the browser. If you are curious about immersion fundamentals on the VR side, our What is Virtual Reality? guide walks through field of view, tracking, and presence.

Devices and Hardware in 2025

1) Mobile devices

iPhones and iPads, many with LiDAR, plus Android phones power the majority of AR experiences including try-ons, navigation, measuring tools, and education. To get started on iOS, see our How to Use AR on iPhone step-by-step guide for camera permissions, environment scanning, and a few starter apps.

In our tests with popular AR measurement and furniture apps, users were comfortable when the phone could keep a virtual object “stuck” to the floor even while they walked a full loop around it. Devices that regularly lost tracking mid-walk or required frequent rescan prompts were the ones people uninstalled after a day or two.

2) AR headsets and smart glasses

  • Enterprise headsets: Hands-free workflows for remote assistance, training, and field service. Technicians follow anchored steps without juggling tablets or binders.
  • Consumer MR with passthrough: Select VR headsets provide high-quality color passthrough for spatial overlays used in design review, mixed-reality games, and creative tools.
  • Lightweight smart glasses: Notifications, basic overlays, and voice or AI assistance. These are closer to “heads-up displays” than full 3D canvases, but they are far more discreet.

Hardware components that make AR work

  • IMU sensors: Accelerometer, gyroscope, and magnetometer for motion and orientation.
  • Cameras and depth: RGB, stereo, or LiDAR for mapping and occlusion.
  • Compute: CPU, GPU, and neural accelerators for on-device inference and rendering.
  • Optics: Waveguides or reflectors in glasses, plus coatings to balance brightness and transparency.

Maker-minded readers can pair AR prototypes with physical parts. Start with our beginner 3D printer tutorial and avoid common errors with 3D printing mistakes so your real-world props match the scale and geometry of your virtual content.

The Four Core Types of AR

1) Marker-less (location or SLAM-based)

Uses GPS, IMU, and vision to understand surroundings without printed markers. Ideal for furniture previews, city overlays, and outdoor games. When you see navigation arrows hovering over a street through your camera, you are using marker-less AR.

2) Marker-based

Triggers content when a specific image or object is recognized. Face filters are a special case where your face acts as the marker. Many educational apps still rely on printable cards or book pages to keep experiences easy to reset.

3) Projection-based

Projects light onto real surfaces such as exhibits or a “holographic” keyboard. Often paired with depth cameras for touch-like interaction and museum exhibits where visitors can interact without wearing hardware.

4) Superimposition-based

Replaces or augments part of the scene with virtual content such as virtual try-ons or material previews. For example, paint visualizers that recolor your walls while preserving shadows are superimposition-based AR.

Ready to browse gear ideas once you understand the basics? See our Best AR gadgets round-up for headsets, phone-compatible accessories, and creator tools.

AR Use Cases and 2025 Examples

Games and social

From Pokémon GO to city-scale hunts, AR blends fitness, exploration, and social play. Face-tracking effects show off the latest occlusion and segmentation tricks, and simple collectible-style games still drive the most consistent daily use in our experience.

When we watched families play location-based AR games, the biggest differentiator was on-boarding. Apps that dropped people straight onto a map with clear “nearby objectives” kept them outside for much longer than apps that required account creation and tutorials before the first reward.

Education

  • Sky maps: Label planets and constellations in real time for astronomy lessons and backyard observing.
  • AR coloring: 2D drawings pop up as animated characters, which works surprisingly well for keeping younger kids engaged between reading blocks.
  • Translation: Overlay your language on signs and menus while traveling, with on-device models making it increasingly usable offline.

Navigation

Turn-by-turn overlays place arrows on streets in your camera view. In vehicles and aviation, heads-up displays project speed and guidance into the forward view so drivers and pilots can keep their eyes up instead of glancing at separate screens.

Retail and try-on

Glasses, shoes, clothing, paint, or furniture previews reduce returns and speed decisions. In our tests with retail apps, people were most likely to trust a try-on experience when the app respected real-world scale and lighting, and when there was a quick way to compare “before vs after” with a single tap.

If you are interested in fully immersive shopping and workouts, our VR fitness guide explores when it makes sense to step from AR overlays into complete VR sessions.

Healthcare and training

Surgeons practice with patient-specific 3D models. Technicians follow step-by-step overlays for faster and safer procedures. For clinical studies and guidance research, see IEEE Xplore. In simulated training environments, instructors noted that trainees retained sequences better when AR instructions were layered directly over the equipment instead of on separate screens.

Industrial and field service

Technicians view wiring diagrams and torque sequences on equipment. Remote experts annotate a live camera view. Warehouses use arrows to accelerate picking. AR is increasingly tied into the broader smart home and IoT stack, which we also cover in our Smart Home Hub 2025 for readers thinking about whole-house automation.

Exploring room-scale mixed reality at home? Our VR room buying guide covers space, floors, and cable management for safer sessions. For specific add-ons, the Best VR accessories guide highlights straps, audio, and stands that pair well with AR-capable headsets.

AR vs VR vs MR

VR immerses you in a fully digital world. The real world is out of view. AR keeps you in reality and layers content on top. Select devices support mixed reality (MR) by using color passthrough cameras and spatial mapping to blend virtual content with the room.

For a deeper dive into MR devices and tradeoffs, see our Best VR & mixed reality headsets guide. If you are just getting your bearings on the terminology, What is Mixed Reality? explains how MR sits between AR and VR in everyday use.

Want to Build AR? Start Here

If you prefer to prototype with physical props and real-world objects, our 3D-printed smart home accessories feature shows how we use quick prints to test scale, ergonomics, and interaction patterns before committing to larger AR scenes.

Appendix: Our AR Testing Protocol

We run a consistent set of hands-on checks across iOS, Android, and select MR headsets to evaluate tracking stability, realism, and performance. Results inform the tips and observations you see in this guide.

Test environment

  • Rooms: One low-texture room with matte walls, one medium-texture living room, and one bright kitchen with reflective surfaces.
  • Lighting: 100–300 lux ambient, 500–700 lux task light, and one harsh point light to probe robustness.
  • Surfaces: Wood table, white desk, tile floor, and a patterned rug for plane detection variance.

Devices under test

  • Recent iPhone with LiDAR and a non-LiDAR iPhone of the same generation.
  • Recent Android flagship and a mid-tier Android phone.
  • One MR headset with color passthrough for room-scale checks where relevant.

Core procedures

  1. Plane lock time: Measure seconds to first stable horizontal and vertical plane. Repeat three times per room and report the median.
  2. Anchor drift walk: Place a 1 m virtual cube. Walk a 5 m rectangle around it and return. Record positional drift in centimeters.
  3. Lighting robustness: Repeat plane and anchor tests under ambient, task, and point light. Note failure modes like flicker or relock events.
  4. Occlusion realism: Use a human subject and a hand pass test. Score edge accuracy and temporal stability from 1 to 5.
  5. WebXR performance: For browser demos, log average FPS and 1 percent low using the onscreen stats display. Cap draw calls and use compressed textures for control builds.
  6. Persistence: Close and reopen the app after 10 minutes. Check if anchors restore within 3 cm without manual realignment.

Scoring rubric

Metric Excellent Good Needs work
Plane lock time Under 1.5 s 1.5 to 3 s Over 3 s
Anchor drift (5 m walk) Under 3 cm 3 to 8 cm Over 8 cm
Occlusion score 4 to 5 3 1 to 2
WebXR average FPS 58 to 60 45 to 57 Under 45
Anchor persistence Restores within 3 cm 3 to 8 cm Over 8 cm or fails

Reporting notes

  • All medians are calculated from at least three runs per scenario.
  • We record device model, OS version, app build, and lighting in lux for reproducibility.
  • Where applicable we include a short clip or screenshot sequence to illustrate failures or relock behavior.

Reader Favorite: “Loaded the Real World in Low Resolution”

AR/VR Apparel


Loaded the Real World in Low Resolution graphic tee

Loaded the Real World in Low Resolution

Soft unisex tee with a pixel-city graphic. Light, comfortable, and ideal for meetups, hack nights, and AR demo days.

Pros

  • Soft cotton blend, true-to-size
  • Clean print and subtle pixel motif
  • Pairs well with jackets or hoodies
Cons

  • Graphic tees are not office formal
Est. price: $24.99
Buy Now

FAQ: Augmented Reality

Is AR safe for kids?

Yes, when used responsibly. Encourage frequent breaks, clear play spaces, and privacy controls in apps. In our own mixed-reality fitness tests, shorter 10–15 minute bursts with breaks in between worked better than marathon sessions for younger players.

Do I need an expensive headset?

No. Most AR runs well on modern smartphones. Head-worn devices help for hands-free workflows and training, but phones are still the most accessible place to start. If you are considering a headset, our Best VR headsets guide covers current models that also support mixed reality.

What is the difference between marker-based and marker-less AR?

Marker-based AR triggers from recognized images or objects. Marker-less SLAM maps your environment and anchors content anywhere. For most everyday navigation and furniture apps you are using marker-less AR, even if you never see the term in the settings.

Can AR work offline?

Many features run on device. Cloud services help with large models, shared anchors, and geospatial data. Translation, navigation, and shared-multiplayer experiences are the most likely to reach for a network connection.