Getting Started with ARKit with Swift 4 - Xcode 9, iOS 11 - Augmented Reality in Swift

Published: 21 June 2018
on channel: Kodeco
5,273
44

In this video, you'll get started with ARKit. You'll learn about the various framework components and how to enable ARKit in your app.

You download the course materials here:

https://files.betamax.raywenderlich.c...

You can watch the screencast here:

https://www.raywenderlich.com/5031-in...

For more screencasts on ARKit, check out raywenderlich.com

---

About www.raywenderlich.com:

raywenderlich.com is a website focused on developing high quality programming tutorials. Our goal is to take the coolest and most challenging topics and make them easy for everyone to learn – so we can all make amazing apps.

We are also focused on developing a strong community. Our goal is to help each other reach our dreams through friendship and cooperation. As you can see below, a bunch of us have joined forces to make this happen: authors, editors, subject matter experts, app reviewers, and most importantly our amazing readers!

---

Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.

Overview

Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.

Augmented Reality with the Back Camera

The most common kinds of AR experience display a view from an iOS device's back-facing camera, augmented by other visual content, giving the user a new way to see and interact with the world around them.

ARWorldTrackingConfiguration provides this kind of experience: ARKit maps and tracks the real-world space the user inhabits, and matches it with a coordinate space for you to place virtual content. World tracking also offers features to make AR experiences more immersive, such as recognizing objects and images in the user's environment and responding to real-world lighting conditions.

Note

You can display a 3D object in the user's real-world environment without building a custom AR experience. In iOS 12, the system provides an AR view for 3D objects when you use QLPreviewController with USDZ files in an app, or use Safari or WebKit with USDZ files in web content.

Augmented Reality with the Front Camera

On iPhone X, ARFaceTrackingConfiguration uses the front-facing TrueDepth camera to provide real-time information about the pose and expression of the user's face for you to use in rendering virtual content. For example, you might show the user's face in a camera view and provide realistic virtual masks. You can also omit the camera view and use ARKit facial expression data to animate virtual characters, as seen in the Animoji app for iMessage.


Watch video Getting Started with ARKit with Swift 4 - Xcode 9, iOS 11 - Augmented Reality in Swift online, duration hours minute second in high quality that is uploaded to the channel Kodeco 21 June 2018. Share the link to the video on social media so that your subscribers and friends will also watch this video. This video clip has been viewed 5,273 times and liked it 44 visitors.