TYPE ON THE FIELD BELOW AND PRESS ENTER TO SEARCH FOR BLOG POSTS

AUGMENTED REALITY – Kit IOS 4.3.19

  • Mobile Development
Tue Mar 5 by vinod kumar

What is Augmented Reality (AR)?

An enhanced version of reality, where live direct or indirect views of the real-world environments are augmented with superimposed computer-generated images over a user’s view of the real world, thereby enhancing one’s perception of reality. AR apps for smartphones typically include global positioning system (GPS) to pinpoint the user’s location and its compass to detect device orientation. Sophisticated AR programs include machine vision, object recognition, and gesture recognition technologies.

Uses of Augmented Reality (AR)

Augmented Reality apps are being developed at a rapid pace to enhance many industries – retail, healthcare, real estate, tourism & navigation, and more. It enhances natural environments/ situations and offers perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Information about the environment and its objects is overlaid on the real world.

Hardware

Hardware components for augmented reality are processor, display, sensors, and input devices. Modern mobile computing devices like smartphones and tablet computers contain these elements which often include a camera and MEMS sensors such as an accelerometer, GPS, and solid state compass, making them suitable AR platforms.

AR KIT

What is ARKit?

ARKit is what Apple calls it’s set of software development tools to enable developers to build augmented-reality apps for iOS. Most of us will never actually use ARKit, but we see its results and interact with ARKit apps in iOS. ARKit is used to integrate the iOS device’s camera and motion features to produce augmented reality experiences in your app or game.

ARKit apps put three-dimensional images in your world using visual inertial odometry, which takes advantage of the device’s sensors to track your world and sense the device’s orientation and position relative to the scene you’re looking at. With Apple’s Ikea app, you simply hold your device up and use the app to drop a piece of furniture – such as a chair, in an empty space. You can easily move it around and even see details, like the texture or the workmanship.

AR Face Tracking

This sample app presents a simple interface allowing you to choose between five augmented reality (AR) visualizations on devices with a TrueDepth front-facing camera. An overlay of x/y/z-axes indicating the ARKit coordinate system tracking the face (and in iOS 12, the position and orientation of each eye). Virtual 3D content appears to attach and interact with the user’s real face.

AR Face Tracking Process
  • ARKit uses a TrueDepth camera to track your face
  • Overlays emoji on your tracked face
  • Manipulate the emoji based on facial expressions you make
Face-Tracking

Face tracking differs from other uses of ARKit in the class you use to configure the session. To enable face tracking, create an instance of ARFace, configure its properties, and pass it to the run (_: method of the AR session associated with your view

guard ARFaceTrackingConfiguration.isSupported else { return }
let configuration = ARFaceTrackingConfiguration()
configuration.isLightEstimationEnabled = true
sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])

Before offering features that require a face-tracking AR session, check the isproperty on the ARface class to determine whether the current device supports ARKit face tracking.

Track the Position and Orientation of a Face

In a SceneKit-based AR experience, you can add 3D content corresponding to a face anchor in the renderer_: or renderer_: delegate method. ARKit manages a SceneKit node for the anchor, and updates that node’s position and orientation on each frame, so any SceneKit content you add to that node automatically follow the position and orientation of the user’s face.

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode?
{
// This class adds AR content only for face anchors.
guard anchor is ARFaceAnchor else { return nil }

// Load an asset from the app bundle to provide visual content for the anchor.
contentNode = SCNReferenceNode(named: "coordinateOrigin")

// Add content for eye tracking in iOS 12.
self.addEyeTransformNodes()

// Provide the node to ARKit for keeping in sync with the face anchor.
return contentNode
}
To Model the User’s Face

Your AR experience can use this mesh to place or draw content that appears to attach to the face. For example, by applying a semi-transparent texture to this geometry you could paint virtual tattoos or makeup onto the user’s skin. To create a SceneKit face geometry, initialize an ARSCNFace object with the Metal device your SceneKit view uses for rendering, and assign that geometry to the SceneKit node tracking the face anchor.

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode?
{
guard let sceneView = renderer as? ARSCNView,
anchor is ARFaceAnchor else { return nil }
#if targetEnvironment(simulator)
#error("ARKit is not supported in iOS Simulator.
Connect a physical iOS device and select it as your Xcode run destination,
or select Generic iOS Device as a build-only destination.")
#else
let faceGeometry = ARSCNFaceGeometry(device: sceneView.device!)!
let material = faceGeometry.firstMaterial!
material.diffuse.contents = #imageLiteral(resourceName: "wireframeTexture")
material.lightingModel = .physicallyBased
contentNode = SCNNode(geometry: faceGeometry)
#endif
return contentNode
}

ARKit updates its face mesh conform to the shape of the user’s face, even as the user blinks, talks, and makes various expressions. To make the displayed face model, follow the user’s expressions, retrieve an updated face meshes in the renderer(_:delegate callback, then update the ARSCNFace object in your scene to match by passing the new face mesh to its update(from:) method:

Animate a Character

In addition to the face mesh shown in the earlier examples, ARKit also provides a more abstract representation of the user’s facial expressions. You can use this representation (called blend shapes) to control animation parameters for your own 2D or 3D assets, creating a character that follows the user’s real facial movements and expressions.

To get the user’s current facial expression, read the blend dictionary from the face anchor in the renderer(_: delegate callback. Then, examine the key-value pairs in that dictionary to calculate animation parameters for your 3D content and update the content accordingly.

let blendShapes = faceAnchor.blendShapes
guard let eyeBlinkLeft = blendShapes[.eyeBlinkLeft] as? Float,
let eyeBlinkRight = blendShapes[.eyeBlinkRight] as? Float,
let jawOpen = blendShapes[.jawOpen] as? Float
else { return }
eyeLeftNode.scale.z = 1 - eyeBlinkLeft
eyeRightNode.scale.z = 1 - eyeBlinkRight
jawNode.position.y = originalJawY - jawHeight * jawOpen
Conclusion

Apple’s release of ARKit is huge for brands. They can now create AR apps to promote their products, encourage adoption, and drive more sales. As for users? They simply get to enjoy the technology in the AR-enabled apps they download.

FOUND THIS USEFUL? SHARE IT

Ready to Build Your Projects?

The best companies and startups hire our top dev teams, and so can you.
Start a Project
GSM Plus Infotech