Object Tracker iOS Tutorial
1. Overview
2. iOS Development
2.1 Create Instants
2.2 Start / Stop Tracker
2.3 Use Tracking Information
2.4 Set Map
2.5 Add / Replace Map
2.6 Change Tracking Mode
3. Reference
3.1 API Reference
3.2 Sample
1. Overview
Start developing MAXST ARSDK Object Tracker on iOS Platform. Refer to Object Tracker Introduction for detailed information.
Refer to Tracker Coordinate System to better understand 3D coordinate system of Object Tracker.
Generate 3D Mapfile of the target for Object Tracker via Visual SLAM Tool.
The Object Tracker loads the map file and renders 3D object on it. Refer to Visual SLAM Learning Guide to create a map more precisely while scanning 3D space.
Prerequisites |
---|
Object Tracker Introduction |
Visual SLAM Tool |
Tracker Coordinate System |
Visual SLAM Learning Guide |
2. iOS Development
Start developing on xCode using Swift. Refer to Requirements & Supports to find out which devices are supported.
ARSDK has to properly integrate on iOS UIViewController. Refer to Life Cycle documents for detail.
2.1 Create Instants
var cameraDevice:MasCameraDevice = MasCameraDevice() var trackingManager:MasTrackerManager = MasTrackerManager()
2.2 Start / Stop Tracker
To start / stop the tracker after loading the map, refer to the following code.
※ To change the tracker, destroyTracker() should be called before
@objc func resumeAR() { ... trackingManager.start(.TRACKER_TYPE_OBJECT) } @objc func pauseAR() { trackingManager.stopTracker() ... }
2.3 Use Tracking Information
To use the tracking information, refer to the following code.
※ startTracker() should be called before
func draw(in view: MTKView) { ... let trackingState:MasTrackingState = trackingManager.updateTrackingState() let result:MasTrackingResult = trackingState.getTrackingResult() let backgroundImage:MasTrackedImage = trackingState.getImage() var backgroundProjectionMatrix:matrix_float4x4 = cameraDevice.getBackgroundPlaneProjectionMatrix() let projectionMatrix:matrix_float4x4 = cameraDevice.getProjectionMatrix() if let cameraQuad = backgroundCameraQuad { cameraQuad.setProjectionMatrix(projectionMatrix: backgroundProjectionMatrix) cameraQuad.draw(commandEncoder: commandEncoder, image: backgroundImage) } let trackingCount:Int32 = result.getCount() for i in stride(from: 0, to: trackingCount, by: 1) { let trackable:MasTrackable = result.getTrackable(i) let poseMatrix:matrix_float4x4 = trackable.getPose() textureCube.setProjectionMatrix(projectionMatrix: projectionMatrix) textureCube.setPoseMatrix(poseMatrix: poseMatrix) textureCube.setTranslation(x: 0.0, y: 0.0, z: -0.15) textureCube.setScale(x: 0.3, y: 0.3, z: 0.3) textureCube.draw(commandEncoder: commandEncoder) } ... }
2.4 Set Map
By calling function addTrackerData to register the map file and calling function loadTrackerData, Space can be tracked. To set a map, refer to the following code.
func startEngin() { ... trackingManager.addTrackerData(objectTrackerMapPath) trackingManager.loadTrackerData() }
2.5 Add / Replace Map
※ You can add multiple 3Dmaps to recognize one of multiple objects. We recommend to add up to three maps.
Create a map file refer to Visual SLAM Tool.
Copy the received map file to the desired path.
If you have an existing map file, call function 'addTrackerData' and function 'loadTrackerData' after calling trackingManager.removeTrackerData()
2.6 Change Tracking Mode
2 Tracking Modes of Object Tracker:
- NORMAL_TRACKING
- EXTENDED_TRACKING
- EXTENDED_TRACKING: Default Setting. Track surrounded environment beyond a trained object.
trackingManager.setTrackingOption(.EXTENDED_TRACKING)
- NORMAL_TRACKING: Track only a trained object.
trackingManager.setTrackingOption(.NORMAL_TRACKING)
3. References
These are additional references to develop Object Tracker
3.1 API Reference
Following documents explain classes and functions used to run Object Tracker.
3.2 Sample
For information regarding sample build and run of Object Tracker, refer to Sample