Instant Fusion Tracker iOS Tutorial
1. Overview
2. iOS Development
2.1 Create Instants
2.2 Start / Stop Tracker
2.3 Use Tracking Information
2.4 Create Instant Target Data
3. Reference
3.1 API Reference
3.2 Sample
1. Overview
Start developing MAXST ARSDK Instant Fusion Tracker on iOS Platform. Refer to Instant Tracker Introduction for detailed information.
Refer to Tracker Coordinate System to better understand 3D coordinate system of Instant Fusion Tracker.
After target recognition and initial poses are acquired through the MAXST SDK, use AR Kit for tracking.
※To use the AR Kit, you must enter the actual size. (See Start / Stop Tracker)
Prerequisites |
---|
Instant Tracker Introduction |
Tracker Coordinate System |
2. iOS Development
Start developing on xCode using Swift. Refer to Requirements & Supports to find out which devices are supported.
ARSDK has to properly integrate on iOS UIViewController. Refer to Life Cycle documents for details.
2.1 Create Instants
var cameraDevice:MasCameraDevice = MasCameraDevice() var trackingManager:MasTrackerManager = MasTrackerManager()
2.2 Start / Stop Tracker
trackingManager.isFusionSupported ()
This function checks whether or not your device supports Fusion.
Return value is bool type. If true, it supports the device in use. If it is false, it does not support the device.
trackingManager.getFusionTrackingState ()
Pass the tracking status of the current Fusion.
The return value is an int of -1, which means that tracking isn't working properly, and 1 means that it's working properly.
To start / stop the tracker, refer to the following code.
func startEngine() { ... trackingManager.start(.TRACKER_TYPE_INSTANT_FUSION) ... } @objc func pauseAR() { ... trackingManager.start(. TRACKER_TYPE_INSTANT_FUSION) } @objc func resumeAR() { trackingManager.stopTracker() ... }
2.3 Use Tracking Information
To augment an object using Tracking results, refer to the following code.
func renderer(_ renderer: SCNSceneRenderer, willRenderScene scene: SCNScene, atTime time: TimeInterval) { if backgroundCameraQuad == nil { backgroundCameraQuad = BackgroundCameraQuad(device: renderer.device, pixelFormat: renderer.colorPixelFormat) } let commandEncoder:MTLRenderCommandEncoder = renderer.currentRenderCommandEncoder! let trackingState:MasTrackingState = trackingManager.updateTrackingState() let result:MasTrackingResult = trackingState.getTrackingResult() let backgroundImage:MasTrackedImage = trackingState.getImage() var backgroundProjectionMatrix:matrix_float4x4 = cameraDevice.getBackgroundPlaneProjectionMatrix() let projectionMatrix:matrix_float4x4 = cameraDevice.getProjectionMatrix() if let cameraQuad = backgroundCameraQuad { if(backgroundImage.getData() != nil) { cameraQuad.setProjectionMatrix(projectionMatrix: backgroundProjectionMatrix) cameraQuad.draw(commandEncoder: commandEncoder, image: backgroundImage) } } let trackingCount:Int32 = result.getCount() if trackingCount > 0 && fusionState == 1 { let trackable:MasTrackable = result.getTrackable(0) let poseMatrix:matrix_float4x4 = trackable.getPose() textureCube!.setProjectionMatrix(projectionMatrix: projectionMatrix) textureCube!.setPoseMatrix(poseMatrix: poseMatrix) textureCube!.setTranslation(x: panTranslateX, y:0.0 , z: panTranslateY) textureCube!.setScale(x: 0.1, y: 0.1, z: 0.00001) textureCube!.draw(commandEncoder: commandEncoder) } }
2.4 Create Instant Target Data
You can create new data only when it is not in Tracking state. Refer to the following code.
@IBAction func clickStartButton(_ sender: Any) { let button:UIButton = sender as! UIButton if button.titleLabel?.text == "START" { trackingManager.findSurface() button.setTitle("STOP", for: .normal) } else if button.titleLabel?.text == "STOP"{ trackingManager.quitFindingSurface() button.setTitle("START", for: .normal) } }
3. References
These are additional references to develop Instant Fusion Tracker
3.1 API Reference
Following documents explain classes and functions used to run Instant Fusion Tracker.
3.2 Sample
For information regarding sample build and run Instant Fusion Tracker, refer to Sample