将我在github pages上面的文章转载到这里,还没有翻译,见谅。
Overview
In ARKit 1, we have:
- Device positioning from world tracking process
- Horizontal and vertical plane detection from world tracking process
- Lighting estimation
- AR face tracking
In ARKit 2, we have:
- Saving and loading maps
- Environment Texturing
- Image detection and tracking
- 3D object tracking
- Improved face tracking
New Features in ARKit 2
Saving and Loading Maps
World Tracking Recap:
- Position and orientation of the device.
- Physical scale in the scene.
- 3D feature points.
- Relocalization (iOS 11.3): we can relocalize objects when your AR session is interrupted, like phone coming or going from background. This feature is implemented by storing the mapping
ARWorldMap
between real world and the coordinate system. However the mapping is not exposed to developers.
World Tracking Enhancement:
- Saving and loading maps: expose the
ARWorldMap
to developers. - Faster initialization and plane detection
- Robust tracking and plane detection
- More accurate extent and boundary Continuous autofocus
- New 4:3 video formats (iPad is also 4:3)
Saving and loading maps:
ARWorldmap
contains:
- Mapping of physical 3D space: for representing 3D feature points in the coordinate system.
- Mutable list of named anchors: for restoring previous 3D environment (like lighting node anchor), and relocalizing previously added virtual objects.
- Raw feature points and extent: for debugging and visualization.
- Serialization: for storing and recovering from an file.
We can use the map in two different ways:
- Persistent: Restore previous AR scene for a new AR session. For example, you go to another room and come back or close the AR app and open it some time later.
- Multiuser experience: We can share the map among devices through WiFi or bluetooth.
The SwiftShot is an multiuser experie