Get started with building apps for spatial computing - Notes
Get started with building apps for spatial computing
Intro
- blend real and virtual (i.e. not VR)
- natural input (i.e. no controllers)
- privacy-first (i.e. limited access to environmental data)
Fundamentals
Foundational Elements
- By default apps launch into Shared Space, apps live side by side like a mac
- Each app can have multiple windows which are SwiftUI scenes (2d and 3d in the window works)
- Shared space apps can also have Volumes - SwiftUI scenes for 3d content inside a bounded space
- Volumes can be viewed from multiple angles
- Volumes use RealityKit
- Full Space makes it so your app is the only app visible
- Full Space uses windows & volumes
- ARKit is available in Full Space
- You can use Passthrough in Full Space
- Fully immersive inside of Full Space gives you more capabilities for sound and "customizing the lining of virtual objects"
- Windows, Volumes, and Spaces are the foundational elements of Spatial Computing
Input
- Eyes and Hands: look at something, "tap their fingers together"
- Users can also reach out and "physically" tap the button in space
- Taps, long presses, drags, rotations, zooms are available and the system automatically detects them
- The same gestures are integrated with SwiftUI and RealityKit
- ARKit gives you skeletal hand tracking
- Input devices like keyboards and trackpads and with game controllers
Collaboration
- SharePlay and Group Activities framework
- Share any window
- Shared Context is managed by the system
- WWDC23 video reference: Design spatial SharePlay experiences
- WWDC23 video reference: Build spatial SharePlay experience
Privacy
- No access to the sensors directly
- System renders hover effect on view when it is looked at but does not communicate to the app where the person is looking
- Permissions are required for scene understanding like detecting walls and furniture
- Permissions are required for hand tracking data
Xcode
- SwiftUI Preview provider shows 3D for RealityKit
- Previews have an object mode for quick previews of 3D layouts
- Use previews for layouts
- Use Simulator to test interaction with simulated system gestures
- Runtime visualizations with plane estimations, collision shapes, etc. works in simulator and on device
Instruments
- RealityKit trace shows GPU and CPU metrics
- WWDC23 video reference: Meet RealityKit trace
Reality Composer Pro
- Preview and prepare 3D content for your apps
- New feature: particles
- Can add 3D sound
- RealityKit has "physically based material" that will interact with real world lighting conditions
- Create custom materials using MaterialX
- Reality Composer Pro can send scenes directly to device without an app
- WWDC23 video reference: Meet Reality Composer Pro
Unity
- Immersive experiences
- WWDC23 video reference: Bring your Unity VR app to a fully immersive space
- WWDC23 video reference: Create immersive Unity apps
Ways to get started
Brand new app
- New Spatial app template
- Initial Scene: window or volume
- Immersive Scene Type: Select Space and a second scene will be added, you will have to launch to that from the initial shared space scene
- WWDC23 video reference: Develop your first immersive app
- Destination Video: shared space app
- Happy Beam: immersive game
- Hello World: transition between spaces
Existing app
- iPad variants will run on VisionOS
- iPhone only apps are fully supported
- Light mode only
- WWDC23 video reference: Run your iPad and iPhone apps in the Shared Space
Deep dive into details
Hello World Example
- SwiftUI window in the shared space
- 3D content in the window
- Volume contains 3D model of Earth
- View Outer Space brings up an immersion style of full
Hello World Concepts: Intro
- Foundational Elements: Windows, Volumes, Spaces
- Spectrum of immersion from more presence to deep immersion
Windows
- Windows serve as a starting point in your app, built in SwiftUI
- Add
WindowGroup
to SwiftUIScene
to make a Window Model3D
is a view similar to an Image but displays 3D content, RealityKit renders it- SwiftUI existing gestures work: tap, onHover, rotateGesture, etc.
- New SwiftUI gestures: RotateGesture3D, SpatialTapGesture, etc.
Volumes
- Extension of Window ideal for 3D content
- Can contain multiple views
- Built for Shared Space but can be used in Full Space
- Content must remain within its bounds
- Create a volume by using
WindowGroup
with awindowStyle(.volumetric)
RealityView
RealityView
is a new SwiftUI View that allows you to interact with RealityKitRealityView
provides coordinate space conversion and SwiftUI attachments to 3D objectsRealityView
has 3 parameters: make closure, update closure, and attachments view builder- Make closure is where you create entities and attach them to the root entity
- Update closure gets content and attachments passed in, you add your attachments to the content there; find your attachments using tag identifiers
- Attachments view builder is where you create your SwiftUI views for attaching
- WWDC23 video reference: Build spatial experiences with RealityKit
- WWDC23 video reference: Enhance your spatial computing app with RealityKit
Spaces
- Full Space hides all other apps
- "Place app content anywhere" (this implies that location of non-full-space windows are bounded?)
- Interact with your surroundings and custom hand interactions (i.e. you only get ARKit in Full Space)
- WWDC23 video reference: Meet ARKit for spatial computing
ImmersiveSpace
is a SwiftUI scene.immersiveStyle
can be.mixed
,.progressive
(default),.full
Next Steps
- WWDC23 video reference: Principles of spatial design
- WWDC23 video reference: Meet SwiftUI for spatial computing
- WWDC23 video reference: Build spatial experiences with RealityKit
- WWDC23 video reference: Meet Reality Composer Pro