Building tools. Learning to build tools. Learning to build learning tools.
With streaming working, the rest of the app is iOS platform integration: making the layout adapt to portrait and landscape, supporting Picture-in-Picture so you can monitor the printer while doing other things, and correctly configuring the permissions and entitlements that iOS requires before it lets your app touch the local network and run audio in the background.
iOS doesn’t give apps a direct “is the device in landscape?” boolean.
Instead, SwiftUI uses size classes — broad descriptors of how much
screen space is available. verticalSizeClass is the one that reliably
distinguishes portrait from landscape on iPhone:
| Orientation | verticalSizeClass | horizontalSizeClass |
|---|---|---|
| iPhone portrait | .regular |
.compact |
| iPhone landscape | .compact |
.regular |
| iPad (any orientation) | .regular |
.regular |
FredCam reads the vertical size class from the SwiftUI environment and maps it to a simple boolean:
@Environment(\.verticalSizeClass) var verticalSizeClass
private var isLandscape: Bool { verticalSizeClass == .compact }
The streaming layout then chooses between horizontal and vertical composition:
@ViewBuilder
private var streamingLayout: some View {
if isLandscape {
HStack(spacing: 0) {
videoArea
landscapeControlBar
.frame(width: 88)
}
} else {
VStack(spacing: 0) {
videoArea
portraitControlBar
.frame(height: 88)
}
}
}
In portrait, the control bar sits below the video (an 88-point tall strip). In landscape, it sits to the side (an 88-point wide strip). The video fills the remaining space.
There’s a subtle and important constraint in the layout code: CameraView
must never move to a different position in the SwiftUI view tree.
SwiftUI tracks views by their position in the hierarchy. If a view moves — from inside
an HStack to inside a VStack, for example — SwiftUI treats
it as a destruction of the old view and creation of a new one. For most views, this is fine.
For CameraView, it would destroy the KSPlayer instance, close the RTSP
connection, and start a new one from scratch. Every orientation change would interrupt the stream.
The solution: videoArea is a computed property that always lives at the same
structural position, regardless of orientation. Both the HStack (landscape)
and the VStack (portrait) paths put videoArea as their
first child. The container changes; the video doesn’t move within it.
Separately, the control bars are two distinct views:
portraitControlBar and landscapeControlBar. You might expect
a single controlBar view that rearranges itself. But if the same view
switches from the first position in a VStack to the second position in
an HStack, SwiftUI sees it as a different view. Maintaining separate views
for each orientation gives each a stable, predictable identity.
You can force stable identity with the .id() modifier, but that’s
a last resort — it tells SwiftUI “treat this as the same view across
recompositions” but it doesn’t prevent the view from moving in the tree.
The cleanest solution is structural: position the view identically across all layout branches.
The video area uses a ZStack to layer connecting and error overlays on top
of the video:
private var videoArea: some View {
ZStack {
Color.black
if let url = settings.streamURL {
CameraView(url: url, streamState: $streamState) { action in
pipAction = action
}
.opacity(streamState.isLive ? 1 : 0)
.animation(.easeIn(duration: 0.5), value: streamState.isLive)
}
if case .connecting = streamState {
connectingOverlay
.transition(.opacity)
}
if case .error(let msg) = streamState {
errorOverlay(message: msg)
.transition(.opacity)
}
}
.animation(.easeInOut(duration: 0.3), value: streamState.phase)
}
Layers in a ZStack go back-to-front: black background, then the video,
then the connecting overlay, then the error overlay. The video’s opacity is 0
when not live (so you see the background through it), then fades in to 1 when the
stream is ready. The overlays appear and disappear with .transition(.opacity).
The PulsingIcon used in the connecting overlay is a custom SwiftUI view:
struct PulsingIcon: View {
let systemName: String
let color: Color
@State private var pulsing = false
var body: some View {
ZStack {
Circle()
.fill(color.opacity(0.15))
.frame(width: 90, height: 90)
.scaleEffect(pulsing ? 1.35 : 1.0)
.opacity(pulsing ? 0 : 0.6)
.animation(
.easeOut(duration: 1.2).repeatForever(autoreverses: false),
value: pulsing
)
Circle()
.fill(color.opacity(0.25))
.frame(width: 70, height: 70)
Image(systemName: systemName)
.font(.system(size: 30, weight: .medium))
.foregroundColor(color)
}
.onAppear { pulsing = true }
}
}
On onAppear, pulsing flips to true, triggering
the animation. The outer circle scales up and fades out repeatedly, giving the radar-ping
effect. repeatForever(autoreverses: false) means the animation repeats
indefinitely without reversing — it scales up and fades, then immediately jumps
back to scale 1 and full opacity before repeating.
Picture-in-Picture (PiP) lets the video float in a small overlay window while the user switches to another app. KSPlayer handles most of the PiP mechanics; the app just needs to configure it correctly.
Three pieces are required:
options.canStartPictureInPictureAutomaticallyFromInline = true
This tells KSPlayer (and the underlying AVPlayerLayer) that PiP can start
automatically when the app moves to the background. Without it, PiP only starts
on explicit user action.
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
</array>
The audio background mode is required for any video/audio playback
that should continue when the app is backgrounded, including PiP. Without this
entry, iOS suspends the app when it leaves the foreground and PiP stops.
onPipReady
closure that surfaces layer.isPipActive.toggle() up to the PiP button.
iOS 14 introduced strict local network access controls. Apps must declare their intent
to use the local network and provide a user-facing explanation. Two Info.plist
entries handle this:
<key>NSLocalNetworkUsageDescription</key>
<string>FredCam needs local network access to connect to your Bambu Lab printer.</string>
<key>NSBonjourServices</key>
<array>
<string>_rtsp._tcp</string>
</array>
NSLocalNetworkUsageDescription is the permission prompt shown to the user
the first time the app tries to access the local network. If this key is missing, iOS
silently blocks the connection with no explanation — one of the more confusing
failure modes for new iOS developers.
NSBonjourServices declares which mDNS (Bonjour) service types the app
will browse or advertise. Even though FredCam connects directly by IP and doesn’t
do any mDNS discovery, the _rtsp._tcp declaration is required by iOS to
grant local network socket access for RTSP connections on TCP. Without it, the local
network permission prompt may not appear at the right time.
App Transport Security (ATS) is Apple’s enforce-HTTPS mechanism. Since RTSPS is not HTTP, ATS doesn’t directly apply. But to allow any plaintext local-network communication (including the RTSP control channel, which can have plain sub-connections), the plist includes:
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
NSAllowsArbitraryLoads: true disables ATS globally. This is broader than
necessary — a tighter configuration would use NSExceptionDomains to
allow only the printer’s IP address. For a personal-use app that always connects
to a known LAN device, the broad exception is acceptable. For an App Store submission
that Apple reviews, a narrower exception with a justification in the review notes is
the better approach.
Adaptive layout uses verticalSizeClass to switch between HStack and
VStack compositions. CameraView sits at a stable position in both to
avoid stream disruption on rotation. ZStack overlays handle the connecting and error
states. PiP requires the KSOptions flag plus the audio background mode
in Info.plist. Local network access requires NSLocalNetworkUsageDescription
and NSBonjourServices.