130 Widgets

Semi-random thoughts and tales of tinkering

10. Complete Source Code Reference Both

This section contains the final version of every source file in the project. Use it as a reference if you got lost during the tutorial, or as a starting point if you want to skip ahead and have the working app in front of you. Each file includes a brief description and cross-references to the tutorial sections where its components were explained.

Project Structure

The complete project has four Swift source files and one asset catalog. Here is the file layout:

VUMeter/
├── VUMeter.xcodeproj
├── VUMeter/
│   ├── VUMeterApp.swift          ← App entry point
│   ├── AudioEngine.swift         ← Audio capture + processing
│   ├── SpectrumAnalyzer.swift    ← FFT analysis
│   ├── ContentView.swift         ← All UI code
│   └── Assets.xcassets/          ← App icon, colors
└── VUMeterTests/
    └── SpectrumAnalyzerTests.swift ← Unit tests
            

There are no storyboards, no XIB files, no Interface Builder artifacts. The entire UI is defined in Swift code via SwiftUI. The entire audio pipeline is pure Swift with calls into Apple's Accelerate framework for the heavy math. No third-party dependencies.

VUMeterApp.swift

The app entry point. This file defines the @main struct and launches the single window containing ContentView. Covered in Section 1.

import SwiftUI

@main
struct VUMeterApp: App {
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
    }
}

Nothing to customize here. This is the standard SwiftUI app boilerplate. If you later add features like app-level state (a settings model, for instance), you would inject them into the environment from this file.

AudioEngine.swift

The audio capture and processing engine. This class owns the AVAudioEngine, installs a tap on the microphone input, computes RMS level, and delegates FFT analysis to SpectrumAnalyzer. The initial version was built in Section 3, optimized with Accelerate in Section 4, and extended with spectrum data in Section 6.

import AVFoundation
import Accelerate
import Observation

@Observable
class AudioEngine {
    var level: Float = 0.0
    var spectrumBars: [Float] = Array(repeating: 0, count: 48)
    var peakHz: Float = 0
    var peakNote: String = "—"
    var isRunning = false

    private let engine = AVAudioEngine()
    private var analyzer: SpectrumAnalyzer?

    func start() {
        let input = engine.inputNode
        let format = input.outputFormat(forBus: 0)

        // Create analyzer with the device's actual sample rate (48000 on most iPhones, not 44100)
        analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: format.sampleRate)

        input.installTap(onBus: 0, bufferSize: 4096, format: format) { [weak self] buffer, _ in
            guard let self, let analyzer = self.analyzer else { return }

            guard let channelData = buffer.floatChannelData?[0] else { return }
            let samples = Array(UnsafeBufferPointer(start: channelData,
                                                    count: Int(buffer.frameLength)))

            let rms = Self.computeRMS(samples: samples)
            let (bars, peakHz, note) = analyzer.process(buffer: samples)

            DispatchQueue.main.async {
                self.level = Self.normalize(rms)
                self.spectrumBars = bars
                self.peakHz = peakHz
                self.peakNote = note
            }
        }

        do {
            try engine.start()
            isRunning = true
        } catch {
            print("Audio engine failed to start: \(error)")
        }
    }

    func stop() {
        engine.inputNode.removeTap(onBus: 0)
        engine.stop()
        isRunning = false
        level = 0
        spectrumBars = Array(repeating: 0, count: 48)
        peakHz = 0
        peakNote = "—"
    }

    private static func computeRMS(samples: [Float]) -> Float {
        guard !samples.isEmpty else { return 0 }
        var sum: Float = 0
        vDSP_svesq(samples, 1, &sum, vDSP_Length(samples.count))
        return sqrt(sum / Float(samples.count))
    }

    private static func normalize(_ rms: Float) -> Float {
        let db = 20 * log10(max(rms, 1e-6))
        let clamped = max(-60, min(0, db))
        return (clamped + 60) / 60
    }
}
Key design decisions

Buffer size of 4096: balances frequency resolution (10.8 Hz per bin at 44.1 kHz) against latency (~93 ms). vDSP_svesq: replaces the hand-written RMS loop from Section 3 with a single Accelerate call (see Section 4). [weak self]: prevents a retain cycle between the closure and the engine instance (see Section 3).

SpectrumAnalyzer.swift

The FFT-based frequency analysis engine. Takes raw audio samples, applies a Hanning window, computes the FFT via Accelerate, maps frequency bins to 48 logarithmically spaced display bars, and identifies the peak frequency as a musical note. Built in Section 5 and integrated in Section 6.

import Accelerate

struct SpectrumAnalyzer {
    let binCount: Int
    let sampleRate: Double
    private let fftSize: Int
    private let halfSize: Int
    private var fftSetup: vDSP.FFT<DSPSplitComplex>?
    private var window: [Float]

    init(binCount: Int = 48, sampleRate: Double = 44100, fftSize: Int = 4096) {
        self.binCount = binCount
        self.sampleRate = sampleRate
        self.fftSize = fftSize
        self.halfSize = fftSize / 2

        self.window = vDSP.window(ofType: Float.self,
                                  usingSequence: .hanningDenormalized,
                                  count: fftSize,
                                  isHalfWindow: false)

        let log2n = vDSP_Length(log2(Float(fftSize)))
        self.fftSetup = vDSP.FFT(log2n: log2n,
                                 radix: .radix2,
                                 ofType: DSPSplitComplex.self)
    }

    func process(buffer: [Float]) -> (bars: [Float], peakHz: Float, note: String) {
        guard let fftSetup else {
            return (Array(repeating: 0, count: binCount), 0, "—")
        }

        // Pad or truncate to FFT size, then apply window
        var samples = Array(buffer.prefix(fftSize))
        if samples.count < fftSize {
            samples += Array(repeating: 0, count: fftSize - samples.count)
        }
        vDSP.multiply(samples, window, result: &samples)

        // FFT: convert to split complex, transform, compute magnitudes
        var reals = [Float](repeating: 0, count: halfSize)
        var imags = [Float](repeating: 0, count: halfSize)
        let magnitudes: [Float] = reals.withUnsafeMutableBufferPointer { realsBP in
            imags.withUnsafeMutableBufferPointer { imagsBP in
                var splitComplex = DSPSplitComplex(
                    realp: realsBP.baseAddress!,
                    imagp: imagsBP.baseAddress!)
                samples.withUnsafeBytes { ptr in
                    let floatPtr = ptr.bindMemory(to: DSPComplex.self)
                    vDSP_ctoz(floatPtr.baseAddress!, 2,
                              &splitComplex, 1,
                              vDSP_Length(halfSize))
                }

                fftSetup.forward(input: splitComplex, output: &splitComplex)

                var mags = [Float](repeating: 0, count: halfSize)
                vDSP.absolute(splitComplex, result: &mags)
                vDSP.multiply(1.0 / Float(fftSize), mags, result: &mags)
                return mags
            }
        }

        // Map to log-spaced display bars
        let bars = logSpacedBars(magnitudes: magnitudes)

        // Find peak frequency and convert to note name
        let peakBin = magnitudes.indices.max(by: {
            magnitudes[$0] < magnitudes[$1]
        }) ?? 0
        let peakHz = Float(peakBin) * Float(sampleRate) / Float(fftSize)
        let note = peakHz > 30 ? frequencyToNote(peakHz) : "—"

        return (bars, peakHz, note)
    }

    private func logSpacedBars(magnitudes: [Float]) -> [Float] {
        let minFreq: Float = 60
        let maxFreq: Float = 18000
        let logMin = log10(minFreq)
        let logMax = log10(maxFreq)

        return (0..<binCount).map { i in
            let logLow  = logMin + Float(i)     / Float(binCount) * (logMax - logMin)
            let logHigh = logMin + Float(i + 1) / Float(binCount) * (logMax - logMin)
            let freqLow  = pow(10, logLow)
            let freqHigh = pow(10, logHigh)

            let binLow  = Int(freqLow  / Float(sampleRate) * Float(fftSize))
            let binHigh = Int(freqHigh / Float(sampleRate) * Float(fftSize))
            let lo = max(0, binLow)
            let hi = min(magnitudes.count - 1, max(binLow, binHigh))
            let slice = magnitudes[lo...hi]

            let avg = slice.isEmpty ? 0 : slice.reduce(0, +) / Float(slice.count)
            let db = 20 * log10(max(avg, 1e-9))
            let normalized = (db + 80) / 80
            return max(0, min(1, normalized))
        }
    }

    private func frequencyToNote(_ hz: Float) -> String {
        let notes = ["C","C#","D","D#","E","F","F#","G","G#","A","A#","B"]
        let midi = Int(round(12 * log2(hz / 440.0) + 69))
        let note = notes[((midi % 12) + 12) % 12]
        let octave = (midi / 12) - 1
        return "\(note)\(octave)"
    }
}
Anatomy of the FFT pipeline

The signal processing chain in process() follows a standard pattern: pad/truncate to a power-of-two length, window to reduce spectral leakage (see Section 5), FFT to transform from time domain to frequency domain, magnitude to collapse complex values to real amplitudes, and normalize by dividing by FFT size so magnitudes are independent of buffer length. The log-spaced binning then maps the linear FFT output to a perceptual frequency scale for display.

ContentView.swift

The complete UI: note display, spectrum bar chart, VU meter bar, and start/stop button. The initial placeholder was built in Section 1, wired to audio in Section 3, and extended with the spectrum view in Section 7.

import SwiftUI

struct ContentView: View {
    @State private var audio = AudioEngine()

    var body: some View {
        ZStack {
            Color.black.ignoresSafeArea()

            VStack(spacing: 24) {
                // Peak note display
                VStack(spacing: 4) {
                    Text(audio.peakNote)
                        .font(.system(size: 48, weight: .thin, design: .monospaced))
                        .foregroundStyle(.white)
                    Text(audio.peakHz > 30
                         ? String(format: "%.1f Hz", audio.peakHz)
                         : "—")
                        .font(.system(.caption, design: .monospaced))
                        .foregroundStyle(.gray)
                }
                .frame(height: 80)

                // Spectrum bars
                SpectrumView(bars: audio.spectrumBars)
                    .frame(height: 200)
                    .padding(.horizontal, 16)

                // VU meter bar
                GeometryReader { geo in
                    ZStack(alignment: .leading) {
                        RoundedRectangle(cornerRadius: 4)
                            .fill(Color.white.opacity(0.1))
                        RoundedRectangle(cornerRadius: 4)
                            .fill(vuColor(level: audio.level))
                            .frame(width: geo.size.width * CGFloat(audio.level))
                            .animation(.easeOut(duration: 0.05), value: audio.level)
                    }
                }
                .frame(height: 12)
                .padding(.horizontal, 16)

                // Start/stop button
                Button(audio.isRunning ? "Stop" : "Start") {
                    audio.isRunning ? audio.stop() : audio.start()
                }
                .font(.system(size: 16, weight: .medium))
                .foregroundStyle(.black)
                .padding(.horizontal, 48)
                .padding(.vertical, 12)
                .background(audio.isRunning ? Color.red : Color.green)
                .clipShape(Capsule())
            }
            .padding(.vertical, 40)
        }
    }

    func vuColor(level: Float) -> Color {
        switch level {
        case 0..<0.6:    return .green
        case 0.6..<0.85: return .yellow
        default:         return .red
        }
    }
}

struct SpectrumView: View {
    let bars: [Float]

    var body: some View {
        GeometryReader { geo in
            let barWidth = geo.size.width / CGFloat(bars.count)
            let spacing: CGFloat = 1

            HStack(alignment: .bottom, spacing: spacing) {
                ForEach(bars.indices, id: \.self) { i in
                    RoundedRectangle(cornerRadius: 2)
                        .fill(barColor(index: i, count: bars.count))
                        .frame(
                            width: barWidth - spacing,
                            height: max(2, geo.size.height * CGFloat(bars[i]))
                        )
                        .animation(.easeOut(duration: 0.08), value: bars[i])
                }
            }
            .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .bottom)
        }
    }

    func barColor(index: Int, count: Int) -> Color {
        let t = Double(index) / Double(count)
        return Color(hue: 0.35 - t * 0.35, saturation: 0.9, brightness: 0.9)
    }
}
UI structure at a glance

ContentView owns the AudioEngine via @State and reads five reactive properties from it: level, spectrumBars, peakHz, peakNote, and isRunning. SwiftUI tracks which views depend on which properties and only redraws what changes. The SpectrumView is a separate struct for clarity, but it could live inline. The hue-based coloring creates a green-to-red gradient across the 48 bars (see Section 7).

SpectrumAnalyzerTests.swift

Unit tests for the spectrum analyzer. These use synthetic sine waves — no microphone required. The sample rate mismatch tests are regression tests for the bug described in Section 8. Covered in Section 9.

import XCTest
@testable import VUMeter

final class SpectrumAnalyzerTests: XCTestCase {

    // MARK: - Helpers

    /// Generates a pure sine wave buffer — no hardware needed.
    private func sineWave(
        frequency: Float,
        sampleRate: Float,
        count: Int,
        amplitude: Float = 1.0
    ) -> [Float] {
        (0..<count).map { i in
            amplitude * sin(2 * .pi * frequency * Float(i) / sampleRate)
        }
    }

    // MARK: - Peak Frequency Detection

    func testA440At44100() {
        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)
        let signal = sineWave(frequency: 440, sampleRate: 44100, count: 4096)
        let (_, peakHz, note) = analyzer.process(buffer: signal)

        XCTAssertEqual(peakHz, 440, accuracy: 12, "Peak should be near 440 Hz")
        XCTAssertEqual(note, "A4")
    }

    func testA440At48000() {
        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 48000)
        let signal = sineWave(frequency: 440, sampleRate: 48000, count: 4096)
        let (_, peakHz, note) = analyzer.process(buffer: signal)

        XCTAssertEqual(peakHz, 440, accuracy: 12, "Peak should be near 440 Hz")
        XCTAssertEqual(note, "A4")
    }

    func testMiddleC() {
        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)
        let signal = sineWave(frequency: 261.63, sampleRate: 44100, count: 4096)
        let (_, peakHz, note) = analyzer.process(buffer: signal)

        XCTAssertEqual(peakHz, 261.63, accuracy: 12, "Peak should be near middle C")
        XCTAssertEqual(note, "C4")
    }

    // MARK: - Sample Rate Mismatch (Regression Test)

    func testSampleRateMismatchProducesWrongNote() {
        // THE BUG: analyzer created with 44100 but device runs at 48000.
        // 440 Hz signal is misinterpreted as ~409 Hz → G#4
        let wrongAnalyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)
        let signal = sineWave(frequency: 440, sampleRate: 48000, count: 4096)
        let (_, _, note) = wrongAnalyzer.process(buffer: signal)

        XCTAssertNotEqual(note, "A4", "Mismatched sample rate should report wrong note")
    }

    func testCorrectSampleRateFixesNote() {
        let correctAnalyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 48000)
        let signal = sineWave(frequency: 440, sampleRate: 48000, count: 4096)
        let (_, _, note) = correctAnalyzer.process(buffer: signal)

        XCTAssertEqual(note, "A4", "Correct sample rate should identify A4")
    }

    // MARK: - Silence and Edge Cases

    func testSilenceReturnsNoNote() {
        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)
        let silence = [Float](repeating: 0, count: 4096)
        let (bars, peakHz, note) = analyzer.process(buffer: silence)

        XCTAssertEqual(note, "—", "Silence should show no note")
        XCTAssertEqual(peakHz, 0, accuracy: 1)
        XCTAssertTrue(bars.allSatisfy { $0 >= 0 && $0 <= 1 })
    }

    func testShortBufferPadsToFFTSize() {
        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)
        let shortSignal = sineWave(frequency: 440, sampleRate: 44100, count: 1024)
        let (bars, peakHz, note) = analyzer.process(buffer: shortSignal)

        XCTAssertEqual(peakHz, 440, accuracy: 20)
        XCTAssertEqual(note, "A4")
        XCTAssertEqual(bars.count, 48)
    }

    // MARK: - Note Mapping: Chromatic Scale

    func testChromaticScale() {
        let expectedNotes: [(Float, String)] = [
            (261.63, "C4"),  (277.18, "C#4"), (293.66, "D4"),
            (311.13, "D#4"), (329.63, "E4"),  (349.23, "F4"),
            (369.99, "F#4"), (392.00, "G4"),  (415.30, "G#4"),
            (440.00, "A4"),  (466.16, "A#4"), (493.88, "B4"),
        ]

        let analyzer = SpectrumAnalyzer(binCount: 48, sampleRate: 44100)

        for (freq, expectedNote) in expectedNotes {
            let signal = sineWave(frequency: freq, sampleRate: 44100, count: 4096)
            let (_, _, note) = analyzer.process(buffer: signal)
            XCTAssertEqual(note, expectedNote, "\(freq) Hz should map to \(expectedNote)")
        }
    }
}
Running the tests

Press ⌘U in Xcode to run the full test suite. All tests use synthetic sine waves and run on the simulator — no device or microphone needed. The sample rate mismatch tests are the most important: they prove that the bug from Section 8 can never silently reappear.