```html SwiftUI: How to Build Audio Recording (iOS 17+, 2026)

How to Build Audio Recording in SwiftUI

iOS 17+ Xcode 16+ Intermediate APIs: AVAudioRecorder Updated: May 11, 2026
TL;DR

Use AVAudioRecorder inside an @Observable class to capture mic input to an .m4a file. On iOS 17+ request permission with the new AVAudioApplication.requestRecordPermission() async API, configure the session category, and call record() / stop().

import AVFoundation

// 1. Permission (iOS 17+)
let granted = await AVAudioApplication.requestRecordPermission()

// 2. Session
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, mode: .default)
try session.setActive(true)

// 3. Record
let url = FileManager.default.temporaryDirectory
    .appendingPathComponent(UUID().uuidString + ".m4a")
let settings: [String: Any] = [
    AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
    AVSampleRateKey: 44_100,
    AVNumberOfChannelsKey: 1,
    AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
let recorder = try AVAudioRecorder(url: url, settings: settings)
recorder.record()   // start
// …later…
recorder.stop()     // finish

Full implementation

The implementation wraps AVAudioRecorder inside a Swift @Observable class so SwiftUI's observation system automatically re-renders when recording state changes. A .task modifier on the view handles the async permission request at first appearance, and a pulsing SF Symbol button doubles as both start and stop control. The recording is saved to the system temporary directory as AAC-encoded .m4a, ready to play back or upload.

import SwiftUI
import AVFoundation

// MARK: - Model

@Observable
final class AudioRecorderModel {
    var isRecording    = false
    var recordingURL: URL?
    var hasPermission  = false
    var errorMessage: String?

    private var recorder: AVAudioRecorder?

    // iOS 17+ async permission API
    func requestPermission() async {
        let granted = await AVAudioApplication.requestRecordPermission()
        hasPermission = granted
        if !granted { errorMessage = "Microphone access denied." }
    }

    func startRecording() {
        guard hasPermission else { errorMessage = "No microphone permission."; return }

        let session = AVAudioSession.sharedInstance()
        do {
            try session.setCategory(.playAndRecord, mode: .default,
                                    options: .defaultToSpeaker)
            try session.setActive(true)
        } catch {
            errorMessage = "Session error: \(error.localizedDescription)"
            return
        }

        let url = FileManager.default.temporaryDirectory
            .appendingPathComponent(UUID().uuidString)
            .appendingPathExtension("m4a")

        let settings: [String: Any] = [
            AVFormatIDKey:              Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey:            44_100,
            AVNumberOfChannelsKey:      1,
            AVEncoderAudioQualityKey:   AVAudioQuality.high.rawValue
        ]

        do {
            recorder = try AVAudioRecorder(url: url, settings: settings)
            recorder?.isMeteringEnabled = true
            recorder?.record()
            recordingURL = url
            isRecording  = true
            errorMessage = nil
        } catch {
            errorMessage = "Recorder error: \(error.localizedDescription)"
        }
    }

    func stopRecording() {
        recorder?.stop()
        isRecording = false
        try? AVAudioSession.sharedInstance().setActive(false,
              options: .notifyOthersOnDeactivation)
    }
}

// MARK: - View

struct AudioRecorderView: View {
    @State private var model = AudioRecorderModel()

    var body: some View {
        VStack(spacing: 32) {
            Spacer()

            Text(model.isRecording ? "Recording…" : "Tap to Record")
                .font(.title2.bold())
                .contentTransition(.numericText())

            Button {
                if model.isRecording {
                    model.stopRecording()
                } else {
                    model.startRecording()
                }
            } label: {
                Image(systemName: model.isRecording
                      ? "stop.circle.fill"
                      : "mic.circle.fill")
                    .font(.system(size: 80))
                    .foregroundStyle(model.isRecording ? .red : .accentColor)
                    .symbolEffect(.pulse, isActive: model.isRecording)
            }
            .accessibilityLabel(model.isRecording ? "Stop recording" : "Start recording")
            .disabled(!model.hasPermission)

            if let url = model.recordingURL, !model.isRecording {
                VStack(spacing: 6) {
                    Text("Saved:")
                        .font(.caption)
                        .foregroundStyle(.secondary)
                    Text(url.lastPathComponent)
                        .font(.caption.monospaced())
                        .foregroundStyle(.secondary)
                        .multilineTextAlignment(.center)
                }
                .padding(.horizontal)
            }

            if let error = model.errorMessage {
                Text(error)
                    .font(.caption)
                    .foregroundStyle(.red)
            }

            Spacer()
        }
        .padding()
        .task {
            await model.requestPermission()
        }
    }
}

// MARK: - Preview

#Preview {
    AudioRecorderView()
}

How it works

  1. AVAudioApplication.requestRecordPermission() (iOS 17+) — Replaces the callback-based AVAudioSession.requestRecordPermission from earlier OS versions. It returns a Bool asynchronously and is called inside .task { } so it runs once when the view appears, off the main actor.
  2. Session category .playAndRecord — Activating the session with this category enables the microphone while still allowing concurrent audio output. The .defaultToSpeaker option routes playback through the main speaker instead of the earpiece, which matters if you add inline playback later.
  3. AAC settings dictionarykAudioFormatMPEG4AAC produces a small, high-quality .m4a at 44.1 kHz mono. Swap AVNumberOfChannelsKey to 2 for stereo, or lower the sample rate to 22_050 for voice-only recordings that need to be even smaller.
  4. isMeteringEnabled = true — Enables the internal level meter on the recorder so you can call recorder.updateMeters() on a timer and read averagePower(forChannel:) to drive a live waveform without starting a second audio tap.
  5. symbolEffect(.pulse, isActive:) — An iOS 17 SF Symbols animation that breathes the mic icon while recording is active, giving instant visual feedback without any hand-rolled animation state.

Variants

Live audio level meter

Poll the recorder's metering API on a repeating timer and map the decibel value to a ProgressView to show a real-time level bar while recording.

// Inside AudioRecorderModel
var audioLevel: Float = 0  // 0.0 – 1.0

private var levelTimer: Timer?

func startRecording() {
    // … existing setup …
    recorder?.isMeteringEnabled = true
    recorder?.record()
    isRecording = true

    levelTimer = Timer.scheduledTimer(withTimeInterval: 0.05, repeats: true) { [weak self] _ in
        guard let self else { return }
        recorder?.updateMeters()
        // averagePower returns dB in range -160…0
        let db = recorder?.averagePower(forChannel: 0) ?? -160
        let normalized = max(0, (db + 60) / 60)   // map -60…0 dB → 0…1
        Task { @MainActor in self.audioLevel = normalized }
    }
}

func stopRecording() {
    levelTimer?.invalidate(); levelTimer = nil
    recorder?.stop()
    isRecording = false; audioLevel = 0
}

// In the View body:
ProgressView(value: model.audioLevel)
    .tint(.red)
    .animation(.linear(duration: 0.05), value: model.audioLevel)
    .accessibilityLabel("Recording level: \(Int(model.audioLevel * 100))%")

Time-limited recording

Pass a duration to AVAudioRecorder.record(forDuration:) instead of calling the bare record() method. The recorder stops automatically and calls your AVAudioRecorderDelegate.audioRecorderDidFinishRecording(_:successfully:) method. This is ideal for voice memos with a configurable maximum length — set recorder.delegate = self on the model (after making it conform to NSObject & AVAudioRecorderDelegate) and update isRecording = false inside the delegate callback.

Common pitfalls

Prompt this with Claude Code

When using Soarias or Claude Code directly to implement this:

Implement audio recording in SwiftUI for iOS 17+.
Use AVAudioRecorder with kAudioFormatMPEG4AAC settings.
Request permission via AVAudioApplication.requestRecordPermission().
Add a live level meter using isMeteringEnabled + updateMeters().
Make it accessible (VoiceOver labels for start/stop button).
Deactivate AVAudioSession properly when recording stops.
Add a #Preview with a realistic demo state (isRecording = true).

In Soarias's Build phase, paste this prompt directly into the Claude Code panel after scaffolding your screen — it generates the model, view, and preview in a single pass so you can move straight to wiring the recorded URL into your data layer.

Related

FAQ

Does this work on iOS 16?

Partially. AVAudioRecorder itself is available back to iOS 3, and the @Observable macro requires iOS 17+. If you need iOS 16 support, replace @Observable with ObservableObject + @Published, swap AVAudioApplication.requestRecordPermission() for the callback-based AVAudioSession.requestRecordPermission, and replace .symbolEffect(.pulse) with a manual withAnimation loop.

How do I play back the recorded file?

Pass the saved URL to AVAudioPlayer(contentsOf:) inside a similar @Observable model, activate the session with category .playback, and call player.play(). If you want inline waveform scrubbing in SwiftUI, consider wrapping AVPlayer in a VideoPlayer-style UIViewControllerRepresentable, or use the new AudioKit / DSWaveformImage packages for rendered waveform thumbnails.

What's the UIKit equivalent?

The underlying AVAudioRecorder API is identical — there's no SwiftUI-specific wrapper. In UIKit you'd typically hold the recorder on a UIViewController, implement AVAudioRecorderDelegate directly on it, and update UI in the delegate callbacks. The SwiftUI approach above is strictly cleaner: the @Observable model acts as the delegate proxy and publishes state changes that the view layer observes automatically.

Last reviewed: 2026-05-11 by the Soarias team.

```