How to Build an Ear Training App in SwiftUI
An ear training app teaches musicians to identify musical intervals by ear through short audio quizzes backed by real instrument samples. It's aimed at music students, grade-exam candidates, and anyone building relative pitch on the go.
Prerequisites
- Mac with Xcode 16+
- Apple Developer Program ($99/year) — required for TestFlight and App Store
- Basic Swift/SwiftUI knowledge
- A real iOS device is strongly recommended — AVFoundation audio latency on the simulator is 3–5× higher than hardware
- A permissively licensed .sf2 soundfont (e.g. GeneralUser GS, free for distribution) bundled in your Xcode project for piano samples
Architecture overview
SwiftData persists each ExerciseSession — interval trained, questions answered, and accuracy — so users can track progress over time. AVAudioEngine with an AVAudioUnitSampler drives melodic playback: two MIDI notes fired sequentially at a configurable delay. QuizView owns all quiz state via @State, and an @Observable IntervalPlayer class is instantiated with @State and passed down. StoreKit 2 gates the full twelve-interval set and the progress dashboard behind a monthly or annual subscription.
EarTraining/ ├── Models/ │ ├── Interval.swift # IntervalName enum + semitone map │ └── ExerciseSession.swift # @Model — persisted session stats ├── Audio/ │ └── IntervalPlayer.swift # AVAudioEngine + AVAudioUnitSampler ├── Views/ │ ├── QuizView.swift # main quiz loop │ └── ProgressView.swift # session history + accuracy chart └── EarTrainingApp.swift # modelContainer + StoreKit setup
Step-by-step
1. Data model
Define the interval vocabulary as a CaseIterable enum so SwiftUI can drive ForEach directly, then create a SwiftData model to persist each quiz session's results.
import SwiftData
import Foundation
enum IntervalName: String, Codable, CaseIterable {
case minorSecond = "Minor 2nd"
case majorSecond = "Major 2nd"
case minorThird = "Minor 3rd"
case majorThird = "Major 3rd"
case perfectFourth = "Perfect 4th"
case perfectFifth = "Perfect 5th"
case majorSixth = "Major 6th"
case octave = "Octave"
var semitones: Int {
switch self {
case .minorSecond: return 1
case .majorSecond: return 2
case .minorThird: return 3
case .majorThird: return 4
case .perfectFourth: return 5
case .perfectFifth: return 7
case .majorSixth: return 9
case .octave: return 12
}
}
}
@Model final class ExerciseSession {
var id: UUID = UUID()
var date: Date = Date.now
var interval: String = ""
var total: Int = 0
var correct: Int = 0
var accuracy: Double { total > 0 ? Double(correct) / Double(total) : 0 }
}
2. Core UI — QuizView
Render the play button, a 2-column answer grid, and instant colour feedback; regenerate the question after the user taps Next.
struct QuizView: View {
@Environment(\.modelContext) private var ctx
@State private var player = IntervalPlayer()
@State private var current : IntervalName = .majorThird
@State private var choices : [IntervalName] = []
@State private var selected : IntervalName?
@State private var revealed = false
var body: some View {
VStack(spacing: 24) {
Text("What interval is this?").font(.title2.weight(.semibold))
Button("▶ Play") { player.play(interval: current) }
.buttonStyle(.borderedProminent).controlSize(.large)
LazyVGrid(columns: [.init(.flexible()), .init(.flexible())], spacing: 12) {
ForEach(choices, id: \.self) { c in
Button(c.rawValue) {
guard !revealed else { return }
selected = c; revealed = true
}
.buttonStyle(.bordered)
.tint(revealed ? (c == current ? .green : c == selected ? .red : nil) : nil)
}
}
if revealed { Button("Next") { newQuestion() }.buttonStyle(.borderedProminent) }
}
.padding().onAppear { newQuestion() }
}
private func newQuestion() {
current = IntervalName.allCases.randomElement()!
var pool = Array(IntervalName.allCases.shuffled().prefix(4))
if !pool.contains(current) { pool[0] = current; pool.shuffle() }
choices = pool; selected = nil; revealed = false
}
}
3. Interval recognition — AVFoundation playback
Use AVAudioEngine with AVAudioUnitSampler to fire two MIDI notes sequentially — the correct approach for melodic interval ear training that avoids file-write overhead.
import AVFoundation
@Observable final class IntervalPlayer {
private let engine = AVAudioEngine()
private let sampler = AVAudioUnitSampler()
init() {
try? AVAudioSession.sharedInstance()
.setCategory(.playback, mode: .default)
try? AVAudioSession.sharedInstance().setActive(true)
engine.attach(sampler)
engine.connect(sampler, to: engine.mainMixerNode, format: nil)
try? engine.start()
if let url = Bundle.main.url(forResource: "GeneralUser", withExtension: "sf2") {
try? sampler.loadSoundBankInstrument(
at: url, program: 0, bankMSB: 0x79, bankLSB: 0)
}
}
func play(interval: IntervalName, root: UInt8 = 60) {
let top = root + UInt8(clamping: interval.semitones)
sampler.startNote(root, withVelocity: 90, onChannel: 0)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.65) {
self.sampler.startNote(top, withVelocity: 90, onChannel: 0)
}
DispatchQueue.main.asyncAfter(deadline: .now() + 1.8) {
self.sampler.stopNote(root, onChannel: 0)
self.sampler.stopNote(top, onChannel: 0)
}
}
}
4. Privacy Manifest
Apple's automated upload checker rejects any new app missing a PrivacyInfo.xcprivacy file — add it to your app target's Copy Bundle Resources phase.
<!-- PrivacyInfo.xcprivacy — add to app target, not an extension -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NSPrivacyTracking</key> <false/>
<key>NSPrivacyTrackingDomains</key> <array/>
<key>NSPrivacyCollectedDataTypes</key> <array/>
<key>NSPrivacyAccessedAPITypes</key>
<array>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryUserDefaults</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array><string>CA92.1</string></array>
</dict>
</array>
</dict>
</plist>
Common pitfalls
- Wrong AVAudioSession category: Skipping
setCategory(.playback)means audio stops when the device is silenced or the screen locks mid-quiz — users will think the app is broken. - Simulator audio latency:
AVAudioUnitSamplertiming on the simulator can be 3–5× slower than a real device. Always profile on hardware before tuning your note-delay values. - Soundfont not in Copy Bundle Resources: The .sf2 file must be explicitly added to the target's Copy Bundle Resources build phase. Xcode won't warn you if it's missing;
loadSoundBankInstrumentsilently fails at runtime. - App Store — copyrighted soundfonts: Distributing samples from a commercial soundfont without an explicit distribution licence triggers rejection under guideline 5.2. Use only public-domain or permissively licensed .sf2 files.
- Missing NSMicrophoneUsageDescription: If you add pitch-detection (microphone input) in a later update without this Info.plist key, the app crashes on first microphone access on iOS 17+.
Adding monetization: Subscription
Configure a monthly and annual subscription in App Store Connect, then load them with StoreKit 2's Product.products(for:) at app launch. Gate premium content — all twelve interval types, harmonic (simultaneous) mode, and the accuracy-over-time chart — behind a Transaction.currentEntitlements check. Register a Transaction.updates listener as a long-lived Swift concurrency task so renewals and refunds are handled in real time without polling. On iOS 17+ you can drop in SubscriptionStoreView(groupID:) for a fully compliant, locale-aware paywall sheet — the fastest path to a correctly formatted subscription offer screen that will pass App Store review.
Shipping this faster with Soarias
Soarias scaffolds the full project in one prompt: SwiftData models, the IntervalPlayer class with the correct AVAudioSession category, and QuizView wired together with a working modelContainer. It generates PrivacyInfo.xcprivacy with the correct API-reason codes, sets up fastlane lanes for TestFlight distribution and App Store Connect submission, and automates screenshots for all required device sizes — no manual Simulator gymnastics.
For an intermediate project like this, setup boilerplate, fastlane configuration, and Privacy Manifest debugging typically eat two to three days. Soarias collapses that to under 30 minutes, leaving your week for the audio logic, subscription UX, and interval-set design that actually differentiate the app in the App Store.
Related guides
FAQ
Do I need a paid Apple Developer account?
Yes. You can build and run the app on a personal device for free with a free Apple ID, but distributing via TestFlight or submitting to the App Store requires an active Apple Developer Program membership ($99/year).
How do I submit this to the App Store?
Archive the build in Xcode (Product → Archive), upload it through Organizer or xcrun altool, then complete the App Store Connect listing — screenshots, age rating, Privacy Manifest, and subscription product setup — before clicking Submit for Review. TestFlight first is strongly recommended.
Can I use MIDI files instead of synthesising notes directly?
AVMIDIPlayer can play a .mid file routed through AVAudioUnitSampler — useful for chord voicings or melodic phrases. For simple two-note intervals, firing startNote(_:withVelocity:onChannel:) directly on the sampler is simpler and gives tighter timing control with no file I/O overhead.
Last reviewed: 2026-05-12 by the Soarias team.