How to Build a Guitar Tuner App in SwiftUI
A Guitar Tuner app listens through the iPhone microphone, detects the pitch of a plucked string in real time, and tells the player whether they're sharp, flat, or in tune. It's a perfect project for musicians who want a distraction-free, always-offline tool on their phone.
Prerequisites
- Xcode 16 or later installed on a Mac
- A physical iPhone or iPad (AVAudioEngine microphone input does not work on Simulator)
- Basic familiarity with SwiftUI and
ObservableObject
Steps
1. Define the Data Model
Represent each of the six standard guitar strings as a value type with a name and target frequency so the engine can compute cents-offset at runtime.
struct GuitarNote: Identifiable, Equatable {
let id = UUID()
let name: String // e.g. "E2"
let frequency: Double // Hz
let string: Int // 1 (high E) – 6 (low E)
static let standard: [GuitarNote] = [
.init(name: "E4", frequency: 329.63, string: 1),
.init(name: "B3", frequency: 246.94, string: 2),
.init(name: "G3", frequency: 196.00, string: 3),
.init(name: "D3", frequency: 146.83, string: 4),
.init(name: "A2", frequency: 110.00, string: 5),
.init(name: "E2", frequency: 82.41, string: 6),
]
}
2. Build the Tuner UI
Compose the main screen from a large detected-note label, a colour-coded cents readout, and a row of string buttons that highlight the nearest match.
struct TunerView: View {
@StateObject private var engine = TunerEngine()
var needleColor: Color {
abs(engine.centsOff) < 5 ? .green : .orange
}
var body: some View {
VStack(spacing: 28) {
Text(engine.detectedNote)
.font(.system(size: 80, weight: .black, design: .rounded))
Text(engine.centsOff >= 0
? "+\(Int(engine.centsOff))¢"
: "\(Int(engine.centsOff))¢")
.font(.title.monospacedDigit())
.foregroundStyle(needleColor)
HStack(spacing: 12) {
ForEach(GuitarNote.standard) { note in
Text(note.name)
.padding(8)
.background(engine.closestNote == note.name
? Color.accentColor : Color.secondary.opacity(0.15))
.clipShape(RoundedRectangle(cornerRadius: 8))
}
}
}
.onAppear { engine.start() }
.onDisappear { engine.stop() }
}
}
3. Implement Pitch Detection
Tap the AVAudioEngine input node, run a real-time FFT via Accelerate, find the dominant frequency bin, and publish the nearest note plus cents deviation to the view.
import AVFoundation, Accelerate
@Observable final class TunerEngine {
var detectedNote = "--"
var centsOff: Double = 0
var closestNote = ""
private let audioEngine = AVAudioEngine()
private let bufferSize: AVAudioFrameCount = 4096
func start() {
let node = audioEngine.inputNode
let format = node.inputFormat(forBus: 0)
node.installTap(onBus: 0, bufferSize: bufferSize,
format: format) { [weak self] buf, _ in
self?.analyze(buf, sampleRate: format.sampleRate)
}
try? audioEngine.start()
}
func stop() { audioEngine.stop(); audioEngine.inputNode.removeTap(onBus: 0) }
private func analyze(_ buf: AVAudioPCMBuffer, sampleRate: Double) {
guard let data = buf.floatChannelData?[0] else { return }
let n = Int(buf.frameLength)
var real = [Float](UnsafeBufferPointer(start: data, count: n))
var imag = [Float](repeating: 0, count: n)
real.withUnsafeMutableBufferPointer { rPtr in
imag.withUnsafeMutableBufferPointer { iPtr in
var split = DSPSplitComplex(realp: rPtr.baseAddress!,
imagp: iPtr.baseAddress!)
let log2n = vDSP_Length(log2(Double(n)))
let fft = vDSP_create_fftsetup(log2n, FFTRadix(kFFTRadix2))!
vDSP_fft_zip(fft, &split, 1, log2n, FFTDirection(FFT_FORWARD))
vDSP_destroy_fftsetup(fft)
var mags = [Float](repeating: 0, count: n / 2)
vDSP_zvmags(&split, 1, &mags, 1, vDSP_Length(n / 2))
let peak = Int(mags[1...].indices.max(by: { mags[$0] < mags[$1] }) ?? 1)
let freq = Double(peak) * sampleRate / Double(n)
DispatchQueue.main.async { self.update(freq: freq) }
}
}
}
private func update(freq: Double) {
guard freq > 60, freq < 1400 else { return }
let nearest = GuitarNote.standard.min {
abs($0.frequency - freq) < abs($1.frequency - freq)
}!
centsOff = 1200 * log2(freq / nearest.frequency)
detectedNote = nearest.name
closestNote = nearest.name
}
}
Common Pitfalls
- Missing microphone usage description. Add
NSMicrophoneUsageDescriptionto yourInfo.plistor the system will silently deny access without a crash. - Testing on Simulator.
AVAudioEnginemicrophone capture is unavailable on Simulator — always profile and test pitch detection on a real device. - FFT bin resolution at low frequencies. A 4096-sample buffer at 44.1 kHz gives ~10 Hz per bin; the low-E string (82 Hz) only spans ~8 bins, so increase buffer size to 8192 for tighter accuracy on bass strings.
Monetization with One-Time Purchase
A Guitar Tuner pairs well with a one-time unlock model: ship a fully functional free tier (standard tuning only) and gate premium tunings — Drop D, Open G, DADGAD — behind a StoreKit 2 Product.purchase() call. Because StoreKit 2 uses Swift concurrency natively, you can call try await product.purchase() directly inside a .task modifier, persist the transaction with Transaction.currentEntitlements, and never touch a server. One-time purchases avoid subscription fatigue and convert well for utility apps where users want to pay once and move on.
Ship Faster with Soarias
Soarias scaffolds your entire Guitar Tuner project — SwiftData models, AVAudioEngine boilerplate, StoreKit product configuration, and App Store metadata — from a single prompt, saving several hours of setup. The $79 one-time desktop app runs locally on your Mac alongside Xcode so your source code never leaves your machine, and it integrates directly with Claude Code to iterate on the FFT logic or UI polish through natural conversation.
Related Tutorials
FAQ
Do I need an Apple Developer account to build this?
You can build and run the app on your own iPhone for free using a personal team in Xcode. You only need a paid Apple Developer account ($99/year) when you're ready to distribute on TestFlight or submit to the App Store.
How do I submit the Guitar Tuner to the App Store?
Archive your app in Xcode (Product › Archive), upload via the Organizer or altool/xcrun notarytool, then complete your App Store Connect listing — screenshots, description, privacy nutrition labels, and your StoreKit in-app purchase — before submitting for review. Soarias automates most of this flow.
Last reviewed: 2026-05-12 by the Soarias team.