How to Build a Drum Machine App in SwiftUI
A drum machine app gives musicians and producers a pocket-sized beat sequencer — tap pads to program rhythms across 8 tracks and 16 steps, then play them back in real time with crisp WAV samples. It's aimed at musicians, hobbyists, and anyone who wants a tactile, lo-fi beat-making tool on iOS.
Prerequisites
- Mac with Xcode 16+
- Apple Developer Program ($99/year) — required for TestFlight and App Store
- Basic Swift/SwiftUI knowledge
- A set of royalty-free drum sample WAV files (kick, snare, hi-hat, etc.) — AVFoundation requires real audio assets; the simulator is fine for UI but test audio on a physical device
- Familiarity with
AVAudioEngineandAVAudioPlayerNodeis helpful but not required
Architecture overview
The app separates concerns cleanly: SwiftData holds DrumPattern models (tracks, steps, BPM) that persist between sessions, while BeatSequencer — an @Observable class — owns the AVAudioEngine and drives playback via a repeating Timer. Views observe the sequencer's currentStep to animate the playhead. There are no network calls and no third-party dependencies, which keeps the Privacy Manifest simple.
DrumMachineApp/
├── Models/
│ └── DrumPattern.swift # SwiftData @Model
├── Engine/
│ └── BeatSequencer.swift # AVFoundation playback
├── Views/
│ ├── ContentView.swift # Root + navigation
│ ├── DrumGridView.swift # 8 × 16 step grid
│ └── TransportView.swift # Play / stop / BPM
└── Resources/
└── Samples/ # kick.wav, snare.wav …
Step-by-step
1. Define the DrumPattern data model
A flat Bool array of 128 elements (8 tracks × 16 steps) keeps SwiftData serialization trivial and makes index math straightforward.
import SwiftData
import Foundation
@Model
final class DrumPattern {
var name: String
var bpm: Double
// 8 tracks × 16 steps = 128 booleans
var steps: [Bool]
var createdAt: Date
init(name: String = "New Pattern", bpm: Double = 120) {
self.name = name
self.bpm = bpm
self.steps = Array(repeating: false, count: 128)
self.createdAt = .now
}
func isActive(track: Int, step: Int) -> Bool {
steps[track * 16 + step]
}
func toggle(track: Int, step: Int) {
steps[track * 16 + step].toggle()
}
}
2. Build the drum grid UI
Render the 8 × 16 grid with a playhead highlight driven by the sequencer's published currentStep property.
struct DrumGridView: View {
@Bindable var pattern: DrumPattern
@State var sequencer: BeatSequencer
let tracks = ["Kick","Snare","Hi-Hat","Open HH",
"Tom 1","Tom 2","Clap","Cowbell"]
var body: some View {
ScrollView(.horizontal, showsIndicators: false) {
VStack(alignment: .leading, spacing: 6) {
ForEach(0..<8, id: \.self) { track in
HStack(spacing: 5) {
Text(tracks[track])
.font(.caption2).frame(width: 52, alignment: .trailing)
ForEach(0..<16, id: \.self) { step in
let active = pattern.isActive(track: track, step: step)
let playing = sequencer.currentStep == step && sequencer.isPlaying
RoundedRectangle(cornerRadius: 5)
.fill(active ? Color.orange : Color.secondary.opacity(0.18))
.frame(width: 34, height: 34)
.overlay(
RoundedRectangle(cornerRadius: 5)
.stroke(playing ? Color.white : .clear, lineWidth: 2)
)
.animation(.easeOut(duration: 0.06), value: playing)
.onTapGesture { pattern.toggle(track: track, step: step) }
}
}
}
}
.padding()
}
}
}
3. Implement the beat sequencer engine
Drive AVAudioPlayerNode instances in lock-step with a 16th-note timer, firing WAV sample buffers for every active step.
import AVFoundation
@Observable
final class BeatSequencer {
var currentStep = 0
var isPlaying = false
private var engine = AVAudioEngine()
private var players = [String: AVAudioPlayerNode]()
private var buffers = [String: AVAudioPCMBuffer]()
private var timer: Timer?
let trackNames = ["Kick","Snare","Hi-Hat","Open HH",
"Tom 1","Tom 2","Clap","Cowbell"]
init() { setupEngine() }
private func setupEngine() {
for name in trackNames {
let node = AVAudioPlayerNode()
engine.attach(node)
engine.connect(node, to: engine.mainMixerNode, format: nil)
players[name] = node
if let url = Bundle.main.url(forResource: name.lowercased()
.replacingOccurrences(of: " ", with: "-"), withExtension: "wav"),
let buf = loadBuffer(url) { buffers[name] = buf }
}
try? engine.start()
}
func togglePlay(pattern: DrumPattern) {
isPlaying ? stop() : start(pattern: pattern)
}
private func start(pattern: DrumPattern) {
isPlaying = true
let interval = 60.0 / pattern.bpm / 4.0 // 16th notes
timer = .scheduledTimer(withTimeInterval: interval, repeats: true) { [weak self] _ in
self?.tick(pattern: pattern)
}
}
private func stop() {
isPlaying = false; timer?.invalidate(); timer = nil; currentStep = 0
}
private func tick(pattern: DrumPattern) {
for (i, name) in trackNames.enumerated() {
guard pattern.isActive(track: i, step: currentStep),
let p = players[name], let b = buffers[name] else { continue }
p.scheduleBuffer(b, at: nil, options: .interrupts)
if !p.isPlaying { p.play() }
}
currentStep = (currentStep + 1) % 16
}
private func loadBuffer(_ url: URL) -> AVAudioPCMBuffer? {
guard let f = try? AVAudioFile(forReading: url),
let buf = AVAudioPCMBuffer(pcmFormat: f.processingFormat,
frameCapacity: AVAudioFrameCount(f.length))
else { return nil }
try? f.read(into: buf); return buf
}
}
4. Add the Privacy Manifest
App Store Connect rejects uploads without PrivacyInfo.xcprivacy — add it to your app target's resource bundle, not just the project root.
<!-- PrivacyInfo.xcprivacy (XML plist, add to app target → Resources) -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NSPrivacyTracking</key><false/>
<key>NSPrivacyTrackingDomains</key><array/>
<key>NSPrivacyCollectedDataTypes</key><array/>
<key>NSPrivacyAccessedAPITypes</key>
<array>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryFileTimestamp</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array><string>C617.1</string></array>
</dict>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryUserDefaults</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array><string>CA92.1</string></array>
</dict>
</array>
</dict>
</plist>
Common pitfalls
- Timer drift at high BPM.
Timerjitter becomes audible above ~160 BPM. Replace it withAVAudioEngine'sAVAudioTimescheduling for sample-accurate playback if timing precision matters. - Audio session not activated. Call
try AVAudioSession.sharedInstance().setActive(true)before starting the engine, or playback silently fails when the device ring/silent switch is engaged. - Sample files missing from bundle. Xcode won't warn you if a resource isn't added to the app target's "Copy Bundle Resources" phase — always verify with
Bundle.main.url(forResource:)returning non-nil at launch. - App Store review: missing microphone usage description. Even if your app only plays back audio, reviewers sometimes query the
NSMicrophoneUsageDescriptionkey if any AVFoundation import is present. Add a clear string to Info.plist proactively. - SwiftData migration on step-count change. If you later expand from 16 to 32 steps, the stored
[Bool]length changes — write a lightweightVersionedSchemamigration or you'll get a fatal schema mismatch on existing installs.
Adding monetization: One-time purchase
A one-time purchase is the cleanest fit for a drum machine — musicians hate recurring charges for a tool app. Implement it with StoreKit 2's Product.purchase() API: define a non-consumable in-app purchase in App Store Connect (e.g. com.yourapp.fullaccess), then at launch call Product.products(for:) to fetch the product and Transaction.currentEntitlements to check unlock status. Gate premium features — extra track count, pattern export, sample import — behind an @AppStorage("isUnlocked") flag that you set after a verified transaction. StoreKit 2 handles receipt validation server-side automatically, so there's no backend required.
Shipping this faster with Soarias
Soarias scaffolds the SwiftData model, AVAudioEngine boilerplate, and the PrivacyInfo.xcprivacy file in the first pass — you describe the app type and it wires up the audio session, engine start/stop lifecycle, and a basic grid layout ready for you to customise. The fastlane lanes for TestFlight distribution and App Store submission are generated with your bundle ID and ASC credentials pre-filled, so you skip the hour of lane configuration that trips up most first-timers.
For an intermediate project like this, Soarias typically shaves two to three days off the week-long estimate — mostly the audio session plumbing, Privacy Manifest research, and the submission metadata grind. That leaves you free to spend your time on what actually differentiates your drum machine: unique samples, a polished grid UI, and BPM feel.
Related guides
FAQ
Do I need a paid Apple Developer account?
Yes. The free tier lets you run the app on your own device via Xcode, but distributing on TestFlight or submitting to the App Store requires an active $99/year Apple Developer Program membership.
How do I submit this to the App Store?
Archive your app in Xcode (Product → Archive), then use the Distribute App flow to upload to App Store Connect. From there, fill in metadata (screenshots, description, age rating), attach any in-app purchases, and submit for review. Expect 24–48 hours for a first review on a new app.
Can I let users import their own drum samples?
Yes — use UIDocumentPickerViewController wrapped in PHPickerViewController or a fileImporter modifier to let users pick WAV/AIFF files from Files.app. Copy the selected URL into your app's document sandbox with FileManager and load it into an AVAudioPCMBuffer the same way bundled samples are loaded. You'll need to add NSMicrophoneUsageDescription only if you add live recording; file import alone does not require it.
Last reviewed: 2026-05-12 by the Soarias team.