How to Implement Haptic Feedback in SwiftUI
On iOS 17+, attach the .sensoryFeedback(_:trigger:) modifier to any view and SwiftUI fires the haptic automatically when the trigger value changes — no UIImpactFeedbackGenerator boilerplate required. For fine-grained control you can still call UIImpactFeedbackGenerator directly inside a button action.
// iOS 17+ — declarative
Button("Tap me") { tapped.toggle() }
.sensoryFeedback(.impact(weight: .medium), trigger: tapped)
// Any iOS — imperative
let g = UIImpactFeedbackGenerator(style: .medium)
g.impactOccurred()
Full Implementation
The example below shows a settings-style toggle row that fires a rigid impact when switched on and a light impact when switched off — a common pattern that communicates directionality. A second button demonstrates the classic UIImpactFeedbackGenerator imperative approach, which is useful when you need to prepare the taptic engine in advance or trigger feedback from non-SwiftUI code paths.
import SwiftUI
struct HapticDemoView: View {
@State private var isEnabled = false
@State private var impactCount = 0
// Imperative generator — prepare() warms up the engine
private let generator = UIImpactFeedbackGenerator(style: .heavy)
var body: some View {
NavigationStack {
List {
// MARK: - Declarative (iOS 17+)
Section("Declarative .sensoryFeedback") {
Toggle("Enable feature", isOn: $isEnabled)
// Fire different feedback depending on the new value
.sensoryFeedback(
.impact(weight: isEnabled ? .rigid : .light),
trigger: isEnabled
)
.accessibilityLabel("Enable feature toggle")
}
// MARK: - Notification feedback
Section("Notification styles") {
Button("Success") {
// Trigger via a separate counter so repeated taps always fire
impactCount += 1
}
.sensoryFeedback(.success, trigger: impactCount)
.accessibilityLabel("Trigger success haptic")
Button("Warning") { impactCount += 1 }
.sensoryFeedback(.warning, trigger: impactCount)
.accessibilityLabel("Trigger warning haptic")
Button("Error") { impactCount += 1 }
.sensoryFeedback(.error, trigger: impactCount)
.accessibilityLabel("Trigger error haptic")
}
// MARK: - Imperative (all iOS versions)
Section("Imperative UIImpactFeedbackGenerator") {
Button("Heavy impact") {
generator.prepare() // warm up before the action
generator.impactOccurred(intensity: 1.0)
}
.accessibilityLabel("Trigger heavy haptic feedback")
}
}
.navigationTitle("Haptics Demo")
.onAppear {
// Pre-warm so the first tap has no delay
generator.prepare()
}
}
}
}
#Preview {
HapticDemoView()
}
How It Works
-
.sensoryFeedback(_:trigger:) — Introduced in iOS 17, this modifier watches
trigger(anyEquatablevalue) and fires the haptic engine whenever the value changes. SwiftUI handles generator lifecycle automatically, so you never allocate or prepare anything manually. -
Conditional weight via
isEnabled— Because the feedback parameter is evaluated at call time, passingisEnabled ? .rigid : .lightlets the toggle communicate direction: a firm tap for "on", a soft tap for "off". The trigger isisEnableditself, so the haptic fires on every toggle change. -
Counter trick for repeated actions — When multiple button taps need to each fire a haptic, an incrementing
impactCountstate ensures the trigger always transitions to a new value, which is required for.sensoryFeedbackto fire again. A plainBooltoggled back immediately would miss every other tap. -
UIImpactFeedbackGenerator.prepare() — Calling
prepare()ononAppearloads the taptic engine into a ready state, reducing latency on the first interaction. The engine stays primed for roughly 500 ms afterprepare(), so call it just before the expected tap, not minutes earlier. -
Notification styles (.success / .warning / .error) — These map to the system's three-pattern vibration sequences and are semantically meaningful to users who rely on haptics for auditory substitution. Use them purposefully:
.successfor completed actions,.warningfor caution,.errorfor failure.
Variants
Selection feedback on a Picker or drag gesture
struct PickerHapticView: View {
@State private var selected = 0
let options = ["Swift", "Kotlin", "Dart"]
var body: some View {
Picker("Language", selection: $selected) {
ForEach(options.indices, id: \.self) { i in
Text(options[i]).tag(i)
}
}
.pickerStyle(.wheel)
// .selection fires a light tick as the wheel spins
.sensoryFeedback(.selection, trigger: selected)
}
}
Conditional feedback — only fire when a condition is met
The overload .sensoryFeedback(_:trigger:condition:) accepts a trailing closure that evaluates the old and new trigger values. This lets you gate the haptic — for example, firing only when a value crosses a threshold:
Text("\(score)")
.sensoryFeedback(.impact(weight: .heavy), trigger: score) { old, new in
// Only fire when crossing 100
return old < 100 && new >= 100
}
Common Pitfalls
-
.sensoryFeedback requires iOS 17. If your deployment target is iOS 16, you must use
UIImpactFeedbackGeneratordirectly and guard calls with@available(iOS 17, *)if you want to mix both approaches. Using.sensoryFeedbackwithout an availability check will cause a compile-time error on iOS 16 targets. -
Haptics are silenced when "Haptics" is off in Settings → Sounds & Haptics. Never use haptics as the only signal for important UI state — always pair with a visual indicator or sound. The
UIFeedbackGeneratorAPIs silently no-op when haptics are disabled, so your code won't crash, but the user won't feel anything. -
Avoid firing haptics in tight loops or on every frame. Continuous haptic output is jarring, drains battery, and can desensitize users. If you're triggering feedback from a drag gesture, debounce it or fire only at meaningful intervals (e.g., when crossing snap points), not on every
.onChangecall.
Prompt This with Claude Code
When using Soarias or Claude Code directly to implement this:
Implement haptic feedback in SwiftUI for iOS 17+. Use UIImpactFeedbackGenerator and the .sensoryFeedback modifier. Make it accessible (VoiceOver labels). Add a #Preview with realistic sample data.
Drop this prompt into the Soarias Build phase after your screen mockups are locked in — Claude Code will wire up haptics across all interactive elements in one pass, keeping your momentum from design to first TestFlight build.
Related
FAQ
Does this work on iOS 16?
The .sensoryFeedback modifier is iOS 17+ only. On iOS 16 you must use UIImpactFeedbackGenerator, UINotificationFeedbackGenerator, or UISelectionFeedbackGenerator directly from within your action closures. Wrap any iOS 17 API in if #available(iOS 17, *) { … } if you need to support both deployment targets simultaneously.
Can I trigger haptics from inside a SwiftData or ViewModel save operation?
Yes — since UIImpactFeedbackGenerator must be called on the main thread, dispatch to MainActor from your async save context: await MainActor.run { generator.impactOccurred() }. The .sensoryFeedback modifier is always evaluated on the main thread by SwiftUI, so as long as your trigger @State or @Published property is updated on MainActor the haptic will fire correctly without any extra threading work.
What's the UIKit equivalent?
In UIKit you use UIImpactFeedbackGenerator, UINotificationFeedbackGenerator, and UISelectionFeedbackGenerator directly — allocate an instance, call prepare(), then call impactOccurred(), notificationOccurred(.success), or selectionChanged(). SwiftUI's .sensoryFeedback is a declarative wrapper around these same underlying engines; they produce identical taptic patterns, so the user experience is identical across UIKit and SwiftUI.
Last reviewed: 2026-05-11 by the Soarias team.