```html SwiftUI: How to Implement Haptic Feedback (iOS 17+, 2026)

How to Implement Haptic Feedback in SwiftUI

iOS 17+ Xcode 16+ Beginner APIs: UIImpactFeedbackGenerator, .sensoryFeedback Updated: May 11, 2026
TL;DR

On iOS 17+, attach the .sensoryFeedback(_:trigger:) modifier to any view and SwiftUI fires the haptic automatically when the trigger value changes — no UIImpactFeedbackGenerator boilerplate required. For fine-grained control you can still call UIImpactFeedbackGenerator directly inside a button action.

// iOS 17+ — declarative
Button("Tap me") { tapped.toggle() }
    .sensoryFeedback(.impact(weight: .medium), trigger: tapped)

// Any iOS — imperative
let g = UIImpactFeedbackGenerator(style: .medium)
g.impactOccurred()

Full Implementation

The example below shows a settings-style toggle row that fires a rigid impact when switched on and a light impact when switched off — a common pattern that communicates directionality. A second button demonstrates the classic UIImpactFeedbackGenerator imperative approach, which is useful when you need to prepare the taptic engine in advance or trigger feedback from non-SwiftUI code paths.

import SwiftUI

struct HapticDemoView: View {
    @State private var isEnabled = false
    @State private var impactCount = 0

    // Imperative generator — prepare() warms up the engine
    private let generator = UIImpactFeedbackGenerator(style: .heavy)

    var body: some View {
        NavigationStack {
            List {
                // MARK: - Declarative (iOS 17+)
                Section("Declarative .sensoryFeedback") {
                    Toggle("Enable feature", isOn: $isEnabled)
                        // Fire different feedback depending on the new value
                        .sensoryFeedback(
                            .impact(weight: isEnabled ? .rigid : .light),
                            trigger: isEnabled
                        )
                        .accessibilityLabel("Enable feature toggle")
                }

                // MARK: - Notification feedback
                Section("Notification styles") {
                    Button("Success") {
                        // Trigger via a separate counter so repeated taps always fire
                        impactCount += 1
                    }
                    .sensoryFeedback(.success, trigger: impactCount)
                    .accessibilityLabel("Trigger success haptic")

                    Button("Warning") { impactCount += 1 }
                        .sensoryFeedback(.warning, trigger: impactCount)
                        .accessibilityLabel("Trigger warning haptic")

                    Button("Error") { impactCount += 1 }
                        .sensoryFeedback(.error, trigger: impactCount)
                        .accessibilityLabel("Trigger error haptic")
                }

                // MARK: - Imperative (all iOS versions)
                Section("Imperative UIImpactFeedbackGenerator") {
                    Button("Heavy impact") {
                        generator.prepare()      // warm up before the action
                        generator.impactOccurred(intensity: 1.0)
                    }
                    .accessibilityLabel("Trigger heavy haptic feedback")
                }
            }
            .navigationTitle("Haptics Demo")
            .onAppear {
                // Pre-warm so the first tap has no delay
                generator.prepare()
            }
        }
    }
}

#Preview {
    HapticDemoView()
}

How It Works

  1. .sensoryFeedback(_:trigger:) — Introduced in iOS 17, this modifier watches trigger (any Equatable value) and fires the haptic engine whenever the value changes. SwiftUI handles generator lifecycle automatically, so you never allocate or prepare anything manually.
  2. Conditional weight via isEnabled — Because the feedback parameter is evaluated at call time, passing isEnabled ? .rigid : .light lets the toggle communicate direction: a firm tap for "on", a soft tap for "off". The trigger is isEnabled itself, so the haptic fires on every toggle change.
  3. Counter trick for repeated actions — When multiple button taps need to each fire a haptic, an incrementing impactCount state ensures the trigger always transitions to a new value, which is required for .sensoryFeedback to fire again. A plain Bool toggled back immediately would miss every other tap.
  4. UIImpactFeedbackGenerator.prepare() — Calling prepare() on onAppear loads the taptic engine into a ready state, reducing latency on the first interaction. The engine stays primed for roughly 500 ms after prepare(), so call it just before the expected tap, not minutes earlier.
  5. Notification styles (.success / .warning / .error) — These map to the system's three-pattern vibration sequences and are semantically meaningful to users who rely on haptics for auditory substitution. Use them purposefully: .success for completed actions, .warning for caution, .error for failure.

Variants

Selection feedback on a Picker or drag gesture

struct PickerHapticView: View {
    @State private var selected = 0
    let options = ["Swift", "Kotlin", "Dart"]

    var body: some View {
        Picker("Language", selection: $selected) {
            ForEach(options.indices, id: \.self) { i in
                Text(options[i]).tag(i)
            }
        }
        .pickerStyle(.wheel)
        // .selection fires a light tick as the wheel spins
        .sensoryFeedback(.selection, trigger: selected)
    }
}

Conditional feedback — only fire when a condition is met

The overload .sensoryFeedback(_:trigger:condition:) accepts a trailing closure that evaluates the old and new trigger values. This lets you gate the haptic — for example, firing only when a value crosses a threshold:

Text("\(score)")
    .sensoryFeedback(.impact(weight: .heavy), trigger: score) { old, new in
        // Only fire when crossing 100
        return old < 100 && new >= 100
    }

Common Pitfalls

Prompt This with Claude Code

When using Soarias or Claude Code directly to implement this:

Implement haptic feedback in SwiftUI for iOS 17+.
Use UIImpactFeedbackGenerator and the .sensoryFeedback modifier.
Make it accessible (VoiceOver labels).
Add a #Preview with realistic sample data.

Drop this prompt into the Soarias Build phase after your screen mockups are locked in — Claude Code will wire up haptics across all interactive elements in one pass, keeping your momentum from design to first TestFlight build.

Related

FAQ

Does this work on iOS 16?

The .sensoryFeedback modifier is iOS 17+ only. On iOS 16 you must use UIImpactFeedbackGenerator, UINotificationFeedbackGenerator, or UISelectionFeedbackGenerator directly from within your action closures. Wrap any iOS 17 API in if #available(iOS 17, *) { … } if you need to support both deployment targets simultaneously.

Can I trigger haptics from inside a SwiftData or ViewModel save operation?

Yes — since UIImpactFeedbackGenerator must be called on the main thread, dispatch to MainActor from your async save context: await MainActor.run { generator.impactOccurred() }. The .sensoryFeedback modifier is always evaluated on the main thread by SwiftUI, so as long as your trigger @State or @Published property is updated on MainActor the haptic will fire correctly without any extra threading work.

What's the UIKit equivalent?

In UIKit you use UIImpactFeedbackGenerator, UINotificationFeedbackGenerator, and UISelectionFeedbackGenerator directly — allocate an instance, call prepare(), then call impactOccurred(), notificationOccurred(.success), or selectionChanged(). SwiftUI's .sensoryFeedback is a declarative wrapper around these same underlying engines; they produce identical taptic patterns, so the user experience is identical across UIKit and SwiftUI.

Last reviewed: 2026-05-11 by the Soarias team.

```