How to Implement Handwriting Recognition in SwiftUI
Wrap PKCanvasView in a
UIViewRepresentable to capture ink strokes as a
PKDrawing, rasterize it to a
CGImage, then pass it to
VNRecognizeTextRequest for offline, on-device text extraction.
import PencilKit, Vision
// Rasterize a PKDrawing, then run Vision's text recognizer
let bounds = drawing.bounds.insetBy(dx: -20, dy: -20)
let image = drawing.image(from: bounds, scale: 3.0)
guard let cgImage = image.cgImage else { return }
let request = VNRecognizeTextRequest { req, _ in
let text = (req.results as? [VNRecognizedTextObservation])?
.compactMap { $0.topCandidates(1).first?.string }
.joined(separator: " ") ?? ""
DispatchQueue.main.async { recognizedText = text }
}
request.recognitionLevel = .accurate
request.usesLanguageCorrection = true
try? VNImageRequestHandler(cgImage: cgImage, options: [:]).perform([request])
Full implementation
The implementation is split into two types: a
CanvasView wrapper that bridges
PKCanvasView into SwiftUI via a
PKDrawing binding, and a
HandwritingRecognitionView that owns
the state, triggers the Vision pipeline asynchronously, and surfaces results.
All recognition runs on-device β no network call is made.
import SwiftUI
import PencilKit
import Vision
// MARK: - UIViewRepresentable wrapper for PKCanvasView
struct CanvasView: UIViewRepresentable {
@Binding var drawing: PKDrawing
func makeUIView(context: Context) -> PKCanvasView {
let canvas = PKCanvasView()
canvas.drawing = drawing
canvas.tool = PKInkingTool(.pen, color: .label, width: 3)
canvas.drawingPolicy = .anyInput // finger + Apple Pencil
canvas.backgroundColor = .systemBackground
canvas.delegate = context.coordinator
return canvas
}
func updateUIView(_ canvas: PKCanvasView, context: Context) {
if canvas.drawing != drawing { canvas.drawing = drawing }
}
func makeCoordinator() -> Coordinator { Coordinator(drawing: $drawing) }
class Coordinator: NSObject, PKCanvasViewDelegate {
@Binding var drawing: PKDrawing
init(drawing: Binding<PKDrawing>) { _drawing = drawing }
func canvasViewDrawingDidChange(_ canvas: PKCanvasView) {
drawing = canvas.drawing
}
}
}
// MARK: - Main view
struct HandwritingRecognitionView: View {
@State private var drawing = PKDrawing()
@State private var recognizedText = ""
@State private var isRecognizing = false
@Environment(\.displayScale) private var displayScale
var body: some View {
VStack(spacing: 16) {
CanvasView(drawing: $drawing)
.frame(height: 280)
.clipShape(RoundedRectangle(cornerRadius: 12))
.overlay(RoundedRectangle(cornerRadius: 12)
.stroke(Color.secondary.opacity(0.3), lineWidth: 1))
.accessibilityLabel("Handwriting canvas β write text here")
HStack(spacing: 12) {
Button("Recognize") {
Task { await recognizeHandwriting() }
}
.buttonStyle(.borderedProminent)
.disabled(drawing.bounds.isEmpty || isRecognizing)
Button("Clear") {
drawing = PKDrawing()
recognizedText = ""
}
.buttonStyle(.bordered)
.tint(.secondary)
}
if isRecognizing {
ProgressView("Recognizingβ¦")
} else if !recognizedText.isEmpty {
GroupBox("Recognized Text") {
Text(recognizedText)
.frame(maxWidth: .infinity, alignment: .leading)
}
.accessibilityElement(children: .combine)
.accessibilityLabel("Recognized text: \(recognizedText)")
}
Spacer()
}
.padding()
.navigationTitle("Handwriting Recognition")
.navigationBarTitleDisplayMode(.inline)
}
// MARK: - Vision pipeline
@MainActor
private func recognizeHandwriting() async {
guard !drawing.bounds.isEmpty else { return }
isRecognizing = true
defer { isRecognizing = false }
let bounds = drawing.bounds.insetBy(dx: -20, dy: -20)
let image = drawing.image(from: bounds, scale: displayScale)
guard let cgImage = image.cgImage else { return }
recognizedText = await withCheckedContinuation { continuation in
let request = VNRecognizeTextRequest { req, error in
guard error == nil,
let obs = req.results as? [VNRecognizedTextObservation]
else { continuation.resume(returning: ""); return }
let text = obs
.compactMap { $0.topCandidates(1).first?.string }
.joined(separator: " ")
continuation.resume(returning: text)
}
request.recognitionLevel = .accurate
request.usesLanguageCorrection = true
try? VNImageRequestHandler(cgImage: cgImage, options: [:])
.perform([request])
}
}
}
#Preview {
NavigationStack {
HandwritingRecognitionView()
}
}
How it works
-
UIViewRepresentable bridge.
CanvasViewwrapsPKCanvasViewand installs aCoordinatoras itsPKCanvasViewDelegate. Every stroke firescanvasViewDrawingDidChange, which writes the latestPKDrawingback into the binding so SwiftUI state stays in sync. -
Drawing rasterization.
drawing.image(from:scale:)converts the vector ink strokes into a bitmap at the display's native scale, captured via@Environment(\.displayScale). The bounds are inset by-20 pton each side to add padding that improves Vision's context window around edge strokes. -
Vision text request.
VNRecognizeTextRequestwith.accuraterecognition level runs a neural-network OCR pipeline entirely on-device. SettingusesLanguageCorrection = trueapplies a language model pass that corrects ambiguous letter shapes using dictionary context. -
Swift Concurrency bridge.
withCheckedContinuationwraps Vision's callback-based API so the result can beawait-ed cleanly on the@MainActor. Thedefer { isRecognizing = false }guarantees the spinner disappears even if an error is thrown. -
Top candidate extraction.
Each
VNRecognizedTextObservationcan return multiple candidate strings ranked by confidence. CallingtopCandidates(1).first?.stringpicks the highest-confidence reading per line; all lines are joined with a space to form the final output.
Variants
Real-time recognition with debounce
Trigger recognition automatically after the user pauses drawing, rather than requiring a button tap,
by attaching a debounced task to the drawing change.
// Inside HandwritingRecognitionView.body, add:
.onChange(of: drawing) { _, _ in
debounceTask?.cancel()
debounceTask = Task {
try? await Task.sleep(for: .milliseconds(600))
guard !Task.isCancelled else { return }
await recognizeHandwriting()
}
}
// Add the state property:
@State private var debounceTask: Task<Void, Never>?
Multi-language recognition
Vision can recognize text in dozens of languages. Restrict or expand the set by setting
request.recognitionLanguages to a prioritized
array of BCP-47 locale codes before performing the request β for example
["zh-Hans", "en-US"] for simplified Chinese with
English fallback. Use VNRecognizeTextRequest.supportedRecognitionLanguages()
(a throwing static method) to query which locales are available on the current device.
Common pitfalls
-
π« Empty bounds crash.
PKDrawing.boundsreturnsCGRect.nullwhen the canvas is blank. Passing a null rect todrawing.image(from:scale:)produces a zero-size image that crashes Vision. Always guard withdrawing.bounds.isEmptybefore rasterizing β as shown in therecognizeHandwriting()guard above. -
π Finger drawing disabled by default.
On a real device,
PKCanvasViewignores finger touches unlessdrawingPolicy = .anyInputis set. Without it, users without an Apple Pencil will see no ink appear, making the feature seem broken. -
β‘ Block the background, not the main thread.
VNImageRequestHandler.perform(_:)is synchronous and can take 200β800 ms on complex handwriting. ThewithCheckedContinuationpattern above keeps Vision on a background thread; never callperformdirectly on the main actor or you'll drop frames and freeze the UI. -
βΏ Don't forget VoiceOver.
PKCanvasViewexposes no inherent accessibility label. Always set an explicit.accessibilityLabelon theCanvasViewwrapper, and announce the recognized result withUIAccessibility.post(notification: .announcement, argument:)after recognition completes.
Prompt this with Claude Code
When using Soarias or Claude Code directly to implement this:
Implement handwriting recognition in SwiftUI for iOS 17+. Use PencilKit (PKCanvasView, PKDrawing) and Vision (VNRecognizeTextRequest). Set recognitionLevel to .accurate and usesLanguageCorrection to true. Wrap the Vision callback with withCheckedContinuation for async/await. Make it accessible (VoiceOver labels on canvas and result text). Add a #Preview with NavigationStack and realistic sample data.
Drop this prompt into Soarias during the Build phase β Claude Code will scaffold both the
CanvasView wrapper and the recognition pipeline as
separate files, keeping your feature modules clean as your project grows.
Related
FAQ
Does this work on iOS 16?
PKCanvasView and
VNRecognizeTextRequest are both available
from iOS 14+, so the core pipeline compiles on iOS 16. However, this guide targets iOS 17+ to take advantage
of @Environment(\.displayScale) and the
#Preview macro. If you need iOS 16 support, replace
\.displayScale with
UIScreen.main.scale (deprecated but functional)
and swap #Preview for
PreviewProvider.
How accurate is Vision on messy handwriting?
With recognitionLevel = .accurate and
usesLanguageCorrection = true,
Vision achieves strong results on clearly formed letters. Accuracy drops significantly on highly
cursive or stylized scripts because the model is optimized for print-style handwriting.
For cursive-heavy use cases, consider prompting the user to write in block letters, or
fall back to on-device ML models trained on your specific script style via Core ML.
What is the UIKit equivalent?
In UIKit you use PKCanvasView directly
(it is a UIView subclass), eliminating the
UIViewRepresentable wrapper entirely.
The Vision pipeline is identical β UIKit has no special advantage here.
SwiftUI requires the wrapper because PKCanvasView
has no native SwiftUI counterpart; Apple has not shipped a first-party
Canvas view with PencilKit backing as of iOS 17.
Last reviewed: 2026-05-12 by the Soarias team.