How to Build a Kanji Learning App in SwiftUI
A Kanji Learning app teaches Japanese through interactive stroke order exercises and spaced repetition, covering the full JLPT N5–N1 syllabus. It's for iOS developers who want to ship a focused study tool for learners who need hands-on writing practice, not just flashcard drilling.
Prerequisites
- Mac with Xcode 16+
- Apple Developer Program ($99/year) — required for TestFlight and App Store
- Basic Swift/SwiftUI knowledge including the
@Modelmacro and SwiftData containers - A physical iPhone for testing — the Simulator's mouse-drag input produces unnaturally smooth strokes that mask real gesture UX problems
- A licensed kanji stroke dataset: KanjiVG (CC BY-SA 3.0) covers ~6,000 kanji; verify attribution requirements before shipping
Architecture overview
SwiftData is the single local store for kanji records, study sessions, and per-character mastery scores. A lightweight SRS engine in SRSEngine.swift computes each kanji's next review date using an SM-2–style interval. The view hierarchy is a NavigationStack: a dashboard filtered by JLPT level leads to a StrokePracticeView built on SwiftUI Canvas and DragGesture. Reference stroke paths are bundled as JSON (parsed from KanjiVG SVG at build time) and loaded once into a lightweight in-memory cache — never stored in SwiftData — to keep the model layer lean.
KanjiApp/
├── Models/
│ ├── Kanji.swift # @Model — character, readings, JLPT, mastery
│ └── StudySession.swift # @Model — date, kanji reviewed, score
├── Views/
│ ├── KanjiDashboardView.swift
│ ├── StrokePracticeView.swift
│ └── FlashcardReviewView.swift
├── Services/
│ └── SRSEngine.swift # SM-2 interval scheduler
└── Resources/
└── kanji_strokes.json # N1–N5 reference paths (from KanjiVG)
Step-by-step
1. Data model
Define a Kanji SwiftData model with readings, JLPT level, mastery score, and a scheduled review date so the SRS engine can prioritise what to study next.
import SwiftData
import Foundation
@Model
final class Kanji {
var character: String
var meaning: String
var onyomi: [String]
var kunyomi: [String]
var strokeCount: Int
var jlptLevel: Int // 1 = N1 … 5 = N5
var masteryScore: Double // 0.0 – 1.0
var nextReviewDate: Date
var reviewInterval: Int // days
init(character: String, meaning: String, onyomi: [String],
kunyomi: [String], strokeCount: Int, jlptLevel: Int) {
self.character = character; self.meaning = meaning
self.onyomi = onyomi; self.kunyomi = kunyomi
self.strokeCount = strokeCount; self.jlptLevel = jlptLevel
self.masteryScore = 0.0
self.nextReviewDate = .now
self.reviewInterval = 1
}
}
2. Core UI — Kanji dashboard
Build the root NavigationStack with a segmented JLPT picker and a live-filtered kanji list that navigates into the stroke practice view.
struct KanjiDashboardView: View {
@Query(sort: \Kanji.jlptLevel) private var allKanji: [Kanji]
@State private var selectedLevel = 5
var filtered: [Kanji] { allKanji.filter { $0.jlptLevel == selectedLevel } }
var body: some View {
NavigationStack {
VStack(spacing: 0) {
Picker("JLPT Level", selection: $selectedLevel) {
ForEach(1...5, id: \.self) { Text("N\($0)").tag($0) }
}
.pickerStyle(.segmented).padding()
List(filtered) { kanji in
NavigationLink(value: kanji) {
KanjiRowView(kanji: kanji)
}
}
}
.navigationTitle("Kanji")
.navigationDestination(for: Kanji.self) {
StrokePracticeView(kanji: $0)
}
}
}
}
3. Core feature — Canvas stroke order practice
Capture freehand strokes with DragGesture and draw them in real time on a Canvas, layered over a ghost-guide character whose opacity fades as mastery increases.
struct StrokePracticeView: View {
let kanji: Kanji
@State private var strokes: [[CGPoint]] = []
@State private var live: [CGPoint] = []
@State private var showGuide = true
var body: some View {
VStack(spacing: 16) {
Text(kanji.character)
.font(.system(size: 80))
.opacity(showGuide ? max(0.08, 1 - kanji.masteryScore) : 0)
Canvas { ctx, _ in
let style = StrokeStyle(lineWidth: 9, lineCap: .round, lineJoin: .round)
for pts in strokes { ctx.stroke(stroke(pts), with: .color(.primary), style: style) }
if !live.isEmpty { ctx.stroke(stroke(live), with: .color(.blue), style: style) }
}
.frame(width: 280, height: 280)
.background(Color(.systemGray6))
.clipShape(RoundedRectangle(cornerRadius: 16))
.gesture(DragGesture(minimumDistance: 0)
.onChanged { live.append($0.location) }
.onEnded { _ in strokes.append(live); live = [] })
HStack {
Button("Clear") { strokes = []; live = [] }.buttonStyle(.bordered)
Toggle("Guide", isOn: $showGuide)
Spacer()
Text("\(strokes.count) / \(kanji.strokeCount)").foregroundStyle(.secondary)
}.padding(.horizontal)
}
.navigationTitle("Stroke Practice").navigationBarTitleDisplayMode(.inline)
}
private func stroke(_ pts: [CGPoint]) -> Path {
var p = Path()
guard let f = pts.first else { return p }
p.move(to: f); pts.dropFirst().forEach { p.addLine(to: $0) }
return p
}
}
4. Privacy Manifest setup
Apple requires a PrivacyInfo.xcprivacy file to declare any required-reason API usage; missing it triggers an automatic rejection during App Store processing.
<!-- PrivacyInfo.xcprivacy (File › New › File › App Privacy) -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NSPrivacyTracking</key><false/>
<key>NSPrivacyTrackingDomains</key><array/>
<key>NSPrivacyCollectedDataTypes</key><array/>
<key>NSPrivacyAccessedAPITypes</key>
<array>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryUserDefaults</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array><string>CA92.1</string></array>
</dict>
</array>
</dict>
</plist>
Common pitfalls
- Stroke validation is harder than stroke count. Pixel-perfect path comparison fails for natural handwriting. Start with stroke count matching and rough directional checks (top→bottom, left→right). Full DTW-based similarity scoring is a v2 feature, not a launch requirement.
- Shipping KanjiVG without attribution. KanjiVG is CC BY-SA 3.0 — attribution is mandatory. App Store Review guideline 5.2 covers intellectual property; add the credit to your app's Settings or About screen before submission.
- Canvas performance on re-renders. Appending to
strokeson everyDragGesture.onChangedtriggers a full Canvas redraw per point. Accumulate live points in a separate@Statearray and only commit to the completed-strokes array ononEnded— exactly as shown in step 3. - Hard paywall on core features. App Store Review guideline 3.1.1 requires users to experience basic functionality before hitting a subscription wall. Gate N1/N2 kanji sets and analytics behind the paywall; keep N5 practice free.
- Missing Privacy Manifest for third-party SDKs. If you add an analytics or crash-reporting SDK, its
PrivacyInfo.xcprivacymust also be present. Xcode's Privacy Report (Product › Archive › Generate Privacy Report) lists any gaps before you upload.
Adding monetization: Subscription
Use StoreKit 2 to offer a monthly or annual "Kanji Pro" subscription that unlocks the full N1–N4 character set, per-stroke accuracy analytics, and offline audio readings. Create the subscription group in App Store Connect under In-App Purchases, then present Apple's built-in SubscriptionStoreView(groupID:) — available since iOS 17 — as a sheet; it renders the free-trial badge, pricing, and the required "Manage Subscription" link automatically. Gate premium views with a @State var isPro: Bool driven by Transaction.currentEntitlements checked on launch inside a task modifier. Listen to Transaction.updates in a background task so renewals and family-sharing grants re-enable access without a manual "Restore Purchases" tap.
Shipping this faster with Soarias
Soarias scaffolds the full SwiftData model layer from your app description, generates a correctly populated PrivacyInfo.xcprivacy with the right reason codes for UserDefaults access, configures fastlane match for code signing, and wires up App Store Connect submission — skipping the two-to-three hour provisioning setup that stalls most intermediate projects. It also writes the StoreKit configuration file and the SubscriptionStoreView paywall boilerplate based on your chosen monetization model, so your subscription gate is ready to test on day one.
For an intermediate app like this, most developers burn two to three days on scaffolding, signing, and Privacy Manifest research before they write a single line of stroke logic. Soarias compresses that to under an hour — leaving the full week for what actually differentiates your app: the SRS algorithm, stroke evaluation quality, and kanji content curation.
Related guides
FAQ
Do I need a paid Apple Developer account?
Yes. The free tier lets you sideload to your own device only. TestFlight distribution and App Store submission both require the $99/year Apple Developer Program membership.
How do I submit this to the App Store?
Archive the app in Xcode (Product › Archive) and upload via the Organizer. In App Store Connect, complete the metadata — screenshots for every required device size, a description, age rating, privacy nutrition labels, and your subscription IAP — then submit for review. First-time submissions typically take 24–48 hours.
Where do I get kanji stroke order data?
KanjiVG (kanjivg.github.io) is the most complete open dataset, with SVG stroke paths for roughly 6,000 kanji under CC BY-SA 3.0. Parse the <path d="…"> attributes at build time into a JSON bundle, then convert to CGPath at runtime for Canvas rendering. Include the required attribution notice in your app's About or Settings screen — the license is not optional.
Last reviewed: 2026-05-12 by the Soarias team.