How to build a photo editing extension in SwiftUI
A Photo Editing Extension requires a UIViewController subclass that conforms to
PHContentEditingController. Embed a SwiftUI view via
UIHostingController, apply your edits with CIFilter,
then write a PHContentEditingOutput with serialised
PHAdjustmentData so Photos can revert the edit later.
// PhotoEditingViewController.swift — minimum viable shell
import UIKit
import Photos
import PhotosUI
final class PhotoEditingViewController: UIViewController, PHContentEditingController {
var shouldShowCancelConfirmation: Bool { true }
private var input: PHContentEditingInput?
func startContentEditing(with input: PHContentEditingInput,
placeholderImage: UIImage) {
self.input = input
// load your SwiftUI editor here
}
func finishContentEditing(completionHandler: @escaping (PHContentEditingOutput?) -> Void) {
guard let input else { completionHandler(nil); return }
let output = PHContentEditingOutput(contentEditingInput: input)
// write edited JPEG + adjustment data, then:
completionHandler(output)
}
func cancelContentEditing() { }
}
Full implementation
The extension is a separate Xcode target. The view controller conforms to
PHContentEditingController and hosts a SwiftUI
FilterEditorView via
UIHostingController. An
@Observable model owns the filter intensity slider and
the original CIImage. When the user taps Done in Photos,
finishContentEditing encodes parameters into
PHAdjustmentData and writes a JPEG to the output URL.
// MARK: - FilterModel.swift (shared between VC and SwiftUI view)
import Foundation
import CoreImage
import UIKit
import Observation
@Observable
final class FilterModel {
var intensity: Double = 0.5 // drives the slider
var preview: UIImage? // live preview shown in editor
var sourceImage: CIImage?
private let context = CIContext()
func applyVignette() {
guard let source = sourceImage else { return }
let filter = CIFilter.vignette()
filter.inputImage = source
filter.intensity = Float(intensity * 2) // 0-2 range
filter.radius = 2.0
guard let output = filter.outputImage,
let cgImage = context.createCGImage(output, from: source.extent)
else { return }
preview = UIImage(cgImage: cgImage)
}
func renderedJPEG() -> Data? {
guard let source = sourceImage else { return nil }
let filter = CIFilter.vignette()
filter.inputImage = source
filter.intensity = Float(intensity * 2)
filter.radius = 2.0
guard let output = filter.outputImage,
let cgImage = context.createCGImage(output, from: source.extent)
else { return nil }
return UIImage(cgImage: cgImage).jpegData(compressionQuality: 0.92)
}
}
// MARK: - FilterEditorView.swift
import SwiftUI
struct FilterEditorView: View {
@Bindable var model: FilterModel
var body: some View {
VStack(spacing: 24) {
Group {
if let preview = model.preview {
Image(uiImage: preview)
.resizable()
.scaledToFit()
.clipShape(RoundedRectangle(cornerRadius: 12))
} else {
RoundedRectangle(cornerRadius: 12)
.fill(Color.secondary.opacity(0.2))
.overlay { ProgressView() }
}
}
.frame(maxHeight: 320)
.padding(.horizontal)
VStack(alignment: .leading, spacing: 8) {
Label("Vignette intensity", systemImage: "circle.dashed")
.font(.subheadline.weight(.medium))
Slider(value: $model.intensity, in: 0...1)
.tint(.primary)
.accessibilityLabel("Vignette intensity")
.accessibilityValue(String(format: "%.0f%%", model.intensity * 100))
.onChange(of: model.intensity) { model.applyVignette() }
}
.padding(.horizontal)
Spacer()
}
.padding(.top, 16)
.background(Color(.systemBackground))
}
}
// MARK: - PhotoEditingViewController.swift
import UIKit
import Photos
import PhotosUI
import SwiftUI
final class PhotoEditingViewController: UIViewController, PHContentEditingController {
// MARK: PHContentEditingController
var shouldShowCancelConfirmation: Bool { model.intensity != 0.5 }
private let model = FilterModel()
private var input: PHContentEditingInput?
private var hostingChild: UIHostingController?
// MARK: Lifecycle
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemBackground
embedSwiftUI()
}
private func embedSwiftUI() {
let hosting = UIHostingController(rootView: FilterEditorView(model: model))
addChild(hosting)
hosting.view.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(hosting.view)
NSLayoutConstraint.activate([
hosting.view.topAnchor.constraint(equalTo: view.topAnchor),
hosting.view.leadingAnchor.constraint(equalTo: view.leadingAnchor),
hosting.view.trailingAnchor.constraint(equalTo: view.trailingAnchor),
hosting.view.bottomAnchor.constraint(equalTo: view.bottomAnchor),
])
hosting.didMove(toParent: self)
hostingChild = hosting
}
// MARK: PHContentEditingController — called by Photos
func startContentEditing(with input: PHContentEditingInput,
placeholderImage: UIImage) {
self.input = input
// Restore prior adjustment if the photo was edited before
if let data = input.adjustmentData,
data.formatIdentifier == adjustmentFormatID,
let params = try? JSONDecoder().decode(AdjustmentParams.self,
from: data.data) {
model.intensity = params.intensity
}
// Load full-size image asynchronously
if let url = input.fullSizeImageURL {
Task.detached(priority: .userInitiated) { [weak self] in
guard let self else { return }
var ci = CIImage(contentsOf: url)
let orientation = CGImagePropertyOrientation(
rawValue: UInt32(input.fullSizeImageOrientation.rawValue)) ?? .up
ci = ci?.oriented(orientation)
await MainActor.run {
self.model.sourceImage = ci
self.model.applyVignette()
}
}
}
}
func finishContentEditing(completionHandler: @escaping (PHContentEditingOutput?) -> Void) {
guard let input else { completionHandler(nil); return }
Task.detached(priority: .userInitiated) { [weak self] in
guard let self, let jpeg = self.model.renderedJPEG() else {
await MainActor.run { completionHandler(nil) }
return
}
let output = PHContentEditingOutput(contentEditingInput: input)
// Write JPEG to the renderedContentURL
try? jpeg.write(to: output.renderedContentURL)
// Encode adjustment parameters so Photos can show a Revert option
let params = AdjustmentParams(intensity: self.model.intensity)
if let encoded = try? JSONEncoder().encode(params) {
output.adjustmentData = PHAdjustmentData(
formatIdentifier: self.adjustmentFormatID,
formatVersion: "1.0",
data: encoded
)
}
await MainActor.run { completionHandler(output) }
}
}
func cancelContentEditing() {
model.intensity = 0.5
}
// MARK: Helpers
private let adjustmentFormatID = "com.example.vignetteEditor"
}
// MARK: - AdjustmentParams.swift
struct AdjustmentParams: Codable {
var intensity: Double
}
// MARK: - Preview (editor UI only — extension itself isn't previewable)
#Preview {
let m = FilterModel()
m.sourceImage = CIImage(image: UIImage(systemName: "photo.fill")!)
m.applyVignette()
return FilterEditorView(model: m)
}
How it works
-
Protocol conformance.
PHContentEditingControlleris the bridge between the Photos app and your extension. Photos callsstartContentEditing(with:placeholderImage:)with aPHContentEditingInputcontaining the full-size image URL and any priorPHAdjustmentData— you restore previous state from that data. -
SwiftUI embedding. Because the extension entry point must be a
UIViewController, theembedSwiftUI()method wrapsFilterEditorViewin aUIHostingControllerand pins it edge-to-edge. The@Observable FilterModelis shared across the boundary — changes to the slider immediately re-render the preview without any binding glue. -
Live preview via CIFilter. Every
onChangeon the slider callsmodel.applyVignette(), which runsCIFilter.vignette()— a named Core Image filter available since iOS 7. The renderedCGImageis wrapped inUIImageand stored inmodel.preview, which SwiftUI picks up automatically. -
Writing output.
finishContentEditingrenders the full-size image, writes JPEG bytes tooutput.renderedContentURL(Photos requires JPEG for.imageassets), and attachesPHAdjustmentDatacarrying JSON-encoded parameters under a reverse-DNS format identifier. Without this data, Photos cannot offer a Revert to Original option. -
Revert support. On re-open,
startContentEditingchecks whetherinput.adjustmentData.formatIdentifiermatches your bundle's identifier and, if so, decodes the savedAdjustmentParamsto pre-populate the slider — giving the user a round-trip editing experience.
Variants
Multiple filters with a picker
enum PhotoFilter: String, CaseIterable, Identifiable {
case vignette = "Vignette"
case noir = "Noir"
case chrome = "Chrome"
var id: String { rawValue }
}
// In FilterEditorView:
Picker("Filter", selection: $model.selectedFilter) {
ForEach(PhotoFilter.allCases) { f in
Text(f.rawValue).tag(f)
}
}
.pickerStyle(.segmented)
.accessibilityLabel("Choose filter style")
.onChange(of: model.selectedFilter) { model.applyFilter() }
// In FilterModel.applyFilter():
switch selectedFilter {
case .vignette:
let f = CIFilter.vignette()
f.inputImage = sourceImage
f.intensity = Float(intensity * 2)
outputImage = f.outputImage
case .noir:
let f = CIFilter.photoEffectNoir()
f.inputImage = sourceImage
outputImage = f.outputImage
case .chrome:
let f = CIFilter.photoEffectChrome()
f.inputImage = sourceImage
outputImage = f.outputImage
}
Video asset support
When input.mediaType == .video, the editing input provides an
audiovisualAsset (an AVAsset)
instead of an image URL. Export with
AVAssetExportSession targeting
AVAssetExportPresetHighestQuality, write the result to
output.renderedContentURL, and set
output.renderedContentURL's path extension to mov.
Declare video support in your extension's Info.plist under
PHSupportedMediaTypes with value Video.
Common pitfalls
-
Missing
NSPhotoLibraryUsageDescriptionin the host app. The extension itself doesn't request library access — the containing app must declare the key in itsInfo.plistor Photos will silently refuse to invoke your extension. -
Forgetting
renderedContentURLmust be JPEG for photos. Photos rejects outputs that aren't JPEG for image assets. PNG will compile and run but the edit will fail at commit time with a cryptic "The operation couldn't be completed" error. Always usejpegData(compressionQuality:). -
Blocking the main thread during rendering.
CIContext.createCGImageis CPU-intensive on high-resolution images. Always dispatch it viaTask.detached(priority: .userInitiated)and return toMainActoronly to update the UI — otherwise Photos' Done button will freeze. -
Not handling image orientation.
input.fullSizeImageOrientationis anInt32matchingCGImagePropertyOrientation. Failing to apply it viaCIImage.oriented(_:)produces rotated output on landscape or selfie photos. -
SwiftUI previews don't run extension lifecycle.
Use
#Previewonly for theFilterEditorViewin isolation. The full extension flow must be tested from the Photos app on a real device or simulator.
Prompt this with Claude Code
When using Soarias or Claude Code directly to implement this:
Implement a photo editing extension in SwiftUI for iOS 17+. Use Photo Editing Extension (PHContentEditingController), PHAdjustmentData, PHContentEditingInput/Output, and CIFilter. Embed the SwiftUI UI in a UIHostingController inside the view controller. Support revert-to-original by encoding/decoding adjustment params as JSON. Make it accessible (VoiceOver labels on all sliders and pickers). Add a #Preview with realistic sample data for the SwiftUI editor view.
In Soarias' Build phase, paste this prompt into the editor after scaffolding your extension target — Claude Code will generate all four files (model, SwiftUI view, view controller, adjustment params) wired together and ready for a first run in the Photos simulator.
Related
FAQ
Does this work on iOS 16?
The PHContentEditingController protocol has been available since iOS 8, so the core mechanism works back to iOS 16.
However, this guide uses the @Observable macro (iOS 17+) and the
#Preview macro (Xcode 15 / iOS 17+). To target iOS 16,
replace @Observable with
@ObservableObject /
@Published and use
PreviewProvider instead of
#Preview.
Can I access the full Photos library from inside the extension?
No. A Photo Editing Extension is sandboxed: Photos passes you exactly one
PHContentEditingInput and you write back exactly one
PHContentEditingOutput. You cannot browse or fetch other assets
from within the extension process. If you need broader access, ship a separate in-app editor that uses
PHPickerViewController or
PhotosUI in the main app target.
What is the UIKit equivalent?
The UIKit approach is identical at the protocol level — you still subclass
UIViewController and conform to
PHContentEditingController. The only difference is your UI:
instead of UIHostingController, you lay out
UIImageView and
UISlider directly using Auto Layout.
The rendering and output writing code is identical between UIKit and SwiftUI-hosted implementations.
Last reviewed: 2026-05-12 by the Soarias team.