```html SwiftUI: How to Build a Photo Editing Extension (iOS 17+, 2026)

How to build a photo editing extension in SwiftUI

iOS 17+ Xcode 16+ Advanced APIs: Photo Editing Extension Updated: May 12, 2026
TL;DR

A Photo Editing Extension requires a UIViewController subclass that conforms to PHContentEditingController. Embed a SwiftUI view via UIHostingController, apply your edits with CIFilter, then write a PHContentEditingOutput with serialised PHAdjustmentData so Photos can revert the edit later.

// PhotoEditingViewController.swift — minimum viable shell
import UIKit
import Photos
import PhotosUI

final class PhotoEditingViewController: UIViewController, PHContentEditingController {

    var shouldShowCancelConfirmation: Bool { true }
    private var input: PHContentEditingInput?

    func startContentEditing(with input: PHContentEditingInput,
                             placeholderImage: UIImage) {
        self.input = input
        // load your SwiftUI editor here
    }

    func finishContentEditing(completionHandler: @escaping (PHContentEditingOutput?) -> Void) {
        guard let input else { completionHandler(nil); return }
        let output = PHContentEditingOutput(contentEditingInput: input)
        // write edited JPEG + adjustment data, then:
        completionHandler(output)
    }

    func cancelContentEditing() { }
}

Full implementation

The extension is a separate Xcode target. The view controller conforms to PHContentEditingController and hosts a SwiftUI FilterEditorView via UIHostingController. An @Observable model owns the filter intensity slider and the original CIImage. When the user taps Done in Photos, finishContentEditing encodes parameters into PHAdjustmentData and writes a JPEG to the output URL.

// MARK: - FilterModel.swift  (shared between VC and SwiftUI view)
import Foundation
import CoreImage
import UIKit
import Observation

@Observable
final class FilterModel {
    var intensity: Double = 0.5          // drives the slider
    var preview: UIImage?                // live preview shown in editor
    var sourceImage: CIImage?

    private let context = CIContext()

    func applyVignette() {
        guard let source = sourceImage else { return }
        let filter = CIFilter.vignette()
        filter.inputImage = source
        filter.intensity = Float(intensity * 2)   // 0-2 range
        filter.radius    = 2.0
        guard let output = filter.outputImage,
              let cgImage = context.createCGImage(output, from: source.extent)
        else { return }
        preview = UIImage(cgImage: cgImage)
    }

    func renderedJPEG() -> Data? {
        guard let source = sourceImage else { return nil }
        let filter = CIFilter.vignette()
        filter.inputImage = source
        filter.intensity = Float(intensity * 2)
        filter.radius    = 2.0
        guard let output = filter.outputImage,
              let cgImage = context.createCGImage(output, from: source.extent)
        else { return nil }
        return UIImage(cgImage: cgImage).jpegData(compressionQuality: 0.92)
    }
}

// MARK: - FilterEditorView.swift
import SwiftUI

struct FilterEditorView: View {
    @Bindable var model: FilterModel

    var body: some View {
        VStack(spacing: 24) {
            Group {
                if let preview = model.preview {
                    Image(uiImage: preview)
                        .resizable()
                        .scaledToFit()
                        .clipShape(RoundedRectangle(cornerRadius: 12))
                } else {
                    RoundedRectangle(cornerRadius: 12)
                        .fill(Color.secondary.opacity(0.2))
                        .overlay { ProgressView() }
                }
            }
            .frame(maxHeight: 320)
            .padding(.horizontal)

            VStack(alignment: .leading, spacing: 8) {
                Label("Vignette intensity", systemImage: "circle.dashed")
                    .font(.subheadline.weight(.medium))
                Slider(value: $model.intensity, in: 0...1)
                    .tint(.primary)
                    .accessibilityLabel("Vignette intensity")
                    .accessibilityValue(String(format: "%.0f%%", model.intensity * 100))
                    .onChange(of: model.intensity) { model.applyVignette() }
            }
            .padding(.horizontal)

            Spacer()
        }
        .padding(.top, 16)
        .background(Color(.systemBackground))
    }
}

// MARK: - PhotoEditingViewController.swift
import UIKit
import Photos
import PhotosUI
import SwiftUI

final class PhotoEditingViewController: UIViewController, PHContentEditingController {

    // MARK: PHContentEditingController
    var shouldShowCancelConfirmation: Bool { model.intensity != 0.5 }

    private let model  = FilterModel()
    private var input: PHContentEditingInput?
    private var hostingChild: UIHostingController?

    // MARK: Lifecycle
    override func viewDidLoad() {
        super.viewDidLoad()
        view.backgroundColor = .systemBackground
        embedSwiftUI()
    }

    private func embedSwiftUI() {
        let hosting = UIHostingController(rootView: FilterEditorView(model: model))
        addChild(hosting)
        hosting.view.translatesAutoresizingMaskIntoConstraints = false
        view.addSubview(hosting.view)
        NSLayoutConstraint.activate([
            hosting.view.topAnchor.constraint(equalTo: view.topAnchor),
            hosting.view.leadingAnchor.constraint(equalTo: view.leadingAnchor),
            hosting.view.trailingAnchor.constraint(equalTo: view.trailingAnchor),
            hosting.view.bottomAnchor.constraint(equalTo: view.bottomAnchor),
        ])
        hosting.didMove(toParent: self)
        hostingChild = hosting
    }

    // MARK: PHContentEditingController — called by Photos
    func startContentEditing(with input: PHContentEditingInput,
                             placeholderImage: UIImage) {
        self.input = input

        // Restore prior adjustment if the photo was edited before
        if let data = input.adjustmentData,
           data.formatIdentifier == adjustmentFormatID,
           let params = try? JSONDecoder().decode(AdjustmentParams.self,
                                                  from: data.data) {
            model.intensity = params.intensity
        }

        // Load full-size image asynchronously
        if let url = input.fullSizeImageURL {
            Task.detached(priority: .userInitiated) { [weak self] in
                guard let self else { return }
                var ci = CIImage(contentsOf: url)
                let orientation = CGImagePropertyOrientation(
                    rawValue: UInt32(input.fullSizeImageOrientation.rawValue)) ?? .up
                ci = ci?.oriented(orientation)
                await MainActor.run {
                    self.model.sourceImage = ci
                    self.model.applyVignette()
                }
            }
        }
    }

    func finishContentEditing(completionHandler: @escaping (PHContentEditingOutput?) -> Void) {
        guard let input else { completionHandler(nil); return }

        Task.detached(priority: .userInitiated) { [weak self] in
            guard let self, let jpeg = self.model.renderedJPEG() else {
                await MainActor.run { completionHandler(nil) }
                return
            }

            let output = PHContentEditingOutput(contentEditingInput: input)

            // Write JPEG to the renderedContentURL
            try? jpeg.write(to: output.renderedContentURL)

            // Encode adjustment parameters so Photos can show a Revert option
            let params = AdjustmentParams(intensity: self.model.intensity)
            if let encoded = try? JSONEncoder().encode(params) {
                output.adjustmentData = PHAdjustmentData(
                    formatIdentifier: self.adjustmentFormatID,
                    formatVersion: "1.0",
                    data: encoded
                )
            }

            await MainActor.run { completionHandler(output) }
        }
    }

    func cancelContentEditing() {
        model.intensity = 0.5
    }

    // MARK: Helpers
    private let adjustmentFormatID = "com.example.vignetteEditor"
}

// MARK: - AdjustmentParams.swift
struct AdjustmentParams: Codable {
    var intensity: Double
}

// MARK: - Preview (editor UI only — extension itself isn't previewable)
#Preview {
    let m = FilterModel()
    m.sourceImage = CIImage(image: UIImage(systemName: "photo.fill")!)
    m.applyVignette()
    return FilterEditorView(model: m)
}

How it works

  1. Protocol conformance. PHContentEditingController is the bridge between the Photos app and your extension. Photos calls startContentEditing(with:placeholderImage:) with a PHContentEditingInput containing the full-size image URL and any prior PHAdjustmentData — you restore previous state from that data.
  2. SwiftUI embedding. Because the extension entry point must be a UIViewController, the embedSwiftUI() method wraps FilterEditorView in a UIHostingController and pins it edge-to-edge. The @Observable FilterModel is shared across the boundary — changes to the slider immediately re-render the preview without any binding glue.
  3. Live preview via CIFilter. Every onChange on the slider calls model.applyVignette(), which runs CIFilter.vignette() — a named Core Image filter available since iOS 7. The rendered CGImage is wrapped in UIImage and stored in model.preview, which SwiftUI picks up automatically.
  4. Writing output. finishContentEditing renders the full-size image, writes JPEG bytes to output.renderedContentURL (Photos requires JPEG for .image assets), and attaches PHAdjustmentData carrying JSON-encoded parameters under a reverse-DNS format identifier. Without this data, Photos cannot offer a Revert to Original option.
  5. Revert support. On re-open, startContentEditing checks whether input.adjustmentData.formatIdentifier matches your bundle's identifier and, if so, decodes the saved AdjustmentParams to pre-populate the slider — giving the user a round-trip editing experience.

Variants

Multiple filters with a picker

enum PhotoFilter: String, CaseIterable, Identifiable {
    case vignette = "Vignette"
    case noir     = "Noir"
    case chrome   = "Chrome"
    var id: String { rawValue }
}

// In FilterEditorView:
Picker("Filter", selection: $model.selectedFilter) {
    ForEach(PhotoFilter.allCases) { f in
        Text(f.rawValue).tag(f)
    }
}
.pickerStyle(.segmented)
.accessibilityLabel("Choose filter style")
.onChange(of: model.selectedFilter) { model.applyFilter() }

// In FilterModel.applyFilter():
switch selectedFilter {
case .vignette:
    let f = CIFilter.vignette()
    f.inputImage = sourceImage
    f.intensity  = Float(intensity * 2)
    outputImage  = f.outputImage
case .noir:
    let f = CIFilter.photoEffectNoir()
    f.inputImage = sourceImage
    outputImage  = f.outputImage
case .chrome:
    let f = CIFilter.photoEffectChrome()
    f.inputImage = sourceImage
    outputImage  = f.outputImage
}

Video asset support

When input.mediaType == .video, the editing input provides an audiovisualAsset (an AVAsset) instead of an image URL. Export with AVAssetExportSession targeting AVAssetExportPresetHighestQuality, write the result to output.renderedContentURL, and set output.renderedContentURL's path extension to mov. Declare video support in your extension's Info.plist under PHSupportedMediaTypes with value Video.

Common pitfalls

Prompt this with Claude Code

When using Soarias or Claude Code directly to implement this:

Implement a photo editing extension in SwiftUI for iOS 17+.
Use Photo Editing Extension (PHContentEditingController),
PHAdjustmentData, PHContentEditingInput/Output, and CIFilter.
Embed the SwiftUI UI in a UIHostingController inside the view controller.
Support revert-to-original by encoding/decoding adjustment params as JSON.
Make it accessible (VoiceOver labels on all sliders and pickers).
Add a #Preview with realistic sample data for the SwiftUI editor view.

In Soarias' Build phase, paste this prompt into the editor after scaffolding your extension target — Claude Code will generate all four files (model, SwiftUI view, view controller, adjustment params) wired together and ready for a first run in the Photos simulator.

Related

FAQ

Does this work on iOS 16?

The PHContentEditingController protocol has been available since iOS 8, so the core mechanism works back to iOS 16. However, this guide uses the @Observable macro (iOS 17+) and the #Preview macro (Xcode 15 / iOS 17+). To target iOS 16, replace @Observable with @ObservableObject / @Published and use PreviewProvider instead of #Preview.

Can I access the full Photos library from inside the extension?

No. A Photo Editing Extension is sandboxed: Photos passes you exactly one PHContentEditingInput and you write back exactly one PHContentEditingOutput. You cannot browse or fetch other assets from within the extension process. If you need broader access, ship a separate in-app editor that uses PHPickerViewController or PhotosUI in the main app target.

What is the UIKit equivalent?

The UIKit approach is identical at the protocol level — you still subclass UIViewController and conform to PHContentEditingController. The only difference is your UI: instead of UIHostingController, you lay out UIImageView and UISlider directly using Auto Layout. The rendering and output writing code is identical between UIKit and SwiftUI-hosted implementations.

Last reviewed: 2026-05-12 by the Soarias team.

```