How to Build a Photo Editor App in SwiftUI

A photo editor app lets users import images from their library, apply preset artistic filters, and fine-tune brightness, contrast, saturation, and sharpness before saving back to Photos. This guide is for Swift developers who want to ship a polished, App Store–ready image editing tool using PhotosUI, CoreImage, and SwiftData — no third-party dependencies required.

iOS 17+ · Xcode 16+ · SwiftUI Complexity: Intermediate Estimated time: 1 week Updated: May 12, 2026

Prerequisites

Architecture overview

The app is split into three layers. The persistence layer uses SwiftData's @Model macro to store edit sessions, recording which PHAsset was edited (by its localIdentifier) alongside the filter name and per-slider values. The filter pipeline lives in a plain Swift struct that chains CIFilter objects — a preset artistic filter, then CIColorControls, then CISharpenLuminance — into a single GPU pass by deferring GPU execution until CIContext.createCGImage is called. The view layer is driven by an @Observable view model that loads the full-resolution UIImage from PHImageManager on a background task and triggers re-renders whenever adjustment sliders change.

PhotoEditorApp/
├── PhotoEditorApp.swift           # @main, .modelContainer(for: EditSession.self)
├── Models/
│   └── EditSession.swift          # @Model — assetID, filterName, b/c/s/sharpness
├── ViewModels/
│   └── EditorViewModel.swift      # @Observable — loads asset, runs pipeline
├── Views/
│   ├── ContentView.swift          # NavigationStack + LazyVGrid of sessions
│   ├── EditorView.swift           # Full-screen canvas + toolbar Save button
│   ├── FilterStripView.swift      # Horizontal scroll of preset filter chips
│   └── AdjustmentPanel.swift      # Sliders for brightness/contrast/saturation/sharpness
├── Filters/
│   └── CIFilterPipeline.swift     # Pure func: CIImage in → UIImage out
└── PrivacyInfo.xcprivacy          # Required for App Store — declares Photos access

Step-by-step

1. Set up the Xcode project

Create a new SwiftUI App in Xcode 16. In Signing & Capabilities, add the Photos capability. Add two keys to Info.plist: NSPhotoLibraryUsageDescription (for read access) and NSPhotoLibraryAddUsageDescription (for saving back). Missing either key causes a silent authorization failure at runtime and an App Store rejection at upload.

// PhotoEditorApp.swift
import SwiftUI
import SwiftData

@main
struct PhotoEditorApp: App {
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
        .modelContainer(for: EditSession.self)
    }
}

// Info.plist — add both keys (Xcode target editor → Info tab):
// Key: NSPhotoLibraryUsageDescription
// Value: "Choose a photo to edit."
//
// Key: NSPhotoLibraryAddUsageDescription
// Value: "Save your edited photo back to your library."

2. Define the data model

Store the PHAsset localIdentifier rather than raw image data — this keeps the SwiftData store lean and lets you re-fetch at full resolution on demand. Each EditSession records which filter and which slider values were applied so users can revisit and tweak past edits.

// Models/EditSession.swift
import SwiftData
import Foundation

@Model
final class EditSession {
    var assetLocalIdentifier: String
    var filterName: String     // CIFilter name, or "none" for no preset
    var brightness: Double     // -1.0 … 1.0  (CIColorControls default: 0)
    var contrast: Double       // 0.25 … 4.0  (CIColorControls default: 1)
    var saturation: Double     // 0.0 … 2.0   (CIColorControls default: 1)
    var sharpness: Double      // 0.0 … 2.0   (CISharpenLuminance default: 0)
    var createdAt: Date

    init(
        assetLocalIdentifier: String,
        filterName: String = "none",
        brightness: Double = 0,
        contrast: Double = 1,
        saturation: Double = 1,
        sharpness: Double = 0
    ) {
        self.assetLocalIdentifier = assetLocalIdentifier
        self.filterName = filterName
        self.brightness = brightness
        self.contrast = contrast
        self.saturation = saturation
        self.sharpness = sharpness
        self.createdAt = .now
    }
}

3. Build the core photo grid UI

The root view presents a LazyVGrid of past edit sessions and a PhotosPicker in the toolbar to start a new edit. NavigationStack with a .navigationDestination pushes to the editor — no deprecated NavigationLink(destination:isActive:) needed.

// Views/ContentView.swift
import SwiftUI
import SwiftData
import PhotosUI

struct ContentView: View {
    @Environment(\.modelContext) private var modelContext
    @Query(sort: \EditSession.createdAt, order: .reverse)
    private var sessions: [EditSession]

    @State private var pickerItem: PhotosPickerItem?

    var body: some View {
        NavigationStack {
            Group {
                if sessions.isEmpty {
                    ContentUnavailableView(
                        "No edits yet",
                        systemImage: "photo.on.rectangle.angled",
                        description: Text("Tap + to import a photo.")
                    )
                } else {
                    ScrollView {
                        LazyVGrid(
                            columns: [GridItem(.adaptive(minimum: 110))],
                            spacing: 4
                        ) {
                            ForEach(sessions) { session in
                                NavigationLink(value: session) {
                                    Rectangle()
                                        .fill(Color(.systemGray5))
                                        .aspectRatio(1, contentMode: .fit)
                                        .overlay(
                                            Text(session.filterName == "none"
                                                 ? "Original" : session.filterName)
                                                .font(.caption2)
                                                .foregroundStyle(.secondary)
                                        )
                                }
                            }
                        }
                        .padding(4)
                    }
                }
            }
            .navigationTitle("Photo Editor")
            .toolbar {
                ToolbarItem(placement: .primaryAction) {
                    PhotosPicker(selection: $pickerItem, matching: .images) {
                        Label("Import", systemImage: "plus")
                    }
                }
            }
            .navigationDestination(for: EditSession.self) { session in
                EditorView(session: session)
            }
            .onChange(of: pickerItem) { _, item in
                guard let item else { return }
                Task { await createSession(from: item) }
            }
        }
    }

    private func createSession(from item: PhotosPickerItem) async {
        // itemIdentifier maps directly to PHAsset.localIdentifier (iOS 17+)
        guard let id = item.itemIdentifier else { return }
        let session = EditSession(assetLocalIdentifier: id)
        modelContext.insert(session)
        pickerItem = nil
    }
}

#Preview {
    ContentView()
        .modelContainer(for: EditSession.self, inMemory: true)
}

4. Load the full-resolution asset

Fetch the full-resolution UIImage from PHImageManager inside the @Observable view model. Wrap the callback-based API in withCheckedContinuation to bridge it cleanly into async/await. Create the CIContext once and reuse it — allocating a new context per render creates a new GPU command queue each time and is a common performance regression.

// ViewModels/EditorViewModel.swift
import SwiftUI
import Photos
import CoreImage
import CoreImage.CIFilterBuiltins

@Observable
final class EditorViewModel {
    var sourceImage: UIImage?
    var renderedImage: UIImage?
    var isRendering = false

    // Allocate once — reused across all renders
    private let ciContext = CIContext(options: [.useSoftwareRenderer: false])

    func loadAsset(localIdentifier: String) async {
        let result = PHAsset.fetchAssets(
            withLocalIdentifiers: [localIdentifier],
            options: nil
        )
        guard let asset = result.firstObject else { return }

        let options = PHImageRequestOptions()
        options.deliveryMode = .highQualityFormat
        options.isNetworkAccessAllowed = true

        let image: UIImage? = await withCheckedContinuation { continuation in
            PHImageManager.default().requestImage(
                for: asset,
                targetSize: PHImageManagerMaximumSize,
                contentMode: .aspectFit,
                options: options
            ) { image, info in
                // Skip the degraded thumbnail delivered first
                let isDegraded = (info?[PHImageResultIsDegradedKey] as? Bool) ?? false
                if !isDegraded {
                    continuation.resume(returning: image)
                }
            }
        }
        sourceImage = image
    }

    func renderPreview(session: EditSession) async {
        guard let src = sourceImage else { return }
        isRendering = true
        // Run the GPU-bound pipeline off the main thread
        let ctx = ciContext
        let result = await Task.detached(priority: .userInitiated) {
            CIFilterPipeline.apply(
                to: src,
                presetFilterName: session.filterName,
                brightness: session.brightness,
                contrast: session.contrast,
                saturation: session.saturation,
                sharpness: session.sharpness,
                in: ctx
            )
        }.value
        renderedImage = result
        isRendering = false
    }
}

5. Implement filters and adjustments (core feature)

Build the CoreImage filter chain as a pure function: input a UIImage, output a UIImage. Chaining is essentially free — CoreImage compiles the entire graph into a single GPU pass and only executes it when createCGImage is called. Offer a set of preset filter names (Chrome, Fade, Instant, Noir, Transfer) and expose them in a horizontal strip below the canvas.

// Filters/CIFilterPipeline.swift
import CoreImage
import CoreImage.CIFilterBuiltins
import UIKit

enum FilterPipeline {
    static let presets: [(name: String, ciName: String)] = [
        ("None",     "none"),
        ("Chrome",   "CIPhotoEffectChrome"),
        ("Fade",     "CIPhotoEffectFade"),
        ("Instant",  "CIPhotoEffectInstant"),
        ("Noir",     "CIPhotoEffectNoir"),
        ("Transfer", "CIPhotoEffectTransfer"),
    ]

    static func apply(
        to source: UIImage,
        presetFilterName: String,
        brightness: Double,
        contrast: Double,
        saturation: Double,
        sharpness: Double,
        in context: CIContext
    ) -> UIImage? {
        guard var ciImage = CIImage(image: source) else { return nil }

        // Stage 1 — Preset artistic filter
        if presetFilterName != "none",
           let preset = CIFilter(name: presetFilterName) {
            preset.setValue(ciImage, forKey: kCIInputImageKey)
            if let out = preset.outputImage { ciImage = out }
        }

        // Stage 2 — Brightness / contrast / saturation
        let colorControls = CIFilter.colorControls()
        colorControls.inputImage  = ciImage
        colorControls.brightness  = Float(brightness)
        colorControls.contrast    = Float(contrast)
        colorControls.saturation  = Float(saturation)
        guard let colorOut = colorControls.outputImage else { return nil }
        ciImage = colorOut

        // Stage 3 — Sharpness (skip when 0 to avoid no-op GPU work)
        if sharpness > 0 {
            let sharpen = CIFilter.sharpenLuminance()
            sharpen.inputImage = ciImage
            sharpen.sharpness  = Float(sharpness)
            if let sharpOut = sharpen.outputImage { ciImage = sharpOut }
        }

        guard let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else {
            return nil
        }
        return UIImage(
            cgImage: cgImage,
            scale: source.scale,
            orientation: source.imageOrientation
        )
    }
}

// Views/FilterStripView.swift
import SwiftUI

struct FilterStripView: View {
    @Binding var selectedCIName: String
    let onSelect: () -> Void

    var body: some View {
        ScrollView(.horizontal, showsIndicators: false) {
            HStack(spacing: 10) {
                ForEach(FilterPipeline.presets, id: \.ciName) { preset in
                    Button(preset.name) {
                        selectedCIName = preset.ciName
                        onSelect()
                    }
                    .buttonStyle(.bordered)
                    .tint(selectedCIName == preset.ciName ? .indigo : .secondary)
                    .controlSize(.small)
                }
            }
            .padding(.horizontal)
        }
    }
}

6. Export the edited photo and build the editor view

Wire the view model, filter strip, and adjustment sliders together in EditorView. Save with PHPhotoLibrary.shared().performChanges — always behind an explicit user-tapped Save button. Apple's review team has flagged apps that write to the photo library silently without a clear user action.

// Views/EditorView.swift
import SwiftUI
import Photos

struct EditorView: View {
    @Bindable var session: EditSession
    @State private var vm = EditorViewModel()
    @State private var showSavedAlert = false
    @State private var saveError: String?

    var body: some View {
        VStack(spacing: 0) {
            // Canvas
            ZStack {
                Color.black.ignoresSafeArea()
                if let img = vm.renderedImage {
                    Image(uiImage: img)
                        .resizable()
                        .scaledToFit()
                } else {
                    ProgressView().tint(.white)
                }
                if vm.isRendering {
                    ProgressView().tint(.white)
                }
            }
            .frame(maxWidth: .infinity, maxHeight: .infinity)

            // Filter strip
            FilterStripView(selectedCIName: $session.filterName) {
                Task { await vm.renderPreview(session: session) }
            }
            .padding(.vertical, 8)
            .background(.ultraThinMaterial)

            // Adjustment sliders
            AdjustmentPanel(session: session) {
                Task { await vm.renderPreview(session: session) }
            }
        }
        .navigationTitle("Edit")
        .navigationBarTitleDisplayMode(.inline)
        .toolbarBackground(.black, for: .navigationBar)
        .toolbarColorScheme(.dark, for: .navigationBar)
        .toolbar {
            ToolbarItem(placement: .primaryAction) {
                Button("Save") { Task { await savePhoto() } }
                    .disabled(vm.renderedImage == nil || vm.isRendering)
            }
        }
        .task {
            await vm.loadAsset(localIdentifier: session.assetLocalIdentifier)
            await vm.renderPreview(session: session)
        }
        .alert("Saved to Photos!", isPresented: $showSavedAlert) {
            Button("OK", role: .cancel) {}
        }
        .alert(
            "Save failed",
            isPresented: Binding(get: { saveError != nil }, set: { if !$0 { saveError = nil } })
        ) {
            Button("OK", role: .cancel) {}
        } message: {
            Text(saveError ?? "")
        }
    }

    private func savePhoto() async {
        guard let image = vm.renderedImage else { return }
        do {
            try await PHPhotoLibrary.shared().performChanges {
                PHAssetChangeRequest.creationRequestForAsset(from: image)
            }
            showSavedAlert = true
        } catch {
            saveError = error.localizedDescription
        }
    }
}

// Views/AdjustmentPanel.swift
import SwiftUI

struct AdjustmentPanel: View {
    @Bindable var session: EditSession
    let onChange: () -> Void

    var body: some View {
        VStack(spacing: 10) {
            AdjustmentSlider(label: "Brightness", value: $session.brightness,
                             range: -1...1,    onChange: onChange)
            AdjustmentSlider(label: "Contrast",   value: $session.contrast,
                             range: 0.25...4,  onChange: onChange)
            AdjustmentSlider(label: "Saturation", value: $session.saturation,
                             range: 0...2,     onChange: onChange)
            AdjustmentSlider(label: "Sharpness",  value: $session.sharpness,
                             range: 0...2,     onChange: onChange)
        }
        .padding()
        .background(.regularMaterial)
    }
}

struct AdjustmentSlider: View {
    let label: String
    @Binding var value: Double
    let range: ClosedRange
    let onChange: () -> Void

    var body: some View {
        HStack {
            Text(label)
                .font(.caption)
                .foregroundStyle(.secondary)
                .frame(width: 82, alignment: .leading)
            Slider(value: $value, in: range) { editing in
                if !editing { onChange() }
            }
            Text(String(format: "%.2f", value))
                .font(.caption2.monospacedDigit())
                .foregroundStyle(.tertiary)
                .frame(width: 40, alignment: .trailing)
        }
    }
}

#Preview {
    NavigationStack {
        EditorView(session: EditSession(assetLocalIdentifier: "preview-id"))
    }
}

7. Add a Privacy Manifest

Since Spring 2024, App Store Connect rejects any binary that accesses Photos without a PrivacyInfo.xcprivacy file in the app target. In Xcode, go to File → New → File → App Privacy to create it, then add it to your app target membership. Fill in the reason code 1.1 for user-initiated photo library access. Declare no data collection if your app doesn't transmit images to a server.

<!-- PrivacyInfo.xcprivacy — add this file to the app target, not an extension -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
  "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>

  <key>NSPrivacyAccessedAPITypes</key>
  <array>
    <dict>
      <key>NSPrivacyAccessedAPIType</key>
      <string>NSPrivacyAccessedAPICategoryPhotoLibrary</string>
      <key>NSPrivacyAccessedAPITypeReasons</key>
      <array>
        <!-- 1.1: App accesses the photo library upon explicit user request -->
        <string>1.1</string>
      </array>
    </dict>
  </array>

  <!-- No data is collected or transmitted externally -->
  <key>NSPrivacyCollectedDataTypes</key>
  <array/>

  <key>NSPrivacyTracking</key>
  <false/>

</dict>
</plist>

Common pitfalls

Adding monetization: One-time purchase

Implement a StoreKit 2 non-consumable product to unlock premium filter presets or the full adjustment panel. Define the product in App Store Connect under your app's In-App Purchases tab (type: Non-Consumable), then use Product.products(for:) to fetch it at launch and product.purchase() when the user taps an upgrade button. On app start, call Transaction.currentEntitlements — an async sequence — to restore entitlement without requiring the user to tap "Restore Purchases" manually. Store the result in a lightweight @Observable store manager and read it wherever you need to gate UI. Because there is no subscription renewal to handle, you only need to process .purchased and .restored transaction states, which keeps the StoreKit surface area small and makes App Store review straightforward.

Shipping this faster with Soarias

Soarias automates the most time-consuming non-coding tasks in this guide. It scaffolds the Xcode project with the correct entitlements, both NSPhotoLibrary usage description keys pre-filled, and the PrivacyInfo.xcprivacy file already embedded in the target. It wires up fastlane with deliver and snapshot configured for your bundle ID, generates App Store screenshots at every required resolution (6.9" and 6.1" are mandatory in 2026), and uploads all metadata — app description, keywords, age rating, support URL, and review notes — directly to App Store Connect without you opening a browser.

For an intermediate project like this photo editor, setup tasks typically consume 3–5 hours: project configuration, Privacy Manifest research, fastlane lane authoring, and ASC metadata entry. Soarias compresses that to under 30 minutes on first run, and subsequent re-submissions take a few minutes from your terminal. That leaves the rest of the week's budget for the work that actually differentiates your app — filter quality, UI transitions, and the preset library that keeps users coming back.

Related guides

FAQ

Does this work on iOS 16?

The guide uses @Observable (iOS 17+) and PhotosPickerItem.itemIdentifier (iOS 17+). Backporting to iOS 16 requires replacing @Observable with ObservableObject and @Published, and adopting a different technique to map a PhotosPickerItem to a PHAsset — doable but adds meaningful boilerplate. With iOS 17+ adoption above 90% as of 2026, targeting iOS 17 minimum is the practical choice for new submissions.

Do I need a paid Apple Developer account to test?

You can sideload the app on a personal device with a free Apple ID, but you cannot distribute via TestFlight, submit to the App Store, or test real StoreKit transactions without the $99/year Apple Developer Program. Use a Products.storekit configuration file (File → New → File → StoreKit Configuration) for local in-app purchase testing in the Simulator — no live ASC product needed during development.

How do I add this to the App Store?

Archive the app in Xcode (Product → Archive), then open the Organizer and click Distribute App → App Store Connect. After upload, complete the metadata in App Store Connect: screenshots at 6.9" and 6.1" (mandatory since 2026), privacy nutrition labels, age rating questionnaire, and a support URL. First submissions typically take 24–48 hours for review; updates to existing apps are often faster.

Can I apply multiple preset filters in sequence without a performance penalty?

Yes — this is one of CoreImage's key design strengths. Chaining additional CIFilter objects before calling createCGImage adds negligible cost because CoreImage defers all GPU work until that final call, then compiles the entire filter graph into a single optimized GPU pass. The performance cost you're paying is always one createCGImage call, regardless of how many filters are in the chain. Only call createCGImage more often than necessary if you genuinely need intermediate rasters.

Last reviewed: 2026-05-12 by the Soarias team.