```html How to Build a Video Editor App in SwiftUI (2026)

How to Build a Video Editor App in SwiftUI

A video editor app lets users import clips from their photo library, trim and reorder footage on a timeline, apply effects, and export a finished video — all on-device without sending footage to a server. This guide is for iOS developers comfortable with Swift who want to ship a serious, professional-grade video editing tool to the App Store.

iOS 17+ · Xcode 16+ · SwiftUI Complexity: Advanced Estimated time: 2–4 weeks Updated: May 12, 2026

Prerequisites

Architecture overview

The app follows a layered architecture: SwiftData stores project metadata and clip descriptors (URLs, trim ranges, applied effect identifiers) while the actual video bytes live on disk in the app's Documents directory. A CompositionEngine actor owns all AVFoundation work — building AVMutableComposition objects, evaluating AVVideoComposition for effects, and running exports — keeping the main thread free. SwiftUI views observe a lightweight @Observable ProjectStore that mediates between SwiftData and the engine; the timeline scrubber talks directly to an AVPlayer instance held in the store. PhotosUI's PhotosPicker handles import and writes copies to Documents so the app owns the files independently of the photo library.

VideoEditorApp/
├── App/
│   ├── VideoEditorApp.swift          # @main, .modelContainer
│   └── AppContainer.swift            # DI: CompositionEngine, ProjectStore
├── Models/
│   ├── VideoProject.swift            # @Model: title, clips, createdAt
│   └── VideoClip.swift               # @Model: url, trimStart, trimEnd, effectID
├── Engine/
│   ├── CompositionEngine.swift       # actor: build, preview, export
│   ├── EffectProcessor.swift         # CIFilter pipeline per clip
│   └── ExportSession.swift           # wraps AVAssetExportSession
├── Store/
│   └── ProjectStore.swift            # @Observable: player, currentProject
├── Views/
│   ├── ProjectListView.swift         # CRUD list of projects
│   ├── EditorView.swift              # root editor: preview + timeline
│   ├── TimelineView.swift            # horizontal scroll, clips, playhead
│   ├── ClipThumbnailView.swift       # AVAssetImageGenerator frames
│   ├── EffectPickerView.swift        # grid of CIFilter presets
│   └── ExportProgressView.swift      # progress sheet
├── Utilities/
│   └── PHPickerBridge.swift          # UIViewControllerRepresentable
└── PrivacyInfo.xcprivacy

Step-by-step

1. Project setup and entitlements

Create a new Xcode project (SwiftUI, SwiftData storage), then add the required Info.plist keys and enable the Background Modes capability with "Audio, AirPlay, and Picture in Picture" — this lets AVPlayer keep ticking when the user backgroundsthe app during export. Also add a Documents folder entitlement so you can write exported videos outside the sandbox.

// Info.plist keys to add (Xcode Target → Info tab)
// NSPhotoLibraryUsageDescription
//   "We need access to your photo library to import video clips."
// NSPhotoLibraryAddUsageDescription
//   "We save your exported videos back to your photo library."

// VideoEditorApp.swift
import SwiftUI
import SwiftData

@main
struct VideoEditorApp: App {
    var body: some Scene {
        WindowGroup {
            ProjectListView()
        }
        .modelContainer(for: [VideoProject.self, VideoClip.self])
    }
}

2. Data model with SwiftData

Define two @Model classes. VideoProject is the top-level container; VideoClip stores a bookmark URL (so file access survives app restarts) plus the trim range encoded as two Doubles representing CMTime seconds.

// Models/VideoProject.swift
import SwiftData
import Foundation

@Model
final class VideoProject {
    var title: String
    var createdAt: Date
    @Relationship(deleteRule: .cascade) var clips: [VideoClip]

    init(title: String) {
        self.title = title
        self.createdAt = .now
        self.clips = []
    }
}

// Models/VideoClip.swift
import SwiftData
import Foundation

@Model
final class VideoClip {
    var bookmarkData: Data        // security-scoped bookmark for URL
    var trimStartSeconds: Double  // CMTime.seconds
    var trimEndSeconds: Double
    var effectIdentifier: String? // e.g. "CIColorMonochrome"
    var orderIndex: Int

    init(bookmarkData: Data, duration: Double, orderIndex: Int) {
        self.bookmarkData = bookmarkData
        self.trimStartSeconds = 0
        self.trimEndSeconds = duration
        self.effectIdentifier = nil
        self.orderIndex = orderIndex
    }

    var resolvedURL: URL? {
        var isStale = false
        return try? URL(
            resolvingBookmarkData: bookmarkData,
            options: .withoutUI,
            relativeTo: nil,
            bookmarkDataIsStale: &isStale
        )
    }
}

3. Photo library picker with PhotosUI

Wrap PHPickerViewController in a UIViewControllerRepresentable so SwiftUI can present it. On selection, copy the video file into the app's Documents directory and create a security-scoped bookmark — this is critical for persistent file access.

// Utilities/PHPickerBridge.swift
import PhotosUI
import SwiftUI
import UniformTypeIdentifiers

struct PHPickerBridge: UIViewControllerRepresentable {
    var onPick: ([URL]) -> Void

    func makeCoordinator() -> Coordinator { Coordinator(onPick: onPick) }

    func makeUIViewController(context: Context) -> PHPickerViewController {
        var config = PHPickerConfiguration(photoLibrary: .shared())
        config.filter = .videos
        config.selectionLimit = 10
        config.preferredAssetRepresentationMode = .current
        let picker = PHPickerViewController(configuration: config)
        picker.delegate = context.coordinator
        return picker
    }

    func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) {}

    final class Coordinator: NSObject, PHPickerViewControllerDelegate {
        let onPick: ([URL]) -> Void
        init(onPick: @escaping ([URL]) -> Void) { self.onPick = onPick }

        func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
            picker.dismiss(animated: true)
            guard !results.isEmpty else { return }
            let group = DispatchGroup()
            var urls: [URL] = []
            let docs = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]

            for result in results {
                guard result.itemProvider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { continue }
                group.enter()
                result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in
                    defer { group.leave() }
                    guard let url else { return }
                    let dest = docs.appendingPathComponent(UUID().uuidString + ".mov")
                    try? FileManager.default.copyItem(at: url, to: dest)
                    urls.append(dest)
                }
            }
            group.notify(queue: .main) { self.onPick(urls) }
        }
    }
}

// Usage in EditorView
// .sheet(isPresented: $showPicker) {
//     PHPickerBridge { urls in Task { await store.addClips(from: urls) } }
// }

4. Timeline UI

Build a horizontal ScrollView with thumbnail strips for each clip, a draggable playhead, and trim handles at each clip's leading and trailing edges. Use DragGesture with .simultaneous modifier so the scrubber and the scroll view coexist without stealing gesture priority from each other.

// Views/TimelineView.swift
import SwiftUI
import AVFoundation

struct TimelineView: View {
    @Binding var clips: [VideoClip]
    @Binding var playheadSeconds: Double
    let pixelsPerSecond: CGFloat = 60

    var body: some View {
        ScrollView(.horizontal, showsIndicators: false) {
            ZStack(alignment: .leading) {
                // Clip strips
                HStack(spacing: 2) {
                    ForEach(clips.sorted(by: { $0.orderIndex < $1.orderIndex })) { clip in
                        ClipThumbnailView(clip: clip, pixelsPerSecond: pixelsPerSecond)
                            .frame(
                                width: CGFloat(clip.trimEndSeconds - clip.trimStartSeconds) * pixelsPerSecond,
                                height: 64
                            )
                            .clipShape(RoundedRectangle(cornerRadius: 8))
                            .overlay(TrimHandles(clip: clip, pixelsPerSecond: pixelsPerSecond))
                    }
                }

                // Playhead
                Rectangle()
                    .fill(Color.white)
                    .frame(width: 2, height: 80)
                    .offset(x: CGFloat(playheadSeconds) * pixelsPerSecond - 1)
                    .gesture(
                        DragGesture()
                            .onChanged { value in
                                let candidate = Double(value.location.x / pixelsPerSecond)
                                playheadSeconds = max(0, candidate)
                            }
                    )
            }
            .padding(.horizontal, 16)
        }
        .frame(height: 96)
        .background(Color(white: 0.1))
    }
}

struct TrimHandles: View {
    let clip: VideoClip
    let pixelsPerSecond: CGFloat

    var body: some View {
        GeometryReader { _ in
            // Leading trim handle
            RoundedRectangle(cornerRadius: 3)
                .fill(Color.yellow)
                .frame(width: 6)
                .frame(maxHeight: .infinity)
                .frame(maxWidth: .infinity, alignment: .leading)
            // Trailing trim handle
            RoundedRectangle(cornerRadius: 3)
                .fill(Color.yellow)
                .frame(width: 6)
                .frame(maxHeight: .infinity)
                .frame(maxWidth: .infinity, alignment: .trailing)
        }
    }
}

#Preview {
    TimelineView(clips: .constant([]), playheadSeconds: .constant(0))
}

5. Clip trimming with AVFoundation

The CompositionEngine actor assembles an AVMutableComposition from all clips in order, inserting each clip's CMTimeRange that corresponds to its trim values. This composition drives both live preview through AVPlayer and the final export.

// Engine/CompositionEngine.swift
import AVFoundation
import Foundation

actor CompositionEngine {

    // Build a mutable composition from the current clip list.
    func buildComposition(from clips: [VideoClip]) async throws -> AVMutableComposition {
        let composition = AVMutableComposition()
        guard
            let videoTrack = composition.addMutableTrack(
                withMediaType: .video,
                preferredTrackID: kCMPersistentTrackID_Invalid),
            let audioTrack = composition.addMutableTrack(
                withMediaType: .audio,
                preferredTrackID: kCMPersistentTrackID_Invalid)
        else {
            throw CompositionError.trackCreationFailed
        }

        var insertCursor = CMTime.zero
        let sorted = clips.sorted { $0.orderIndex < $1.orderIndex }

        for clip in sorted {
            guard let url = clip.resolvedURL else { continue }
            let asset = AVURLAsset(url: url)

            // Load tracks asynchronously (iOS 16+ API, no deprecation warnings)
            let assetVideoTracks = try await asset.loadTracks(withMediaType: .video)
            let assetAudioTracks = try await asset.loadTracks(withMediaType: .audio)
            guard let srcVideo = assetVideoTracks.first else { continue }

            let trimStart = CMTime(seconds: clip.trimStartSeconds, preferredTimescale: 600)
            let trimEnd   = CMTime(seconds: clip.trimEndSeconds,   preferredTimescale: 600)
            let trimRange = CMTimeRange(start: trimStart, end: trimEnd)

            try videoTrack.insertTimeRange(trimRange, of: srcVideo, at: insertCursor)
            if let srcAudio = assetAudioTracks.first {
                try audioTrack.insertTimeRange(trimRange, of: srcAudio, at: insertCursor)
            }

            let clipDuration = CMTime(
                seconds: clip.trimEndSeconds - clip.trimStartSeconds,
                preferredTimescale: 600
            )
            insertCursor = CMTimeAdd(insertCursor, clipDuration)
        }

        return composition
    }

    enum CompositionError: Error {
        case trackCreationFailed
    }
}

6. Effects pipeline with AVVideoComposition

Apply per-clip CIFilter effects by implementing a custom AVVideoCompositing class. The compositor receives a AVAsynchronousVideoCompositionRequest for every frame, looks up which clip segment is active, applies the matching CIFilter chain, and calls finish(with:). Register it on an AVMutableVideoComposition that you attach to your player item and export session.

// Engine/EffectProcessor.swift
import AVFoundation
import CoreImage

// Map of clip time ranges → CIFilter identifier
struct ClipEffect {
    let timeRange: CMTimeRange
    let filterName: String?
}

final class VideoEffectCompositor: NSObject, AVVideoCompositing {
    var sourcePixelBufferAttributes: [String: Any]? = [
        kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA
    ]
    var requiredPixelBufferAttributesForRenderContext: [String: Any] = [
        kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA
    ]

    private let ciContext = CIContext()
    var clipEffects: [ClipEffect] = []

    func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {}

    func startRequest(_ request: AVAsynchronousVideoCompositionRequest) {
        guard
            let sourceBuffer = request.sourceFrame(byTrackID: request.sourceTrackIDs[0].int32Value),
            let outputBuffer = request.renderContext.newPixelBuffer()
        else {
            request.finish(with: CompositorError.missingBuffer)
            return
        }

        let time = request.compositionTime
        var image = CIImage(cvPixelBuffer: sourceBuffer)

        // Find the active clip effect for this frame's timestamp
        if let effect = clipEffects.first(where: { CMTimeRangeContainsTime($0.timeRange, time: time) }),
           let filterName = effect.filterName,
           let filter = CIFilter(name: filterName) {
            filter.setValue(image, forKey: kCIInputImageKey)
            if let output = filter.outputImage {
                image = output
            }
        }

        ciContext.render(image, to: outputBuffer)
        request.finish(withComposedVideoFrame: outputBuffer)
    }

    enum CompositorError: Error { case missingBuffer }
}

// Attach to an AVPlayerItem in ProjectStore:
// let videoComp = AVMutableVideoComposition(asset: composition, applyingCIFiltersWithHandler: nil)
// videoComp.customVideoCompositorClass = VideoEffectCompositor.self
// playerItem = AVPlayerItem(asset: composition)
// playerItem.videoComposition = videoComp

7. Export with AVAssetExportSession

Wrap AVAssetExportSession in an AsyncStream so you can stream progress updates back to the UI. After export completes, save the output file to the photo library using PHPhotoLibrary.shared().performChanges.

// Engine/ExportSession.swift
import AVFoundation
import Photos

actor ExportSession {

    func export(
        composition: AVMutableComposition,
        videoComposition: AVVideoComposition,
        outputURL: URL
    ) -> AsyncStream {
        AsyncStream { continuation in
            guard let session = AVAssetExportSession(
                asset: composition,
                presetName: AVAssetExportPresetHighestQuality
            ) else {
                continuation.finish()
                return
            }

            session.videoComposition = videoComposition
            session.outputURL = outputURL
            session.outputFileType = .mp4
            session.shouldOptimizeForNetworkUse = true

            let timer = Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { _ in
                continuation.yield(Double(session.progress))
            }

            session.exportAsynchronously {
                timer.invalidate()
                continuation.yield(1.0)
                continuation.finish()

                if session.status == .completed {
                    PHPhotoLibrary.shared().performChanges {
                        PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL)
                    } completionHandler: { _, _ in }
                }
            }
        }
    }
}

// In EditorView, observe progress:
// for await progress in await exportSession.export(...) {
//     await MainActor.run { exportProgress = progress }
// }

8. Testing on device

Write unit tests that inject known AVAssets (short bundled test clips) into CompositionEngine and assert the resulting composition duration matches expected trimmed lengths. For the trim gesture, write a UI test that drags the trailing handle and verifies the clip's trimEndSeconds updates in the model. Always run these on a physical device — the Simulator drops frames silently in AVFoundation tests.

// VideoEditorTests/CompositionEngineTests.swift
import Testing
import AVFoundation
@testable import VideoEditor

@Suite("Composition Engine")
struct CompositionEngineTests {

    @Test("trimmed composition has correct duration")
    func trimmedDuration() async throws {
        // Bundle a 5-second test clip (Resources/test_clip_5s.mov)
        let url = Bundle(for: CompositionEngineTests.self)
            .url(forResource: "test_clip_5s", withExtension: "mov")!

        let bookmark = try url.bookmarkData()
        let clip = VideoClip(bookmarkData: bookmark, duration: 5.0, orderIndex: 0)
        clip.trimStartSeconds = 1.0
        clip.trimEndSeconds = 3.5 // 2.5 seconds of active content

        let engine = CompositionEngine()
        let composition = try await engine.buildComposition(from: [clip])

        let duration = try await composition.load(.duration)
        #expect(abs(duration.seconds - 2.5) < 0.05)
    }
}

9. Privacy Manifest (required for App Store)

Add a PrivacyInfo.xcprivacy file to your app target. Video editors must declare photo library access and, if you use UserDefaults for settings or FileManager modification dates for sorting, those required reason APIs too. Missing this file will cause App Store Connect to reject your binary at validation.

<!-- PrivacyInfo.xcprivacy -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
  "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
  <key>NSPrivacyTracking</key><false/>
  <key>NSPrivacyTrackingDomains</key><array/>
  <key>NSPrivacyCollectedDataTypes</key>
  <array>
    <dict>
      <key>NSPrivacyCollectedDataType</key>
      <string>NSPrivacyCollectedDataTypePhotosorVideos</string>
      <key>NSPrivacyCollectedDataTypeLinked</key><false/>
      <key>NSPrivacyCollectedDataTypeTracking</key><false/>
      <key>NSPrivacyCollectedDataTypePurposes</key>
      <array>
        <string>NSPrivacyCollectedDataTypePurposeAppFunctionality</string>
      </array>
    </dict>
  </array>
  <key>NSPrivacyAccessedAPITypes</key>
  <array>
    <dict>
      <key>NSPrivacyAccessedAPIType</key>
      <string>NSPrivacyAccessedAPICategoryFileTimestamp</string>
      <key>NSPrivacyAccessedAPITypeReasons</key>
      <array><string>C617.1</string></array>
    </dict>
    <dict>
      <key>NSPrivacyAccessedAPIType</key>
      <string>NSPrivacyAccessedAPICategoryUserDefaults</string>
      <key>NSPrivacyAccessedAPITypeReasons</key>
      <array><string>CA92.1</string></array>
    </dict>
  </array>
</dict>
</plist>

Common pitfalls

Adding monetization: Subscription

Use StoreKit 2's Product.products(for:) API to fetch your subscription offerings — a monthly and annual tier works well for video apps. Gate premium effects, 4K export, and watermark removal behind a subscription check using Transaction.currentEntitlements, which automatically handles family sharing, billing grace periods, and subscription renewals. Present your paywall with a native SwiftUI sheet using SubscriptionStoreView (iOS 17+), which Apple pre-approves in design guidelines and reduces review friction. Always implement SK2's Transaction.updates listener in your App struct to handle renewals and revocations in real time. For pricing, a $3.99/month or $24.99/year model is common in the video editor category; avoid a free trial shorter than seven days — reviewers test against it and many users need a weekend to evaluate editing apps.

Shipping this faster with Soarias

Soarias automates the parts of an advanced project where most developers lose days to paperwork. It scaffolds the full file tree above — including the CompositionEngine actor stub, PhotosUI bridge, and SwiftData model — from a prompt describing your app. It generates and validates your PrivacyInfo.xcprivacy against the current App Store Connect required-reason API list so you won't get a binary rejection on upload. It configures fastlane match for certificate management and deliver for metadata, screenshots (via snapshot), and ASC submission in one command. StoreKit 2 subscription product identifiers are wired into the scaffolded paywall view automatically.

For an advanced app at this complexity level, the scaffolding, Privacy Manifest, and fastlane setup alone typically cost two to three days of focused work. With Soarias, those steps collapse to under an hour, letting you spend your 2–4 week budget on the hard creative problems: the trim gesture feel, effect quality, and export performance — the things that actually earn five-star reviews.

Related guides

FAQ

Does this work on iOS 16?

The architecture targets iOS 17+ because @Observable, SubscriptionStoreView, and several AVFoundation async load APIs require iOS 16 at minimum with the async load(_:) form — but SubscriptionStoreView is iOS 17 only. If you need iOS 16 support, replace SubscriptionStoreView with a custom StoreKit 2 paywall and drop @Observable in favour of @ObservableObject. The AVFoundation code in this guide compiles back to iOS 16 without changes.

Do I need a paid Apple Developer account to test?

You need a free Apple ID to sideload onto your own device via Xcode. However, AVFoundation's video composition and export pipeline behaves differently in the Simulator, so you will need a physical device early. TestFlight and App Store submission require the $99/year Apple Developer Program membership. StoreKit 2 sandbox testing also requires a paid membership to create test accounts in App Store Connect.

How do I add this to the App Store?

Archive your app in Xcode (Product → Archive), validate the archive, then upload via Xcode Organizer or xcrun altool. In App Store Connect, create a new app record, complete the required metadata (description, screenshots for all device sizes, privacy nutrition labels), attach your binary, and submit for review. Review times for video apps average two to four days. Ensure you have the photo library add permission string and your Privacy Manifest in place before submitting — these are the two most common rejection reasons for media apps.

How do I keep the effects pipeline from dropping frames during live preview?

For live preview at 30fps, keep your CIFilter chain under three filters per clip and avoid CPU-based rendering paths — pass CIContextOption.useSoftwareRenderer: false when creating your CIContext to ensure Metal is used. For complex effects that can't hit 30fps in real time, show a "processing" indicator during scrubbing and render a proxy resolution (720p) for preview, saving full-resolution rendering for export only. Profile with Instruments' Metal System Trace template to find frame budget overruns before submission.

Last reviewed: 2026-05-12 by the Soarias team.

```