How to Build a Photo Collage App in SwiftUI
A Photo Collage app lets users arrange multiple photos into a single composed image using preset layout templates — ideal for sharing memories on social media or printing. This guide is for iOS developers who want to ship a polished collage maker to the App Store using SwiftUI, PhotosUI, and SwiftData.
Prerequisites
- Mac with Xcode 16+ installed
- Apple Developer Program ($99/year) — required for TestFlight and App Store submission
- Basic Swift/SwiftUI knowledge — familiarity with
@State,@Binding, and view composition - A physical iPhone or iPad for testing photo library access (Simulator has limited PhotosUI support)
- NSPhotoLibraryUsageDescription and NSPhotoLibraryAddUsageDescription entries in Info.plist — required before first build or App Store review will reject you
Architecture overview
The app uses SwiftData as the persistence layer, storing Collage and CollageItem model objects that encode photo data, position, size, and rotation. The UI layer is split into a list view (browse saved collages), an editor view (a ZStack canvas where items are dragged and resized), and a layout picker sheet. PhotosUI handles photo selection; ImageRenderer converts the live SwiftUI canvas into a UIImage for export. There are no external network calls — everything runs fully on-device.
CollageApp/ ├── CollageApp.swift ← @main, .modelContainer ├── Models/ │ ├── Collage.swift ← @Model (title, layout, date) │ └── CollageItem.swift ← @Model (imageData, x, y, w, h, rotation) ├── Views/ │ ├── CollageListView.swift ← browse / create │ ├── CollageEditorView.swift ← main editor shell + toolbar │ ├── LayoutPickerView.swift ← template grid sheet │ └── ExportButton.swift ← ImageRenderer + ShareLink ├── Canvas/ │ ├── CollageCanvasView.swift ← ZStack renderer │ ├── CollageItemView.swift ← per-photo drag/rotate tile │ └── LayoutTemplate.swift ← frame math per template ├── Store/ │ └── StoreManager.swift ← StoreKit 2 one-time purchase └── PrivacyInfo.xcprivacy
Step-by-step
1. Project setup
Create a new Xcode project using the iOS App template, select SwiftUI for the interface and SwiftData for storage. Add NSPhotoLibraryUsageDescription and NSPhotoLibraryAddUsageDescription to your Info.plist immediately — missing these strings will crash on the first photo access before you even open the picker.
// CollageApp.swift
import SwiftUI
import SwiftData
@main
struct CollageApp: App {
var body: some Scene {
WindowGroup {
CollageListView()
}
.modelContainer(for: [Collage.self, CollageItem.self])
}
}
2. Data model with SwiftData
Two @Model classes capture everything needed to reconstruct a collage across app launches. Image data is stored as raw Data on each CollageItem; for large libraries you may want to store file URLs instead, but inline Data keeps the implementation simple and crash-free at intermediate scale.
// Models/Collage.swift
import SwiftData
import Foundation
@Model
final class Collage {
var id: UUID
var title: String
var createdAt: Date
var layoutTemplate: String // e.g. "grid2x2"
@Relationship(deleteRule: .cascade) var items: [CollageItem]
init(title: String = "New Collage", layoutTemplate: String = "grid2x2") {
self.id = UUID()
self.title = title
self.createdAt = Date()
self.layoutTemplate = layoutTemplate
self.items = []
}
}
// Models/CollageItem.swift
@Model
final class CollageItem {
var id: UUID
var imageData: Data?
var xPosition: Double
var yPosition: Double
var width: Double
var height: Double
var rotation: Double // degrees
var zIndex: Int
init(frame: CGRect) {
self.id = UUID()
self.xPosition = frame.origin.x
self.yPosition = frame.origin.y
self.width = frame.width
self.height = frame.height
self.rotation = 0
self.zIndex = 0
}
}
3. Photo picker integration
Use PhotosPicker from PhotosUI with maxSelectionCount set to match the chosen template's slot count. Load each PhotosPickerItem as Data via the Transferable API — this is the modern, permission-respecting path that avoids the deprecated UIImagePickerController.
// Views/CollageEditorView.swift (photo picker section)
import SwiftUI
import PhotosUI
struct PhotoPickerButton: View {
let maxCount: Int
let onImagesLoaded: ([UIImage]) -> Void
@State private var selectedItems: [PhotosPickerItem] = []
var body: some View {
PhotosPicker(
selection: $selectedItems,
maxSelectionCount: maxCount,
matching: .images
) {
Label("Add Photos", systemImage: "photo.badge.plus")
.font(.subheadline.weight(.medium))
}
.onChange(of: selectedItems) { _, newItems in
Task {
var images: [UIImage] = []
for item in newItems {
if let data = try? await item.loadTransferable(type: Data.self),
let image = UIImage(data: data) {
images.append(image)
}
}
onImagesLoaded(images)
selectedItems = [] // reset so picker re-opens cleanly
}
}
}
}
#Preview {
PhotoPickerButton(maxCount: 4) { _ in }
}
4. Canvas layout engine
The canvas is a fixed-size ZStack that renders each CollageItem as a draggable, rotatable photo tile. Using .position() instead of .offset() makes coordinate math straightforward — items store their centre point as absolute canvas coordinates, not deltas.
// Canvas/CollageCanvasView.swift
import SwiftUI
struct CollageCanvasView: View {
let items: [CollageItem]
let canvasSize: CGSize
@Binding var selectedID: UUID?
let onCommitMove: (CollageItem, CGPoint) -> Void
var body: some View {
ZStack {
Color.white
ForEach(
items.sorted { $0.zIndex < $1.zIndex },
id: \.id
) { item in
CollageItemTile(item: item, isSelected: selectedID == item.id)
.onTapGesture { selectedID = item.id }
.gesture(dragGesture(for: item))
}
}
.frame(width: canvasSize.width, height: canvasSize.height)
.clipped()
}
private func dragGesture(for item: CollageItem) -> some Gesture {
DragGesture()
.onEnded { value in
let newCenter = CGPoint(
x: item.xPosition + item.width / 2 + value.translation.width,
y: item.yPosition + item.height / 2 + value.translation.height
)
onCommitMove(item, newCenter)
}
}
}
// Canvas/CollageItemTile.swift
struct CollageItemTile: View {
let item: CollageItem
let isSelected: Bool
var body: some View {
ZStack {
if let data = item.imageData, let ui = UIImage(data: data) {
Image(uiImage: ui)
.resizable()
.scaledToFill()
} else {
Rectangle()
.fill(Color.gray.opacity(0.25))
.overlay {
Image(systemName: "photo")
.foregroundStyle(.gray)
}
}
}
.frame(width: item.width, height: item.height)
.clipped()
.rotationEffect(.degrees(item.rotation))
.overlay {
if isSelected {
RoundedRectangle(cornerRadius: 2)
.stroke(Color.accentColor, lineWidth: 2)
}
}
.position(
x: item.xPosition + item.width / 2,
y: item.yPosition + item.height / 2
)
}
}
#Preview {
CollageCanvasView(
items: [],
canvasSize: CGSize(width: 400, height: 400),
selectedID: .constant(nil),
onCommitMove: { _, _ in }
)
.border(Color.gray)
}
5. Layout templates
Named templates produce pre-calculated CGRect frames for each photo slot, so when a user taps "Magazine" the existing items snap to the new positions automatically. Adding a new template is a single case in the enum plus a frames(in:) implementation — no other code changes required.
// Canvas/LayoutTemplate.swift
import CoreGraphics
enum LayoutTemplate: String, CaseIterable, Identifiable {
case grid2x2, magazine, strip, triptych
var id: String { rawValue }
var displayName: String {
switch self {
case .grid2x2: return "2×2 Grid"
case .magazine: return "Magazine"
case .strip: return "Strip"
case .triptych: return "Triptych"
}
}
var slotCount: Int {
switch self {
case .grid2x2: return 4
case .magazine: return 3
case .strip: return 3
case .triptych: return 3
}
}
var systemImage: String {
switch self {
case .grid2x2: return "square.grid.2x2"
case .magazine: return "rectangle.split.2x1"
case .strip: return "rectangle.split.3x1"
case .triptych: return "square.split.2x2"
}
}
func frames(in size: CGSize) -> [CGRect] {
let p: CGFloat = 4 // padding between slots
let w = size.width
let h = size.height
switch self {
case .grid2x2:
let sw = (w - p * 3) / 2
let sh = (h - p * 3) / 2
return [
CGRect(x: p, y: p, width: sw, height: sh),
CGRect(x: sw+p*2, y: p, width: sw, height: sh),
CGRect(x: p, y: sh+p*2, width: sw, height: sh),
CGRect(x: sw+p*2, y: sh+p*2, width: sw, height: sh),
]
case .magazine:
let lw = w * 0.55 - p * 1.5
let rw = w - lw - p * 3
let rh = (h - p * 3) / 2
return [
CGRect(x: p, y: p, width: lw, height: h - p*2),
CGRect(x: lw+p*2, y: p, width: rw, height: rh),
CGRect(x: lw+p*2, y: rh+p*2, width: rw, height: rh),
]
case .strip:
let sw = (w - p * 4) / 3
return (0..<3).map { i in
CGRect(x: p + CGFloat(i) * (sw + p),
y: p,
width: sw,
height: h - p * 2)
}
case .triptych:
let th = h * 0.6 - p * 1.5
let bh = h - th - p * 3
let bw = (w - p * 3) / 2
return [
CGRect(x: p, y: p, width: w - p*2, height: th),
CGRect(x: p, y: th+p*2, width: bw, height: bh),
CGRect(x: bw+p*2, y: th+p*2, width: bw, height: bh),
]
}
}
}
6. Export and sharing
Use ImageRenderer (available from iOS 16) to convert the live canvas view into a UIImage, then present it via ShareLink. Rendering at scale = 3.0 produces a 3× retina image suitable for printing or full-resolution social sharing without any additional Core Graphics work.
// Views/ExportButton.swift
import SwiftUI
struct ExportButton: View {
let items: [CollageItem]
let canvasSize: CGSize
@State private var exportedImage: UIImage?
@State private var isShowingShare = false
var body: some View {
Button {
Task { @MainActor in
exportedImage = renderCanvas()
isShowingShare = exportedImage != nil
}
} label: {
Label("Export", systemImage: "square.and.arrow.up")
}
.sheet(isPresented: $isShowingShare) {
if let ui = exportedImage {
let image = Image(uiImage: ui)
ShareLink(
item: image,
preview: SharePreview("My Collage", image: image)
)
.presentationDetents([.medium])
}
}
}
@MainActor
private func renderCanvas() -> UIImage {
let renderer = ImageRenderer(
content: CollageCanvasView(
items: items,
canvasSize: canvasSize,
selectedID: .constant(nil),
onCommitMove: { _, _ in }
)
.frame(width: canvasSize.width, height: canvasSize.height)
)
renderer.scale = 3.0
return renderer.uiImage ?? UIImage()
}
}
#Preview {
ExportButton(items: [], canvasSize: CGSize(width: 400, height: 400))
}
7. Privacy Manifest (required for App Store)
Any app using PhotosUI must include a PrivacyInfo.xcprivacy file. Add it via File → New → File → App Privacy in Xcode. The App Store review pipeline will reject builds submitted without it if your app accesses photos, the file system, or uses any Required Reason APIs such as UserDefaults.
<!-- PrivacyInfo.xcprivacy -->
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>NSPrivacyTracking</key>
<false/>
<key>NSPrivacyTrackingDomains</key>
<array/>
<key>NSPrivacyCollectedDataTypes</key>
<array>
<dict>
<key>NSPrivacyCollectedDataType</key>
<string>NSPrivacyCollectedDataTypePhotosorVideos</string>
<key>NSPrivacyCollectedDataTypeLinked</key>
<false/>
<key>NSPrivacyCollectedDataTypeTracking</key>
<false/>
<key>NSPrivacyCollectedDataTypePurposes</key>
<array>
<string>NSPrivacyCollectedDataTypePurposeAppFunctionality</string>
</array>
</dict>
</array>
<key>NSPrivacyAccessedAPITypes</key>
<array>
<dict>
<key>NSPrivacyAccessedAPIType</key>
<string>NSPrivacyAccessedAPICategoryUserDefaults</string>
<key>NSPrivacyAccessedAPITypeReasons</key>
<array>
<string>CA92.1</string> <!-- store user preferences -->
</array>
</dict>
</array>
</dict>
</plist>
Common pitfalls
- Storing raw image Data in SwiftData at scale. Each full-resolution photo can be 5–15 MB. Four photos in a collage means your model container balloons fast. For production, write images to the app's
Documentsdirectory and store file URLs in the model instead of inlineData. ImageRenderermust run on the main actor. Calling it from a backgroundTaskcrashes at runtime. Always annotate your render function@MainActoror wrap the call inTask { @MainActor in … }.- Missing Info.plist photo usage strings. Both
NSPhotoLibraryUsageDescriptionandNSPhotoLibraryAddUsageDescriptionare required. The App Store review team will reject your app on first submission if either is missing, adding days to your release cycle. - App Store review: "app is similar to a built-in app." Apple occasionally flags collage apps for duplicating iOS functionality. Make sure your app has a distinct value proposition (e.g. custom templates, brand kit, watermarks, print ordering) visible in screenshots and the app description.
PhotosPickeron Simulator returns low-resolution placeholder assets. Always test photo loading and export on a physical device — the Simulator's asset library does not reflect real-world image sizes or EXIF orientation quirks.
Adding monetization: One-time purchase
Implement a single non-consumable in-app purchase using StoreKit 2. Create the product in App Store Connect (type: Non-Consumable, e.g. com.yourapp.unlock), then use Product.products(for:) to fetch it at launch and product.purchase() on tap. Listen to Transaction.updates in a background Task on app start to restore purchases automatically — this satisfies App Store Review Guideline 3.1.1 without requiring a separate "Restore" button, though adding one is good practice. Gate premium templates (magazine, triptych) behind the purchase flag stored in UserDefaults or a SwiftData preference record, and keep two free templates available so new users can evaluate the app before buying.
Shipping this faster with Soarias
Soarias automates the parts of an intermediate iOS project that eat the most calendar time without teaching you anything new. For a collage app specifically, it scaffolds the full SwiftData model container, generates the PrivacyInfo.xcprivacy with the correct photo-library data-type entries, wires up a StoreKit 2 non-consumable purchase flow, sets up fastlane deliver with App Store Connect API key auth, and runs pilot upload to push your first TestFlight build — all before you write your first layout template.
At intermediate complexity, the manual path typically spends two to three days on scaffolding, metadata, and CI plumbing before any real feature work begins. With Soarias, that overhead compresses to under an hour, so your one-week estimate becomes four to five focused days on the canvas engine and templates — the parts that actually differentiate your app in the App Store.
Related guides
FAQ
Does this work on iOS 16?
ImageRenderer was introduced in iOS 16, and PhotosPicker in iOS 16 too, so the core export and picker flows work on iOS 16. However, SwiftData requires iOS 17 — if you need to target iOS 16, replace SwiftData with Core Data or a plain JSON file store. The guide is written for iOS 17+ and that is the recommended minimum for new App Store submissions in 2026.
Do I need a paid Apple Developer account to test?
No — you can sideload to a personal device with a free Apple ID via Xcode. However, a free account limits you to three apps simultaneously and the install expires after seven days, requiring re-signing. For testing photo library access, StoreKit sandbox purchases, and TestFlight distribution, a paid Apple Developer Program membership ($99/year) is effectively required.
How do I add this to the App Store?
Create your app record in App Store Connect, add at least one screenshot per supported device size (6.9" and 6.5" cover most modern iPhones), fill in the description, keywords, and privacy nutrition labels, set your pricing (free with one-time IAP unlock), then archive and upload from Xcode via Product → Archive → Distribute App → App Store Connect. First-time submissions typically take one to three business days for review.
What's the best way to handle large batches of photos without blocking the UI?
Load PhotosPickerItem transfers inside an async Task (as shown in Step 3) so the main thread stays free. For additional safety with many photos, process items one at a time with for await rather than firing concurrent tasks, and show a ProgressView overlay during loading. If you want true background processing, move the Data writes to a separate actor — but for a four-slot collage this is typically unnecessary.
Last reviewed: 2026-05-12 by the Soarias team.