How to implement a cropping tool in SwiftUI

iOS 17+ Xcode 16+ Advanced APIs: UIGraphicsImageRenderer Updated: 2026-05-12

TL;DR

Layer a DragGesture over an Image to track a crop rectangle in view-space, then convert those coordinates to image-space and render the cropped region with UIGraphicsImageRenderer.

// Minimal crop call
func crop(_ image: UIImage, to rect: CGRect) -> UIImage {
    let renderer = UIGraphicsImageRenderer(size: rect.size)
    return renderer.image { _ in
        image.draw(at: CGPoint(x: -rect.origin.x,
                               y: -rect.origin.y))
    }
}

Full implementation

import SwiftUI

struct CropTool: View {
    let source: UIImage
    @State private var cropRect: CGRect = .zero
    @State private var dragStart: CGPoint = .zero
    @State private var croppedImage: UIImage?
    @State private var renderedSize: CGSize = .zero

    var body: some View {
        VStack(spacing: 20) {
            GeometryReader { geo in
                let fitted = fittedRect(for: source.size, in: geo.size)
                ZStack(alignment: .topLeading) {
                    Image(uiImage: source)
                        .resizable()
                        .scaledToFit()
                        .frame(width: geo.size.width)
                        .onAppear { renderedSize = fitted.size }

                    if cropRect != .zero {
                        Rectangle()
                            .stroke(Color.yellow, lineWidth: 2)
                            .background(Color.black.opacity(0.25))
                            .frame(width: cropRect.width,
                                   height: cropRect.height)
                            .offset(x: cropRect.origin.x,
                                    y: cropRect.origin.y)
                            .allowsHitTesting(false)
                    }
                }
                .gesture(
                    DragGesture(minimumDistance: 4)
                        .onChanged { v in
                            if v.translation == .zero { dragStart = v.location }
                            let x = min(v.location.x, dragStart.x)
                            let y = min(v.location.y, dragStart.y)
                            let w = abs(v.location.x - dragStart.x)
                            let h = abs(v.location.y - dragStart.y)
                            cropRect = CGRect(x: x, y: y, width: w, height: h)
                        }
                        .onEnded { _ in
                            croppedImage = performCrop(
                                viewRect: cropRect,
                                renderedSize: renderedSize)
                        }
                )
            }
            .frame(height: 320)

            if let result = croppedImage {
                Image(uiImage: result)
                    .resizable()
                    .scaledToFit()
                    .frame(maxHeight: 160)
                    .clipShape(RoundedRectangle(cornerRadius: 10))
                    .overlay(RoundedRectangle(cornerRadius: 10)
                        .stroke(Color.yellow, lineWidth: 1))
            }

            Button("Reset") { cropRect = .zero; croppedImage = nil }
                .buttonStyle(.borderedProminent)
                .tint(.yellow)
        }
        .padding()
    }

    // MARK: - Helpers

    private func fittedRect(for imageSize: CGSize, in viewSize: CGSize) -> CGRect {
        let scale = min(viewSize.width / imageSize.width,
                        viewSize.height / imageSize.height)
        let w = imageSize.width * scale
        let h = imageSize.height * scale
        let x = (viewSize.width - w) / 2
        let y = (viewSize.height - h) / 2
        return CGRect(x: x, y: y, width: w, height: h)
    }

    private func performCrop(viewRect: CGRect,
                              renderedSize: CGSize) -> UIImage? {
        guard viewRect.width > 4, viewRect.height > 4 else { return nil }
        let scaleX = source.size.width  / renderedSize.width
        let scaleY = source.size.height / renderedSize.height
        let imageRect = CGRect(x: viewRect.origin.x * scaleX,
                               y: viewRect.origin.y * scaleY,
                               width: viewRect.width  * scaleX,
                               height: viewRect.height * scaleY)
        let renderer = UIGraphicsImageRenderer(size: imageRect.size)
        return renderer.image { _ in
            source.draw(at: CGPoint(x: -imageRect.origin.x,
                                    y: -imageRect.origin.y))
        }
    }
}

#Preview {
    CropTool(source: UIImage(systemName: "photo.artframe")!
        .withTintColor(.systemIndigo, renderingMode: .alwaysOriginal))
}

How it works

  1. Coordinate tracking. DragGesture.onChanged records the anchor point on first movement and continuously recomputes a CGRect in view space as the finger moves, which drives the live yellow selection overlay.
  2. Scale mapping. fittedRect calculates the aspect-fill scale applied by .scaledToFit; dividing the view-space crop rect by those scale factors converts it to image-pixel space before passing it to the renderer.
  3. UIGraphicsImageRenderer crop. The renderer is initialized at the cropped region's pixel size, then source.draw(at:) is called with a negative offset so only the desired region falls inside the canvas — the most reliable, color-profile-safe crop path on iOS.
  4. Result display. onEnded triggers performCrop which returns the new UIImage and stores it in @State, causing SwiftUI to re-render the preview thumbnail below the canvas.

Common pitfalls

Claude Code prompt to generate this

Create a SwiftUI view called CropTool that lets the user drag a rectangle over a UIImage to select a crop region. Show a live yellow border overlay during the drag. On drag end, use UIGraphicsImageRenderer to crop the image to the selected region and display the result below the canvas. Include a Reset button. Use iOS 17+ APIs, the #Preview macro, and no third-party dependencies.

Related guides

FAQ

Does this work on iOS 16?

The UIGraphicsImageRenderer crop logic is compatible with iOS 13+. However, the #Preview macro and several onChange / GeometryReader refinements used here require iOS 17. To target iOS 16 replace #Preview with a PreviewProvider conformance and remove the minimumDistance parameter from DragGesture — the rest compiles unchanged.

What is the UIKit equivalent?

In UIKit you would use UIScrollView with a nested UIImageView for pan/zoom, overlay a custom UIView drawing the crop rect via CAShapeLayer, and call the same UIGraphicsImageRenderer snippet on a UIButton tap. SwiftUI's DragGesture + GeometryReader pipeline replaces roughly 120 lines of UIKit gesture recognizer boilerplate with about 20.

Last reviewed: 2026-05-12 by the Soarias team.