How to Build a Ringtone Maker App in SwiftUI

A Ringtone Maker app lets users pick any song from their library, trim a clip up to 30 seconds, and export it as an .m4r file they can use as a custom ringtone. It's a proven App Store category that serves music fans and personalisation enthusiasts who want unique tones without paying per-clip.

iOS 17+ · Xcode 16+ · SwiftUI Complexity: Intermediate Estimated time: 1 week Updated: May 12, 2026

Prerequisites

Architecture overview

The app uses SwiftData to persist ringtone metadata (name, source URL, trim start/end), AVFoundation's AVAssetExportSession to trim and write audio segments, and a PHPickerViewController wrapper (PhotosUI) for media selection. State flows from a SwiftData @Query in the list view down to an @Observable AudioTrimmer service that drives the async export. There is no network layer — everything is local-first.

RingtoneApp/
├── Models/
│   └── Ringtone.swift          # SwiftData @Model
├── Views/
│   ├── RingtoneListView.swift  # Main list + navigation
│   ├── RingtoneTrimView.swift  # Waveform + trim handles
│   └── AudioPickerView.swift   # PHPicker wrapper
└── Services/
    └── AudioTrimmer.swift      # AVFoundation export

Step-by-step

1. Data model

Persist ringtone metadata with SwiftData so clips survive app restarts and can be listed, renamed, or deleted.

import SwiftData
import Foundation

@Model
final class Ringtone {
    var id: UUID
    var name: String
    var sourceFileURL: URL
    var trimStart: Double   // seconds
    var trimEnd: Double     // max 30 s for standard ringtones
    var createdAt: Date

    init(name: String, sourceFileURL: URL,
         trimStart: Double = 0, trimEnd: Double = 30) {
        self.id            = UUID()
        self.name          = name
        self.sourceFileURL = sourceFileURL
        self.trimStart     = trimStart
        self.trimEnd       = min(trimEnd, trimStart + 30)
        self.createdAt     = Date()
    }

    var duration: Double { trimEnd - trimStart }
}

2. Core UI

A NavigationStack with a query-driven List handles browsing saved clips and routing into the trim editor sheet.

struct RingtoneListView: View {
    @Query(sort: \Ringtone.createdAt, order: .reverse)
    private var ringtones: [Ringtone]
    @Environment(\.modelContext) private var modelContext
    @State private var showingPicker     = false
    @State private var selectedRingtone: Ringtone?

    var body: some View {
        NavigationStack {
            List {
                ForEach(ringtones) { ringtone in
                    RingtoneRow(ringtone: ringtone)
                        .contentShape(Rectangle())
                        .onTapGesture { selectedRingtone = ringtone }
                }
                .onDelete { offsets in
                    offsets.forEach { modelContext.delete(ringtones[$0]) }
                }
            }
            .navigationTitle("My Ringtones")
            .toolbar {
                ToolbarItem(placement: .primaryAction) {
                    Button("Add", systemImage: "plus") { showingPicker = true }
                }
            }
            .sheet(isPresented: $showingPicker)  { AudioPickerView() }
            .sheet(item: $selectedRingtone) { RingtoneTrimView(ringtone: $0) }
        }
    }
}

3. Audio clip creation

Use AVAssetExportSession with a CMTimeRange to write the trimmed segment — the output is .m4a, which the share sheet renames to .m4r for ringtone use.

import AVFoundation

@Observable
final class AudioTrimmer {
    var isExporting = false

    func export(ringtone: Ringtone, to outputURL: URL) async throws {
        isExporting = true
        defer { isExporting = false }

        let asset = AVURLAsset(url: ringtone.sourceFileURL)
        guard let session = AVAssetExportSession(
            asset: asset,
            presetName: AVAssetExportPresetAppleM4A
        ) else { throw TrimError.noSession }

        let start = CMTime(seconds: ringtone.trimStart, preferredTimescale: 600)
        let end   = CMTime(seconds: ringtone.trimEnd,   preferredTimescale: 600)
        session.outputURL      = outputURL
        session.outputFileType = .m4a
        session.timeRange      = CMTimeRange(start: start, end: end)

        try await withCheckedThrowingContinuation { (cont: CheckedContinuation<Void, Error>) in
            session.exportAsynchronously {
                if let err = session.error { cont.resume(throwing: err) }
                else { cont.resume() }
            }
        }
    }

    enum TrimError: Error { case noSession }
}

4. Privacy Manifest

Add PrivacyInfo.xcprivacy to your app target — App Store Connect has rejected uploads missing this file since May 2024.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
    "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>NSPrivacyTracking</key><false/>
    <key>NSPrivacyCollectedDataTypes</key><array/>
    <key>NSPrivacyAccessedAPITypes</key>
    <array>
        <dict>
            <key>NSPrivacyAccessedAPIType</key>
            <string>NSPrivacyAccessedAPICategoryFileTimestamp</string>
            <key>NSPrivacyAccessedAPITypeReasons</key>
            <array><string>C617.1</string></array>
        </dict>
    </array>
    <!-- Add NSMicrophoneUsageDescription in Info.plist only
         if you add live recording; file-based trimming alone
         does not require microphone permission -->
</dict>
</plist>

Common pitfalls

Adding monetization: One-time purchase

Use StoreKit 2's Product.purchase() to gate the export feature behind a non-consumable in-app purchase. Define the product (e.g. com.yourapp.unlock) in App Store Connect, then call Transaction.currentEntitlement(for:) at app launch to restore access across reinstalls. Keep the free tier genuinely useful — let users import tracks, scrub the waveform, and preview clips at full quality — but require the purchase to export and share the .m4r file. This free-to-preview approach minimises refund requests because users know exactly what they are unlocking before they pay.

Shipping this faster with Soarias

Soarias scaffolds the complete project from your description — SwiftData model, AudioTrimmer service, PHPickerViewController wrapper, StoreKit paywall, and the PrivacyInfo.xcprivacy file — in a single step. It also generates fastlane lanes for screenshot capture and handles the App Store Connect metadata form including age rating, export compliance declaration, and content rights, which typically takes two to three hours the first time through.

At intermediate complexity, most solo developers spend five to seven days going from a blank Xcode project to a first TestFlight build. With Soarias handling scaffolding, StoreKit wiring, and ASC submission automation, that window typically narrows to two to three days, freeing you to concentrate on the trim-editor UX — the feature users actually pay for.

Related guides

FAQ

Do I need a paid Apple Developer account?

Yes. A free Apple ID lets you sideload onto your own device for development testing, but you need the $99/year Apple Developer Program membership to distribute on the App Store or invite external testers via TestFlight.

How do I submit this to the App Store?

Archive your build in Xcode via Product → Archive, then use the Organizer to upload to App Store Connect. Fill in your app metadata, add screenshots for every required device size, set your pricing, and submit for review. First submissions typically take one to three days to clear review.

Can my app set the iPhone ringtone automatically?

No. Apple does not provide a public API to programmatically set the system ringtone. Your app exports the .m4r clip; the user installs it manually by syncing with Finder on macOS or using AirDrop. Be explicit about this workflow in your App Store description to avoid low ratings from users who expected one-tap setting.

Last reviewed: 2026-05-12 by the Soarias team.