The Rise of AI Coding Agents: Building an iOS App with OpenAI Codex in Xcode

Published: February 18, 2026

title

I’ve been using OpenAI’s Codex for the past few weeks inside Apple’s latest version of Xcode (26.3).

Something has shifted.

For the first time in my career, writing software feels less like typing code — and more like directing it.

Codex isn’t just autocomplete. It’s an AI-powered software engineering agent that can write, understand, test, refactor, and reason about code across an entire project. And that difference — reasoning across context — changes everything.

We’ve had autocomplete for years.
We’ve had snippets.
We’ve all had Stack Overflow open on a second monitor.

But this feels different.

This feels collaborative.


From Code Completion to AI Code Collaboration

Traditional AI coding tools focus on the line you’re typing.

Codex 3.5 thinks in terms of features, files, and workflows.

Inside Xcode, I didn’t just ask it to write a function. I asked it to:

  • Create a SwiftUI view hierarchy
  • Implement a playback engine using AVFoundation
  • Handle background audio and lock screen controls
  • Add AirPlay support
  • Refactor models into a clean MVVM structure
  • Debug build errors
  • Suggest performance improvements

Instead of giving fragmented answers, it reasoned about the entire project.

It read my files.
It understood context.
It suggested structural improvements.
It caught mistakes before I even ran the simulator.

At times it felt less like autocomplete and more like programming with a team member who never takes lunch breaks and never loses context.


Building a Custom MP3 Player with SwiftUI and OpenAI Codex

Recently I decided to move away from subscription streaming services and take back control of my music library.

I’ve been converting my old CD collection, organising MP3 purchases, and adding a few new and second-hand albums along the way. Rather than just dumping everything into Apple Music, I decided to build my own digital audio player for iPhone using SwiftUI and GPT-5.3-Codex.

Partly as an experiment.
Partly as a challenge.
Partly because I wanted something that was completely mine.

App Goals

The app needed to:

  • Import and manage local MP3 and FLAC files
  • Display album artwork
  • Group tracks by album and artist
  • Provide a clean SwiftUI interface
  • Support background playback
  • Integrate AirPlay
  • Create and manage playlists
  • Work entirely offline with local content and zero subscriptions

Simple. Fast. Private. Personal.


Designing the iOS App Before Writing Code

Before writing a single line of code, I created a design brief.

I documented:

  • What I wanted to achieve
  • The core features the app would need
  • How the experience should look and feel
  • Visual layouts for key screens
  • Navigation flow
  • Playback behaviour and interactions

Sharing that brief with Codex changed the entire workflow.

Instead of asking for isolated snippets, I anchored development around a clear vision. Every conversation referenced the intent behind the app — not just the syntax required to build it.

That early clarity mattered more than I expected.


iOS App Architecture: SwiftUI, MVVM and AVFoundation

I started with a basic SwiftUI template. From there, Codex helped structure the app into:

  • Track and Album models
  • A playback manager built around AVPlayer
  • A ViewModel layer using modern observation patterns
  • A clean separation between UI and audio logic

When I asked, “Is this the right architecture for what I’m building?” it didn’t just answer yes or no. It suggested refinements based on the goals in the design brief.

Normally, I’d spend time second-guessing architectural decisions. This time, it felt like discussing trade-offs with another engineer.


Handling Audio Playback in iOS (The Hard Part)

Handling audio on iOS isn’t trivial. In this app you:

  • Configure the audio session for playback .
  • Handle interruptions (pause on begin, resume if allowed).
  • Handle AirPlay and AirPlay route changes (pause if the output device disappears).
  • Update lock screen metadata via MPNow​Playing​Info​Center.
  • Support remote transport controls via MPRemote​Command​Center.

This is typically where documentation tabs multiply.

Instead, when something failed to compile or behave correctly, I pasted the error directly into Codex. It reasoned about the issue in the context of my entire project — not just the snippet in isolation.

In one case, it even warned me about iPad multitasking validation requirements before I hit the App Store error.

That proactive awareness is new territory.


Debugging with an AI Coding Agent

One of the biggest changes wasn’t writing code — it was debugging.

  • File URL permissions
  • Info.plist configuration
  • Background audio modes
  • AirPlay routing issues

Normally, this means browser searches, half-relevant forum threads, and trial and error.

Instead, I had a conversation.

The difference wasn’t just speed — it was continuity. The AI remembered the architecture, the files, and the goals. That context made the suggestions sharper and more precise.


What AI Coding Agents Mean for Developers

Using Codex doesn’t remove the need to understand code.

If anything, it raises the bar for architectural thinking.

You still need to know what good structure looks like.
You still need to make decisions.
You still need to recognise when something feels wrong.

But the time spent wiring boilerplate, chasing documentation, and fixing minor syntax errors shrinks dramatically.

Instead of spending:

  • 30 minutes wiring setup code
  • 20 minutes searching documentation
  • 15 minutes debugging small mistakes

You spend time deciding what to build and how it should work.

The role shifts from implementer to director.

And that’s powerful.

The developers who thrive in this new environment won’t necessarily be the fastest typists — they’ll be the clearest thinkers.


From Dusty CDs to a Custom iOS Music App

Now the project has come full circle.

My CD collection is converted.
My albums are organised.
Everything sits locally on my iPhone — no streaming dependency, no subscription required.

I open the app.
I scroll through artwork I imported myself.
I tap play.

The music starts instantly.

In the lounge, I send it to the HiFi via AirPlay.
In the kitchen, it plays through smart speakers.
Out walking, it streams straight to my headphones.

All powered by an app I built myself — with Codex as my co-engineer.

From dusty CDs on a shelf to a custom digital player in my pocket.

AI didn’t just help me write code.

It helped me build something personal.

And this feels less like a novelty — and more like the beginning of a new way to build software.