
I’ve been using OpenAI’s Codex for the past few weeks inside Apple’s latest version of Xcode (26.3).
Something has shifted.
For the first time in my career, writing software feels less like typing code — and more like directing it.
Codex isn’t just autocomplete. It’s an AI-powered software engineering agent that can write, understand, test, refactor, and reason about code across an entire project. And that difference — reasoning across context — changes everything.
We’ve had autocomplete for years.
We’ve had snippets.
We’ve all had Stack Overflow open on a second monitor.
But this feels different.
This feels collaborative.
Traditional AI coding tools focus on the line you’re typing.
Codex 3.5 thinks in terms of features, files, and workflows.
Inside Xcode, I didn’t just ask it to write a function. I asked it to:
Instead of giving fragmented answers, it reasoned about the entire project.
It read my files.
It understood context.
It suggested structural improvements.
It caught mistakes before I even ran the simulator.
At times it felt less like autocomplete and more like programming with a team member who never takes lunch breaks and never loses context.
Recently I decided to move away from subscription streaming services and take back control of my music library.
I’ve been converting my old CD collection, organising MP3 purchases, and adding a few new and second-hand albums along the way. Rather than just dumping everything into Apple Music, I decided to build my own digital audio player for iPhone using SwiftUI and GPT-5.3-Codex.
Partly as an experiment.
Partly as a challenge.
Partly because I wanted something that was completely mine.
The app needed to:
Simple. Fast. Private. Personal.
Before writing a single line of code, I created a design brief.
I documented:
Sharing that brief with Codex changed the entire workflow.
Instead of asking for isolated snippets, I anchored development around a clear vision. Every conversation referenced the intent behind the app — not just the syntax required to build it.
That early clarity mattered more than I expected.
I started with a basic SwiftUI template. From there, Codex helped structure the app into:
When I asked, “Is this the right architecture for what I’m building?” it didn’t just answer yes or no. It suggested refinements based on the goals in the design brief.
Normally, I’d spend time second-guessing architectural decisions. This time, it felt like discussing trade-offs with another engineer.
Handling audio on iOS isn’t trivial. In this app you:
This is typically where documentation tabs multiply.
Instead, when something failed to compile or behave correctly, I pasted the error directly into Codex. It reasoned about the issue in the context of my entire project — not just the snippet in isolation.
In one case, it even warned me about iPad multitasking validation requirements before I hit the App Store error.
That proactive awareness is new territory.
One of the biggest changes wasn’t writing code — it was debugging.
Normally, this means browser searches, half-relevant forum threads, and trial and error.
Instead, I had a conversation.
The difference wasn’t just speed — it was continuity. The AI remembered the architecture, the files, and the goals. That context made the suggestions sharper and more precise.
Using Codex doesn’t remove the need to understand code.
If anything, it raises the bar for architectural thinking.
You still need to know what good structure looks like.
You still need to make decisions.
You still need to recognise when something feels wrong.
But the time spent wiring boilerplate, chasing documentation, and fixing minor syntax errors shrinks dramatically.
Instead of spending:
You spend time deciding what to build and how it should work.
The role shifts from implementer to director.
And that’s powerful.
The developers who thrive in this new environment won’t necessarily be the fastest typists — they’ll be the clearest thinkers.
Now the project has come full circle.
My CD collection is converted.
My albums are organised.
Everything sits locally on my iPhone — no streaming dependency, no subscription required.
I open the app.
I scroll through artwork I imported myself.
I tap play.
The music starts instantly.
In the lounge, I send it to the HiFi via AirPlay.
In the kitchen, it plays through smart speakers.
Out walking, it streams straight to my headphones.
All powered by an app I built myself — with Codex as my co-engineer.
From dusty CDs on a shelf to a custom digital player in my pocket.
AI didn’t just help me write code.
It helped me build something personal.
And this feels less like a novelty — and more like the beginning of a new way to build software.