When people describe a career after the fact, they make it sound linear. Title follows title, company follows company, and the throughline feels obvious. The truth is usually less tidy. From the inside, the path looks fragmented and occasionally incoherent; only later does it become clear what it was training you to see.
That has been true for me across military service, defense intelligence, building a company, operating at scale, and advising leaders through transformation. The advantage was never the novelty of the resume. It was what those environments taught me about how systems fail, and how they can be rebuilt.
The Value of a Non-Linear Path
The real asset in a non-linear path is not that you've done unusual things. It's that you've seen the same structural problems show up in very different places. Over time, you start recognizing:
- The same leadership failures in different uniforms.
- The same accountability gaps behind different org charts.
- The same decision traps, whether the stakes are technical, financial, or mission-critical.
That pattern recognition changes how you work. It becomes easier to distinguish what is structural from what is situational. You stop overreacting to surface noise and start looking for the underlying operating model - how decisions get made, who owns what, and where reality actually shows up in the metrics.
In the context of AI-native organizations, that range matters. You're not just applying a framework to one sector; you're noticing how AI as delegated labor breaks and repairs similar seams across government, enterprise, and PE-backed companies.
Seeing Integration Problems Earlier
People who have lived in more than one system tend to see integration problems earlier. They've watched what happens when:
- Strategy disconnects from execution because the operating model was never updated.
- Product decisions ignore governance until risk or regulators force a correction.
- Confidence outruns evidence, and nobody is clear on who is accountable when it fails.
When you've spent time in defense, startups, Big Tech, and PE environments, you start noticing that the failure modes rhyme. The labels change; the seams do not. That perspective is useful when you're redesigning organizations for AI: you're effectively integrating a new class of worker, digital labor, into an already complex human system. The ability to see across silos isn't a nice-to-have; it's the work.
Hindsight Is Cleaner Than Reality
Looking back, it's easy to tell a clean story. Military precision led to systems thinking. Startup chaos taught resourcefulness. Big Tech scale taught how operating models break under load. PE and board work made governance and accountability non-negotiable.
But that coherence only appears in hindsight. At the time, the moves felt like good bets, not a master plan. The useful move isn't to pretend the path was linear. It's to recognize what the path has compounded into:
- Judgment under uncertainty.
- Comfort with complexity and constraint.
- A habit of looking at systems, not just roles or tools.
In an environment where AI is reshaping work, markets, and operating models, those are the traits that let you do more than bolt new technology onto old structures. They let you redesign the structures themselves.
What To Do Next
If your path feels unconventional, stop trying to defend it as if it were supposed to be straight. Ask a different question: what patterns has this path trained you to see that more conventional paths may miss.
If you're on the other side of the table, as a hiring manager, board member, or investor, ask where you might be undervaluing that kind of range. In a world where organizations are being rebuilt around AI, the people who have seen multiple systems from the inside often have the clearest view of where to start.