Skip to main content

Strategy approach / Data & AI leadership

Winning in the AI era is an organizational design problem.

Most data and AI strategy work confuses adoption with capability. Buying tools, running pilots, and shipping use cases is not a strategy. It is the visible exhaust of one.

The harder strategic question is this: how does an organization become the kind of place that compounds value from data and AI over decades, in a regulated environment, without breaking what already works?

North star

A north star, not a slogan.

"AI-first" and "data-driven" are weak strategy statements because they are hard to prosecute. You cannot run an investment committee, operating review, or architecture trade-off against a slogan.

A useful north star answers a sharper question: what kind of company do we want to be when AI is no longer the differentiator, but table stakes?

"Customers trust our recommendations because the data, model, decision path, and approval evidence are traceable end to end."

"Routine decisions are automated and non-routine decisions are augmented, lowering cost to serve without weakening accountability."

"Analysts, engineers, and domain experts compete on judgment, not on assembling information that the platform should already provide."

Assessment

Honest assessment beats ambitious slides.

You cannot plan a credible route until you know where you are. In regulated enterprises, the useful assessment names the gap, the cost of closing it, and the sequence in which closing it becomes realistic.

Usually overestimated

Data quality

AI-ready is a higher bar than BI-ready. AI exposes ambiguity, missing ownership, weak semantics, and shortcuts that dashboards can hide.

Platform readiness

A lakehouse and notebooks are not the operating system. Production AI needs retrieval, evaluation, monitoring, drift detection, grounding, feedback loops, and release discipline.

Workforce literacy

AI literacy is not the fact that people have used a chatbot. It is knowing when to trust a model, when not to, and how to frame the work so human judgment improves.

Usually underestimated

Operating model fragility

Many organizations were designed for software and reporting delivery: committees, RACI, stage gates, annual budgets. AI changes faster than that system was built to absorb.

Evidence debt

If controls, approvals, evaluations, lineage, and exceptions are not generated as work happens, the organization accumulates trust debt faster than technical debt.

Decision cost

The important question is not only what the gap is. It is what closing that gap costs in people, time, platform investment, governance effort, and opportunity delay.

Strategic choices

The decisions that decide the next five years.

Strategy is a small number of close-to-irreversible choices. If they are avoided, the organization gets a wishlist: many pilots, many tools, many slides, and very little compounding advantage.

Centralized vs federated capability

A central platform with thin domain teams scales differently than thick domain teams on shared foundations. The right answer depends on regulation, talent density, domain maturity, and business-unit alignment.

Build vs buy vs customize

Foundation models are usually bought. The strategic asset is often the data, retrieval, evaluation, orchestration, workflow integration, and feedback system around them.

Speed of adoption vs depth of evidence

Internal meeting summaries and customer-facing financial advice are not the same risk class. Each use case needs an intentional position on speed, evidence, review, and rollback.

Vendor concentration vs resilience

AI lock-in is not only cloud lock-in. Model-provider dependency has business continuity, contractual, data, cybersecurity, and regulatory implications.

Internal productivity vs customer-facing AI

Both matter, but they need different governance, value cases, risk thresholds, and operating metrics. Mixing them in one roadmap creates noise.

Data and platform foundation

AI strategy is data strategy promoted to executive level.

Every weakness in the data foundation is amplified when a model trains on it, retrieves from it, reasons over it, or explains a decision using it. The board does not need every architecture detail, but it does need to understand the investment logic behind the platform.

This is where data engineering, data lineage, AI architecture, and governance become one operating conversation.

The strategic platform positions

  • Lineage, quality, policy, and governance should be platform features, not project deliverables that die when the project closes.
  • A Data OS mindset treats the data and AI estate as a sovereign control plane across warehouses, lakehouses, catalogs, engines, AI workflows, and hybrid environments.
  • Federated ownership needs contracts, service levels, decision rights, and data-product accountability that survive reorganization.
  • AI-grade observability must track cost, latency, quality, grounding, drift, safety, fairness where relevant, and operational exceptions.
  • There should be one governed entry point for AI capability: clear intake, approval path, reusable components, evidence, and operational support.

Operating model

The operating model shift is the strategy.

The hardest part of an AI strategy is not AI. It is the way the organization changes shape to deliver, govern, fund, staff, and absorb AI without losing accountability.

From service function to product organization

Internal customers should not simply raise tickets. They should consume governed data and AI products with owners, roadmaps, service levels, controls, and clear improvement loops.

From temporary AI roles to durable capability

AI product owners, evaluation engineers, AI governance officers, model-risk liaisons, retrieval specialists, and AI platform engineers may evolve, but the capabilities cannot stay informal.

From request-deliver-close to continuous co-creation

AI work changes through interaction with domain experts. The engagement model must allow iteration while preserving accountability, cost control, evidence, and delivery cadence.

From generic engineering ladders to AI career paths

If senior technical careers do not recognize AI specialization with comparable status and compensation, the best people leave for organizations that do.

From hiring-first to ownership-first

The hiring plan should follow what the organization has decided to own internally. Talent strategy is downstream of operating-model strategy, not a separate HR exercise.

Risk and regulation

Regulation is a strategic thread, not a compliance afterthought.

In Europe, Switzerland, and global regulated industries, AI sits inside a multidimensional risk space: the EU AI Act, GDPR, FINMA-style expectations, sector regulation, model risk, third-party risk, cyber risk, audit, and reputation.

This is strategy framing, not legal advice. The leadership point is that risk cannot be bolted on after the roadmap is approved.

Continuous assurance

The right posture is evidence-producing delivery: every important model, dataset, prompt pattern, workflow, approval, exception, evaluation, monitoring signal, and rollback path creates usable evidence as work ships.

Done well, regulatory rigor accelerates the strategy because each new use case starts further down the path. Done badly, it becomes the reason every initiative slips.

Transformation sequence

Sequencing matters more than design.

Strategy fails most often not in the design, but in the order of execution. Many organizations try to execute scaled adoption in month three. The trust debt this creates can take longer to repair than the original capability gap.

Foundation

Months 0-9

Data ownership, lineage, governance, platform basics, AI policy, control paths, and the first evidence-producing delivery lanes.

Lighthouse use cases

Months 6-12

Two or three high-evidence use cases that prove the foundation, build credibility, and show how delivery, risk, and business teams will work together.

Capability building

Months 9-18

Hire, restructure, train, and formalize the operating model. This is where strategy becomes the organization chart, delivery cadence, and career path.

Scaled adoption

Months 12-24

Many use cases on a stronger shared foundation, with repeatable governance, reusable architecture, and measurable business outcomes.

Continuous refresh

Permanent

Strategy updates against regulation, model capability, cost, vendor change, risk posture, and competitive movement.

Expectation management

Expectation management, up and down.

Senior data and AI executives spend more time on expectation management than most strategy decks admit. The work is to keep three audiences aligned without flattening their different concerns.

Board and ExCo

They need leading indicators that connect AI to cost structure, revenue, risk posture, customer trust, and capability maturity. A use-case leaderboard is not enough.

Delivery and operating teams

They need a stable set of priorities, a known refresh cadence, and a clear escalation path when reality and strategy disagree.

Legal, risk, audit, product, and security

They need to be inside the design, not invited to block it later. A good strategy makes their work easier through clearer ownership, better evidence, and fewer surprises.

The credibility of a strategy executive lives in the gap between these audiences. All three notice when it slips.

Continuous strategy

Continuous strategy is the only kind that survives.

A 2026 data and AI strategy cannot behave like a three-year plan. The model landscape, regulatory landscape, vendor landscape, and competitive landscape all move on shorter cycles.

The operating principle is stability of intent, agility of execution.

The cadence I trust

  • Annual strategy refresh: north star, assessment, strategic choices, investment posture, and operating model.
  • Quarterly priority recalibration: what changed, what stops, what accelerates, and what must be escalated.
  • Monthly delivery review: OKRs, value, risk, evidence quality, adoption, and blockers.
  • Weekly horizon scan: regulation, technology, vendor, competitor, and security moves filtered for strategic implication.

Where this comes from

Twenty-five years in the gap between board ambition and engineering reality.

I have spent most of my career building, scaling, and modernizing data platforms, teams, and delivery systems where strategy had to survive cost, regulation, legacy constraints, architecture decisions, and execution pressure.

If this is how you think about your organization's data and AI future, the conversation will be useful. Open to permanent senior Data & AI leadership positions in Switzerland and globally.