The Oakland Ballers — the independent team that won over Bay Area fans after the A’s left — left the dugout to a company algorithm for an entire game. The stunt was supposed to be a tech-forward lark. Instead, it struck a nerve, raising meaty questions about trust, accountability, and the limits of “Moneyball 2.0.”
On paper, the AI did fine. In practice, the experiment revealed how fast sports technology stops being cute when it meets culture, labor, and the lived experience of fans who have had corporate logic push teams out of town.

How the AI Seized Control of In-Game Decisions
The Ballers collaborated with Distillery to develop a decision system, powered by OpenAI’s models and trained on years of historical baseball data, internal scouting reports, and recent tendencies of the club. The intention was not to transplant Manager Aaron Miles’s brain so much as replicate it, the same bullpen moves, lineup tweaks, and pinch-hit calls he would make — with available data at his disposal.
According to club accounts, that’s more or less what happened. Miles said the model’s recommendations, which covered pitching changes and late-game at-bats, had matched him on those moves — with one human override when his starting catcher felt poorly. Consider it a shadow manager raised to acting manager after the playoff slot was locked down.
On this point, baseball is not coming way out of left field. MLB has automated its own digital revolution for 20 years; Statcast has followed every pitch and batted ball for years now, while the automated ball-strike system has undergone testing across the minor leagues. Independent leagues are often proving grounds. But the notion of an AI calling the play in real time is a further line to cross.
What Actually Might Go Wrong With On-Field AI
Speed and context kill. With a pitch clock compressing decisions, any latency — the model being computed, data syncing down to the field, or coach-to-field comms — is subject to lag, mis-signals, or burned timeouts. The typical dugout has seconds, not minutes, to evaluate bullpen readiness and batter splits and defensive alignment.
Data gaps matter. Autonomous parks do not always receive the same telemetry that MLB does. The model inherits blind spots when the tracking is not complete or its trajectory is noisy. Garbage in, brittle out. That is particularly hazardous with injuries, where subtle signs — fatigue in a catcher’s throws, a wince during a swing — seldom reside in databases.
Black-box reasoning complicates accountability. If a model greenlights a reliever two days in a row, who bears the risk of his injury? Spikes in workload are associated with soft-tissue injuries, according to sports medicine research — including work published in BMJ Open Sport & Exercise Medicine. A manager can tell when the spreadsheet is lying, and AI cannot unless you give it constraints that it enforces (e.g., through auditable logic).
Then there’s the vulnerability surface. Dugouts are not air-gapped labs. An update that went awry, a dead battery, or interference can all derail a system mid-game. NIST’s AI Risk Management Framework notes robustness and security among key considerations for real-time systems; an organization or a ballclub is not going to run red-team exercises like a bank or hospital.
And the human element persists. Clubhouse buy-in is fragile. A player who thinks he’s a click of code away from being benched might react differently than one who trusts a manager’s read. The Premier League’s star-studded VAR misfires illustrate how tech that “helps” can undermine credibility if outcomes start feeling automated or overly mysterious.

Fans, Labor and the Trust Gap in Oakland Baseball
How Oakland responded wasn’t just numbers and code. The city saw the Raiders depart, the Warriors cross the bay, and baseball’s A’s plan an exit. “Tech solutionism” lands differently here. When the Ballers — the supposed people’s team — rely on a generative model, certain supporters see another path that places novelty over community.
Labor questions trail the hype. When a tactic an AI drives backfires, who should be held accountable? So what if a player challenges the role change by the model? There are no Major League Baseball Players Association agreements covering the Pioneer League; however, their more general posture on data rights and algorithmic profiling is evident: transparency and consent are non-negotiable.
Symbolism, even in jest, clings to things. A team that built goodwill by leaning into Oakland’s identity could lose it if fans feel like props in a demo day. In branding, the price to pay for a clever one-off can be much more than just one loss in the standings.
Smart Guardrails for On-Field AI in Baseball Clubs
There’s a responsible path forward. Run AI in “shadow mode” first — make recommendations but don’t implement them — to benchmark against human calls, win probability added, and injury results. Record when and why humans override the model, then rinse and repeat.
Clarify decision rights: AI informs; manager decides. Demand explainability for high-risk decisions (e.g., deploying a pitcher), log model inputs, and run pregame checklists to ensure data quality. It’s going to require basic cybersecurity hygiene, offline fallbacks, authenticated devices, and change control.
Communicate what the system does do and doesn’t do. Fans don’t need access to source code — but they deserve a playbook in plain English: where the AI helps, what metrics it optimizes, and when human judgment remains in control. The Partnership on AI’s guidelines for transparency and NIST’s RMF provide helpful templates, albeit not for sports.
The Bottom Line on Oakland’s AI-Managed Game
The Ballers’ AI game did not collapse; hence the point — and the warning. Most failures in sports tech aren’t black and white like this; they’re small, not-quite-black-and-white failures to live up to promises that chip away at trust, safety, or accountability. If clubs also keep human leadership in place, demonstrate rigor away from the action, and trust their voice to speak frankly to their communities without jargon, AI can be more than a gimmick: it can be a sharp tool.
Numbers and nerve have always balanced on a teetering tipping point for baseball. The challenge now is ensuring that the algorithm works for the clubhouse and city — and not the other way around.
