One of Apple’s senior artificial intelligence staff in robotics has departed to join Meta’s robotics group, according to Bloomberg. The maneuver highlights a growing talent war in embodied AI, or the area where software and hardware meet, just as Apple is seeking to reposition itself around both generative models and intelligent assistants.
At Apple, Zhang’s work focused on automation technologies and on taking machine learning into real-world tasks, the area of research that often involves perception, motion planning, on-device inference and other transformations that move the technology out from the cloud and onto the devices. His team was working on something different from a separate group that was experimenting with a robot “virtual companion,” but both efforts fell under a broader aspiration: to make Apple’s devices more adept at understanding and operating in the physical world.
Meta pushes further into embodied AI
Meta has been hiring aggressively in AI, combining open-source model momentum with increased robotics work. The Research Units of the company have invested in embodied AI tooling, such as Habitat (a simulation platform to training agents that are capable of nav- / manipulation in realistic 3D spaces) and in egocentric perception research with datasets such as Ego4D. Hiring a veteran robotics AI lead from Apple gives Meta more depth in the turning of these research assets into product-ready systems.
Compensation has been a powerful lever. Meta earlier had reached out to Apple’s head of Foundational Models Ruoming Pang with a deal offer in the $200 million range, Bloomberg previously reported. Top-of-market AI offers now, for example, depends largely on outsized equity grants and long-vesting stock refreshers, a construction that rivals have a hard time matching when they seem uncertain about their strategies or product timelines.
Nor would this be a singular exception. Bloomberg and other outlets have pegged at least a dozen AI experts who have left Apple for Meta, OpenAI and other companies since the beginning of the year. Meanwhile, three of Apple’s foundation models domain researchers are understood to be moving over to OpenAI and Anthropic – raising questions over continuity within Apple’s AI team.
Apple under fire over AI direction
And Financial Times reported that these recent departures constitute a “crisis of confidence” in Apple’s machine learning direction. Apple has internally been developing a more advanced, LLM-based version of Siri, as its ambitious Apple Intelligence features for the assistant fell off from an earlier software release. Senior software leadership has relayed to employees that a second-generation Siri architecture is on its way, with concrete improvements likely being given at a later date.
Apple has also looked into powering parts of Siri and system intelligence functionality using third-party models, and it has spoken with OpenAI, Anthropic and Google. Pragmatic, as there is a cost and time consideration when it comes to training frontier models— the cost of a cutting-edge run is said to be in the hundreds of millions of dollars in compute by industry analysts — but has also apparently created tension among Apple’s large language model group where engineers are making a trade off between platform clarity and reliance on external stacks.
The immediate impact of Zhang’s departure is more acute for robotics than chatbots. Embodied AI requires a tight loop between model design, sensor fusion, simulation, and real-world experimentation, plus relentless optimization for on-device silicon. And Apple’s strengths — custom chips, power-efficient inference and privacy-oriented design — are well matched for it. To lose a lead in just that nexus would risk slowing a field in which iteration speed is a competitive moat.
What this means for Meta and Apple
For Meta, the hire is a step in the direction of connecting underlying models, AR/VR platforms and embodied agents.” The company’s long-term bet is that assistants won’t just answer questions; they will see the world around them, manipulate objects and assist with real-world tasks via glasses, camera and ultimately home or office robots. Veteran talent with consumer device experience can help turn research insights into reliable, shipping features.
For Apple, the word has to be coherence and cadence. Distinct choices regarding in-house versus partner models, demonstrable momentum around Siri’s next generation, and a focused robotics roadmap may help stabilize recruiting and retention. And there is precedent: When Apple drew a line around custom silicon, it unlocked a decade of performance improvements. A similar strong stand on where Apple is leading on AI — in-device agents, multimodal perception, household robotics — it could be used to counter Meta’s pitch of outsized upside and open research.
What to watch next
Things to watch continue to be the company’s general human capital and leadership situation in frontier AI labs; whether Apple is able to form a formal third-party model partnership around Siri; how Apple is doing on on-device multimodal capabilities; and how quickly Meta Fusion is able to bring robotics research into real consumer-facing assistants. It won’t just ship the biggest model — it will ship the smartest stack that gets the job done, in the messy, physical world.