UK-based chip designer Arm Holdings is stepping up its concentrate on bodily AI as demand grows for clever techniques able to working in real-world environments, from autonomous automobiles and robotics to industrial machines and linked infrastructure.
Talking at embedded world 2026 in Nuremberg this week, Chris Bergey, Government Vice President of Arm’s Edge AI Enterprise Unit, described bodily AI as a key strategic precedence for the corporate as linked units develop into extra succesful and autonomous. “Physical AI is where intelligence meets the real world. Devices don’t just need to compute — they need to perceive, reason, and act safely in real time,” he mentioned.
In January, Arm reorganised its enterprise models to create a devoted Bodily AI division, reflecting rising trade curiosity in applied sciences that mix AI with real-world motion and interplay. In accordance with Bergey, organisations growing robotics, autonomous automobiles, and industrial techniques more and more require predictable, low-latency computing platforms able to operating AI workloads on the edge.
“We’ve seen companies increasingly demand predictable, low-latency systems,” he mentioned. “That’s exactly the space Arm is uniquely positioned to serve — from autonomous vehicles to robotics to industrial automation.”

The shift displays broader modifications throughout the embedded and IoT panorama, the place units are evolving from static, single-purpose endpoints into clever, linked techniques able to localised “sensemaking” — decoding context, processing knowledge, and responding autonomously in actual time.
“The pace of innovation is unlike anything I’ve seen in my career,” Bergey mentioned. “What was difficult a year ago is becoming practical now. Edge AI isn’t just enabling smarter devices — it’s redefining what embedded intelligence means across industries.”
He added that the position of Edge computing itself can also be altering as AI capabilities develop into extra environment friendly and extensively deployed. “The Edge is no longer just an extension of the Cloud. It’s becoming a place where AI is the foundation of the product itself — making decisions locally, collaborating with other devices, and acting instantly.”
The corporate’s expanded bodily AI focus is already bearing fruit in high-profile collaborations. In late February, Tensor and Arm introduced a multi-year strategic partnership to ship the foundational compute structure behind what Tensor calls the world’s first agentic AI private Robocar. Every car integrates greater than 400 safety-capable, power-efficient Arm-based cores — the best focus of Arm expertise in a shopper car as we speak — powering a Stage 4 autonomous system with a complete sensor suite together with 37 cameras, 5 lidars, 11 radars, and triple-channel 5G connectivity.
Bergey mentioned that AI-enabled units are more and more in a position to uncover and coordinate with one another, forming distributed techniques that share context and adapt in actual time with out counting on a central gateway.
“Robots, cameras, and controllers are working together as one unified system,” he mentioned. “That’s a game-changer for industrial and autonomous applications, where speed and reliability are critical.”
In accordance with Bergey, enhancements in NPU acceleration and energy-efficient processing are making persistent AI capabilities extra sensible for OEMs.
“Persistent voice recognition used to be limited to high-power systems,” he mentioned. “Now, with NPU acceleration and low-power Arm cores, OEMs can deploy always-on AI without breaking energy or cost constraints. Ambient intelligence becomes practical, durable, and repeatable.”
Different Arm demonstrations on the present highlighted on-device multimodal AI, the place cameras and NPUs run imaginative and prescient and language fashions collectively to ship personalised experiences fully on-device, lowering latency whereas holding delicate knowledge native. The corporate additionally demonstrated methods by which pre-trained fashions might be deployed quickly on Arm-based embedded {hardware} utilizing validated runtimes and growth toolchains in collaboration with EmbedUR.
Talking about what clients are asking, Bergey mentioned: “A lot of them are asking us, ‘what’s possible?’ Everyone is aware of how disruptive these technologies are, and nobody wants to be left behind. There’s a huge desire to move faster and make things possible. I’ve seen people’s careers reinvigorated because they don’t want to miss out on this technology wave. For many, it’s an enabler, not a negative disruption.”
There’s loads of different editorial on our sister website, Digital Specifier! Or you’ll be able to all the time be a part of within the dialog by visiting our LinkedIn web page.



