Assembly of bendable materials, like clothing, can be automated in ways other than duplicating manual sewing techniques. Source: CreateMe
For over 200 years, the sewing machine has been the core of how clothes are produced. It automated the handwork of artisans, but it also trapped the industry in a single approach: pushing thread through material. Even with improvements in robotics and automation, most clothing still follows that same principle. Humans provide the skill, alignment, and problem-solving for bendable materials that machines find difficult to copy.
The limitation isn’t a lack of trying. It’s that most methods are trying to automate a process that was never made for machines.
Traditional automation is great at steady, predictable jobs like welding, assembly, and other stable material handling tasks. Fabric acts differently. It stretches, wrinkles, folds, and shifts during a job. When materials deform, robots struggle not because they can’t move accurately, but because they can’t reliably estimate material condition or adapt to changing states.
That gap shows a bigger challenge in manufacturing: building systems that can sense, reason about contact, and adjust instantly instead of just repeating pre-recorded motions. That is the goal of physical AI.
From deformable demos to real production
Progress is happening. Advances in vision, simulation, sensing, and robot intelligence are pushing skilled manipulation from lab experiments toward real use. But the requirement for commercialization isn’t whether a robot can finish a task once. It’s whether it can work continuously, across variations, with acceptable throughput, output, and recovery.
These methods are now being tested in real production settings, where success is measured in uptime, cycle time, and the engineering work needed to keep systems working. Deformable materials quickly expose the difference between a good demo and a deployable system.
Why clothing manufacturing is a tough test
Apparel is one of the hardest business tests for physical AI. Few production groups combine this much physical variation—fabric type, drape, stretch, shape, layering, and construction—with this degree of world scale and cost pressure.
If a system can reliably sense, predict, and control cloth, it creates a transferable base for handling flexible materials more widely. Fabric handling isn’t a small problem. It’s a practical test of physically aware manipulation.
The challenge is that many efforts begin by trying to automate sewing itself—keeping the most difficult parts of the problem instead of removing them.
Redesign the method, not just automate it
A more expandable method is to redesign manufacturing around what robots can handle.
Instead of copying needle-and-thread processes, garments can be treated as forms to be shaped and bonded instead of pierced and stitched. This changes the structure of the problem.
In practice, the challenge is less “train the robot to handle fabric” and more “make fabric act in a way a robot can learn from.”
Deformable materials are naturally unstable. Learning-based manipulation only becomes dependable when the system adds constraint and steady reference shapes.
Single-sided access lowers blocking and coordination difficulty. Three-dimensional molds and fixtures steady shapes and increase visibility. Purpose-built grippers offer finer control over soft, porous materials. Bonded assembly frees several limits set by needles and thread.
Together, these choices create a more controlled setting where sensing, planning, and learning can generalize. This is the main point: for deformable assembly, process design and intelligence are inseparable.
These systems work not because AI is added onto existing workflows, but because robotics, joining methods, and learning-based control are made as one, unified system.
Bonding also introduces a different kind of flexibility. Adhesive patterns can directly build stretch, durability, and performance into the joint. In effect, the joint becomes programmable, not just mechanical. With closed-loop feedback, placement and hardening can adjust to the material in front of the system instead of an idealized standard. Each operation becomes both a production step and a source of data.
Editor’s note: Physical AI will be among the topics covered at the Robotics Summit & Expo this month in Boston. Register now to attend.

When learning grows in production
In this model, ability comes less from fixed motion and more from learned behavior. Skills like alignment, flattening, and placement can transfer across products and materials. Over time, performance improves through data instead of repeated retooling.
This doesn’t remove the need for hardware or process discipline. But it shifts how systems adjust. Instead of rebuilding workflows for each change, systems can generalize within set limits.
That shift has effects for manufacturing architecture. When improvement is software-driven, production can become more responsive to demand, with shorter lead times and less dependence on large, fixed production runs.
Robotic handling of bendables goes beyond clothing
Apparel is a helpful testing ground, but the effects go well beyond clothing. The same challenges appear in car interiors, medical textiles, furniture, and aerospace composites, where variable materials, complex shapes, and tight tolerances are common.
Deformable assembly isn’t a small application. It’s a foundational skill for industries working with soft goods, technical textiles, laminates, and other variable materials.
From demonstration to real production
The field is now being judged on production terms: uptime, output, cycle time, and the work required to keep systems working. That shift is necessary. It’s what turns Physical AI from an experimental approach into a practical one.
The next phase of automation will be shaped not only by faster machines but by systems that can estimate material condition, adapt to variation, and improve with use.
The next wave of manufacturing will be won not by automating old processes, but by redesigning them for intelligence.
About the author

Cam Myers, founder and CEO of CreateMe.
Cam Myers is founder, CEO, and a board member of CreateMe, which is building the infrastructure for automated manufacturing of soft materials, beginning with apparel. The company replaces traditional sewing with digitally joined construction powered by robotics, proprietary adhesives, and AI-driven manufacturing systems, built on the idea that the “future of fashion is bonded.”
Myers holds 25 patents in apparel automation technologies made at CreateMe.
Before starting the company, he was on the founding executive team of Group Commerce, a venture-backed ecommerce platform eventually acquired by Blackhawk Network. Myers earlier held roles at DoubleClick and Allen & Co.



