Amazon just announced a successful trial of its Vulcan robot in a German distribution center. This is news from the perspective that the robot arm handles dexterity very well. Unlike human hands a machine cannot feel an item. It's easy to crush something or to drop it. How did Amazon figure it out? Let's take a look. Vulcan is designed to stow and pick items in Amazon's mobile robotic inventory system. The retailer started a few years ago with its Sparrow system which has since evolved to handle over half a million different items. The retailer stows 14 billion items in its warehouses each year and aims to handle 80% of it through robots at 300 items per hour and 20 hours per day. Problems such as maximizing bin density remain, but the company is progressing. Its robots already work faster than humans. On the picking side, as mentioned, the issue has always been how to grab something gently. Vulcan's dexterity is based on a combination of force-feedback sensors, physical AI, and specialized end-of-arm tooling that, taken together, provide a sense of touch. Essentially, it handles a wide range of items with human-like finesse. The sensors measure force and adjust pressure accordingly. The end-of-arm tool uses a ruler to sort things. Vulcan also leverages a camera and suction cup. The camera identifies a target item and the best spot to grip, then monitors the process to ensure only the correct item is picked. Lastly, the system continuously learns so that each mistake improves the system through lessons that are propagated to all machines for future picks. Other companies including Tesla (Optimus), Google (ALOHA) and Boston Dynamics (Atlas) are also making quick progress in this area. Dexterity is a necessary capability enabling most use cases for robots. Once we reach it, things may well change quickly in factories, warehouses and eventually homes. #supplychain #truckl #innovation
How Amazon is Driving Robotics Innovation
Explore top LinkedIn content from expert professionals.
Summary
Amazon is revolutionizing the robotics space by integrating cutting-edge technologies to enhance warehouse operations and exploring humanoid robots for future delivery solutions. With innovations like dexterous robot arms and obstacle courses for humanoid testing, Amazon is driving advancements in robotics that could transform industries beyond logistics.
- Focus on dexterity: Amazon is using sensors, physical AI, and specialized tools to develop robots like Vulcan that can handle items with human-like precision.
- Test in real-world scenarios: The company is experimenting with humanoid robots in obstacle courses to prepare them for navigating complex delivery tasks in unstructured environments.
- Commit to continuous learning: By leveraging machine learning, Amazon ensures its robots improve with each task, sharing insights across systems to advance their capabilities.
-
-
Amazon is building a humanoid robot obstacle course to test robots for package delivery. If this sounds like science fiction, that's because all tech revolutions do, right up until the moment they become mainstream. For decades, robotics lagged AI not because we lacked the hardware or the ambition - but because we lacked scalable data. You can train a language model on the internet but there’s no web-scale corpus for touch. There’s no dataset of “how hard to grip an egg” or “how to angle your wrist to tie a shoelace.” Each new skill meant hours of demonstration, labeling, and hardware resets. The result: a field bottlenecked not by compute, but by data, and ultimately by human bandwidth. Bad recipe for scale. But over the past 24 months, something changed. Robotics quietly entered its foundation model era. A new generation of models - trained on thousands of demos, rich simulations, and increasingly across multiple robot forms - are showing early signs of what once seemed impossible: ✨generalization✨ It’s a subtle but profound shift. Robots are no longer being trained to perform specific actions in brittle scripts. They’re being taught to recognize context, reason about goals, and apply learned skills to unfamiliar settings - just like we do. Call it general physical intelligence: the ability to not just move through the world, but adapt to it. Once you can copy-paste a physical skill - laundry folding, warehouse sorting, house cleaning - you can scale labor like software. Delivery is just the gateway drug. Once a robot can safely walk a sidewalk, open a gate, and navigate stairs - the entire unstructured world opens up: elder care, retail restocking, construction, hospitality, you name it. Today: humanoid obstacle courses. Tomorrow: the open world. And after that: a physical workforce that learns like software and updates over the air. Amazon's humanoid field trips may seem like a stunt today. But so did self-driving car test loops in parking lots a decade ago. Now I’m typing this from the back of a Waymo.
-
Amazon trying to use humanoid robots for deliveries makes perfect sense: 1) Amazon thinks long term; they are OK if this takes years 2) Amazon has already spend more than a decade and billions of dollars on drones. Drone delivery may work great in suburbs, but not in cities. 3) Amazon has also owned Kiva Systems, now Amazon Fulfillment Technologies & Robotics, for more than a decade. Amazon is deeply experienced in using robots inside controlled environments like a warehouse, and has long work with autonomous robots in the form of drones. This news fits nicely in the middle, trying to bring robots out of the warehouse onto the street. It covers the gap between the warehouse and the suburbs. As for using a truck as a rolling base, I helped Amazon patent that idea back in 2017, so it has been a known concept there for at least 8 years. I think we are a long way from actual humanoid delivery (do you agree?), but I am not surprised they are testing it. Thoughts? https://lnkd.in/gXd5i9KK