Amazon is opening new automation opportunities by deploying its first robots that leverage force and touch sensing to improve material handling tasks.
One of the classic applications for robots at Amazon warehouses is centered around the “goods to person” (G2P) solution with the Kiva robots. The Kiva mobile robots present movable shelves, stocked with inventory, to a stationary human picker. The human associate picks a specific item for a specific customer order and singulates it for shipment.

Over time, the shelves are depleted of inventory and need to be replenished. The replenishment task is currently done manually. To automate the replenishment task, Amazon developed a new robot called Vulcandesigned to pick items from bulk and place them onto the movable shelves.
What makes Vulcan unique is that it is equipped with force feedback sensors and AI, giving it a sense of touch. This “sense of touch” allows Vulcan to manipulate objects with greater precision and dexterity. According to Amazon, Vulcan can pick and stow approximately 75% of the items in Amazon warehouses, moving them at speeds comparable to human workers.
The robot’s capabilities are expected to improve operational efficiency, workplace safety, and reduce physically demanding tasks for human employees. Vulcan’s end-of-arm tooling and sensors enable it to manage a wide range of products, from small gadgets to larger items, by applying the appropriate amount of force.
Vulcan uses an arm, camera and suction cup gripper to pick items from storage pods. | Credit: Amazon Robotics
Aaron Parness, Director of Applied Science at Amazon Robotics, joined Steve Crowe, Executive Editor of The Robot Report, to discuss the technology behind Vulcan during a keynote during last week’s Robotics Summit and Expo in Boston. Parness explained the importance of touch and force sensing to the future of robotics at Amazon.
Parness’ team has said “force is the language of manipulation.”
“(Force sensing) is essential to how we interact with the world. It’s one of the big limitations in our field right now,” Parness said during his Robotics Summit keynote. “If you look at mobility, robots are doing back flips, but manipulation is still a very unsolved challenge. We get confused sometimes between digital intelligence and physical intelligence. We are rightly impressed when robots beat grand masters at chess. They are amazing at playing chess, but robots still kind of suck at moving the pieces on the board. And that’s the physical intelligence. That’s where (the people in this room have) lots of opportunity to make advances.”
Aaron Parness (left) discussed how force sensing improves robotic manipulation at Robotics Summit & Expo 2025. Credit: Jeff Pinette
Parness believes said there are a number of new applications that will be enabled by touch. This includes densely putting items into a padded mailer, handling groceries, and putting packages into delivery bags. These are things where you have a lot of physical contact, where you need the next wave of robotics.
“(A sense of touch) allows us to go faster so we don’t have to be as cautious, because we can move quickly and then respond when we make contact, as opposed to watching and watching and watching,” Parness said at Robotics Summit. “And it’s a faster response rate. It also allows us to fill the bins to a higher level of growth cube because we can compress items. You can squeeze the pillow or the t-shirt over to the side. You can’t know that ahead of time always. So you need to have that force feedback to know if what you’re pushing on is rigid or compliant.
“It also helps us avoid damaging items and dropping items. It helps us with item eligibility. You don’t grip a physics textbook that’s very heavy with the same amount of force as you do a thin cardboard box that’s got some medicine in it. So it’s part of everything we do. I had an old mentor at NASA JPL, Brett Kennedy, who used to say industrial robots 1.0 were dumb and numb. They didn’t feel anything, and they didn’t have a brain.
“That’s OK for a lot of tasks, right? If you are welding a robot, you can do that dull, dangerous, dirty, repetitive task without needing to feel the world. But we want them to interact in highly cluttered environments. You should see my kids play area. If we want to sort through that pile of junk, you have to have a sense of touch. That’s my fundamental hypothesis.”
Amazon currently has a number of other robotic picking applications deployed. Sparrow is currently picking from totes, but it only picks from the top layer of the totes. Sparrow has a lot of intelligence to identify the items and plan the trajectories, but it (currently) doesn’t require a sense of touch.
Amazon has another robot called Cardinal, designed to fill a cart with packages. The key for Cardinal is to get the cart as full as possible. Parness believes Cardinal could benefit from a sense of touch to help it maximize the cart load in the future.
Vulcan aims to automate the stowing of items in upper bin rows, which are hard for people to access, according to Parness. This focus on the top rows means human workers would primarily stow items on mid-level shelves, the “power zone,” potentially reducing worker injuries, Parness noted. Amazon’s injury rates have historically been higher compared to other warehouses, although the company states these rates have decreased considerably.
Vulcan represents the first of the low-hanging fruit applications for better force and touch sensing. The Amazon robotics team developed their understanding of touch sensing integration with the Vulcan development and is now looking to expand this to other target applications in the warehouse.
For now, Vulcan is only in full operation at Amazon’s warehouses in Spokane, Washington, and Hamburg, Germany.
GIPHY App Key not set. Please check settings