In this episode, Abate De Mey interviews two speakers from the Agricultural track of the RoboUniverse 2016 conference in San Diego: Dan Harburg of Soft Robotics Inc. and Matthew Borzage of BioTac. Borzage and Harburg discuss their distinct approaches to advancing gripping technology in Agriculture. Borzage stresses the importance of tactile sensing while Harburg pushes for low cost, soft grippers with no on-board sensors.
Jana Witt: RoboUniverse with Robots: The podcast for news and views on robotics. Hi and welcome to the Robots podcast. This will be the first of two episodes covering our visit to last year’s RoboUniverse 2016 conference in San Diego. Our interviewer Abate De Mey was there for us and met up with some of the attendees to bring us the latest on robotics technologies that seek to improve the way people work, learn, and live. The RoboUniverse conference is the leading professional robotics conference and exposition that promotes practical application of robots and intelligence systems. In today’s episode, we’ll hear Abate’s conversations with Dan Harburg and Matthew Borzage. Dan Harburg, director of business development at Soft Robotics Inc., spoke to Abate about the advance of robotics gripping technology in the agricultural field.
Abate De Mey: Hello. Welcome to the Robots podcast. Could you please introduce yourself?
Dan Harburg: Sure. My name’s Dan Harburg. I’m the director of business development at Soft Robotics Incorporated.
Abate De Mey: What do you guys produce at Soft Robotics?
Dan Harburg: Soft Robotics is an end-of-arm tooling company. We make adaptive grippers that can go on the ends of industrial robots and be able to handle products that have a variety of different shapes and weights and sizes. We build the tools themselves, and then we also build pneumatic control systems that interface to those tools to provide a really easy package for companies that want to automate processes that they can’t automate using traditional gripping technology.
Abate De Mey: Can you describe the way your grippers look and the way they function?
Dan Harburg: Yeah, sure. People often say, and our founding professor will say, that the technology was inspired by an octopus. You can see what looks like a biomimetic structure when you look at the tools. They open and close in a way that a tentacle might through pressurized air and through vacuum. They’re configured in a variety of different shapes. We’ve got two fingers and three fingers and four fingers, five fingers, six fingers. All the way up to the biggest gripper we’ve built to date is 12 fingers for handling big bags of laundry detergents and chicken wings and stuff like that. They have their pneumatic actuator that has a bit of an accordion-like structure, so that when you inflate it with air it wraps around product and uses that to grip basically. They look a little bit like a tentacle.
Abate De Mey: Why do some of the grippers have 12 fingers and other ones have two or three or however many it is?
Dan Harburg: When we first started the company, we thought we’d be able to go out and find the super gripper that would be exactly like the human hand and be able to handle all kinds of different stuff. What we found was that part of what makes a human capable of handling a variety of different things is having two hands. Many things you need to pick with both hands. It’s not all just single-hand operations. As you look at being able to automate picking up a strawberry versus picking up a bag rice versus picking up an apple or a dough ball, every one of those things the requirement is different in terms of how you want to grip it. That’s allowed us to basically build the finger and then allow the finger to be reconfigured into different hands. That’s why we have two finger hands up to 12 finger hands.
Abate De Mey: Do your grippers have sensors on them?
Dan Harburg: They don’t. We believe in a dumb gripper. At the end of the day, our tool responds with air pressure to allow the gripper to wrap around whatever is presented in front of it. We’ve found that that kind of a system that has no sensor technology inside of it, but just relies on the materials and on its fundamental principles to be able to adapt to the environment, can be significantly lower cost and lower complexity, and is also significantly faster. For us, that’s allowed us to go into environments where people who’ve put tactile sensors in grippers and need to slow down and have all these complex wiring, we can go more quickly into those environments and help automate those operations.
Abate De Mey: Grippers with sensors at the end, are they slower because there’s some sort of latency between getting the data and then computing it and understanding what to do?
Dan Harburg: Sure. Part of it is definitely the latency. Another part of it is just the complexity of the system. If you think about being able to handle agricultural products like we’ve been discussion here at the event today, if you have electronic sensors in a tool, you have it in an environment where there might be sanitation requirements, you have to figure out, “How do we encapsulate the sensors so that they’re food-safe? How do we make sure that the electrical wiring running into the tool is totally encapsulated and doesn’t present pockets for bacteria to grow and develop? How do we make sure that it can work for millions of cycles without failure, and, when it does fail, how can we quickly replace it at very low cost?” All of that stuff, if you have sensors in there, the sensors have to get replaced all the time, the wiring has to be figured out. If it’s a stupidly simple system, then that complexity goes away, and so the cost stays lower and the speed can be faster. All of that comes together to allow a system to work for a customer.
Abate De Mey: Without some sort of feedback, how does the gripper know when it’s actually grasping the object, and how does it know where the object is?
Dan Harburg: It knows where the object is from input from vision sensors. Our system typically works in conjunction with a conveyor tract vision system or a 3D or 2D scanning vision system that sees the environment, locates the objects and their X, Y, and Z coordinates, and then tells the robot to go and move to that object and pick it up. Our tool though doesn’t often know exactly what the object is that it’s picking and so it replies on the principle of the response of the actuator to the object as it’s inflating to wrap around it to be able to handle a variety of different objects. We don’t know what we’re picking, and often we don’t know whether we’ve picked it or haven’t picked it. We rely on the feedback from vision systems to know where the object is and allow the tool to grasp the object.
Abate De Mey: In today’s panel, you mentioned this gripper has agricultural applications as well. I was wondering if you could tell me what farmers or people working in agriculture expect out of these grippers? What type of tasks to these expect out of robots in agriculture?
Dan Harburg: The agriculture market’s interesting because I think that the customer base is highly demanding of the products that they’re using. If you look at a tractor, things people are pulling through the field, equipment that’s installed in their production plants and packing plants, this stuff gets beat. It gets beat up all the time. It’s sanitized at the end of the day. It’s washed. It’s got dirt all over it. It’s being banged around and thrown around. We have to build extremely reliable products, and we need things that can last for millions of cycles without any failures. We know that that reliability has forced us to really focus on building products that can maintain consistent performance over very long cycles. That, for us, is a combination of cost and reliability that we pushed to have in our products.
Abate De Mey: If you have these dumb grippers that don’t have sensors on it, doesn’t that mean you could also just replace the grippers, replace the tips, instead of worrying about making it last millions of cycles?
Dan Harburg: We do. You actually need to be able to do both. You need to have things that can last for millions of cycles because if you consider an application where somebody might be picking a product at 100 picks per minute, that means that they’re doing 100 pick and place operations every minute, if that’s running consistently for an hour, and then they’re running for 18 hours a day and then seven days a week, you get to a million cycles really quickly. You need to have products that can last for quite a bit a number of cycles, but then can also be easily replaced and serviced as needed for either sanitary requirements or wear and tear of the product. If you had something that only lasted for 10,000 cycles, it might not even get through a single day.
Abate De Mey: How would say that farmers would go from having these human-oriented farms to more robotics-oriented farms that are more optimized for robots to perform well?
Dan Harburg: There’s definitely a complete change in thinking that’s going to be required from the farmers, because today, as I mentioned in the panel, everything has been designed and optimized around the availability of human labor to be able to pick and pack and harvest and all of these complex tasks. They’ve tried their very best to drive as much efficiency as they can through the other parts of the system while still relying on human labor to pack.
A great example of that would be consider product packaging where peppers today, bell peppers, might be packed inside of a Ziploc bag. The way that a human packer gets the peppers into the back is by wedging one pepper into one side, pushing it all the way to the side, and then taking a second pepper and wedging it into the other side and packing it. That’s a task that’s almost impossible for a robot to do. If you look instead at a tray where just feeding two peppers to a tray and then the tray flows through a flow wrapper where the thing is packaged in an automated system, now you start to imagine, “Okay, that’s something that we could do, and you could actually automate.”
The grocery stores, the consumers, and the processing houses have become reliant upon meeting the demands and requirements of what people would optimally like to see in their packaging. That’s driven them to say, “Okay, we need to have human workers that can do these very complex tasks to pack into the packaging that the stores expect.” This whole sequence, if you say, “We need to automate it,” you probably have to reinvent the packaging and different types of packaging. That’s just one example, but it speaks to how the mindsets are going to have to change at every level if you’re going to really imagine fully automated packing and processing environments.
Abate De Mey: It sounds like when you have these human-oriented farms, you have humans doing multiple tasks at once. They’re wedging to one side, wedging to the other side, they’re closing the Ziploc bag, whereas the move to a more robotic oriented farm would be to break down these tasks even further. Break them down into more simplistic pieces. How would you educate the farmers to be able to, or the agricultural leaders to be able to, break down these tasks that we never thought about before into these more simple robotic-oriented tasks?
Dan Harburg: I was visiting a greenhouse grower of tomatoes and peppers just yesterday in Santa Maria, and what he was saying was, “Look, if we could get consistent packaging across many different types of products, then we could imagine a single robotic station where your tool could handle either or a tomato or a pepper or a sweet pepper or any of the products that we grow here in our greenhouse. You could package it into the same kinds of trays. We can start to standardize the type of packaging we have on the output. Now, we can imagine bringing in automation.” That’s the kind of thinking that’s going to be required and the kinds of changes. Today, there are tens of different packaging types for one type of product that a farmer might produce, and they are capable because they are using human labor to switch between packaging on the fly and on demand depending on what their customers are asking for. There will have to be more standardization, at least in the kinds of machines that are used for packaging, in order for automation to be accepted in that environment.
Abate De Mey: Thank you very much.
Jana Witt: Now, let’s see Abate’s discussion with Matthew Borzage, founder of SynTouch, with whom he spoke about the BioTac sensors they produce and the importance of tactile sensing in gripping technology.
Abate De Mey: Hello and welcome to the Robots Podcast. Could you introduce yourself?
Matthew Borzage: My name is Matt Borzage. I’m the co-founder of SynTouch and head of business development.
Abate De Mey: Thank you. Can you tell us what you do at SynTouch.
Matthew Borzage: Well, in a small robotics company you tend to do a little bit of everything. As a co-founder, I have an intimate appreciation for the technologies, and in the role of business development, I find that I spend a lot of my time talking with people who have potential applications for our technology.
Abate De Mey: What technology do you produce?
Matthew Borzage: SynTouch is the, somewhat immodest in our definition of what we do, we own the world of touch, machine touch specially, which we define as, “A field analogous to machine vision.” We invented and hold the patents for tactile sensors that can detect everything that a human fingertip can. We understand the algorithms that are required to process and use that information, and we also have come to understand the applications for commercializing that technology into useful products.
Abate De Mey: What are your sensors called and what do they look like?
Matthew Borzage: Excellent question. Our sensors are called the BioTac and BioTac SP sensors. Those are green fingertips. If you see a robot with green fingertips, it’s probably ours. Our NumaTac sensor, which is a different class of sensor, but all three of these are technologies that we developed based on our understanding of human touch.
Abate De Mey: Can you describe how your fingers work? Your senors?
Matthew Borzage: We have two different sensing technologies. The BioTac sensors emulate the three main modalities of human fingertips. Humans can detect deformations of their fingertip, vibrations that occur as you slide your finger across an object or as the object vibrates, and also the transfer of heat between your finger and an object that you’re touch. Our sensors use different features in order to replicate each of those senses. The finger, the BioTac, is a rigid epoxy core with an array of electrodes over it. We’ve placed a green silicone skin over the top of that core and have a layer of conductive fluid trapped between the core and the skin. When the skin is deformed is causes the fluid layer to deform, and that essentially acts as a fluid filled strain gauge causing the array of electrodes to tell you about the deformation of the fingertip.
The finger also has fingerprints on it which are very important for amplifying vibrations as you slide your finger over the top of a material or our BioTac over the top of a material. Those vibrations are propagated through the skin and fluid layer into a pressure sensor that’s capable of detecting essentially the hydrophone vibrations of the skin sliding over materials. Then, also based on humans and like humans, our BioTacs are deliberately created to be slightly warmer than the world around them. When you touch an object with your hand, you’ll exchange heat with that object. When our BioTac touches an object, it exchanges heat. That heat is drawn out of the fingertip typically, and the rate at which it’s drawn tells you how the material will feel warm or cool.
The NumaTac is a greatly simplified sensor which is suitable for extremely dirty and dangerous applications where sensors might become damaged. It’s compliant and is able to tell information about coming in contact with the object based on a pressure sensor connected to a matrix of foam essentially.
Abate De Mey: Can these sensors experience all of the same sensations that a human sensor could sense?
Matthew Borzage: Yes. The BioTac and BioTac SP sensors are able to detect essentially everything that a human fingertip can feel, and able to detect the more sensitively. The NumaTac is deliberately designed to have a simplified sensor more analogous to what you would feel if you put a heavy work glove over your hand.
Abate De Mey: Can you list off all of the different sensations that the sensors can feel?
Matthew Borzage: It’s a good question, and actually one of the struggles that we have in the world of touch is that there actually aren’t great vocabulary for describing the things that people feel. In the world of color and in vision we have very good, well-defined, universal terms. You can specify the red, blue, green value of something and a web developer or print shop will know pretty much exactly the color you intend to have. In the world of touch, people tend to use really sloppy vocabulary which is highly subjective, or they’ll tend to use measurements based on classic engineering techniques that really don’t describe the way it feels to a human. I.e., it has a static coefficient of friction blank.
In the world of touch, we have a come up with a vocabulary that we call “The SynTouch Standard”. It comprises 15 dimensions of touch. They’re in broad categories which are as follows: the micro and macro texture of an object, the frictional properties of an object, the adhesive, thermal and compliance of an object. You might have noticed there’s only five. We do have subcategories within there that describe slightly different things.
Abate De Mey: Do you get feedback from the sensors? You get it in the form of numbers I assume? Numerical feedback?
Matthew Borzage: That’s correct.
Abate De Mey: How do you translate that into something that a human could understand?
Matthew Borzage: That’s a great question. I really hope that somebody listening to your podcast has a device that you could essentially plug our sensor into and create those sensations. In the field we call those either haptic displays or tactors. It’s analogous in the world of vision to saying that we’ve invented a video camera and we’ve love for somebody to have a monitor out there. There isn’t right now a corresponding technology where you could just essentially plug the BioTac in. I hope that somebody develops one.
What we’re doing as an intermediate measure right now is providing those SynTouch standard parameters, those 15 numbers that describe how materials feel, so that if you’re interested in knowing exactly how it feels, at least you can have that set of numbers. We hope that in time that people who are interested in knowing how things feel will be able to grow to appreciate those just like if I tell you an RGB value of 255, 128, 0, you can have a rough idea about what that might look like.
Abate De Mey: What applications are your sensors using?
Matthew Borzage: Great question. The applications for our sensors are really bimodal, and they mirror very closely how humans use tactile information. There’s action for perception and perception for action. On one side of it, you might want to feel an object because you’re interested in manipulating it. You don’t cognitively care if it feels warm or fuzzy or smooth or slippery, but that tactile information might be useful as you adjust your grip on the object. That would be obviously very useful set of information for a robot. If you think about in, lacking that information in your hand makes your hand very cumbersome. If you’ve ever had numb or cold skin and you know how clumsy you feel, that’s essentially as best a robot can be without the sense of touch.
Moving away from manipulation, we have found there’s actually an incredible market for being able to quantify how objects feels for the world of design and production. If you’re producing a piece of luxury apparel or an automobile and you’re trying to do it efficiently, or the next consumer electronic device, touch screen, or tablet, and you want it to have a particular feel, you would like to be able to define that. While you can do that right now for color, the only way to do that with texture or feel is to use our SynTouch standard.
Abate De Mey: Since many of these companies lack the vocabulary to describe touch, and maybe they’re not very knowledgeable of the SynTouch library, how do they portray to you what they want from the touch? For example, these five luxury goods, how do they portray to you exactly what they want their product to feel like?
Matthew Borzage: Talking with companies and learning about their internal vocabulary has been a very interesting experience. I have literally sat in the board room of a major automotive manufacturer and listened to a debate over what the word “soft” meant. To an engineer, soft might have something to do with the spring constant of the material or maybe its visco elastic properties. To a designer, soft might mean it should feel fuzzy like a puppy’s belly. That’s a quote I’ve actually heard. Every group that we’ve talked with has had their own internal vocabulary. Every industry has had some terms that they bandy about, but there’s actually very little consensus over what the words mean. One of the jobs that I have is to go into a company and learn what they call various physical attributes the material has and then help them map between what they’ve been calling it and what we can quantify objectively and repeatably for them.
You can use an analogy from the world of color. We have very specific understandings about what red and blue and green are. Certain wavelengths of light. If we didn’t, we might have trouble if say when I say “red”, you thought crimson and somebody else was picturing maroon. Very different shades of red that you can’t use to then define a color without having some misunderstandings.
Abate De Mey: I know you spoke today on the agricultural panel in the RoboUniverse. Could you tell us some agricultural applications of the sensor?
Matthew Borzage: The agricultural applications for tactile sensing in robotics and manipulation essentially comes down to, “Do you need to pick something that’s fragile?” If there’s a piece of produce that needs to be handled which will become damaged by essentially handling it inappropriately, we have a sensor that can be mounted on essentially any robotic gripper and provide a sensitive, gentle touch. On the panel today, I also pointed out that it’s entirely possible that a lot of applications can be done without this technology. Personally one of the things I find interesting about agriculture is how diverse the products and the crops that need to be handled are, and the variety of ways that people are looking at replacing the human hand when trying to interact with those materials is really remarkable.
Abate De Mey: How is your company dealing with the very expensive and slow process of development on the engineering design?
Matthew Borzage: Working with robotics and with hardware can certainly be a lot more difficult at times than purely software applications. The focus of SynTouch has been decided since our inception to remain on the world of touch, and I think we’ve done an excellent job on that. The implication of that is that we focus on something that we know is useful to humans. We know a human with the sense of touch is capable of doing amazing feats of manipulation and object identification, and we know if you remove that sense of touch the human will become very clumsy and unable to understand what it is that they’re interacting with. Instead of trying to follow each of the trends in either the market or the other technologies that’s available, we’ve continued to refine our technologies and make them more durable and more inexpensive and more robust in terms of the information that they provide. If there is utility in a human hand and a human hand’s sense of touch when the other markets essentially have settled, we will have the technology that we can provide.
Abate De Mey: Thank you very much.
Matthew Borzage: Thank you. It was a pleasure.
Jana Witt: That’s it for today, but tune in again in a fortnight when we’ll have more interviews from Abate’s visit to RoboUniverse. If you can’t wait that long, simply visit our website at RoboHub.org for all our past episodes are loads more about robotics. We’ll back in two weeks time. Until then, goodbye. RoboUniverse with Robots: the podcast for news and views on robotics.
All audio interviews are transcribed and edited for clarity with great care, however, we cannot assume responsibility for their accuracy.