At this year’s Consumer Electronics Show, the world’s foremost tech conference, farming machinery giant Kubota introduced Flash, a vehicle-mounted camera that creates ultra-detailed images to help farmers care for their crops.
It was an idea born in a Carnegie Mellon University robotics lab and another example of the revolutionary technology being developed in any one of the labs scattered across at least six buildings on the school’s Oakland campus.
The research has helped Pittsburgh earn a reputation as a central hub in the robotics world, with Carnegie Mellon serving as a pool of talent for more than 100 local robotics companies. The university partners with a variety of industries hungry to see what new robotics research has to offer.
More than 30 robotics companies got their start on Carnegie Mellon’s campus, and the university’s research helped attract the attention of large corporations like Intel, Caterpillar and Disney. When Google began expanding its presence in Pittsburgh in 2006, it was followed quickly by Apple, Uber and SAP, who recognized the pipeline of computer science talent streaming from Carnegie Mellon and other regional universities.
“What we have here is a global anomaly and one of the top robotics ecosystems in the world,” said Jennifer Apicella, executive director for the Pittsburgh Robotics Network. “There are only two other regions, San Francisco and Boston, with this breadth and depth of robotics companies. There are more than 125 companies here with more than 8,000 people working for them.”
Southwestern Pennsylvania’s robotics companies are creating solutions used across 18 industries, Apicella said.
“Astrobotics is one. They’re in the space-industry vertical,” she said. “They’re creating automation to go to the moon. We have agriculture robotics companies like Bloomfield (Robotics) and others working on greenhouse picking of tomatoes. Kubota just acquired another (agricultural) company that’s focusing on precision farming for picking vineyard grapes.”
Pittsburgh’s Gecko Robotics is working with the U.S. Navy to deploy AI-powered robots to assess and maintain its ships, submarines and aircraft carriers. It recently signed a $100 million deal with power operator North American Energy Services to deploy robots as part of the modernization of power plants across the U.S.
Robots should be viewed as the power tools of modern business, Apicella said.
“They aren’t taking jobs away as much as they’re creating ways for humans to be more efficient, to be safer and to perform dull, repetitive tasks that people do not want to do,” she said.
Robots in farming
George Kantor, a research professor who came to Carnegie Mellon in 1999, dived headlong into robotics after receiving a grant from the American Nursery and Landscape Association to work on automated watering control.
In 2008, he secured a $6 million grant from the U.S. Department of Agriculture’s Specialty Crops Research Initiative and partnered with Washington State University and other schools to develop technology for Washington’s tree fruit industry.
“They wanted a robot that could harvest apples, and we had to explain something like that was a much longer-term goal than they realized,” Kantor said. “So we set out on a path that could lead to eventually having those robots, and one of those ways was a method for monitoring fruit in the apple orchards.”
Between 2008 and 2014, the research team developed a specialized camera that could size apples, monitor for disease and relay data back to growers. A second project, funded by the U.S. Department of Energy, researched ways to phenotype crops — to collect the most data possible in areas such as plant quality, photosynthesis, development, architecture, growth and productivity.
“That program had a very strong tech-transition component, and it incentivized us to start a company,” Kantor said. “(Co-founder) Tim Mueller-Sim decided he wanted to go for it, and Bloomfield Robotics was founded in 2017.”
Bloomfield Robotics is part of “Robotics Row,” where more than 20 robotics companies dot the banks of the Allegheny River from Downtown to Lawrenceville. By last September, the start-up attracted the attention of a Kubota team in California’s Silicon Valley looking for innovative tech that can be applied to agriculture.
“We got to know the team, worked on some proofs-of-concept, introduced that technology to our customers and found out it’s just the right thing,” said Brett
Featured Local Businesses
McMickell, Kubota’s chief technology officer.
Kubota has historically provided agricultural hardware — tractors and physical equipment.
“We’ve strengthened machine control of the tractor, but how do you task it?” McMickell asked. “We needed to be able to ‘digitize’ the farm.”
Take blueberries.
“Flash will go through fields, scanning all the fruit,” McMickell said. “It can do counts, it can look at coloration and the quality of the berry. A lot of blueberries are still harvested by hand, but this system can help you be more efficient in how you deploy those pickers. You can look at what fields have the most ripe blueberries and get more consistent quality in your product.”
Learning to help
“Hey, Obi, get me a spoon of pretzels, please.”
That verbal command elicits a quiet beep of acknowledgment from a nearby computer, as Obi — a small robotic arm that can dip a spoon into four small bowls — slowly scoops into the correct bowl and brings the spoon within a couple of inches of the person who gave the command. The person can then lean forward and eat.
“That’s as close as it gets,” Carnegie Mellon junior Janavi Gupta said. “We don’t want it actually going into a person’s mouth.”
Obi is a commercially available robot. But the speech system Gupta and others designed is an add-on, developed at the university’s Robot-Human Interaction Lab, which is focusing on ways robotics can be used to help people with everyday tasks.
Before any of those robots can help, however, they need to learn.
Zheyuan Hu was hard at work in the Robot-Human Interaction Lab in early March trying to get two robotic arms to pick up a shirt, put it on a hanger and place it on a rack, one of several home health care applications being researched.
The primary piece of equipment on many of Carnegie Mellon’s robots — a sizable arm-like appendage — has seven degrees of articulated freedom. Basically, that’s seven joints that can rotate to maximize its movement in all directions. Even that presents complex problems for simple actions, according to Jason Liu, 23, of Toronto, a first-year Ph.D. student.
“We’re collecting human data and training the robot on it,” Liu said as he worked with a hand rig that uses three fingers to control the robot arm. Liu can remotely control the robot, and the hand rig is designed to provide force feedback.
“When I touch the box, you get a small amount of pushback pressure,” he said. “When I’m picking something up, I can feel the ‘force’ of gravity pulling down.”
Some of that technology is already at work commercially in applications like the DaVinci surgical system, where surgeons use tiny high-definition cameras and remotely control robotic arms to perform the most minimally invasive surgery. Force feedback allows surgeons to feel when they’re nearing nerves, arteries and bones. The system has been in use in Independence Health System hospitals since 2009, and UPMC has it in more than 20 of its hospitals.
Those systems, however, require a skilled human to control the robot via a hands-on system.
Liu and others are working to create what they call “policies” for their robots: autonomous behavior that can be generalized and safely applied in a real-world setting without physical human control.
Another CMU robotics station has two of the arms working in tandem, each with a flat area allowing them to apply pressure from both sides and pick up a small plastic basket. But a demonstration reveals a weakness that robotics students are working to overcome: When the arms apply too much pressure, the basket gives way and slips out of their grasp. Sometimes the arms apply pressure to opposite corners and just push the basket around.
That is because a robot only knows what it is taught. It can’t think like a person, making adapting to new situations a challenge, Liu said.
In another lab, Jiahui Yang demonstrates how one of the robot arms is able to avoid obstacles. He puts a cardboard box in its way, and the correct articulation points revolve to guide the arm around it. If he does it too quickly, though, the robot won’t be able to adapt in time.
Human-driven data to learn
Unlike their colleagues involved in recent artificial intelligence developments such as ChatGPT or the Midjourney AI-powered graphics generator, robotics developers can’t simply feed their machines internet data to teach them. These robots aren’t creating pictures or writing stories. They’re trying to do physical tasks.
Featured Local Businesses
“Our main focus is how can we use AI and other data sources to train robots to do more,” said Kenny Shaw, 26, of Hopewell, N.J., a fifth-year Ph.D. student at Carnegie Mellon.
One way to do that is to find affordable ways to collect the type of human-driven data necessary for robots to learn and adapt at even a fraction of the pace humans do.
“We designed a bunch of different low-cost, efficient and anthropomorphic (LEAP) robot hands that are in the thousands-of-dollars range,” Shaw said. “They’re 3D printable and open source, and something like that can really help us scale up data collection.”
Shaw and others are hoping the LEAP robot hands are affordable enough to be used widely, and the data they collect can be centralized to create a host of policies robots can use to adapt and advance.
In the LEAP lab, Tony Tao and Mohan Kumar Srirama are working on a project called DexWild (short for “dexterity in the wild”), where they’re also looking for new ways to collect human data to help train their robots. One such rig is worn on both hands, a glove outfitted with sensors for one and a large cube on the other.
“You can wear these, and they capture everyday human data,” Srirama said. “That can be fed to the robot to help train it. The next step would be glasses that you could wear that are less cumbersome and will capture data as you do physical tasks. This (hand-worn) rig is a step in that direction.”
In the Robot-Human Interaction Lab, Gupta, Hu and others are working to bring that technology into the real world through applications like a robot designed to help people get dressed.
“It’s a surprisingly complex operation,” CMU spokesman Aaron Aupperlee said. “It has to grip the cloth of the shirt, which is difficult. It has to pull the sleeve up. But places like home health care is where robotics is looking to expand.”
Said Kantor: “We’re trying to learn how the robots can do things by watching people do them. Robots can be used for a lot of real-world applications, but they’re going to require training by people — and those people won’t all be scientists.”
The agriculture industry specifically could provide a world of opportunity for adaptive, generalized robotics, Kantor said.
“The segmentation is huge, and the idea that you could make one robot to solve all those problems is not going to work,” he said. “But you can create a tool that growers can use to solve their individual problems. I want to be able to give a robotic platform to a farmer, and the farmer can teach it what he does and the way he does it.”
The future
The focus on robotics in Pittsburgh led Carnegie Mellon last year to implement a new robotics undergraduate degree, which CMU Robotics Institute Director Matthew Johnson-Roberson said is the most exciting latest development for him.
“I think it really signals that there are careers in robotics you can get with an undergrad degree that didn’t exist five years ago,” he said. “Where entering that field used to mean a career path in computer science or mechanical engineering, I think this is a big step forward for the field.”
Robotics has the potential to change how people work and live, Apicella said.
“We’re outpacing every other major city when it comes to robotics companies,” she said. “This is Southwestern Pennsylvania’s future.”