Friday, December 5, 2025
  • Login
CEO North America
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel
No Result
View All Result
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel
No Result
View All Result
CEO North America
No Result
View All Result

CEO North America > Technology > New tool gives anyone the ability to train a robot

New tool gives anyone the ability to train a robot

in Technology
Knowledge workers will be biggest losers from AI boom
Share on LinkedinShare on WhatsApp

Teaching a robot new skills used to require coding expertise. But a new generation of robots could potentially learn from just about anyone.

Engineers are designing robotic helpers that can “learn from demonstration.” This more natural training strategy enables a person to lead a robot through a task, typically in one of three ways: via remote control, such as operating a joystick to remotely maneuver a robot; by physically moving the robot through the motions; or by performing the task themselves while the robot watches and mimics.

Learning-by-doing robots usually train in just one of these three demonstration approaches. But MIT engineers have now developed a three-in-one training interface that allows a robot to learn a task through any of the three training methods. The interface is in the form of a handheld, sensor-equipped tool that can attach to many common collaborative robotic arms. A person can use the attachment to teach a robot to carry out a task by remotely controlling the robot, physically manipulating it, or demonstrating the task themselves — whichever style they prefer or best suits the task at hand.

The MIT team tested the new tool, which they call a “versatile demonstration interface,” on a standard collaborative robotic arm. Volunteers with manufacturing expertise used the interface to perform two manual tasks that are commonly carried out on factory floors.

The researchers say the new interface offers increased training flexibility that could expand the type of users and “teachers” who interact with robots. It may also enable robots to learn a wider set of skills. For instance, a person could remotely train a robot to handle toxic substances, while further down the production line another person could physically move the robot through the motions of boxing up a product, and at the end of the line, someone else could use the attachment to draw a company logo as the robot watches and learns to do the same.

“We are trying to create highly intelligent and skilled teammates that can effectively work with humans to get complex work done,” says Mike Hagenow, a postdoc at MIT in the Department of Aeronautics and Astronautics. “We believe flexible demonstration tools can help far beyond the manufacturing floor, in other domains where we hope to see increased robot adoption, such as home or caregiving settings.”

Hagenow will present a paper detailing the new interface, at the IEEE Intelligent Robots and Systems (IROS) conference in October. The paper’s MIT co-authors are Dimosthenis Kontogiorgos, a postdoc at the MIT Computer Science and Artificial Intelligence Lab (CSAIL); Yanwei Wang PhD ’25, who recently earned a doctorate in electrical engineering and computer science; and Julie Shah, MIT professor and head of the Department of Aeronautics and Astronautics.

Training together

Shah’s group at MIT designs robots that can work alongside humans in the workplace, in hospitals, and at home. A main focus of her research is developing systems that enable people to teach robots new tasks or skills “on the job,” as it were. Such systems would, for instance, help a factory floor worker quickly and naturally adjust a robot’s maneuvers to improve its task in the moment, rather than pausing to reprogram the robot’s software from scratch — a skill that a worker may not necessarily have.

The team’s new work builds on an emerging strategy in robot learning called “learning from demonstration,” or LfD, in which robots are designed to be trained in more natural, intuitive ways. In looking through the LfD literature, Hagenow and Shah found LfD training methods developed so far fall generally into the three main categories of teleoperation, kinesthetic training, and natural teaching.

One training method may work better than the other two for a particular person or task. Shah and Hagenow wondered whether they could design a tool that combines all three methods to enable a robot to learn more tasks from more people.

“If we could bring together these three different ways someone might want to interact with a robot, it may bring benefits for different tasks and different people,” Hagenow says.

Tasks at hand

With that goal in mind, the team engineered a new versatile demonstration interface (VDI). The interface is a handheld attachment that can fit onto the arm of a typical collaborative robotic arm. The attachment is equipped with a camera and markers that track the tool’s position and movements over time, along with force sensors to measure the amount of pressure applied during a given task.

When the interface is attached to a robot, the entire robot can be controlled remotely, and the interface’s camera records the robot’s movements, which the robot can use as training data to learn the task on its own. Similarly, a person can physically move the robot through a task, with the interface attached. The VDI can also be detached and physically held by a person to perform the desired task. The camera records the VDI’s motions, which the robot can also use to mimic the task when the VBI is reattached.

To test the attachment’s usability, the team brought the interface, along with a collaborative robotic arm, to a local innovation center where manufacturing experts learn about and test technology that can improve factory-floor processes. The researchers set up an experiment where they asked volunteers at the center to use the robot and all three of the interface’s training methods to complete two common manufacturing tasks: press-fitting and molding. In press-fitting, the user trained the robot to press and fit pegs into holes, similar to many fastening tasks. For molding, a volunteer trained the robot to push and roll a rubbery, dough-like substance evenly around the surface of a center rod, similar to some thermomolding tasks.

For each of the two tasks, the volunteers were asked to use each of the three training methods, first teleoperating the robot using a joystick, then kinesthetically manipulating the robot, and finally, detaching the robot’s attachment and using it to “naturally” perform the task as the robot recorded the attachment’s force and movements.

The researchers found the volunteers generally preferred the natural method over teleoperation and kinesthetic training. The users, who were all experts in manufacturing, did offer scenarios in which each method might have advantages over the others. Teleoperation, for instance, may be preferable in training a robot to handle hazardous or toxic substances. Kinesthetic training could help workers adjust the positioning of a robot that is tasked with moving heavy packages. And natural teaching could be beneficial in demonstrating tasks that involve delicate and precise maneuvers.

“We imagine using our demonstration interface in flexible manufacturing environments where one robot might assist across a range of tasks that benefit from specific types of demonstrations,” says Hagenow, who plans to refine the attachment’s design based on user feedback and will use the new design to test robot learning. “We view this study as demonstrating how greater flexibility in collaborative robots can be achieved through interfaces that expand the ways that end-users interact with robots during teaching.”

Read the full article by Jennifer Chu / MIT

Related Posts

Strong start to online holiday shopping masks signs of a fragile U.S. consumer
Technology

Strong start to online holiday shopping masks signs of a fragile U.S. consumer

AI-powered children’s toys are here, but are they safe?
Technology

AI-powered children’s toys are here, but are they safe?

Chipmaker Nvidia doubles Q2 sales
Technology

Nvidia stock falls after report says Google, Meta in talks for multibillion-dollar AI chip deal

What are AI agents?
Technology

What are AI agents?

Reimagining cloud operations
Technology

Reimagining cloud operations

Data strategy for AI success: Winning the race against time
Technology

Data strategy for AI success: Winning the race against time

How AI-driven retail software helps emerging retailers win big
Technology

How AI-driven retail software helps emerging retailers win big

Elon Musk uses Grok to imagine the possibility of love
Technology

Elon Musk uses Grok to imagine the possibility of love

AI-washing and the massive layoffs hitting the economy
Technology

AI-washing and the massive layoffs hitting the economy

Microsoft AI chief says only biological beings can be conscious
Technology

Microsoft AI chief says only biological beings can be conscious

No Result
View All Result

Recent Posts

  • Rio Tinto CEO outlines $10 billion divestment plan
  • Layoff announcements for 2025 top 1.1 million
  • US dollar declines for tenth day in a row
  • The environmental costs of corn: should the US change how it grows its dominant crop?
  • How to lead a high-performance team

Archives

Categories

  • Art & Culture
  • Business
  • CEO Interviews
  • CEO Life
  • Editor´s Choice
  • Entrepreneur
  • Environment
  • Food
  • Health
  • Highlights
  • Industry
  • Innovation
  • Issues
  • Management & Leadership
  • News
  • Opinion
  • PrimeZone
  • Printed Version
  • Technology
  • Travel
  • Uncategorized

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

  • CONTACT
  • GENERAL ENQUIRIES
  • ADVERTISING
  • MEDIA KIT
  • DIRECTORY
  • TERMS AND CONDITIONS

Advertising –
advertising@ceo-na.com

110 Wall St.,
3rd Floor
New York, NY.
10005
USA
+1 212 432 5800

Avenida Chapultepec 480,
Floor 11
Mexico City
06700
MEXICO

  • News
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life

  • CONTACT
  • GENERAL ENQUIRIES
  • ADVERTISING
  • MEDIA KIT
  • DIRECTORY
  • TERMS AND CONDITIONS

Advertising –
advertising@ceo-na.com

110 Wall St.,
3rd Floor
New York, NY.
10005
USA
+1 212 432 5800

Avenida Chapultepec 480,
Floor 11
Mexico City
06700
MEXICO

CEO North America © 2024 - Sitemap

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • News
    • Business
    • Entrepreneur
    • Industry
    • Innovation
    • Management & Leadership
  • CEO Interviews
  • Opinion
  • Technology
  • Environment
  • CEO Life
    • Art & Culture
    • Food
    • Health
    • Travel

© 2025 JNews - Premium WordPress news & magazine theme by Jegtheme.