hhhh
Newsletter
Magazine Store
Home

>>

Technology

>>

Artificial intelligence

>>

MIT's Groundbreaking Method Re...

ARTIFICIAL INTELLIGENCE

MIT's Groundbreaking Method Revolutionizes Robot Training with Large Language Model Inspiration!

MIT's Groundbreaking Method Revolutionizes Robot Training with Large Language Model Inspiration!
The Silicon Review
04 November, 2024

MIT's Heterogeneous Pretrained Transformers (HPT) empower robots with adaptable, human-like learning using vast data, promising universal robotic flexibility for diverse tasks and environments

In a striking success of large language models (LLMs), researchers at the Massachusetts Institute of Technology (MIT) have revealed a novel approach to train robots novel skills, marking a significant advancement in robotics. This novel technique, called Heterogeneous Pretrained Transformers (HPT), goes beyond conventional approaches that frequently rely on small datasets by using tremendous volumes of heterogeneous data to accelerate robot training.

The novel strategy addresses a key issue in robotic learning: when robots come across tiny variations in their surroundings, including new obstacles or altered lighting, imitation learning may break down. Researchers at MIT have created a system that integrates input from multiple sensors and settings into a single framework by mimicking the data-rich training procedures used for models such as GPT-4. This enables robots to quickly adjust to new tasks with little assistance from humans.

Eventually, a "universal robot brain" that can be downloaded and used without any prior training will be developed. The HPT architecture promises greater robotic efficiency and flexibility as part of this ambitious drive, which could result in innovations in how robots carry out challenging tasks in many circumstances. The Toyota Research Institute has shown an enormous stake in developing robotic capabilities by partially funding this research.

NOMINATE YOUR COMPANY NOW AND GET 10% OFF