“Google is teaching robots to think for themselves”, is the statement with which Harry McCracken presents his article in the Fast Company media outlet. He has been fortunate to see how the technology firm’s robots are becoming increasingly useful And how they give him a way of acting that seems more human to him.
The authors of Fast Company give the example of a Google robot that was responsible for performing tasks in a small kitchen in an office in Mountain View (California). As can be seen in the video you shared, this white bodied machine It gets a single arm and a gripping mechanism at the end of the wheels. that allows you to do your job.
The robot, made by Alphabet’s Everyday Robots, has cameras where human eyes would be. This makes the device look like a humanGive him a certain anthropomorphism, according to McCracken.
In addition to its appearance, its creators (Everyday Robots) are working on a new language model that gives robots a comprehensive understanding of the world, to help them better understand humans. This is PaLM-SayCan (which is an abbreviation for ‘Pathway Language Model’).
The system McCracken is talking about is used by Google robots to decipher both spoken and written statements. thus, Machines can easily find the right answers to help humans.
According to Alphabet engineers, the PaLM-SayCan study marks the first time robots have access to large-scale language models. Compared to the software they used before at Google, this new system makes Robots are 14% better at planning their tasks and 13% better at completing them successfully.
Said Google technology is still far from being launched commercially. At the moment, the PaLM-SayCan is in the testing phase, so users won’t be able to see it in the recent future.
Sign up for our newsletter and receive the latest tech news in your email.