September 18, 2022
Google Robots, YT Trending - Latest Technology News

Google Robots Are More Human-Like and Capable of Performing Complex Tasks.

These days, robots can be seen everywhere, including in factories all over the world. These robots can only carry out simple pre-programmed tasks and only follow specific instructions. The Google robots appear to be different, however; they can follow straightforward instructions and carry out more complex tasks. The existing Google robots reportedly look human more.

Researchers at Google Labs recently demonstrated new robotic skills, making burgers out of a variety of plastic toy ingredients. The robot understands the cooking process and knows to add ketchup after the meat and before the lettuce. However, it thinks the correct way to do it is to put the entire bottle in the burger. While the Google robot won’t be a smart cook anytime soon, it represents a major breakthrough that Google claims.

Read More: Artificial intelligence, Machine Learning, and Deep learning: A Comparison

Using recently developed artificial intelligence (AI) software known as large language models, the researchers were able to design robots that can assist humans with a broader range of everyday tasks. Rather than taking a series of disaggregated instructions and then guiding them through each action one by one, such robots can now respond to full commands, much like humans.

“I’m hungry, can you get me a snack?” Google researchers asked the robot during last week’s demonstration. The robot then began searching the cafeteria, opening a drawer, locating and delivering potato chips to humans. According to Google executives and researchers, this is the first time a language model has been embedded in a robot. “It’s fundamentally a very different model for training robots,” said Brian Ichter, a Google research scientist.

Language modeling (LM) is the use of various statistical and probabilistic techniques to determine the probability of a given sequence of words occurring in a sentence. Language models analyze bodies of text data to provide a basis for their word predictions. They are used in natural language processing (NLP) applications, particularly ones that generate text as an output. Some of these applications include , machine translation and question answering.

Google robots will carry out complex tasks.

Today’s robots typically specialise in one or two tasks, such as moving a product on an assembly line or welding two pieces of metal together. Creating robots that can perform a variety of everyday tasks and learn autonomously on the job is far more difficult. For years, large and small technology companies have been working to develop such general-purpose robots.

Read More: New program bolsters innovation in next-generation artificial intelligence hardware MIT News

Language models work by using large amounts of text uploaded to the internet to train AI software to predict what kind of response will follow certain questions or comments. These models have become so accurate at predicting correct responses that dealing with people often feels like speaking with an informed individual. Google and others, including OpenAI and Microsoft, have invested significant resources in developing and training better language models.

Google Robots, YT Trending - Latest Technology News
Google Robots Are More Human-Like and Capable of Performing Complex Tasks.

However, this work has a lot of controversies. In July, Google fired an employee for claiming that Google robots have human perception. The consensus among AI experts is that these models are not perceptive, and many fear they will exhibit bias. Some language models exhibit racist or sexist tendencies or are easily manipulated to say hate speech or lies. In general, language models allow robots to understand more advanced planning steps, but don’t give robots all the information they need, says Deepak Pathak, an assistant professor at Carnegie Mellon University.

Nonetheless, Google is progressing and has integrated language models into many of its bots. Researchers can now speak to the robot in everyday language rather than writing specific technical instructions for each task the robot can perform. Furthermore, the new software assists the robot in autonomously interpreting complex, multi-step instructions. Robots can now interpret commands they have never heard before, respond meaningfully, and act.

Robots will take and create jobs

According to Zac Stewart Rogers, assistant professor of supply chain management at Colorado State University, robots that can use language models could change the way manufacturing and distribution facilities operate. “Collaboration between humans and robots has always been extremely efficient.” “Both humans and robots can lift heavy objects manually, and both can do nuanced troubleshooting,” he said.

If robots can handle complex tasks, distribution centres may become smaller, requiring fewer humans and more robots. It may also result in fewer job opportunities. Although Rogers points out that when one industry shrinks due to automation, other industries create more jobs.

Training general-purpose robots may still be a long way off. AI techniques such as neural networks and reinforcement learning have been used to train robots for many years. While many breakthroughs have been made, progress is still slow, and Google robots are far from ready for real-world service. Google researchers and executives have repeatedly said they are just running a research lab and have no plans to commercialize the technology.

But it’s clear that Google and other big tech companies have a serious interest in robotics. Amazon, which uses a number of robots in its warehouses, is experimenting with drone deliveries, and earlier this month agreed to buy the maker of Roomba robots for $1.7 billion. Tesla says it’s building friendly robots that can perform everyday tasks and won’t fight back. In addition, Tesla has developed several self-driving features for its cars, and the company is also working on general-purpose robots.

Read More: South Korean Electric Vehicle Subsidies are Cancelled By The US

Google invests heavily in robotics companies

In 2013, Google began investing heavily, acquiring several robotics companies, including Boston Dynamics, which often sparked social media buzz. But the executive in charge of the project was accused of sexual misconduct and left the company shortly after. In 2017, Google sold Boston Dynamics to Japanese tech giant SoftBank. Since then, the hype around increasingly intelligent robots designed by the most powerful tech companies has cooled.

On the language model project, researchers at Google collaborated with the Everyday Robots team. Everyday Robots is a Google-owned but independently operated company that specializes in building robots that can perform a range of “repetitive and tedious” tasks. The robots are already working across Google’s cafeterias, cleaning counters and trash.

Here is the Video of Robot

Leave a Reply

Your email address will not be published.