Google system excels in performing complex games


A computer program managed to learn how to play some of the classic Atari games. The computer program is inspired by a human brain and in more than half it was noticed that it was good or even better than a professional human player.

google system

Google DeepMind researchers claimed that this was the first time a system learned how to win and master a wide range of complex tasks. Journal Nature has the study about the new computer program. The vice president of DeepMind Dr. Demis Hassabis said, “Up until now, self-learning systems have only been used for relatively simple problems. For the first time, we have used it in a perceptually rich environment to complete tasks that are very challenging to humans.”

As the technology has improved, more and more companies are investing more machine learning. Google purchased the DeepMind Technologies for reportedly £400 in 2014. However, this is not the first time a machine has become a star in terms of performing complex tasks.

The system had an artificial intelligence which was pre-programmed with an instruction manual which helped it to excel in the task.DeepMind claimed that the machine is provided with just the basic information before it is given a video game to play.

Photo Credits: zimbio