Digital ‘brain’ learned to spot cats by watching YouTube stills

Google is dabbling with getting computers to simulate the learning process of the human brain as one of the unusual projects for researchers in its X Lab.

Digital ‘brain’ learned to spot cats  by watching YouTube stills

Computers programmed with algorithms intended to mimic neural connections “learned” to recognise cats after being shown a sampling of YouTube videos, Google fellow Jeff Dean and visiting faculty Andrew Ng said in a blog post.

“Our hypothesis was that it would learn to recognise common objects in those videos,” the researchers said.

“Indeed, to our amusement, one of our artificial neurons learned to respond strongly to pictures of... cats. Remember that this network had never been told what a cat was, nor was it given even a single image labelled as a cat.”

The computer, essentially, discovered for itself what a cat looked like, according to Mr Dean and Mr Ng.

The computations were spread across an “artificial neural network” of 16,000 processors and a billion connections in Google data centres.

The small-scale “newborn brain” was shown YouTube images for a week to see what it would learn.

“It ‘discovered’ what a cat looked like by itself from only unlabeled YouTube stills,” the researchers said.

“That’s what we mean by self-taught learning.”

Google researchers are building a larger model and are working to apply the artificial neural network approach to improve technology for speech recognition and natural language modelling, according to Mr Dean and Mr Ng.

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited