WATCH: Mark Zuckerberg showcases his very own Jarvis AI

And disses Nickelback while he's at it...

Much has been made of Mark Zuckerberg's "run 365 miles in a year" project – particularly when he was jogging mask-free and beaming through a smoggy Tiananmen Square – but he's been up to something else in 2016 that's altogether more intriguing.

Now he's presented the fruits of his 100 hours of labour to the world, coming across like a pretty wooden host and getting a shark-vaulting "Nickelback's music isn't very good, is it?" gag in for good measure.

Essentially, he's come over all Tony Stark and created his own home "Jarvis" AI. And he's one-upped Iron Man and made sure it's voiced by Morgan "God" Freeman... For the video, at least. 

After a quick humblebrag ("it's Saturday, so you only have five meetings" Freeman deadpans to a recently-conscious and fresh-as-a-daisy Zuck), we learn that the Facebook founder started work on a simple AI that could be controlled via an app and run his family home. We then get an idea of how it all works. Take a look...

For a more detailed understanding of what went into the project and how exactly the technology works, you're best heading for Zuckerberg's extensive Facebook post on it all, 'Building Jarvis'. Pretty soon off the bat, he gives a summary of what it can actually do:

"So far this year, I've built a simple AI that I can talk to on my phone and computer, that can control my home, including lights, temperature, appliances, music and security, that learns my tastes and patterns, that can learn new words and concepts, and that can even entertain [his son] Max. It uses several artificial intelligence techniques, including natural language processing, speech recognition, face recognition, and reinforcement learning, written in Python, PHP and Objective C."

Connectivity was an early stumbling block. Even when he could hook certain "dumb" appliances up with internet-connected power switches that let you turn the power on and off remotely, he ran into problems. For example, Zuckerberg had to resort to an old '50s toaster, because he found it nigh-on impossible to locate a newer model that allows you to push the bread down while it's powered off so you can automatically start toasting when the power goes on. He called for the development of common application programming interfaces (APIs) and standards so that all devices can talk to each other.

In conclusion, Zuckerberg restates his old prediction that we'll all have AI systems within 5-10 years that are more accurate than people for each of our senses, as well as language. 

"It's impressive how powerful the state of the art for these tools is becoming," he writes, "and this year makes me more confident in my prediction."

Despite this, he believes we are still "far off" from being able to teach AI how to learn. Zuckerberg notes:

"I spent about 100 hours building Jarvis this year, and now I have a pretty good system that understands me and can do lots of things. But even if I spent 1,000 more hours, I probably wouldn't be able to build a system that could learn completely new skills on its own – unless I made some fundamental breakthrough in the state of AI along the way."

news cards