At LEAP 2023, Jack McCauley (Co-Founder of Oculus, Professor of Engineering at UC Berkeley) spoke about his experience of playing with the potential of AI, and developing different capabilities that could change the world we live in.
He described his private R&D facility, equipped with state-of-the-art technologies and highly skilled engineers, where he explores and tests ideas – including ideas he feels sceptical about: “if I doubt something, or something seems like hysteria, I will go and do the reading and the work myself.”
McCauley told us how, around five years earlier, he and his team had built a homegrown neural network. They wanted to find out if they could track something in flight using AI.
It started with an understanding of the human inner ear
McCauley asked the LEAP audience to hold a finger out in front of them. He told them to focus on the fingernail of that hand, and then move their heads from side to side; noticing that as the head moved the eyes stayed focused on the fingernail – but the eyes moved in the opposite direction to the head.
And it was this basic reflex that he and his team used to start creating inner ear electronics when he was leading the engineering team at Oculus.
“The inner ear has structures in it,” he said, “and one of the structures is a series of tubes that has a fluid in it, and that fluid tickles hair inside these tubes, and gives the brain feedback on what the head position is and what the body position it. And it feeds this cranial nerve into a cluster of neutrons — about a thousand of them, out of 86 billion, are dedicated to balance. So it seemed easy to us at the time to replicate that.”
Using a combination of hardware purchased from online retailers, and hardware that had to be built from scratch, the team created a neural network that could track a ball being bounced up and down – much like a human eye can track a ball.
They were doing the inner ear research to find a solution to the nausea that 10% of people experience when they use a VR headset. But McCauley explained that the results of this exploration can also be translated to AI-powered automotive vehicles – taking the ability to track an object and using that to map the road ahead of a vehicle.
The neural network has the capacity to learn
A neural network can be trained, or conditioned, to do certain things – responding to input sets that act like teachers.
“We wanted to teach this thing we built to track something in flight.” McCauley said. To do this, he and his team had to throw a ball in the air in front of a little eyeball they’d built; “and we’d throw the ball over and over again, and eventually the eyeball started catching it and then tracking it.”
It was a slow process – they bounced that ball 50,000 times or more – but eventually, the neural network did something amazing: it learned.
“The beauty of AI like this is I didn’t really have to sit down at a keyboard and type into the keyboard what to do — it sort of taught itself what to do. It was able to do this without my intervention.”
This device, McCauley added, can be used as part of a cluster of devices on the front of a car to track objects ahead of the vehicle; and even determine exactly where and how far away an object is.
But putting use cases aside for a moment, there’s something else to highlight here. The neural network can learn. Seemingly impossible feats of technology can become possible – through experimentation, play, and the power of curiosity. And that makes the future of technology an incredibly exciting prospect.
To find out more on AI...👀
Head over to DeepFest, the dedicated AI stream at LEAP from 4-7 March 2024. DeepFest is where the global AI community comes together to learn, connect, and explore the latest innovations in AI.
Register now to attend LEAP 2024.