Key Takeaways
- Embodied cognition shows how physical presence influences decision-making.
- Giving AI software senses is vital for creating a thinking machine.
- Virtual simulations accelerate AI learning before physical robots are now common.
When you think about “AI” these days, you’re probably thinking of an app on your phone that you can chat with, or create pictures for you based on a text prompt. However, AI is a broad and varied field, and with advances in robotics, we’re about to give our AI friends real physical bodies, which could be a game-changer for their development.
A Body Shapes a Mind
Imagine for a second that your brain was transplanted into a dolphin, or a bear, or any animal you like. The way you see the world, how your body works, and the nature of your physical presence would have a psychological effect on you, wouldn’t it? Even using VR to “swap” your gender for a few minutes can have an effect on how you make decisions.
These are examples of what psychologists refer to as embodied cognition. This approach to understanding how we (and other living organisms) think is rooted in the idea that the nature of your physical body and your senses has an effect on the sort of mind you develop. It fundamentally shapes how you think, how you make decisions, and how you see the world.
Now, I don’t want to draw hard parallels between biological animals such as ourselves and nascent AI software, but there good reason to think that once AI software is “embodied” it will change the nature of that AI.
Brains in Jars
If you take something like an LLM (Large Language Model), it’s an enormous artificial neural net that’s been shaped and trained on abstract concepts such as words, grammar, logic, and other symbolic things. It’s like a brain in a jar without any senses or the means to interact with the outside world, apart from the prompts that we put into it.
Now, slowly, we’ve been giving these simulated neural networks “senses” where they can understand sights and sounds. A very rudimentary example of this is how Boston Dynamics combined their Spot robot with ChatGPT, which allows the robot to look around its environment, and talk and interact with people. This is just a parlor trick, of course, but it’s highly successful at creating the illusion of a thinking machine. If you watch the video below, there’s even evidence of some pretty cool “emergent” behavior, where the GPT-infused Spot robot exhibits some unexpected, but logical things its creators did not foresee.
Living in a Simulation
Before we get to real physical robots and how that’s going to be a big deal soon, there’s actually an intermediary step to giving AI real physical bodies. It’s now common practice to let AI software learn inside a virtual simulation of the real world.
By putting simulated robot bodies under the control of AI software, you can get years and years of learning done in hours. This not only massively accelerates how quickly AI software can learn to move a body around an environment, it’s also much cheaper than putting a $100K of robotic hardware at risk just to learn how not to fall down.
Becoming Present in the World
For the most part, modern AI models have been trained on massive, abstract data sets. It’s as if everything you’ve ever learned in your life was vicarious, from reading books or watching videos. Once you embody learning software, you change that learning from vicarious to experiential. Suppose you had thousands of household robots, who are all gathering data from the real world to improve the software that runs them. This new data is based on first-hand “experience” in aggregate. So potentially hundreds or thousands of years of embodied learning data from the real world reshape the the AI models that drive the hardware.
I just can’t see this qualitative change in data having no effect on the nature of the model. Moving it from the abstract to the concrete, and making the AI something that’s shaped by its environment (just like we and every other living creature on Earth are) will alter the nature of that AI.
We are only a few years away from capable robotic hardware being affordable for households, and versatile autonomous robots like the Figure 02 are being tested in factories as I write this.
So, don’t be surprised if you start seeing machines out in the wild doing all sorts of odd jobs, and learning from the real hard knocks in life, just like you and me.