The human brain has eluded scientists in terms of how it actually works. Granted, we’ve got a general idea thanks to MRI machines, but a study led by the University of Oregon has got us even closer to the truth by using AI to read a person’s mind… sort of.
This study, led by Brice Kuhl and Hongmi Lee, used artificial intelligence that analysed brain activity in an attempt to recreate one of a series of faces that participants were seeing. It’s not an exact science, but as you can see below, the computer got close.
So how do you get AI to recreate faces? Well, you start with a training phase… Test participants were put in an MRI and shown hundreds of pictures of faces. The program processed this MRI data in real time, analysing an insane 300 points of data about every face!
As an MRI detects the movement of blood around the brain, the assumed conclusion was that equals brain activity - so the program analysed these movements of blood in reaction to these different points to have a crack at guessing the face you’re looking at.
The test itself saw the participants being shown a face, but this time without the program having seen it before. The only thing it can go on was the processed MRI data. And while the results were not correct, they were pretty damn close…
The top row of images shows the faces seen by the person, and the bottom two are the AI guesses based on two different areas of the brain:
OTA (occipitotemporal cortex) processes visual input
ANG (angular gyrus) is the part of the brain that sparks to life when we relive vivid memories
To provide some specific numbers data on the program’s success rate, Kuhl and Lee showed these reconstructed faces to a separate group of people and asked them questions relating to skin tone, gender and emotion. Unsurprisingly, the group responded correctly at a higher rate than random chance - which proves the AI renderings provide relatively accurate data.
But the team didn’t stop there… What if they took the visual aid away and worked entirely from memory? Participants popped inside the MRI again and was shown a face picture again, but this time were told to think about it after the picture was hidden. The test here is simple - can artificial intelligence read your mind? And based on that memory alone (and your MRI data) you can see that the software is not quite there yet...
The lovingly named mind-reading program (which isn’t actually that much of a mind reader, but a really effective tool for understanding how the brain words) was able to identify the variables - however, the faces were far too different to be recognisable to their original.
So breathe a sigh of relief, because Skynet hasn’t channelled its inner-Xavier just yet, but this fascinating work has a bright future of helping us understand more about the human mind.