I can read your mind. I know what you are looking at. You are about to reach the end of the first paragraph. Ta-da! Telepathy, I confess, is not my strong point. But something disconcertingly close to it is emerging from the laboratory. Your perceptions of the outside world arise through brain activity. Scientists in China have managed to reverse-engineer this process, using brain activity to guess what people are looking at. Their algorithm, which analyses functional MRI brain scans collected while volunteers gaze at digits and letters, is able to furnish uncannily clear depictions of the original images.
It has been termed a mind-reading algorithm; a more accurate, though less catchy, description would be a “reconstruction of visual field” algorithm. The work, led by Changde Du at Beijing’s Research Centre for Brain-inspired Intelligence, represents an important step for machine learning, because the algorithm knits together information derived from the visual cortex as it builds up the reimagined image.
Functional MRI brain imaging produces notoriously “noisy” scans. Since the brain is three-dimensional, the data emerge in the form of 3D pixels, called voxels. Using deep-learning techniques, Mr Du’s algorithm was able to piece those voxels together while stripping out much of the noise. The next step