top of page
Writer's pictureMal McCallion

AI Rocks On: Recreating Pink Floyd Using Brain Activity

Updated: Dec 11, 2023


Ever wondered what your brainwaves sound like as a rock anthem?


Well, an artificial intelligence (AI) has turned this sci-fi fantasy into a reality by recreating a Pink Floyd song based on brain activity! This breakthrough not only hits the right notes for music lovers but also enhances our understanding of sound perception, potentially improving devices for people with speech difficulties.


The maestro behind this symphony of science is Robert Knight and his team at the University of California, Berkeley. They studied brain activity recordings from 29 individuals, who had electrodes surgically implanted onto their brains for epilepsy treatment. The recordings were made while participants were grooving to Pink Floyd's "Another Brick in the Wall, Part 1".


The team identified brain signals strongly linked to the song's pitch, melody, harmony, and rhythm. They then trained an AI to understand these links, excluding a 15-second segment of the song from the training data. The AI then predicted this unseen song snippet based on the participants' brain signals, achieving a 43% similarity to the actual song clip. Quite a performance for a non-human, wouldn't you agree?

Click here to listen the original song clip after some simple processing to make it a fair comparison with the AI-generated clip which undergoes some degradation when converted from a spectrogram to audio.

The study also revealed that the right hemisphere of the brain is more crucial for processing music than the left, confirming previous research. Furthermore, the superior temporal gyrus, a brain region, was identified as the rhythm processor for the guitar in the song.


This deeper understanding of how the brain perceives music could potentially enhance devices that assist people with speech difficulties, such as those with amyotrophic lateral sclerosis or aphasia. Knight envisions a future where these devices sound less robotic and more human.


While the brain implants used in this study are unlikely to be used for non-clinical applications, other researchers have used AI to generate song clips from brain signals recorded using MRI scans. This technology could even be used to compose music based on imagined, not just heard, songs.


As this technology progresses, it could raise interesting questions around copyright infringement, depending on the similarity between the AI's recreation and the original music. Who would be the author of the AI-generated song? The person recording the brain activity, the AI, or the listener? For now, we'll have to wait and see how the courts decide to play this tune.


So, next time you're listening to your favourite song, just remember - your brain might be jamming along too!



Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai

3 views0 comments

Recent Posts

See All

Comments


bottom of page