Auditory perception comes into focus in undergraduate research project
(August 5, 2013)
BUZZ… ZOOM… WHOOSH…
These sounds come from a small side room in the Sherrill Center at UNC Asheville, as part of a summer neuroscience research project. Each sound is played through headphones and controlled by a computer just outside of the door. Back inside the side room, the electrodes and wires of electroencephalography (EEG) equipment record the listener’s brain responses and send them to the computer to project the data onto the screen.
That’s where psychology major Matthew Underwood is piecing together how the brain recognizes real moving sounds from jumping or stationary ones, under the guidance of Associate Professor Michael Neelon.
“We are generally trying to find out how auditory perception works,” said Underwood. “More specifically, we are trying to determine how we recognize a moving sound source. In this case, we are linking the electrical activity your neurons produce when you hear a moving sound, to the subjective perception of a sound that moves continuously. And the purpose is basically to learn more about how the auditory pathways in your brain produce that perception of "continuousness," which is one step toward understanding the auditory processing system in the brain.”
“One goal, and a scary thought of what will happen soon, is to develop computer programs that can look at these inputs and re-create what is going on in your mind…." —Associate Professor Michael Neelon
They designed the experiment from scratch, building from Neelon’s dissertation and introducing Underwood to the kind of theoretical research he will be doing in graduate school. The senior from Chapel Hill will graduate with a neuroscience minor and plenty of hands-on experience or heads-on experience, in this case.
To measure the perception of sound motion, the researchers use an EEG cap with 64 electrodes that record the duration and location of many brain signals. The data appears on the screen in the form of hundreds of jagged lines in units of microvolts or one-millionth of a volt. They average the events to remove random responses and focus on “event-related potentials,” or the occurrences where the brain registers a particular sound. Thanks to multicolored graphics that flare up in response, they can also determine the location in the brain that responds to the sound. The goal is to link brain responses to personal sensory experiences of the world. They are starting with the sensory experiences and looking at the brain responses, but in the future, it could work in reverse as well.
“Neurons produce responses on or off. If you do millions of them together and connect them in the right way, you stop seeing neural responses and start seeing moving sounds, music, language, or people you recognized – all of those things that are your interpretation of neural signals,” explained Neelon. “One goal, and a scary thought of what will happen soon, is to develop computer programs that can look at these inputs and re-create what is going on in your mind…. We are just at the beginning of that, and as close as we can get is to say you heard a sound. Maybe you heard a moving sound. Maybe we can say you heard a sound moving to the left. But in 20 years, we might be able to say you are listening to music and you are listening to a song you like, and we can program your device to keep playing that song.”
That might be music to Underwood’s ears, but for now, he’s concentrating on this first experiment and the research methods, which also are cutting through barriers of sounds and the brain.
“Most auditory motion research has not had this range in the type of sounds used. So we are using different motions and paths of sounds and determining the best methods for doing this kind of research,” Underwood said.