Multidisciplinary artist Jason Snell, who composes music directly from his brain waves, will perform and discuss his exploration of mental music composition during a Feb. 5 seminar at St. Ambrose University.
During performances, Snell wears an Electroencephalographic (EEG) probe to read his brain electricity. The EEG scan is then transmitted via Bluetooth to an iPhone app he developed to manipulate sound – all without touching a synthesizer.
The technology has enabled him to explore the space between consciousness and deep areas of the subconscious, including an intensely visual hypnagogic state where manipulating objects in his mind has an immediate parallel in the audio output. This powerful biofeedback system erases the line between cause and effect as the brain and its immediate auditory output influence and shape each other in real-time.
Anyone interested in sensation and perception, cognition, music, and how information can be creatively processed are encouraged to attend. The one-hour seminar begins at 6:30 p.m. in the Galvin Fine Arts Center, Madsen Hall. The event is free and open to the public, and sponsored by the SAU departments of Psychology, Music and Engineering.
Based between New York, Iowa, and Berlin, Snell has expertise in several fields, including computer programming, artificial intelligence, motion design, music production, and generative composition. For more than 20 years, he's fused his talents into multimedia projects, most recently with his EEG system that composes music with thought.
Snell's music, motion design, and filmmaking have been featured at Sundance, SXSW, Slamdance DIG, the Berlin Independent, San Francisco Independent, Eufonia Berlin, and Seoul Net film festivals.
Share This Story