Connect with top gaming leaders in Los Angeles at GamesBeat Summit 2023 this May 22-23. Register here.
An electroencephalogram, or EEG, is a noninvasive brain-monitoring test that involves placing electrodes along the scalp to send signals to a computer for analysis. EEGs have been widely used to study swallowing, classify mental states, and diagnose neuropsychiatric disorders such as neurogenic pain and epilepsy, but some researchers believe they have unexplored potential.
In a paper (“Emotion Recognition with Machine Learning Using EEG Signals“) published on the preprint server Arxiv.org, a team hailing from Texas Tech University, the University of Tabriz in Iran, and Akrham Hospital describe an AI system that recognizes emotion from EEG results alone.
“Emotion states are associated with wide variety of human feelings, thoughts, and behaviors; hence, they affect our ability to act rationally, in cases such as decision-making, perception, and human intelligence,” they wrote. “In recent years, developing emotion recognition systems based on EEG signals [has] become a popular research topic among cognitive scientists.”
EEG signals, the team notes, are challenging to analyze because they’re nonlinear, somewhat random, and “buried into various sources of noise.” To reduce that noise, researchers used an average mean reference method and settled on a decomposition approach to feature extraction. Through wavelet transforms — a mathematical means of performing signal analysis when the signal frequency varies over time — each EEG signal was split into its gamma, beta, alpha, and theta band components, from which the statistical features of the source EEG signal were derived.
Event
Transform 2023
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
The researchers sourced DEAP, an annotated corpus for emotion analysis using physiological signals, to train their emotion classifiers. This comprises EEG data from 32 participants, who were told to watch 40 one-minute music videos and grade them on a scale of 1 to 9 in several categories, including valence (a given video’s intrinsic attractiveness/”good”-ness or averseness/”bad”-ness), arousal (the intensity of the physiological response it provoked), dominance, and emotion. Grades greater than 4.5 were considered “high,” while grades less than 4.5 were labeled “low.”
With that data in hand, the paper’s authors trained three types of classifiers to distinguish among emotions: a k-nearest neighbor algorithm, a support vector machine, and an artificial neural network. All three were fed features from the EEG signals of 10 electrode channels near the left and right frontal brain — the regions closely associated with positive and negative emotions. Compared with the baseline, the best-performing classifier of the three achieved 91.3 percent accuracy for arousal and 91.1 percent accuracy for valence, both in the beta frequency band.
The researchers posit that ensemble learning, an AI paradigm in which a combination of machine learning systems work together to produce a single prediction, could further improve the models’ performance. But they claim the current accuracy is higher than existing algorithms applied to the DEAP data set.
“[S]tudies on emotion recognition using emotional signals enhance the brain-computer interface (BCI) systems as an effective subject for clinical applications and human social interactions,” the researchers say. Systems like these could be used “to investigate emotional states while considering natural aspects of emotions to elucidate therapeutics for psychological disorders such as autism spectrum disorder (ASD), attention deficit hyperactivity disorder, (ADHD) and anxiety disorder.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.