Researchers at the Massachusetts Institute of Technology have developed a face emotion algorithm that monitors minuscule facial expressions to measure how much pain a person is experiencing.
The algorithm was trained using videos of people wincing, grimacing and showing other expressions indicating pain, as reported by New Scientist. The videos presented people suffering from shoulder pain, who performed movements and then rated the resulting level of pain. Based on this input, the algorithm was able to detect slight changes in expressions and correlate them to specific degrees of pain.
The technology could play a useful role in distinguishing between real and faked displays of agony. This could provide a more accurate and objective way to measure pain levels than asking patients to self-report rate their own degree of discomfort. As a result, it could prevent doctors from overprescribing opioids and other medications.
Any technique that could help stem the level of opioid-related drug usage would find a warm welcome in the United States, where abuse of such medications has reached a crisis point. The number of deaths in the US related to opioid overdoses increased by nearly a factor of three from 2002 to 2015, according to the National Center for Health Statistics. Nearly 35,000 deaths were attributed to these types of drugs in 2016, according to the center.
Other computer-based technologies have achieved success in distinguishing between real and contrived demonstrations of pain. For example, a system developed by the University of California, San Diego, could flag fake pain expressions 85 percent of the time—compared to only 55 percent for human assessments.
The MIT system can be calibrated to detect face emotions with greater accuracy. For example, the algorithm can take into account a patient’s age, sex and skin complexion, according to Dianbo Liu, who created the system with his colleagues at MIT.
The algorithm addresses some of the more nagging issues related to evaluating pain. Individuals experience and express pain differently. Furthermore, some patients may exaggerate their level of pain to obtain prescriptions to medications.
The MIT algorithm determines face emotion using a neural network and Gaussian process regression models.
Liu said the technology eventually could be turned into a smartphone app that could be employed by doctors. However, Liu cautioned that the technology could never completely replace doctors’ diagnosis.
The system was trained with videos that had optimal lighting and photography conditions. Under less ideal circumstances, the accuracy of the algorithms might be diminished.
Nirel Marofsky is project analyst for the cognitive engine and application ecosystem at Veritone. She acts as a liaison to strategic partners, integrating developers and their capabilities into the Veritone Platform. Learn more about our platform and join the Veritone developer ecosystem today.