Three students at Cornell University have devised a mobile keyboard app that can sense users’ emotions by monitoring factors including sentiment analysis.
The app, called Keymochi, was developed as part of a research project by Cornell Tech students Hsiao-Ching Lin, Huai-Che Lu, and Claire Opila, according to an article in Digital Trends. Keymochi runs on both iOS and Android.
As users click away on the keyboard, Keymochi gathers information such as typing speed, the time of day, changes in punctuation and the numbers of backspaces. The app also collects data on movement from the smartphone’s sensors. Furthermore, users can indicate their mood by picking one of 16 pictures showing people in different emotional states, a feature called the photographic affect meter (PAM).
All of these data points are aggregated to develop an emotional profile of the user. This data is sent to the Keymochi database, which combines the data and the sentiment analysis information from the PAM to build a user-specific machine-learning algorithm.
Keymochi could help reduce miscommunications in texts that arise from misjudgments of the author’s intentions and emotional state, according to Cornell. The technology could eventually expand to provide more connected experiences based on people’s moods. For example, Keymochi could adjust the lighting or music in people’s homes based on its reading of their emotions.
While Keymochi presently is not being sold commercially, the creators are considering offering a software development kit that developers can use to add the keyboard’s capabilities to their own apps.
Tyler Schulze is vice president, strategy & development at Veritone. He serves as general manager for developer partnerships, cognitive engine ecosystem, and media ingestion for the Veritone Artificial Intelligence Platform™ (VAIP). Learn more about our platform and join the Veritone developer ecosystem today.