Created by three Cornell Tech students, the Keymochi keyboard app detects users’ emotions, helping provide appropriate support at home and in customer service.
Using affective computing, the Cornell Tech team’s Keymochi emotion-predicting keyboard app is being developed for eventual commercial and healthcare use. A working prototype has been developed based on the development team’s own exaggerated emotional data, and the predictive accuracy is already 82 percent. The app works by monitoring a user’s typing speed, phone movements, punctuation and general sentiment of the message, whether text or email. Data is encrypted, and the app only saves the way in which a message was typed, not the content.
With small changes in facial expression affecting someone’s typing, the team hopes to use the app’s predictions to support mental healthcare and commercial customer service. Future development will focus on making the app’s machine-learning algorithm stronger, using the app’s analysis to affect a person’s home environment (such as changing the lighting or playing a favorite song), and providing more personalized customer support depending on the caller’s mood.
With so much of life lived online, a number of projects are working to improve the efficacy and efficiency of keyboards. There is the iPhone keyboard that lets sellers send images and links to products, and a connected, LED keyboard that can be programmed for a range of tasks. What other chores or tasks could be sped up through connected keyboards?