University research project aims to highlight potential dangers of facial analysis technology.
Sign in or buy a plan to view this innovation
Springwise have noted various examples of smart technology capable of analysing human behaviour, from recruitment software to advertisement monitoring. However, although these innovations improve existing systems, another project reveals the potential harm in trusting such technology too much.
Artist Lucy McRae, together with researchers from University of Melbourne, has devised an interactive project to illustrate the potential implications of AI facial analysis. Titled Biometric Mirror, participants step in front of a screen and receive a scan from AI software. The algorithm accurately identifies a range of characteristics. Some seem quite normal and harmless, such as gender, age and ethnicity. Others seem far more subjective, like kindness, emotional stability, responsibility or attractiveness.
The software learnt to identify such subjective features by assimilating data based on crowd-sourced feedback on people’s facial appearances. It therefore takes on the bias of those data providers. Niels Wouters, Digital Media Specialist at the University of Melbourne, explains, “the algorithm is correct…but the information that it feeds back is not accurate because it’s based on subjective information.”
Technology such as this is already being used for purposes such as personalised advertising campaigns or predicting outcomes. What Biometric Mirror attempts to expose is the potential implications of such technology being used to predict aggression levels, sexual preferences or other highly personal characteristics. Relying on potentially biased systems to influence our decisions regarding security, employment or health matters could contribute to social discrimination. For example, recruitment AI software might identify a person as emotionally unstable and therefore not recommend them for a managerial role. However, the individual would have had no opportunity to contest that decision. These ethical boundaries are rapidly requiring more consideration as we incorporate more and more technology into our lives.
The Biometric Mirror project is a collaboration between Lucy McRae, the University of Melbourne’s Microsoft Research Centre for Social Natural User Interface and Science Gallery Melbourne. The event opens at 10am on 25 September in the State Library Victoria and the Science Gallery. It runs until 30 September. You can stay up to date with the project through the Biometric Mirror newsletter.