New Delhi: Amid the debate over its controversial facial recognition technology, Microsoft has announced limiting public access to several of its AI-powered facial analysis tools, including descending facial analysis tools that claim to infer emotional states and identity, gender, age, smile, facial hair, hair and makeup.
The tech giant said it will not provide an open-ended API (application programming interface) access to technology that can scan people’s faces and claims to infer their emotional state based on their facial expressions or movements.
The decision was made as part of Microsoft’s “Responsible AI Standard,” a framework to guide how it builds AI systems.
“AI is increasingly becoming a part of our lives, yet our laws are lagging behind. They have not caught up with the unique risks of AI or the needs of society,” said Natasha Crampton, Chief Responsible AI Officer at Microsoft.
“While we see signs that government action on AI is increasing, we also recognize our responsibility to act. We believe we need to work to ensure AI systems are accountable by design,” she said in a statement.
Microsoft also introduced similar restrictions to the Custom Neural Voice feature, which allows users to create AI voices based on recordings of real people.
Building on what Microsoft has learned from Custom Neural Voice, it will apply similar controls to its facial recognition services.
“After a transition period for existing customers, we are restricting access to these services to managed customers and partners, limiting use cases to predefined acceptable ones, and leveraging technical controls built into the services,” the company announced. at.
Microsoft said it will stop offering these features to new customers from June 21, while existing customers will be revoked on June 30, 2023.
“As part of our work to align our ‘Azure Face’ service with the requirements of the Responsible AI Standard, we are also eliminating capabilities that infer emotional states and identity traits, such as gender, age, smile, facial hair, hair, and makeup. -up,” the company added.
The company worked with internal and external researchers to understand the limitations and potential benefits of this technology and make the tradeoffs.
“Particularly in the case of emotion classification, these efforts raised important questions about privacy,” said Sarah Bird Principal Group Product Manager, Azure AI.