India Requires Approval for Release of ‘Unreliable’ AI Tools

In a bid to regulate the use of artificial intelligence (AI) tools, India has mandated tech firms to seek governmental approval before releasing any AI systems deemed “unreliable” or under trial. The directive, issued by India’s IT ministry, also mandates these tools to be labeled accordingly, warning users of the potential for inaccuracies in query…

Read full story at Cryptopolitan >