Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

Abu Dhabi aims for 80% local materials in housing projects
UAE Central Bank support package reaches AED 6.2 billion
Dubai Holding picks 15 scale-ups from 1,400+ applicants for sustainability challenge
ADNOC backs UAE's AED1 billion National Industrial Resilience Fund
