United Kingdom  

Trusted News Discovery Since 2008
One News Page
One News Page United Kingdom > Front Page News > Taylor Swift > Taylor Swift 'tried to sue' Microsoft over racist chatbot Tay

Taylor Swift 'tried to sue' Microsoft over racist chatbot Tay

BBC News Tuesday, 10 September 2019
The singer's legal team claimed the holocaust-denying AI's name implied a link with the singer.
0
shares
ShareTweetSavePostSend
 
Credit: Wochit News - Published
News video: Taylor Swift Reportedly Threatened To sue Microsoft Racist Twitter Bot

Taylor Swift Reportedly Threatened To sue Microsoft Racist Twitter Bot 00:32

When an artificially intelligent chatbot that used Twitter to learn how to talk unsurprisingly turned into a bigot bot, Taylor Swift reportedly threatened legal action because the bot’s name was Tay. Microsoft would probably rather forget the experiment where Twitter trolls took advantage of the...

You Might Like


Tweets about this



Environmentally friendly: One News Page is hosted on servers powered solely by renewable energy
© 2020 One News Page Ltd. All Rights Reserved.
About us  |  Contact us  |  Disclaimer  |  Press Room  |  Terms & Conditions  |  Content Accreditation
 RSS  |  News for my Website  |  Free news search widget  |  In the News  |  DMCA / Content Removal  |  Privacy & Data Protection Policy
How are we doing? FeedbackSend us your feedback  |   LIKE us on Facebook   FOLLOW us on Twitter  •  FOLLOW us on Pinterest
One News® is a registered trademark of One News Page Ltd.