Home

Patois Smět Ward microsoft twitter bot Veverka Hnědý Kniha rekordů Guinessovy knihy světových rekordů

Microsoft takes offensive bot 'Tay' offline - NZ Herald
Microsoft takes offensive bot 'Tay' offline - NZ Herald

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

HuffPost Tech on Twitter: "Microsoft's chat bot "Tay" went on a racist  Twitter rampage within 24 hours of coming online https://t.co/znOQ7ubTBN  https://t.co/THwECB7gna" / Twitter
HuffPost Tech on Twitter: "Microsoft's chat bot "Tay" went on a racist Twitter rampage within 24 hours of coming online https://t.co/znOQ7ubTBN https://t.co/THwECB7gna" / Twitter

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Microsoft artificial intelligence 'chatbot' taken offline after trolls  tricked it into becoming hateful, racist
Microsoft artificial intelligence 'chatbot' taken offline after trolls tricked it into becoming hateful, racist

Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how  to be racist and say horrible things https://t.co/onmBCysYGB  https://t.co/0Py07nHhtQ" / Twitter
Kotaku on Twitter: "Microsoft releases AI bot that immediately learns how to be racist and say horrible things https://t.co/onmBCysYGB https://t.co/0Py07nHhtQ" / Twitter

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online  Conversation - IEEE Spectrum
In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online Conversation - IEEE Spectrum

Microsoft: Twitter-Bot Tay - vom Hipstermädchen zum Hitlerbot - DER SPIEGEL
Microsoft: Twitter-Bot Tay - vom Hipstermädchen zum Hitlerbot - DER SPIEGEL

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

Microsoft's racist robot: "Chatbot" taken offline as Tweets turn off-colour  - YouTube
Microsoft's racist robot: "Chatbot" taken offline as Tweets turn off-colour - YouTube

Microsoft launches an artificially intelligent profile on Twitter - it  doesn't go according to plan - Mirror Online
Microsoft launches an artificially intelligent profile on Twitter - it doesn't go according to plan - Mirror Online

Microsoft Nixes AI Bot for Racist Rant
Microsoft Nixes AI Bot for Racist Rant

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

How Twitter taught a robot to hate - Vox
How Twitter taught a robot to hate - Vox

In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online  Conversation - IEEE Spectrum
In 2016, Microsoft's Racist Chatbot Revealed the Dangers of Online Conversation - IEEE Spectrum

Microsoft apologizes for hijacked chatbot Tay's 'wildly inappropriate'  tweets | TechCrunch
Microsoft apologizes for hijacked chatbot Tay's 'wildly inappropriate' tweets | TechCrunch

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

Requiem for Tay: Microsoft's AI Bot Gone Bad - The New Stack
Requiem for Tay: Microsoft's AI Bot Gone Bad - The New Stack

Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than  24 Hours
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours

Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium
Microsoft's Tay is an Example of Bad Design | by caroline sinders | Medium

Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News
Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News