(Bloomberg) -- Tay take two?

Microsoft Corp. is letting users try a new chatbot on the app Kik, nine months after it shut down an earlier bot that internet users got to spout racist, sexist and pornographic remarks. The new one is called Zo and it refuses to discuss politics and steers clear of racism.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access