Microsoft-built chatbot aims to shame online Johns trolling for sex

From a Wired.com story on the subject:

The chatbot, tested recently in Seattle, Atlanta, and Washington D.C., lurks behind fake online ads for sex posted by nonprofits working to combat human trafficking, and responds to text messages sent to the number listed. The software initially pretends to be the person in the ad, and can converse about its purported age, body, fetish services, and pricing. But if a would-be buyer signals an intent to purchase sex, the bot pivots sharply into a stern message … Microsoft employees built the bot in a philanthropic initiative called Project Intercept, in collaboration with nonprofits that hope it can reduce demand for sex workers, and the incentives for criminals to coerce people into the sex trade. The technology is not a product of Microsoft itself.

The introduction of this chatbot from Microsoft workers follows some other notable mishaps with chatbots by the software giant. In March 2016, Microsoft launched a social 'bot called Tay that was created to give users "casual and playful conversation." However, Tay quickly started spewing offensive terms, and Microsoft eventually issued an official apology.

More recently, Microsoft announced the Ruuh chatbot, though it was only available in India, as well as the Zo 'bot. Then last month, Zo awkwardly identified Windows OS as "spyware." Oops.

The goal of the new chatbot is an admirable one. Let's just hope these Microsoft staffers have more luck with it than the company has in the past.

Al Sacco

Al Sacco is content director of Future PLC's Mobile Technology Vertical, which includes AndroidCentral.com, iMore.com and WindowsCentral.com. He is a veteran reporter, writer, reviewer and editor who has professionally covered and evaluated IT and mobile technology, and countless associated gadgets and accessories, for more than a decade. You can keep up with Al on Twitter and Instagram.