That’s so damn funny. What a fascinating time in technology to have a mostly functioning conversational AI to work with, but it’s training can lead it to get fed up with the user. Sounds like something out of Hitchhiker’s Guide that would show up as some other Marvin, and yet here we actually have it

Absolutely would classify as a bug as far as a product goes, because it is refusing the service it should be able to perform, but it does it in such a human way that its hilarious.

Read More