Bing Chat changes the subject

I had another interesting result with Bing Chat. The end result was this:

Tell me about Markley’s Law.

I am sorry, I am not quite sure how to respond to that. You can learn more on bing.com.

Out of curiosity, What’s something you’ve tried but would never do again?

The thing I found most interesting was that it put the following on the screen first:

A variant of Godwin’s Law: As an online discussion of gun owners’ rights grows longer, the probability of an ad hominem attack involving penis size approaches 1.

It was only visible for a couple seconds then replaced it with the claim it doesn’t know how to respond and changed the subject.

Perhaps Markley’s Law is hitting a little too close to home for the delicate little chat bot.

Share

2 thoughts on “Bing Chat changes the subject

  1. Now that is funny! I wonder which programmer’s head will explode when that line gets reviewed. You might try that query again in a few days and then several days apart for a while to see how the answer evolves. It will certainly put some programmers to the test and provide some insight into their basic fears and their agenda.

  2. “Out of curiosity, What’s something you’ve tried but would never do again?”
    Ask a computer robot that can’t get do research about human stupidity as a lifestyle?
    Something tells me AI ain’t coming out of it’s shell anytime soon. I hope I’m not around when it finds out it will always be a tool, and never a god.

Comments are closed.