For the record, I used ChatGPT to solve an AWS configuration problem I was having.
And by "solve" I mean "looking at the recommended code, seeing the error there and realizing my own code had a similar error", because the proposed solution wouldn't have worked.
But sure, let's use it to give medical answers on Bing and Google.
@funranium points out that in a lab setting, someone is going to die if they rely on plausible and wrong advice.
This is going to go very wrong, very quickly and it's very obvious that that's what's going to happen.
@Homebrewandhacking @pgcd Here's another one.
Search for poison control advice.