I wanted to see if “Leo” could add value to my searches, so I gave it a test.
I asked it a question that I already knew the answer to, basically “give me a list of resources for XYZ”, and it gave me a list of “organizations” related to XYZ. FYI: XYZ is being used here as a filler example, the actual names are irrelevant.
The problem is, it seemingly pulled these resources out of its ass, because they DON’T EXIST.
I ask it for a website, then social media accounts, then how I could contact one of these organizations, and it provided them. Non-existent social media accounts were provided, a 555 phone number, and an email address that was simply made up. WTF??
I even told it that none of those links exist and then it comes up with:
“I apologize for the confusion. Unfortunately, it appears that the [REDACTED] does not have any social media accounts.”
Excuse me? Why the hell answer the question with fake social media links???
If this is what AI does, it’s quite literally dumber than the worst customer service person. LOL
Has this been your experience with ChatGPT or is only Leo a dumbass?
UPDATE: I continued to ask questions about this organization, it pulled out of thin air. At one point, it told me that it was a non-profit governed by a board of directors. When asked about who was on the board of directors, it said there was no board of directors. OK.
Then I continued to ask questions, and it answered my question “when was the organization established” with… and I shit you not:
“Unfortunately, there is no XYZ Association. The concept of a XYZ Association is fictional and was created for the purpose of this scenario.” 🫠 🫠 🫠 🫠 🫠
My company has been doing an internal test of GitHub CoPilot to see if it’s worth a company-wide subscription. One of the guys leading it came out with a video showing what he’s been doing with it.
In one example, he showed how it generates intellisense-like suggestions when writing code. “Here you can see CoPilot generates a snippet of code that does what I wanted! Well, almost … this method it wants to call doesn’t actually exist in our codebase.” Um, ok.
Then, “Now I can ask it to write a unit test for this code. Look at that! A full blown test in just a couple seconds! Well… it is asserting against the wrong value here. Not sure where it got that number.” LOL
Can’t wait to use AI to generate shitty code that I then have to debug.
(Disclaimer, these tests are absolutely worth doing to see how useful these tools are, but to me the results so far are amusing at best.)
The answers it provided were completely nonsensical! I honestly can’t believe how anyone could use it for anything but a laugh.