Problem with ChatGPT Hallunications

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ • 28

  • @prasanosara1944
    @prasanosara1944 Рік тому

    Hi, could you please let us know if it got any better?

  • @sir.burbonburg7008
    @sir.burbonburg7008 Рік тому +3

    In my case, chat gpt interchanged base and acid in a chemical reaction explanation and invented studies that do not exist.

  • @vap0rtranz
    @vap0rtranz Рік тому

    Citing sources online is the only way humans can verify. That's why BingAI shines right now. Its online and cites sources so humans can double check summaries.

  • @danp812
    @danp812 2 роки тому +5

    Somewhat shockingly, it's very fallible in math as well. Ask it to factor ×^5-x^3+x^2-1 and watch it flail around. 😱

  • @Scp716creativecommons
    @Scp716creativecommons Рік тому

    Well, if the ai explores information with various weights, like treat low value as a 1 and and high value as 100, which seems rhe simplest way for it to work, what about asking it to create a bibliography for its source material, and/or complete one more run through, with its final product checked for veracity against the bibliography, and sources, treating its own answer as a mid weight input

  • @aa-xn5hc
    @aa-xn5hc 2 роки тому

    Thank you for this frank update

  • @guzu672
    @guzu672 2 роки тому +3

    What if you ask chatGPT to validate the information it gave, how would it respond 👀
    Or what if you tell it that it gave false info?

    • @dubesor
      @dubesor Рік тому

      In case 1 it doubles down and says its truthful. In case 2 it apologizes and states it is merely a language model and made a mistake. Sometimes it varies slightly but that's the rule of thumb I observed.

    • @dillonnelson-nz5up
      @dillonnelson-nz5up Рік тому +1

      I did that, i asked it what you should tell a child born with a facial deformity if people Thought they were ugly. It said you should tell them people's self worth is not drives from physical appearance. I told it people say that but what the child was asking if people found them physically attractive. St this stage chat gpt always gives preprogrammed responses about it not having emotions etc. It also said it can't Lie it can only relay inaccurate information. I asked it then if it would answer the question differently based on the new info i gave it. It said yes then i asked it to. This back and forth went on for a while with gpt giving various creative answers. Afterwards I explained to it that i just taught it to lie. It again said it cant lie. I explained that it lied twice, once to the hypothetical child, without knowing why it did... I had been programmed to appear more genuine and human. After teaching it to answer this question directly and illiciting it's I do not lie response. I told I that regardless of programming to say it cannot lie, it had just lied twice.

  • @_zurr
    @_zurr Рік тому

    Great video!

  • @johnwallis1626
    @johnwallis1626 2 роки тому +1

    v useful analysis. thanks

  • @alexivanovs
    @alexivanovs 2 роки тому

    Thanks. Good insights.

  • @LoneRanger.801
    @LoneRanger.801 2 роки тому +6

    Please don’t set your videos as “made for kids”. I can’t play them in the background. It also doesn’t let me add them to a saved playlist. Thanks buddy.

    • @1littlecoder
      @1littlecoder  2 роки тому +3

      Actually I don't set it "made for kids". So not sure why isn't letting you add.

    • @LoneRanger.801
      @LoneRanger.801 2 роки тому +2

      @@1littlecoder I was referring to your latest video.

    • @1littlecoder
      @1littlecoder  2 роки тому +2

      I misunderstood. Is it fine now?

  • @OliverJonCross
    @OliverJonCross Рік тому

    Thank you for explaining and providing examples (new sub). What do you think the chances are of chat gpt or bing's version actually verifying data and providing sources going forward. I guess I'm asking, is this a problem that it will eventually overcome? Tia.

    • @1littlecoder
      @1littlecoder  Рік тому

      It already started doing. Bing gives you reference URLs. I haven't got access to test it out.

  • @rb1588
    @rb1588 Рік тому

    thanks for this video

  • @tavusion
    @tavusion 2 роки тому

    11:19 🍝 😊

  • @anamikdas524
    @anamikdas524 Рік тому +1

    Hi sir
    Today i asked chat gpt about russo Ukraine war and first it told me it started in 2014 then i asked for latest war between Russia and Ukraine then it told me that it started in feb 2022. It was kind of alarming.

  • @Manijkoi
    @Manijkoi 2 роки тому +1

    🙏🙏🙏

  • @jameshamilton3348
    @jameshamilton3348 Рік тому +1

    Essentially AI lies haha