• sircac@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    ·
    6 days ago

    A LLM can “reveal” also that water ice melts into mapple syrup given the proper prompts, if people already can (consciously and not) lie proportionally to their biases I don’t understand why would somebody treat a LLM output as a fact…

    • Someone8765210932@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      6 days ago

      I agree, but in this case, I think it doesn’t really matter if it is true. Either way, it is hilarious. If it is false, it shows how shitty AI hallucination is and the bad state of AI.

      Should the authors who publish this mention how likely this is all just a hallucination? Sure, but I think Musk is such a big spreader of misinformation, he shouldn’t get any protection from it.

      Btw. Many people are saying that Elon Musk has (had?) a small PP and a botched PP surgery.

      • bstix@feddit.dk
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        6 days ago

        It’s usually possible to ask the AI for the sources. A proper journalist should always question the validity of their sources.

        Unfortunately, journalism is dead. This is just someone writing funny clickbait, but it’s quite ironic how they use AI to discredit AI.

        It makes sense for a journalist to discredit AI because AI took their jobs. This is just not the way to do it, because AI is also better at writing clickbait.

        • Petter1@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 days ago

          If an AI isn’t in web search mode, it will just invent most likely answer, to the question of the source. Changes are very high that such sources don’t even exist.

          • bstix@feddit.dk
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            That’s why you ask for the sources, so you can check them.

            I think this kind of prompting is an important part of how to use it in any meaningful manner.

            You can also input your own sources and ask it to only use that. For instance by uploading a pdf of a law and ask it to figure out how to do something totally legal and then let it show where in the law it says so. You’ll obviously still need to check that the law actually says so and that it isn’t hallucinating.