Mastodon Skip to content
  • Home
  • Aktuell
  • Tags
  • Über dieses Forum
Einklappen
Grafik mit zwei überlappenden Sprechblasen, eine grün und eine lila.
Abspeckgeflüster – Forum für Menschen mit Gewicht(ung)

Kostenlos. Werbefrei. Menschlich. Dein Abnehmforum.

  1. Home
  2. Uncategorized
  3. FOUND IT

FOUND IT

Geplant Angeheftet Gesperrt Verschoben Uncategorized
82 Beiträge 66 Kommentatoren 0 Aufrufe
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • adrianriskin@kolektiva.socialA adrianriskin@kolektiva.social

    @lokeloski

    It's the Gell-Mann amnesia effect all over again.

    -----------
    The Gell-Mann amnesia effect is a claimed cognitive bias describing the tendency of individuals to critically assess media reports in a domain they are knowledgeable about, yet continue to trust reporting in other areas despite recognizing similar potential inaccuracies.

    https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect

    at1st@mstdn.caA This user is from outside of this forum
    at1st@mstdn.caA This user is from outside of this forum
    at1st@mstdn.ca
    schrieb zuletzt editiert von
    #32

    @AdrianRiskin @lokeloski It's one of the reasons that, when pointing out that I do not like Generative LLMs for the work they output, I do emphasize that it's not just *my* programming expertise that I feel this for.

    Like, I feel the same way for books; if you wrote it with an LLM, and we can see because a prompt made it into the printed version, that tells me that you did not read what you claimed to have "Wrote" with an LLM - why should I read it then, when I know it can do the same thing it can do for math, or coding, or images?

    1 Antwort Letzte Antwort
    0
    • jantietje@norden.socialJ jantietje@norden.social shared this topic
    • lokeloski@mastodon.socialL lokeloski@mastodon.social

      FOUND IT

      geeeero@mastodon.socialG This user is from outside of this forum
      geeeero@mastodon.socialG This user is from outside of this forum
      geeeero@mastodon.social
      schrieb zuletzt editiert von
      #33

      @lokeloski Very well put. To me, this is similar to the Gell-Mann amnesia effect, where for subjects we have deep knowlege about, we see all the flaws in media reports, but tend to assume that for all other subjects, the media reports are basically fine. @davidgerard

      https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect?wprov=sfla1

      davidgerard@circumstances.runD 1 Antwort Letzte Antwort
      0
      • gkrnours@mastodon.gamedev.placeG gkrnours@mastodon.gamedev.place

        @lokeloski dev seems to be the only one thinking they can replace their own job with AI and everything will be fine

        mathieugenois@fediscience.orgM This user is from outside of this forum
        mathieugenois@fediscience.orgM This user is from outside of this forum
        mathieugenois@fediscience.org
        schrieb zuletzt editiert von
        #34

        @gkrnours
        Some mathematicians are also on this "let's automatize our own job" path…
        @lokeloski

        craigduncan@mastodon.auC 1 Antwort Letzte Antwort
        0
        • lokeloski@mastodon.socialL lokeloski@mastodon.social

          FOUND IT

          bovaz@misskey.socialB This user is from outside of this forum
          bovaz@misskey.socialB This user is from outside of this forum
          bovaz@misskey.social
          schrieb zuletzt editiert von
          #35
          @lokeloski@mastodon.social I just shared this at work.
          With some of the people pushing for AI integration everywhere.
          1 Antwort Letzte Antwort
          0
          • lokeloski@mastodon.socialL lokeloski@mastodon.social

            FOUND IT

            bartholin@fops.cloudB This user is from outside of this forum
            bartholin@fops.cloudB This user is from outside of this forum
            bartholin@fops.cloud
            schrieb zuletzt editiert von
            #36
            @lokeloski or programmers asking LLMs to generate code for them, because they cannot code
            1 Antwort Letzte Antwort
            0
            • lokeloski@mastodon.socialL lokeloski@mastodon.social

              FOUND IT

              pi_rat@shitposter.worldP This user is from outside of this forum
              pi_rat@shitposter.worldP This user is from outside of this forum
              pi_rat@shitposter.world
              schrieb zuletzt editiert von
              #37
              @lokeloski rediscovering https://en.wikipedia.org/wiki/Michael_Crichton#Gell-Mann_amnesia_effect all over are we
              1 Antwort Letzte Antwort
              0
              • lokeloski@mastodon.socialL lokeloski@mastodon.social

                FOUND IT

                arnotron@noc.socialA This user is from outside of this forum
                arnotron@noc.socialA This user is from outside of this forum
                arnotron@noc.social
                schrieb zuletzt editiert von
                #38

                @lokeloski I call that Mount Stupid

                1 Antwort Letzte Antwort
                0
                • geeeero@mastodon.socialG geeeero@mastodon.social

                  @lokeloski Very well put. To me, this is similar to the Gell-Mann amnesia effect, where for subjects we have deep knowlege about, we see all the flaws in media reports, but tend to assume that for all other subjects, the media reports are basically fine. @davidgerard

                  https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect?wprov=sfla1

                  davidgerard@circumstances.runD This user is from outside of this forum
                  davidgerard@circumstances.runD This user is from outside of this forum
                  davidgerard@circumstances.run
                  schrieb zuletzt editiert von
                  #39

                  @geeeero @lokeloski important to note the Gell-Mann effect is made up trash. It's literally something Crichton said once. So imagine how cognitive psychologists feel about it.

                  tattie@eldritch.cafeT hazelnot@sunbeam.cityH 2 Antworten Letzte Antwort
                  0
                  • lokeloski@mastodon.socialL lokeloski@mastodon.social

                    FOUND IT

                    W This user is from outside of this forum
                    W This user is from outside of this forum
                    woo@fosstodon.org
                    schrieb zuletzt editiert von
                    #40

                    @lokeloski Fortunately for AI pushers, most people are ignorant about most things. Optimistically, the Inverse 80/20 rule applies.

                    1 Antwort Letzte Antwort
                    0
                    • lokeloski@mastodon.socialL lokeloski@mastodon.social

                      FOUND IT

                      xerge@mastodon.nlX This user is from outside of this forum
                      xerge@mastodon.nlX This user is from outside of this forum
                      xerge@mastodon.nl
                      schrieb zuletzt editiert von
                      #41

                      @lokeloski I’ve seen this attitude even in some highly skilled people.

                      The idea that what they’re doing is obviously complex and requires deep knowledge and skills, but work that others are doing is obviously trivial. Very surprising.

                      It’s not uncommon for undergraduates to assume some field is easy, because the introductory course they had on it was, but for accomplished professors to have similar ideas about fields outside of their expertise? Why? Is there a psychologist in the house?

                      nymnympseudonymm@mstdn.scienceN 1 Antwort Letzte Antwort
                      0
                      • lokeloski@mastodon.socialL lokeloski@mastodon.social

                        FOUND IT

                        beemoh@mastodonapp.ukB This user is from outside of this forum
                        beemoh@mastodonapp.ukB This user is from outside of this forum
                        beemoh@mastodonapp.uk
                        schrieb zuletzt editiert von
                        #42

                        @lokeloski An extra step to this I saw elsewhere- "People think it can do things except the things they personally are competent to do. Which is why the C Suite thinks it can do everything"

                        1 Antwort Letzte Antwort
                        0
                        • lokeloski@mastodon.socialL lokeloski@mastodon.social

                          FOUND IT

                          jigmedatse@social.openpsychology.netJ This user is from outside of this forum
                          jigmedatse@social.openpsychology.netJ This user is from outside of this forum
                          jigmedatse@social.openpsychology.net
                          schrieb zuletzt editiert von
                          #43

                          @lokeloski@mastodon.social Why do I always find it at best questionable for any field I look at? Like, "yeah that kind of feels like that's maybe decent, but I'd have to check out to see if it's actually stupid..." Ah well, because it's always stupid when I have the slightest bit of a clue.

                          1 Antwort Letzte Antwort
                          0
                          • lokeloski@mastodon.socialL lokeloski@mastodon.social

                            FOUND IT

                            wolf@helvede.netW This user is from outside of this forum
                            wolf@helvede.netW This user is from outside of this forum
                            wolf@helvede.net
                            schrieb zuletzt editiert von
                            #44

                            @lokeloski The strange thing about AI is that it generates great answers to everything I don't know much about, yet in my field of expertise it seems to be incredibly dumb.

                            1 Antwort Letzte Antwort
                            0
                            • denofearth@mas.toD denofearth@mas.to

                              @lokeloski
                              I recently went to an opera where the composer was not only present but also performing as one of the soloists, among five other vocalists, along with a men's choir, accompanied by a full orchestra.

                              The backdrop to this rich contribution to human musical art was AI visuals projected onto a screen.

                              shaulaevans@zirk.usS This user is from outside of this forum
                              shaulaevans@zirk.usS This user is from outside of this forum
                              shaulaevans@zirk.us
                              schrieb zuletzt editiert von
                              #45

                              @DenOfEarth @lokeloski 😬

                              1 Antwort Letzte Antwort
                              0
                              • lokeloski@mastodon.socialL lokeloski@mastodon.social

                                FOUND IT

                                nerthos@shitposter.worldN This user is from outside of this forum
                                nerthos@shitposter.worldN This user is from outside of this forum
                                nerthos@shitposter.world
                                schrieb zuletzt editiert von
                                #46
                                @lokeloski So what I'm getting from that post is "generative AI is like an indian, but it takes a lot of electricity instead of shitting on the street"
                                1 Antwort Letzte Antwort
                                0
                                • distractal@hachyderm.ioD distractal@hachyderm.io

                                  @lokeloski Really kind of gets back to cultural acceptance of the idea of "unskilled labor", labor as just something that can be swapped out and dehumanized, merely a resource, a tool, not a high-context manifestation of human effort.

                                  remove_huilo@mas.toR This user is from outside of this forum
                                  remove_huilo@mas.toR This user is from outside of this forum
                                  remove_huilo@mas.to
                                  schrieb zuletzt editiert von
                                  #47

                                  @distractal @lokeloski we need to call CEO-level jobs unskilled labor for precisely the reasons you just listed.

                                  1 Antwort Letzte Antwort
                                  0
                                  • lokeloski@mastodon.socialL lokeloski@mastodon.social

                                    FOUND IT

                                    drhyde@fosstodon.orgD This user is from outside of this forum
                                    drhyde@fosstodon.orgD This user is from outside of this forum
                                    drhyde@fosstodon.org
                                    schrieb zuletzt editiert von
                                    #48

                                    @lokeloski my biggest takeaway from this is that YOU CAN BE A COMICS PROFESSOR?!?!!?!?

                                    1 Antwort Letzte Antwort
                                    0
                                    • mynameistillian@plush.cityM mynameistillian@plush.city

                                      @lokeloski the lack of artist solidarity stings here...those people think everyone else but them can be replaced by an LLM...foolish creatures

                                      deborahh@cosocial.caD This user is from outside of this forum
                                      deborahh@cosocial.caD This user is from outside of this forum
                                      deborahh@cosocial.ca
                                      schrieb zuletzt editiert von
                                      #49

                                      @mynameistillian @lokeloski ah, I see it now: *this* is at the root of why mandated AI use is so corrosive. Someone up the heirarchy, not understanding the complexity of the work of their subordinates, thinks they are replaceable by the machine. Hmm. I need to think on this.

                                      wifwolf@packmates.orgW hamishb@mstdn.caH 2 Antworten Letzte Antwort
                                      0
                                      • philwill@aus.socialP philwill@aus.social

                                        @ratsnakegames @lokeloski
                                        Absolutely, what we do not know intimately we make assumptions about...
                                        They 'just' do their thing, how could it possibly be as important, complex and difficult as the work that I am doing?

                                        deborahh@cosocial.caD This user is from outside of this forum
                                        deborahh@cosocial.caD This user is from outside of this forum
                                        deborahh@cosocial.ca
                                        schrieb zuletzt editiert von
                                        #50

                                        @PhilWill @ratsnakegames @lokeloski

                                        ah, I see it now: *this* is at the root of why mandated AI use is so corrosive. Someone up the heirarchy, not understanding the complexity of the work of their subordinates, declares they are replaceable by the machine.

                                        Hmm. I need to think on this.

                                        1 Antwort Letzte Antwort
                                        0
                                        • lokeloski@mastodon.socialL lokeloski@mastodon.social

                                          FOUND IT

                                          toerror@mastodon.gamedev.placeT This user is from outside of this forum
                                          toerror@mastodon.gamedev.placeT This user is from outside of this forum
                                          toerror@mastodon.gamedev.place
                                          schrieb zuletzt editiert von
                                          #51

                                          @lokeloski In both cases it sounds like the advice is that you can't use it for things that you are going to be marked on - the justification given is of course, wrong.

                                          1 Antwort Letzte Antwort
                                          0
                                          Antworten
                                          • In einem neuen Thema antworten
                                          Anmelden zum Antworten
                                          • Älteste zuerst
                                          • Neuste zuerst
                                          • Meiste Stimmen



                                          Copyright (c) 2025 abSpecktrum (@abspecklog@fedimonster.de)

                                          Erstellt mit Schlaflosigkeit, Kaffee, Brokkoli & ♥

                                          Impressum | Datenschutzerklärung | Nutzungsbedingungen

                                          • Anmelden

                                          • Du hast noch kein Konto? Registrieren

                                          • Anmelden oder registrieren, um zu suchen
                                          • Erster Beitrag
                                            Letzter Beitrag
                                          0
                                          • Home
                                          • Aktuell
                                          • Tags
                                          • Über dieses Forum