Mastodon Skip to content
  • Home
  • Aktuell
  • Tags
  • Über dieses Forum
Einklappen
Grafik mit zwei überlappenden Sprechblasen, eine grün und eine lila.
Abspeckgeflüster – Forum für Menschen mit Gewicht(ung)

Kostenlos. Werbefrei. Menschlich. Dein Abnehmforum.

  1. Home
  2. Uncategorized
  3. This blogpost makes an astoundingly good case about LLMs I hadn't considered before.

This blogpost makes an astoundingly good case about LLMs I hadn't considered before.

Geplant Angeheftet Gesperrt Verschoben Uncategorized
27 Beiträge 19 Kommentatoren 0 Aufrufe
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • cwebber@social.coopC cwebber@social.coop

    This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

    dave@rascalking.comD This user is from outside of this forum
    dave@rascalking.comD This user is from outside of this forum
    dave@rascalking.com
    schrieb zuletzt editiert von
    #8

    @cwebber dear god, i knew SO was taking a beating, but that chart the post linked to was _bleak_.

    1 Antwort Letzte Antwort
    0
    • mayintoronto@beige.partyM mayintoronto@beige.party

      @cwebber Sure didn't help that Stack Overflow was killing itself too.

      shom@gts.shom.devS This user is from outside of this forum
      shom@gts.shom.devS This user is from outside of this forum
      shom@gts.shom.dev
      schrieb zuletzt editiert von
      #9

      @mayintoronto @cwebber
      True, that decline was on it's way. But at least the information was publicly available.
      This is also why I was ranting about it two years ago when people were encouraging others to delete their stackoverflow contributions (as if that meant that the AI companies wouldn't have access to it, SO made a deal for direct share vs scraping). You're just denying it to other human beings at this point. Build a better web, post in personal blogs, community forums, don't lock up knowledge in opaque systems!

      1 Antwort Letzte Antwort
      0
      • cwebber@social.coopC cwebber@social.coop

        This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

        r343l@freeradical.zoneR This user is from outside of this forum
        r343l@freeradical.zoneR This user is from outside of this forum
        r343l@freeradical.zone
        schrieb zuletzt editiert von
        #10

        @cwebber Extra painful: bigger tech companies can afford to pay for plans that limit use of context data to train future versions of an LLM service's models so THEIR work is "protected" while their employees consume the commons. But smaller companies and individual users will be giving up their data.

        1 Antwort Letzte Antwort
        0
        • cwebber@social.coopC cwebber@social.coop

          This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

          dandean@indieweb.socialD This user is from outside of this forum
          dandean@indieweb.socialD This user is from outside of this forum
          dandean@indieweb.social
          schrieb zuletzt editiert von
          #11

          @cwebber enclosure strikes again

          1 Antwort Letzte Antwort
          0
          • cwebber@social.coopC cwebber@social.coop

            This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

            tseitr@mastodon.sdf.orgT This user is from outside of this forum
            tseitr@mastodon.sdf.orgT This user is from outside of this forum
            tseitr@mastodon.sdf.org
            schrieb zuletzt editiert von
            #12

            @cwebber this article have some good prediction on the knowledge silos LLM might be able to create, but it does not address the fact that the business model is not profitable. When the price hike hit to make LLM generating profits, people will have to balance the price of the subscription with the real value provided... just like any purchase. We might loose a few years of knowledge down the LLM silos before the collapse but personally, I think it is ok, we have plenty of good documented 1/2

            tseitr@mastodon.sdf.orgT 1 Antwort Letzte Antwort
            0
            • tseitr@mastodon.sdf.orgT tseitr@mastodon.sdf.org

              @cwebber this article have some good prediction on the knowledge silos LLM might be able to create, but it does not address the fact that the business model is not profitable. When the price hike hit to make LLM generating profits, people will have to balance the price of the subscription with the real value provided... just like any purchase. We might loose a few years of knowledge down the LLM silos before the collapse but personally, I think it is ok, we have plenty of good documented 1/2

              tseitr@mastodon.sdf.orgT This user is from outside of this forum
              tseitr@mastodon.sdf.orgT This user is from outside of this forum
              tseitr@mastodon.sdf.org
              schrieb zuletzt editiert von
              #13

              @cwebber tech from 2010 to 2023 that will still be fully usable. People choosing to use said tech will decrease cost and might fast-forward the downfall of LLM everywhere trend we are currently in.

              1 Antwort Letzte Antwort
              0
              • cwebber@social.coopC cwebber@social.coop

                This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                jens@social.finkhaeuser.deJ This user is from outside of this forum
                jens@social.finkhaeuser.deJ This user is from outside of this forum
                jens@social.finkhaeuser.de
                schrieb zuletzt editiert von
                #14

                @cwebber I keep repeating this in such contexts, apologies if I sound like a broken record: AI is a fascist project.

                The purpose isn't merely enclosure of the commons. Making public stuff private is more of a means to an end.

                There is centuries of historic precedent that shows that when a state has natural resources, it needs fewer people to extract wealth from that, and so pay for what keeps rulers in power.

                If a state has fewer resources, it has to rely on a large, healthy population's...

                jens@social.finkhaeuser.deJ 1 Antwort Letzte Antwort
                0
                • jens@social.finkhaeuser.deJ jens@social.finkhaeuser.de

                  @cwebber I keep repeating this in such contexts, apologies if I sound like a broken record: AI is a fascist project.

                  The purpose isn't merely enclosure of the commons. Making public stuff private is more of a means to an end.

                  There is centuries of historic precedent that shows that when a state has natural resources, it needs fewer people to extract wealth from that, and so pay for what keeps rulers in power.

                  If a state has fewer resources, it has to rely on a large, healthy population's...

                  jens@social.finkhaeuser.deJ This user is from outside of this forum
                  jens@social.finkhaeuser.deJ This user is from outside of this forum
                  jens@social.finkhaeuser.de
                  schrieb zuletzt editiert von
                  #15

                  @cwebber ... labour. A population that needs to be educated and mobile enough to fulfil their task. Such a population tends to demand more say in the affairs of state.

                  So natural resources lead to tyrannies, and lack thereof to democracies.

                  Privatisation of knowledge is a way of creating artificial resources to extract with fewer labourers. Plus, the more that extraction is automated, the smaller the population a ruler needs - or the more precarious their existence.

                  That is the goal.

                  kats@chaosfem.twK 1 Antwort Letzte Antwort
                  0
                  • cwebber@social.coopC cwebber@social.coop

                    This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                    svelmoe@hachyderm.ioS This user is from outside of this forum
                    svelmoe@hachyderm.ioS This user is from outside of this forum
                    svelmoe@hachyderm.io
                    schrieb zuletzt editiert von
                    #16

                    @cwebber
                    One thing I've noticed in the same aspect as this, is that I talk much less code with colleagues now and have much less interactions with them for working through problems, and thus limit my exposure to alternate problem solving.

                    When ever I want to discuss a problem, it's more often than not boiled down to some LLM answer, meaning, I might as well 'cut out' the middle and ask the LLM itself if all I get are LLM answers anyway.
                    That truly sucks.
                    "Have you asked Claude/Copilot/Chatgpt".....

                    1 Antwort Letzte Antwort
                    0
                    • cwebber@social.coopC cwebber@social.coop

                      This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                      mahadevank@mastodon.socialM This user is from outside of this forum
                      mahadevank@mastodon.socialM This user is from outside of this forum
                      mahadevank@mastodon.social
                      schrieb zuletzt editiert von
                      #17

                      @cwebber of course guys - it was never about the LLM, it was about crowd-sourcing intelligence at an epic-scale. Every piece of code a developer writes and fixes becomes training data. Same with every conversation. I'm surprised people don't see the danger in having one single overlord and gatekeeper of all information in the world. Its crazy.

                      People seem to have forgotten what the real meaning of democracy and multi-lateralism are.

                      1 Antwort Letzte Antwort
                      0
                      • cwebber@social.coopC cwebber@social.coop

                        This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                        dianshuo@mstdn.ioD This user is from outside of this forum
                        dianshuo@mstdn.ioD This user is from outside of this forum
                        dianshuo@mstdn.io
                        schrieb zuletzt editiert von
                        #18

                        @cwebber this isn’t really new. For all the things on StackOverflow there was a huge domain of knowledge that was just not on there.

                        For most of my corporate developer life the knowledge/bug fixes would not be found on public forums but in internal collective knowledge, documentation or simply knowing a person in the same field. Most of this was not public domain.

                        The biggest issue now is that those firms commit their group knowledge to LLMs and we will not get it back.

                        1 Antwort Letzte Antwort
                        0
                        • cwebber@social.coopC cwebber@social.coop

                          This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                          michiel@social.tchncs.deM This user is from outside of this forum
                          michiel@social.tchncs.deM This user is from outside of this forum
                          michiel@social.tchncs.de
                          schrieb zuletzt editiert von
                          #19

                          @cwebber you calling it an 'astoundingly good case' makes me feel insightful in a way no LLM has been able to accomplish. I'm going to be insufferably smug for the rest of the day 🙂

                          1 Antwort Letzte Antwort
                          0
                          • vfrmedia@social.tchncs.deV vfrmedia@social.tchncs.de

                            @cwebber Already the forums for VOIP software and embedded stuff (Arduino etc) are fully enshittified - they were toxic enough pre-AI) and folk have simply stopped contributing there (I think this happened just *before* LLMs became popular, so the quality of whatever does go to the training sets isn't going to be much good).

                            I suspect another factor is when people are getting paid for their work *and* depending on their employers upselling SaaS or other commercial services, they are less inclined to share stuff with the competition (I had to figure out my PJSIP trunk and securing it for myself, most of my findings are tooted here on Fedi as I'm not even sure where else to put them)

                            lfzz@mastodon.socialL This user is from outside of this forum
                            lfzz@mastodon.socialL This user is from outside of this forum
                            lfzz@mastodon.social
                            schrieb zuletzt editiert von
                            #20

                            @vfrmedia make a small static site blog to share your experience? Put so ethibg like Anubis to stop/slow scraping

                            1 Antwort Letzte Antwort
                            0
                            • jens@social.finkhaeuser.deJ jens@social.finkhaeuser.de

                              @cwebber ... labour. A population that needs to be educated and mobile enough to fulfil their task. Such a population tends to demand more say in the affairs of state.

                              So natural resources lead to tyrannies, and lack thereof to democracies.

                              Privatisation of knowledge is a way of creating artificial resources to extract with fewer labourers. Plus, the more that extraction is automated, the smaller the population a ruler needs - or the more precarious their existence.

                              That is the goal.

                              kats@chaosfem.twK This user is from outside of this forum
                              kats@chaosfem.twK This user is from outside of this forum
                              kats@chaosfem.tw
                              schrieb zuletzt editiert von
                              #21

                              @jens @cwebber Thus, our only effective defense is mass co-operation among peers, in a way that's resistant to further disruption by said fascists.

                              We really are in trouble, aren't we?

                              1 Antwort Letzte Antwort
                              0
                              • futzle@old.mermaid.townF futzle@old.mermaid.town

                                @cwebber The number of times someone would DM me on a forum asking me for private help, and I would always tell them to ask on the public forum so that everyone else could benefit … and they never did.

                                The “fuck the community, I’ve got mine” is stronger than ever.

                                bornach@fosstodon.orgB This user is from outside of this forum
                                bornach@fosstodon.orgB This user is from outside of this forum
                                bornach@fosstodon.org
                                schrieb zuletzt editiert von
                                #22

                                @futzle @cwebber
                                When newbies encounter toxicity for asking their question on a public forum, cannot really blame them for turning to a LLM.
                                https://youtu.be/N7v0yvdkIHg

                                futzle@old.mermaid.townF 1 Antwort Letzte Antwort
                                0
                                • bornach@fosstodon.orgB bornach@fosstodon.org

                                  @futzle @cwebber
                                  When newbies encounter toxicity for asking their question on a public forum, cannot really blame them for turning to a LLM.
                                  https://youtu.be/N7v0yvdkIHg

                                  futzle@old.mermaid.townF This user is from outside of this forum
                                  futzle@old.mermaid.townF This user is from outside of this forum
                                  futzle@old.mermaid.town
                                  schrieb zuletzt editiert von
                                  #23

                                  @bornach @cwebber Yeah, that was not the case on the forum I was referring to, but Stack Overflow dropped the ball not moderating that kind of sanctimony.

                                  1 Antwort Letzte Antwort
                                  0
                                  • cwebber@social.coopC cwebber@social.coop

                                    This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                                    reneschwietzke@foojay.socialR This user is from outside of this forum
                                    reneschwietzke@foojay.socialR This user is from outside of this forum
                                    reneschwietzke@foojay.social
                                    schrieb zuletzt editiert von
                                    #24

                                    @cwebber I agree. There is less public information for future training for anyone as well as similar code due to more AI written software also in the open space. I expect an input standstill until someone invents non-LLM AI for coding.

                                    1 Antwort Letzte Antwort
                                    0
                                    • cwebber@social.coopC cwebber@social.coop

                                      This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                                      mhartle@mastodon.onlineM This user is from outside of this forum
                                      mhartle@mastodon.onlineM This user is from outside of this forum
                                      mhartle@mastodon.online
                                      schrieb zuletzt editiert von
                                      #25

                                      @cwebber Well, people went to StackOverflow with a question and looked forward to answers based on the experience of others. While one can still ask an LLM and give a rubber-duck training session for its provider, I still fail to see the influx of answers based on experience.

                                      1 Antwort Letzte Antwort
                                      0
                                      • cwebber@social.coopC cwebber@social.coop

                                        This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                                        leeloo@chaosfem.twL This user is from outside of this forum
                                        leeloo@chaosfem.twL This user is from outside of this forum
                                        leeloo@chaosfem.tw
                                        schrieb zuletzt editiert von
                                        #26

                                        @cwebber
                                        Wait, if AI caused the collapse of wrong-answers-only sites like stackoverflow, doesn't that mean they have positive uses?

                                        1 Antwort Letzte Antwort
                                        0
                                        • cwebber@social.coopC cwebber@social.coop

                                          This blogpost makes an astoundingly good case about LLMs I hadn't considered before. The collapse of public forums (like Stack Overflow) for programming answers coincides directly with the rise of programmers asking for answers from chatbots *directly*. Those debugging sessions become part of a training set that now *only private LLM corporations have access to*. This is something that "open models" seemingly can't easily fight. https://michiel.buddingh.eu/enclosure-feedback-loop

                                          martijn@scholar.socialM This user is from outside of this forum
                                          martijn@scholar.socialM This user is from outside of this forum
                                          martijn@scholar.social
                                          schrieb zuletzt editiert von
                                          #27

                                          @cwebber but also, as uninviting as the stack overflow culture may have been, the moderators were there to try to get people to ask better questions. I doubt llms will handle things like x/y problem issues, so to me it seems things will get worse for people able/willing to pay as well.

                                          1 Antwort Letzte Antwort
                                          0
                                          • svenja@mstdn.gamesS svenja@mstdn.games shared this topic
                                          Antworten
                                          • In einem neuen Thema antworten
                                          Anmelden zum Antworten
                                          • Älteste zuerst
                                          • Neuste zuerst
                                          • Meiste Stimmen



                                          Copyright (c) 2025 abSpecktrum (@abspecklog@fedimonster.de)

                                          Erstellt mit Schlaflosigkeit, Kaffee, Brokkoli & ♥

                                          Impressum | Datenschutzerklärung | Nutzungsbedingungen

                                          • Anmelden

                                          • Du hast noch kein Konto? Registrieren

                                          • Anmelden oder registrieren, um zu suchen
                                          • Erster Beitrag
                                            Letzter Beitrag
                                          0
                                          • Home
                                          • Aktuell
                                          • Tags
                                          • Über dieses Forum