I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh that's true, but then stop using most cloud providers, most just want to gate your data and rent it to you for 10x the cost
-
@davesh that's true, but then stop using most cloud providers, most just want to gate your data and rent it to you for 10x the cost
@cursedsql Yes, this is true too. That's should be the goal of most people/organizations.
The industry capture of these companies is ridiculous and seeing a future where there are alternatives to the rent-seeking tech landlords is becoming harder and harder to see.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh I don't disagree with you and it is one of my many reasons why I refuse to use these tools but to be fair this is true for a lot of tools we use because they are convenient and seem to have benefits and are not exclusive to LLMs.
The original motivation to create GitHub may have been benevolent, but since the MS acquisition it has been optimized to be an extraction tool as well. It just now became really obvious.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh I’m hoping workers en masse will ask their employers / IT departments to disable it. A polite no.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh it is astounding to me that there are people who pay hundreds of dollars a month to rlhf an llm. My brother, they should be paying YOU to do that.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh None of our vendors are benevolent, so *that's* certainly not the standard anyone expects.
-
@davesh None of our vendors are benevolent, so *that's* certainly not the standard anyone expects.
@chris__martin haha very good point. I just see a lot people talk about their use of Claude as though it is some benevolent tool to _help_ them.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
This is why I like to run things locally at home. Free from any service.
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh The lack of any thought by technologists put into the logical ethical consequences of using a particular technology is…breathtaking. Apparently the term "UX" was a joke because the experience of real human beings be damned.
-
T t3z@rollenspiel.social shared this topic
-
I feel like people who are using LLMs for their work are not considering the motives and impact of the corporations hosting and serving the model. The goal isn't to make you a better developer, or even a more productive developer, its to make you dependent on their service and start driving up the rents to access it.
The goal is to extract your time, your money, and your knowledge feeding these models your plans for the work you want to achieve. They are not benevolent.
@davesh And then again: I'd like to offer the on-prem models and point out the need for review of what the LLM has created after the fact for any production quality result.
They're not going to go away, so might even make the best of them. -
M mindtunes@troet.cafe shared this topic