Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
-
To be honest the making process sometimes frustrates me, but I don't see how you could be proud of what you have made if it's just hastily thrown together by a chatbot!
I'm proud of the things I've made, because I've put my own brain to use to create something I had in mind. I thought it worked this way for others too, but like you say I think we might be in the minority here

@Purple @kitcat To say nothing of when you've faced a problem, figured out a fix, and can *actually explain why that fix is correct*, and *apply the same reasoning in other situations*. Not just the same fix, but the same *reasoning*.
Maybe I'm old-fashioned like that, but actually having figured something out brings me joy. Even if it is stuff that lots of other people know. Learning how the pieces fit together to bring the result I get out of the thing I made.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple yea i know a teamlead who just had to fire one guy like that, who wouldn't stop trying to push broken ai slop despite it being very obvious. it's legit a cult that breaks people's brains -
@Purple @kitcat To say nothing of when you've faced a problem, figured out a fix, and can *actually explain why that fix is correct*, and *apply the same reasoning in other situations*. Not just the same fix, but the same *reasoning*.
Maybe I'm old-fashioned like that, but actually having figured something out brings me joy. Even if it is stuff that lots of other people know. Learning how the pieces fit together to bring the result I get out of the thing I made.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple As someone who basically refuses to use AI for anything, it's likely I'm unaware of just how much the use of it has pervaded people's working behaviors.
My understanding is that AI and LLM's are meant to be a tool. As with any tool, if used properly, it can be helpful and effective in completing a task. But there's a reason that people are educated on how to do things manually before they're allowed to use tools: there needs to be that sense of intuition that informs a person whether something seems correct or doesn't.
What you're describing, seems to be an outsourcing of critical thinking to these tools. It's like putting someone in the driver's seat of a self-driving car and now they're panicking when they have to take manual control.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple I think people are burned out and are just find AI easier to just continue earning money and do less.
source: me, coding makes me wanna kms
-
@Purple I think people are burned out and are just find AI easier to just continue earning money and do less.
source: me, coding makes me wanna kms
@Purple I think there would be less stories like these if the industry was more focused on just programming rather than having overengineered cringe solutions to suck off investors, so people do things that are actually interesting and have a real impact for the good rather than putting stress, pressure and making engineers just hate their hobby
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple using AI is just like managing a fairly incompetent Junior. Welcome to management. This is the fear and dread and uncertainty that you feel for the first couple of years after somebody has put you in charge of a software team. Where productivity can't be managed, and the results are not really within your control. How long do you wait to fire your AI assistance and go looking for another?
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple I think it’s more exposing than changing.
AI coding won’t make you able to do something you can’t do, it can give you code you must understand and verify.
It can’t replace someone with knowledge, but it can, if used sensibly, augment some parts. Like boring refactoring and generating some simple boring stuff.
-
Is anyone else experience this thing where your fellow senior engineers seem to be lobotomised by AI?
I've had 4 different senior engineers in the last week come up with absolutely insane changes or code, that they were instructed to do by AI. Things that if you used your brain for a few minutes you should realise just don't work.
They also rarely can explain why they make these changes or what the code actually does.
I feel like I'm absolutely going insane, and it also makes me not able to trust anyones answers or analysis' because I /know/ there is a high chance they should asked AI and wrote it off as their own.
I think the effect AI has had on our industry's knowledge is really significant, and it's honestly very scary.
@Purple my colleague wrote a question in one of our threads the other day and then not 10 seconds later wrote "wait, that was a really lazy question... I can figure that out myself"

He uses chatbots a lot and I think he basically hit a question that intersected "company specific knowledge a chatbot doesn't know" and "question he would usually ask a chatbot out of laziness".
I had already answered because I had the thread open when he asked, and knew the answer off hand, but I saved him maybe 1 minute of work so it was indeed rather overkill to disturb someone else for that.
-
I think AI is incredibly weak at actually understanding problems with any moderate degree of complexity.
People forget it's a language model that is just very advanced at predicting what would be a likely, or seemingly sensible answer to a prompt. It's kinda inherent to it's design, they try to overcome it by having it write it's own prompt ("thinking mode") but even then it still just doesn't grasp the full idea
-
@Purple
yes and I got fired shortly after probably because I was not "productive" enough compared to them -
E energisch_@troet.cafe shared this topic