this post was submitted on 22 Jun 2025
377 points (99.0% liked)
Not The Onion
17072 readers
878 users here now
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
He should've asked ChatGPT instead. That's how "modern developers" seem to get by
I’m not a developer, I do not work in a technology field, but I used to. I know Linux sysadmin, security, and a very small amount of python.
ChatGPT has allowed me to “write” code that I use everyday in my business. So, I’m not a developer, but it lets me do things I otherwise would not be able to.
My business is still too small to even consider hiring a developer, so it’s allowing me to grow my business.
I’m just writing this to point out that “devs” are not the only people using chatgpt to write code.
Chatgpt and other LLMs are fantastic technical task assistants but, and this is a big but, you need to treat their work the same way you'd treat work from a new intern. Verify the output before you trust it.
It’s just making some front end stuff for other people to use to access PDFs on my server that need some level of protection and access control.
So, it’s been pretty easy to verify.
I’m too paranoid about trusting it or even myself to write code that could have irreversible effects.
Thanks for the advice🙏
Could just go full Luddite and pull his tooth by tying string to it like developers who refuse to use AI.
Not using the nonsense machine doesn't mean we have to abandon the whole rest of the toolbox.
Jumping on the new shiny thing and relying on it over all the other tried and tested tools is the core re-occuring mistake in development.
What? This fantastical scenario has never happened. Name one other new development tool that has lead to the sort of issues you seem to think will happen. Debuggers? Auto-complete? Syntax highlighting? Better build tooling?
Hey man, maybe re-read my comment. It's not long.
I never said a single tool causes issues. I said abandoning existing tools to only use the new thing is a problem.
See people who want to only use the newest frameworks, to the point of re-building projects when they come out.
See people who fixate on a single design pattern and insist on using it in every application.
And I said - when the hell has that ever happened? Ever?
I'm talking about development tools not platforms and libraries. An LLM is not replacing a framework. It's not replacing... Anything really.
So google how to do a root canal is cool with you?
Go sit in his chair. Moron
It's not a panacea like they're saying it is. This is going to kill the software developer space.
No.
Not even close.