this post was submitted on 12 Dec 2025
1205 points (95.6% liked)

No Stupid Questions

44477 readers
1815 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 2 years ago
MODERATORS
 

"garbage account"

you are viewing a single comment's thread
view the rest of the comments
[–] pulsewidth@lemmy.world 9 points 18 hours ago (1 children)

When people say "I fucking hate AI", 99% of the time they mean "I fucking hate AI™©®". They don't mean the technology behind it.

To add to your good points, I'm a CS grad that studied neural networks and machine learning years back, and every time I read some idiot claiming something like "this scientific breakthrough has got scientists wondering if we're on the cusp of creating a new species of superintelligence" or "90% of jobs will be obsolete in five years" it annoys me because its not real, and it's always someone selling something. Today's AI is the same tech they've been working on for 30+ years and incrementally building upon, but as Moore's Law has marched on we now have storage pools and computing power to run very advanced models and networks. There is no magic breakthrough, just hype.

The recent advancements are all driven by the $1500 billion spent on grabbing as many resources they could - all because some idiots convinced them it's the next gold rush. What has that $1500 bil got us? Machines that can answer general questions correctly around 40% of the time, plagiarize art for memes, create shallow corporate content that nobody wants, and write some half-decent code cobbled together from StackOverflow and public GitHub repos.

What a fucking waste of resources.

What's real is the social impacts, the educational impacts, the environmental impacts, the effect on artists and others who have had their work stolen for training, the useability of the Internet (search is fucked now), and what will be very real soon is the global recession/depression it causes as businesses realize more and more that it's not worth the cost to implement or maintain (in all but very few scenarios).

[–] devedeset@lemmy.zip 5 points 18 hours ago* (last edited 18 hours ago) (2 children)

I'm really split with it. I'm not a 10x "rockstar" programmer, but I'm a good programmer. I've always worked at small companies with small teams. I can figure out how to parse requirements, choose libraries/architecture/patterns, and develop apps that work.

Using Copilot has sped my work up by a huge amount. I do have 10 YoE before Copilot existed. I can use it to help write good code much faster. It may not be perfect, but it wouldn't have been perfect without it. The thing is I have enough experience to know when it is leading me down the wrong path, and that still happens pretty often. What it helps with is implementing common patterns, especially with common libraries. It basically automates the "google the library docs/stackoverflow and use code there as a starting point" aspect of programming. (edit: it also helps a lot with logging, writing tests, and rewriting existing code as long as it isn't too whacky, and even then you really need to understand the existing code to avoid a mess of bugs)

But yeah search is completely fucked now. I don't know for sure but I would guess stackoverflow use is way down. It does feel like many people are being pigeonholed into using the LLM tools because they are the only things that sort of work. There's also the vibe coding phenomenon where people without experience will just YOLO out pure tech debt, especially with the latest and greatest languages/libraries/etc where the LLMs don't work very well because there isn't enough data.

[–] Metju@lemmy.world 3 points 17 hours ago (1 children)

LLMs are an okay'ish tool if your code style is not veering from what 99% of the open-sourced codebase looks like. Use any fringe concept in a language (for example, treat errors as values in languages ridden with exceptions, use functional concepts in an OOP language) and you will have problems.

Also, this crap tends to be an automated copy-paste. Which is especially bad when it skips on abstracting away a concept you would notice if you were to write the code yourself.

Source: own experience 😄

[–] devedeset@lemmy.zip 2 points 16 hours ago

Totally agree. In my day to day work, I'm not dealing with anything groundbreaking. Everything I want/need to code has already been done.

if you have a Copilot license and are using the newest Visual Studio, it enables the agentic capabilities by default. It will actually write the code into your files directly. I have not done that and will not do that. I want to see and understand what it is trying to do.

[–] pulsewidth@lemmy.world 1 points 14 hours ago* (last edited 14 hours ago)

I agree it's great at writing and frame-working parts of code and selecting libraries - it definitely has value for coding. $1500 bil value though, I doubt.

My main concern there lies in the next gen of programmers. The work that ChatGPT (and Claude etc) outputs requires some significant programming prior-experience to allow them to make sense of the output and adjust (or correct) it to suit their scope and requirements of the project - it will be much harder for junior devs to learn that skill with LLMs doing all the groundwork - essentially the same problem in wider education now with kids/teens just using LLMs to write their homework and essays. The consequences will be long term, and significant. In addition (for coding) it's taking away the entry-level work that junior devs would usually do and then have cleaned up for prod by senior devs - and that's not theory, the job market for junior programmers is dying already.