No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
view the rest of the comments
I don't hate AI, LLMs are incredibly powerful tools that have an incredibly wide range of uses. The technology itself is something that's very exciting and promising.
What I do hate is how they're being used by large corporations. A small handful of big tech companies (Google, Microsoft, Facebook, OpenAI, etc) decided to take this technology and pursue it in the greediest ways possible:
They took open source code, built on top of it, and closed it off so they could sell it
They scrapped all the data on the internet without consent and used it to train their models
They made their models generate stuff based on copyrighted works without permission or giving credit, thus basically stealing the content
But that wasn't enough for them so they decided to train their models on every interaction you have with their LLM services, so all your private conversations are stored and recycled even if you don't want that to happen
They use the data from the conversations that you've had with the chatbots to build customer profiles about you that they sell to advertisers so they could send you hoards of personalized ads
They started integrating their LLMs into their other products as much as they could so they could artificially increase their stock prices
They aggressively campaign for other companies to buy and integrate their models so both parties could artificially increase their stock prices
In order to meet their artificially induced demand, they're sucking the life out of the electricity grid, which is screwing over everybody else
They're also taking over the hardware industry and killing off consumer electronics since its more profitable for manufacturers to sell to AI companies than to consumers
They're openly bribing, lobbying, and campaigning governments to give them grants, tax breaks, and keep regulations at a minimum so they could do whatever they want and have society pay for the privilege
They're using these LLMs to cut as many jobs as possible so they could penny pinch just a little more, hence the massive waves of recent layoffs recently. This is being done even if the LLM replacements perform far worse than humans.
All of this is being done with zero regard to the environmental damage caused by them with their monstrous data centers and electricity consumption
All of this is being done with zero regard to the harmful impacts caused to people and society. These LLMs frequently lie and spread misinformation, they feed into delusions and bad habits of mentally unwell people, and they're causing great damage to schools since students could use these models to easily cheat and nothing can be done about it
When you put all of this together, then it's easy to understand why people hate AI. This is what people oppose, and rightfully so. These corporations created a massive bubble and put our economy at risk of a major recession, they're destabilizing our infrastructure, destroying our environment, they're corrupting our government, they're forcing tens of thousands of people into dire financial situations by laying them off, they're eroding our privacy and rights, and they're harming our mental health... and for what? I'll tell you, all of this is done so a few greedy billionaires could squeeze a few more dollars out of everything so they could buy their 5th yacht, 9th private jet, or 7th McMansion. Fuck them all.
When people say "I fucking hate AI", 99% of the time they mean "I fucking hate AI™©®". They don't mean the technology behind it.
To add to your good points, I'm a CS grad that studied neural networks and machine learning years back, and every time I read some idiot claiming something like "this scientific breakthrough has got scientists wondering if we're on the cusp of creating a new species of superintelligence" or "90% of jobs will be obsolete in five years" it annoys me because its not real, and it's always someone selling something. Today's AI is the same tech they've been working on for 30+ years and incrementally building upon, but as Moore's Law has marched on we now have storage pools and computing power to run very advanced models and networks. There is no magic breakthrough, just hype.
The recent advancements are all driven by the $1500 billion spent on grabbing as many resources they could - all because some idiots convinced them it's the next gold rush. What has that $1500 bil got us? Machines that can answer general questions correctly around 40% of the time, plagiarize art for memes, create shallow corporate content that nobody wants, and write some half-decent code cobbled together from StackOverflow and public GitHub repos.
What a fucking waste of resources.
What's real is the social impacts, the educational impacts, the environmental impacts, the effect on artists and others who have had their work stolen for training, the useability of the Internet (search is fucked now), and what will be very real soon is the global recession/depression it causes as businesses realize more and more that it's not worth the cost to implement or maintain (in all but very few scenarios).
I'm really split with it. I'm not a 10x "rockstar" programmer, but I'm a good programmer. I've always worked at small companies with small teams. I can figure out how to parse requirements, choose libraries/architecture/patterns, and develop apps that work.
Using Copilot has sped my work up by a huge amount. I do have 10 YoE before Copilot existed. I can use it to help write good code much faster. It may not be perfect, but it wouldn't have been perfect without it. The thing is I have enough experience to know when it is leading me down the wrong path, and that still happens pretty often. What it helps with is implementing common patterns, especially with common libraries. It basically automates the "google the library docs/stackoverflow and use code there as a starting point" aspect of programming. (edit: it also helps a lot with logging, writing tests, and rewriting existing code as long as it isn't too whacky, and even then you really need to understand the existing code to avoid a mess of bugs)
But yeah search is completely fucked now. I don't know for sure but I would guess stackoverflow use is way down. It does feel like many people are being pigeonholed into using the LLM tools because they are the only things that sort of work. There's also the vibe coding phenomenon where people without experience will just YOLO out pure tech debt, especially with the latest and greatest languages/libraries/etc where the LLMs don't work very well because there isn't enough data.
LLMs are an okay'ish tool if your code style is not veering from what 99% of the open-sourced codebase looks like. Use any fringe concept in a language (for example, treat errors as values in languages ridden with exceptions, use functional concepts in an OOP language) and you will have problems.
Also, this crap tends to be an automated copy-paste. Which is especially bad when it skips on abstracting away a concept you would notice if you were to write the code yourself.
Source: own experience 😄
Totally agree. In my day to day work, I'm not dealing with anything groundbreaking. Everything I want/need to code has already been done.
if you have a Copilot license and are using the newest Visual Studio, it enables the agentic capabilities by default. It will actually write the code into your files directly. I have not done that and will not do that. I want to see and understand what it is trying to do.
I agree it's great at writing and frame-working parts of code and selecting libraries - it definitely has value for coding. $1500 bil value though, I doubt.
My main concern there lies in the next gen of programmers. The work that ChatGPT (and Claude etc) outputs requires some significant programming prior-experience to allow them to make sense of the output and adjust (or correct) it to suit their scope and requirements of the project - it will be much harder for junior devs to learn that skill with LLMs doing all the groundwork - essentially the same problem in wider education now with kids/teens just using LLMs to write their homework and essays. The consequences will be long term, and significant. In addition (for coding) it's taking away the entry-level work that junior devs would usually do and then have cleaned up for prod by senior devs - and that's not theory, the job market for junior programmers is dying already.