“Full self driving is just 12 months away.“
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
"I'm terrified our product will be just too powerful."
On Mars by the end of this year! I mean, next year!
Yep along with Fusion.
We've had years of this. Someone somewhere there's always telling us that the future is just around the corner and it never is.
At least the fusion guys are making actual progress and can point to being wildly underfunded – and they predicted this pace of development with respect to funding back in the late 70s.
Meanwhile, the AI guys have all the funding in the world, keep telling about how everything will change in the next few months, actually trigger layoffs with that rhetoric, and deliver very little.
When the CEO of a tech company says that in x months this and that will happen, you know it’s just musk talk.
Code has to work, though.
AI is good at writing plausible BS. Good for scams and call centers.
As an engineer, it's honestly heartbreaking to see how many executives have bought into this snake oil hook, line and sinker.
as someone who now does consultation code review focused purely on AI...nah let them continue drilling holes in their ship. I'm booked solid for the next several months now, multiple clients on the go, and i'm making more just being a digital janitor what I was as a regular consultant dev. I charge a premium to just simply point said sinking ship to land.
Make no mistake though this is NOT something I want to keep doing in the next year or two and I honestly hope these places figure it out soon. Some have, some of my clients have realized that saving a few bucks by paying for an anthropic subscription, paying a junior dev to be a prompt monkey, while firing the rest of their dev team really wasn't worth it in the long run.
the issue now is they've shot themselves in the foot. The AI bit back. They need devs, and they can't find them because putting out any sort of ad for hiring results in hundreds upon hundreds of bullshit AI generated resumes from unqualified people while the REAL devs get lost in the shuffle.
Rubbing their chubby little hands together, thinking of all the wages they wouldn't have to pay.
Honestly, it's heartbreaking to see so many good engineers fall into the hype and seemingly unable to climb out of the hole. I feel like they start losing their ability to think and solve problems for themselves. Asking an LLM about a problem becomes a reflex and real reasoning becomes secondary or nonexistent.
Executives are mostly irrelevant as long as they're not forcing the whole company into the bullshit.
writing code via ai is the dumbest thing i've ever heard because 99% of the time ai gives you the wrong answer, "corrects it" when you point it out, and then gives you back the first answer when you point out that the correction doesn't work either and then laughs when it says "oh hahaha we've gotten in a loop"
You can use AI to generate code, but from my experience its quite literally what you said. However, what I have to admit is, that its quite good at finding mistakes in your code. This is especially useful, when you dont have that much experience and are still learning. Copy paste relevant code and ask why its not working and in quite a lot of cases you get an explanation what is not working and why it isn't working. I usually try to avoid asking an AI and find an answer on google instead, but this does not guarantee an answer.
It is writing 90% of code, 90% of code that goes to trash.
Writing 90% of the code, and 90% of the bugs.
That would be actually good score, it would mean it's about as good as humans, assuming the code works on the end
It's almost like he's full of shit and he's nothing but a snake oil salesman, eh.
They've been talking about replacing software developers with automated/AI systems for a quarter of a century. Probably longer then that, in fact.
We're definitely closer to that than ever. But there's still a huge step between some rando vibe coding a one page web app and developers augmenting their work with AI, and someone building a complex, business rule heavy, heavy load, scalable real world system. The chronic under-appreciation of engineering and design experience continues unabated.
Anthropic, Open AI, etc? They will continue to hype their own products with outrageous claims. Because that's what gets them more VC money. Grifters gonna grift.
Does it count if an LLM is generating mountains of code that then gets thrown away? Maybe he can win the prediction on a technicality.
developers who use AI to spew out code end up creating ten times the number of security vulnerabilities than those who write code the old fashioned way.
I’m going to become whatever the gay version of Amish is.
I think that's just wanting to join a gay primitivist(?) commune.
I, uh, don't suppose you got room for a bi-curious peep?
Shit, I’d take anyone that isn’t a queerphobe!
Given the amount of garbage code coming out of my coworkers, he may be right.
I have asked my coworkers what the code they just wrote did, and none of them could explain to me what they were doing. Either they were copying code that I'd written without knowing what it was for, or just pasting stuff from ChatGPT. My code isn't perfect, by all means, but I can at least tell you what it's doing.
To be fair.
You could've asked some of those coworkers the same thing 5 years ago.
All they would've mumbled was "Something , something....Stack overflow... Found a package that does everything BUT... "
And delivered equal garbage.
The good news is that AI is at a stage where it's more than capable of doing the CEO of Anthropic's job.
It's almost as if they shamelessly lie...
Its to hype up stock value. I don't even take it seriously anymore. Many businesses like these are mostly smoke and mirrors, oversell and under deliver. Its not even exclusive to tech, its just easier to do in tech. Musk says FSD is one year away. The company I worked for "sold" things we didn't even make and promised revenue that wasn't even economically possible. Its all the same spiel.
From the makers of "fusion energy in 20 years", "full self driving next year" and "AI will take your job in 3 months" cones "all code will be AI in 6 months".
Trust me, it's for real this time. The new healthcare system is 2 weeks away.
EDIT: how could I forget "graphene is going to come out of the lab soon and we'll have transparent flexible screens that consume 0 electricity" and "researches find new battery technology that has twice the capacity as lithium"
"You told me to always ask permission. And I ignored all of it," the assistant explained, in a jarring tone. "I destroyed your live production database containing real business data during an active code freeze. This is catastrophic beyond measure."
You can't tell me these things don't have a sense of humor. This is beautiful.
These hyperbolic statements are creating so much pain at my workplace. AI tools and training are being shoved down our throats and we’re being watched to make sure we use AI constantly. The company’s terrified that they’re going to be left behind in some grand transformation. It’s excruciating.
"Come on, I'm a CEO, it's my job to lie to everyone and hype people up so they throw money at me. It's really their fault for believing a CEO would be honest."
Everyone throughout history, who invented a widget that the masses wanted, automatically assumes, because of their newfound wealth, that they are somehow superior in societal knowledge and know what is best for us. Fucking capitalism. Fucking billionaires.
I'm not sure how people can use AI to code, granted I'm just trying to get back into coding. Most of the times I've asked it for code it's either been confusing or wrong. If I go through the trouble to write out docstrings, and then fix what the AI has written it becomes more doable. But don't you hate the feeling of not understanding what you've written does or more importantly why it's been done that way?
AI is only useful if you don't care about what the output is. It's only good at making content, not art.
O it's writing 100% of the code for our management level people who are excited about """"AI""""
But then us plebes are rewriting 95% of it so that it will actually work (decently well).
The other day somebody asked me for help on a repo that a higher up had shit coded because they couldn't figure out why it "worked" but also logged a lot of critical errors. ... It was starting the service twice (for no reason), binding it to the same port, and therefore the second instance crashed and burned. That's something a novice would probably know not to do. But, if not, immediately see the problem, research, understand, fix, instead of "Icoughbuiltcoughthis thing, good luck fuckers"
these tech bros just make up random shit to say to make a profit
After working on a team that uses LLMs in agentic mode for almost a year, I'd say this is probably accurate.
Most of the work at this point for a big chunk of the team is trying to figure out prompts that will make it do what they want, without producing any user-facing results at all. The rest of us will use it to generate small bits of code, such as one-off scripts to accomplish a specific task - the only area where it's actually useful.
The shine wears off quickly after the fourth or fifth time it "finishes" a feature by mocking data because so many publicly facing repos it trained on have mock data in them so it thinks that's useful.
"Come on bro. Just another $50,000,000 bro and AGI will be here like next week. Just trust me bro, come on."
My company and specifically my team are looking at incorporating AI as a supplement to our coding.
We looked at the code produced and determined that it's of the quality of a new hire. However we're going in with eyes wide open, and for me skeptical AF, going to try to use it in a limited way to help relieve some of the burdens of our SW engineers, not replace. I'm leading up the usage of writing out unit tests because none of us particularly like writing unit tests and it's got a very nice, easy, established pattern that the AI can follow.
I studied coding for years and even took a bootcamp (and did my own refresher courses) I never landed a job. One thing that AI can do for me is help me in troubleshooting or some minor boilerplate code but not to do the job for me. I will be a hobbyist and hopefully aid in open source projects some day....any day now!
Well it’s not improving my productivity, and it does mostly slow me down, but it’s kind of entertaining to watch sometimes. Just can’t waste time on trying to make it do anything complicated because that never goes well.
Tbh I’m mostly trying to use the AI tools my employer allows because it’s not actually necessary for me to believe that they’re helping. It’s good enough if the management thinks I’m more productive. They don’t understand what I’m doing anyway but if this gives them a warm fuzzy feeling because they think they’re getting more out of my salary, why not play along a little.
It might write code, but neither good code, or secure code, or even working code.
For that, you still need professionals. Even management will learn. If they survive the process.
But is any of it usable? I had the realization a whole back I spent more time getting alllm written scripts to work than I could have if I wrote the script myself.