this post was submitted on 18 Dec 2025
63 points (95.7% liked)

Ask Lemmy

36095 readers
1812 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Every industry is full of technical hills that people plant their flag on. What is yours?

top 50 comments
sorted by: hot top controversial new old
[–] dfyx@lemmy.helios42.de 58 points 23 hours ago (3 children)

For any non-trivial software project, spending time on code quality and a good architecture is worth the effort. Every hour I spend on that saves me two hours when I have to fix bugs or implement new features.

Years ago I had to review code from a different team and it was an absolute mess. They (and our boss) defended it with "That way they can get it done faster. We can clean up after the initial release". Guess what, that initial release took over three years instead of the planned six months.

[–] Flax_vert@feddit.uk 16 points 22 hours ago (3 children)

The joys of agile programming....

[–] dfyx@lemmy.helios42.de 15 points 22 hours ago

What they did was far beyond "agile". They didn't care for naming conventions, documentation, not committing commented-out code, using existing solutions (both in-house and third-party) instead of reinventing the wheel...

In that first review I had literally hundreds of comments that each on their own would be a reason to reject the pull request.

load more comments (2 replies)
load more comments (2 replies)
[–] flamingo_pinyata@sopuli.xyz 36 points 22 hours ago* (last edited 22 hours ago) (2 children)

Not strictly technical, although organizational science might be seen as a technical field on it's own.

Regularly rotating people between teams is desirable.

Many companies just assign you in a team and that's where you're stuck forever unti you quit. In slightly better places they will try to find a "perfect match" for you.

What I'm saying is that moving people around is even better:
You spread institutional knowledge around.
You keep everyone engaged. Typically on a new job you learn for the first few months, then you have a peak of productivity when you have all the new ideas. After some 2 years you either reach a plateau or complacency.

[–] Kyle_The_G@lemmy.world 9 points 22 hours ago

I'm in health sciences and I wish we would do more education days/conferences. I'm a med lab tech and I feel like no one knows what the lab actually does, they just send samples off and the magic lab gremlins Divine these numbers/results. I feel the same way when another discipline discusses what they do, its always interesting!

[–] slazer2au@lemmy.world 5 points 22 hours ago

I'll allow it, institutional knowledge while sounding good does cause business continuity problems.

[–] rowinxavier@lemmy.world 35 points 19 hours ago (4 children)

I work in disability support. People in my industry fail to understand the distinction between duty of care and dignity of risk. When I go home after work I can choose to drink alcohol or smoke cigarettes. My clients who are disabled are able to make decisions including smoking and drinking, not to mention smoking pot or watching porn. It is disgusting to intrude on someone else's life and shit your own values all over them.

I don't drink or smoke but that is me. My clients can drink or smoke or whatever based on their own choices and my job is not to force them to do things I want them to do so they meet my moral standards.

My job is to support them in deciding what matters to them and then help them figure out how to achieve those goals and to support them in enacting that plan.

The moment I start deciding what is best for them is the moment I have dehumanised them and made them lesser. I see it all the time but my responsibility is to treat my clients as human beings first and foremost. If a support worker treated me the way some of my clients have been treated there would have been a stabbing.

[–] RebekahWSD@lemmy.world 7 points 17 hours ago

Disabled people are so often treated like children and it just sucks.

load more comments (3 replies)
[–] DasFaultier@sh.itjust.works 34 points 21 hours ago (1 children)

Not everything needs to be deployed to a cluster of georedundant K8s nodes, not everything needs to be a container, Docker is not always necessary. Just run the damn binary. Just build a .deb package.

(Disclaimer: yes, all those things can have merit and reasons. Doesn't mean you have to shove them into everything.)

[–] slazer2au@lemmy.world 9 points 21 hours ago (4 children)

But then how will I ship my machine seeing as it works for me?

load more comments (4 replies)
[–] Godnroc@lemmy.world 30 points 20 hours ago (3 children)

Cleaning, organizing, and documentation are high priorities.

Every job I've worked at has had mountains of "The last guy didn't..." that you walk into and it's always a huge pain in the ass. They didn't throw out useless things, they didn't bother consolidating storage rooms, and they never wrote down any of their processes, procedures, or rationals. I've spent many hours at each job just detangling messes because the other person was to busy or thought it unimportant and didn't bother to spend the time.

Make it a priority, allocate the time, and think long-term.

[–] NOT_RICK@lemmy.world 9 points 19 hours ago

Starting a new job soon, and I’m paying for some holes in documentation as I prep my offboarding documentation for my current team. Definitely making it a priority to do better going forward! Being lazy in the moment is nice but the “stitch in time” adage is definitely true

[–] mech@feddit.org 6 points 13 hours ago

Make it a priority, allocate the time, and think long-term.

In many jobs, someone with the power to fire you makes the priorities, allocates your time and does not think long-term.

load more comments (1 replies)
[–] kescusay@lemmy.world 25 points 22 hours ago

React sucks. I'm sorry, I know it's popular, but for the love of glob, can we not use a technology that results in just as much goddamn spaghetti code as its closest ancestor, jQuery? (That last bit is inflammatory. I don't care. React components have no opinionated structure imposed on them, just like jQuery.)

[–] jordanlund@lemmy.world 24 points 21 hours ago (2 children)

AI is a fad and when it collapses, it's going to do more damage than any percieved good it's had to date.

[–] tal@lemmy.today 8 points 20 hours ago* (last edited 15 hours ago) (1 children)

I can believe that LLMs might wind up being a technical dead end (or not; I could also imagine them being a component of a larger system). My own guess is that language, while important to thinking, won't be the base unit of how thought is processed the way it is on current LLMs.

Ditto for diffusion models used to generate images today.

I can also believe that there might be surges and declines in funding. We've seen that in the past.

But I am very confident that AI is not, over the long term, going to go away. I will confidently state that we will see systems that will use machine learning to increasingly perform human-like tasks over time.

And I'll say with lower, though still pretty high confidence, that the computation done by future AI will very probably be done on hardware oriented towards parallel processing. It might not look like the parallel hardware today. Maybe we find that we can deal with a lot more sparseness and dedicated subsystems that individually require less storage. Yes, neural nets approximate something that happens in the human brain, and our current systems use neural nets. But the human brain runs at something like a 90 Hz clock and definitely has specialized subsystems, so it's a substantially-different system from something like Nvidia's parallel compute hardware today (1,590,000,000 Hz and homogenous hardware).

I think that the only real scenario where we have something that puts the kibosh on AI is if we reach a consensus that superintelligent AI is an unsolveable existential threat (and I think that we're likely to still go as far as we can on limited forms of AI while still trying to maintain enough of a buffer to not fall into the abyss).

EDIT: That being said, it may very well be that future AI won't be called AI, and that we think of it differently, not as some kind of special category based around a set of specific technologies. For example, OCR (optical character recognition) software or speech recognition software today both typically make use of machine learning


those are established, general-use product categories that get used every day


but we typically don't call them "AI" in popular use in 2025. When I call my credit card company, say, and navigate a menu system that uses a computer using speech recognition, I don't say that I'm "using AI". Same sort of way that we don't call semi trucks or sports cars "horseless carriages" in 2025, though they derive from devices that were once called that. We don't use the term "labor-saving device" any more


I think of a dishwasher or a vacuum cleaner as distinct devices and don't really think of them as associated devices. But back when they were being invented, the idea of machines in the household that could automate human work using electricity did fall into a sort of bin like that.

[–] Tar_alcaran@sh.itjust.works 6 points 19 hours ago* (last edited 19 hours ago)

I'm a bit more pessimistic. I fear that that LLM-pushers calling their bullshit-generators "AI" is going to drag other applications with it. Because I'm pretty sure that when LLM's all collapse in a heap of unprofitable e-waste and takes most of the stockmarket with it, the funding and capital for the rest of AI is going to die right along with LLMs.

And there are lots of useful AI applications in every scientific field, data interpretation with AI is extremely useful, and I'm very afraid it's going to suffer from OpenAI's death.

[–] kboos1@lemmy.world 7 points 19 hours ago* (last edited 19 hours ago)

The issue that I take with AI is that it's having a similar effect on ignorance that the Internet created but worse. It's information without understanding. Imagine a highschool drop out that is a self proclaimed genius and a Google wizard, that is AI, at least at the moment.

Since people imagine AI as the super intelligence from movies they believe that it's some kind of supreme being. It's really not. It's good at a few things and you should still take it's answers with skepticism and proof read it before copy/paste it's results into something.

[–] slazer2au@lemmy.world 23 points 23 hours ago* (last edited 22 hours ago) (3 children)

They should stop teaching the OSI model and stick to the DOD TCP/IP model

In the world of computer networking you are constantly hammered about the OSI model and how computer communication fits into that model. But outside of specific legacy uses, nothing runs the OSI suite, everything runs TCP/IP.

[–] flamingo_pinyata@sopuli.xyz 9 points 22 hours ago (8 children)

Understanding that other protocols are possible is important. Sure, reality doesn't fit neatly into the OSI model, but it gives you a conceptual idea of everything that goes into a networking stack.

load more comments (8 replies)
[–] Flax_vert@feddit.uk 7 points 22 hours ago (2 children)

Don't you mean the OSI model? ISO means International Standards Organisation lol

load more comments (2 replies)
load more comments (1 replies)
[–] MudMan@fedia.io 18 points 21 hours ago (19 children)

Is there anybody on Lemmy that isn't a software engineer of some description? No? Anyone?

[–] SchmidtGenetics@lemmy.world 9 points 20 hours ago
[–] buttmasterflex@piefed.social 9 points 21 hours ago

I'm a geologist!

[–] slazer2au@lemmy.world 8 points 21 hours ago

Yes, me. I am a network engineer with an expired CCNA

[–] FridaySteve@lemmy.world 6 points 20 hours ago

Just because I'm not in a technical job doesn't mean I'm not a technology user.

[–] Mostly_Gristle@lemmy.world 5 points 20 hours ago

I'm a machinist.

load more comments (14 replies)
[–] unknownuserunknownlocation@kbin.earth 14 points 19 hours ago (4 children)

IT restrictions should be much more conservatively applied (at least in comparison to what's happening in my neck of the woods). Hear me out.

Of course, if you restrict something in IT, you have a theoretical increase in security. You're reducing the attack surface in some way, shape or form. Usually at the cost of productivity. But also at the cost of the the employees' good will towards the IT department and IT security. Which is an important aspect, since you will never be able to eliminate your attack surface, and employees with good will can be your eyes and ears on the ground.

At my company I've watched restrictions getting tighter and tighter. And yes, it's reduced the attack surface in theory, but holy shit has it ruined my colleagues' attitude towards IT security. "They're constantly finding things to make our job harder." "Honestly, I'm so sick of this shit, let's not bother reporting this, it's not my job anyway." "It will be fine, IT security is taking care of it anyway." "What can go wrong when are computers are so nailed shut?" It didn't used to be this way.

I'm not saying all restrictions are wrong, some definitely do make sense. But many of them have just pissed off my colleagues so much that I worry about their cooperation when shit ends up hitting the fan. "WTF were all these restrictions for that castrated our work then? Fix your shit yourself!"

[–] Tar_alcaran@sh.itjust.works 5 points 19 hours ago

you will never be able to eliminate your attack surface, and employees with good will can be your eyes and ears on the ground.

All the good will in the world won't make up for ignorance. Most people know basically next to nothing about IT security, and will just randomly click shit to make the annoying box go away and/or get to where they think they want to go. And if that involves installing a random virus they'll happily do it, and be annoyed that it requires their password.

[–] myfunnyaccountname@lemmy.zip 5 points 18 hours ago (2 children)

You pay me to admin 400 servers on a couple million dollars worth of hardware. Let me install a fucking app on my own machine without 4 levels of bullshit.

load more comments (2 replies)
load more comments (2 replies)
[–] Tar_alcaran@sh.itjust.works 12 points 18 hours ago (2 children)

Workplace safety is quickly turning from a factual and risk-based field into a vibes-based field, and that's a bad thing for 95% of real-world risks.

To elaborate a bit: the current trend in safety is "Safety Culture", meaning "Getting Betty to tell Alex that they should actually wear that helmet and not just carry it around". And at that level, that's a great thing. On-the-ground compliance is one of the hardest things to actually implement.

But that training is taking the place of actual, risk-based training. It's all well and good that you feel comfortable talking about safety, but if you don't know what you're talking about, you're not actually making things more safe. This is also a form of training that's completely useless at any level above the worksite. You can't make management-level choices based on feeling comfortable, you need to actually know some stuff.

I've run into numerous issues where people feel safe when they're not, and feel at risk when they're safe. Safety Culture is absolutely important, and feeling safe to talk about your problems is a good thing. But that should come AFTER being actually able to spot problems.

load more comments (2 replies)
[–] jode@pawb.social 11 points 14 hours ago (1 children)

Any tolerance on a part less than +/- 0.001 isn't real. If I can change the size of the part enough to blow it out of tolerance by putting my hand on it and putting some of my body temperature into it then it's just not real.

load more comments (1 replies)
[–] EponymousBosh@awful.systems 11 points 13 hours ago (1 children)

Cognitive behavioral therapy/dialectical behavioral therapy are not the universal cure for everything and they need to stop being treated as such

[–] corsicanguppy@lemmy.ca 6 points 13 hours ago

I'll join you on this hill, soldier.

CBT is the only one they've tested, and they tested themselves, and of course they look great. It offloads all success and failure 100% to the victim, and so many failures don't reflect on the process; ever. It resembles a massive sham.

My counsellor friend calls it "sigma-6 for mental health" and notes how it's often not covered by insurance (even outside America's mercenary system) so it's a nice cash cow for the indu$try.

[–] philpo@feddit.org 10 points 16 hours ago* (last edited 16 hours ago) (1 children)

Technisation and standardisation are good for the EMS sector.

The whole "it was better when we could do what we want and back then we had only real calls with sicker people and everything was good" is fucking aweful and hurting the profession.

Look, you fucking volunteer dick, I know you do this for 10 years longer than me (and I do it for 25 now),but unlike you I did it full-time and probably had more shifte in one year than you had in your life. Now my back is fucked because back then there was no "electrohydraulic stretcher", no stair chair, the ventilator was twice as heavy (and could basically nothing), the defibrillator weighted so much we often had to switch carrying it after two floors up.

And we had just as many shit calls,but got actually attacked worse because the shit 2kg radios were shit and had next to zero coverage indoors, and so had cellphones which led to you being unable to even call for backup.

And of course we had longer shifts,needed to work more hours and the whole job market was even more fucked.

"But we didn't need this and that,we looked at the patient". Yeah,go fuck yourself. MUCH more people died or took damage from that. So many things were not seen. And it was all accepted as "yeah, that's how life is".

So fuck everyone in this field and their nostalgia.

load more comments (1 replies)
[–] chunes@lemmy.world 9 points 21 hours ago (3 children)

If people used a language that actually leverages the strengths of dynamic typing, they wouldn't dislike it so much.

I encourage every programmer to build a Smalltalk program from the ground up while it's running the entire time. It really is a joy

load more comments (3 replies)
[–] KokusnussRitter@discuss.tchncs.de 8 points 12 hours ago (2 children)

I fucking hate AI in HR/hiring. I try so hard not to spread my personal data to LLMs/AI ghuls and the moment I apply for a job I need to survive I have to accept that the HR department's AI sorting hat now knows a shit ton about me. I just hope these are closed systems. if anyone from a HR department knows more, please let me know

load more comments (2 replies)
[–] 0x0@lemmy.zip 7 points 11 hours ago (1 children)

Weird i haven't seen this one yet: the cloud is just someone else's computers.

load more comments (1 replies)
[–] untorquer@lemmy.world 6 points 13 hours ago

People are idiots and it's the designers' duty to remove opportunities for an idiot to hurt themselves up and just short of impacting function.

[–] kboos1@lemmy.world 6 points 18 hours ago

Take the time to do it right the first time but also don't waste time if it doesn't add value.

Having a process is great but if the process exceeds the value then the process not only harms profit margins but also erodes morale. If the reason a process exists is to counter bad behavior then it's an employee problem not a process problem.

Open office floorplans are a terrible idea!

Work from home shouldn't be considered a given based on the job tasks but a privilege and benefit extended to those employees that have shown the discipline and reliability to work from home. But the in office requirement shouldn't be forced on everyone just to satisfy a "butts in seat policy" or a managers insecurity.

[–] FridaySteve@lemmy.world 6 points 21 hours ago (3 children)

Your favorite AI enabled LLM does a very, very good job of simulating language tests based on previous tests and there's no reason at all not to use it to study and prepare.

load more comments (3 replies)
[–] ethaver@kbin.earth 5 points 22 hours ago

Abilify is a beautiful long term maintenance med but wholly inappropriate for an acutely agitated and combative patient.

[–] jjjalljs@ttrpg.network 5 points 18 hours ago

Snapshot tests suck. That's a test that stores the dom (or I guess any json serializable thing) and when you run the test again, compares what you have now to what it has saved.

No one is going to carefully examine a 300 line json diff. They're just going to say "well I updated the file so it makes sense it changed" and slap the update button.

Theoretically you could only feed it very small things, but if that's the case you could also just assert on what's important yourself.

Snapshots don't encode intent. They make everything look just as important as everything else. And then hotshot developers think they have 100% coverage

[–] 30p87@feddit.org 4 points 22 hours ago* (last edited 22 hours ago) (1 children)
load more comments (1 replies)
load more comments
view more: next ›