Deestan

joined 2 years ago
[–] Deestan@lemmy.world 39 points 11 hours ago (5 children)

The energy requirements for storing one ton of co2 are many many times higher than the energy gained from generating one ton of co2 (by oil, gas, coal or biofuel).

So each MWh spent "storing co2" would be ten times more efficient if used to offset oil extraction to get one MWh less out in the first place.

This is wasteful greenwashing. If it wasn't, we'd have broken physics on the level of making perpetual motion machines.

[–] Deestan@lemmy.world 9 points 23 hours ago

I have briefly scanned the headline and am ready to share my immediate opinion on this significant political issue that I did zero research on. Where do I put it?

[–] Deestan@lemmy.world 51 points 1 day ago* (last edited 1 day ago)

Here's an expensive thing!

What value does it have?

...you figure it out!

I am not impressed.

:o

[–] Deestan@lemmy.world 12 points 2 days ago (1 children)

The lemmyverse isn't huge, but broad.

Unless you give some examples, people are probably going to respond based on what they saw somewhere else than where you were thinking of.

My best attempt at an answer is that lemmy has fewer long-term well behaving residents, so people coming from other platforms because they kept being banned for bad behavior or just kept not getting along, stick out more.

[–] Deestan@lemmy.world 4 points 3 days ago

Sir, this is a Wendy's

[–] Deestan@lemmy.world 39 points 3 days ago (3 children)

This reads like OpenAI's fanfic on what happened, retconning decisions they didn't make, things they didn't (couldn't!) do, and thought that didn't occur to them. All indicating that the possibility to be infinitely better is not only possible, but is right there for their taking.

For the one in April, engineers created many new versions of GPT-4o — all with slightly different recipes to make it better at science, coding and fuzzier traits, like intuition.

Citation needed.

OpenAI did not already have this test. An OpenAI competitor, Anthropic, the maker of Claude, had developed an evaluation for sycophancy

This reality does not exist: Claude is trying to lick my ass clean every time I ask it a simple question, and while sycophantic language can be toned down, the behavior of coming up with a believable positive answer for whatever the user has, is the foundational core of LLMs.

“We wanted to make sure the changes we shipped were endorsed by mental health experts,” Mr. Heidecke said.

As soon as they found experts who were willing to say something else than "don't make a chatbot". They now have a sycophantically motivated system with an ever growing list of sticky notes on its desk: "if sleep deprivation then alarm", "if suicide then alarm", "if ending life then alarm", "if stop living then alarm", hoping to have enough to catch the most obvious attempts.

The same M.I.T. lab that did the earlier study with OpenAI also found that the new model was significantly improved during conversations mimicking mental health crises.

The study was basically rigged: it used 18 known and identified crises chat logs from ChatGPT - meaning the set of stuff OpenAI just had hard coded "plz alarm" for, and thousands of "simulated mental health crises" generated by FUCKING LLMs meaning they only test if ChatGPT can identify mental health problems in texts where it had written its own understanding of what meantal health crisis looked like. For fucks sake of course it did perfectly in guessing its own card.

TLDR; bullshit damage control

[–] Deestan@lemmy.world 72 points 5 days ago* (last edited 5 days ago) (5 children)

Business idiots are killings jobs. Generative AI is just their excuse to do it and threat to make people feel more replacable.

It's on the verge of pedantic, but I feel it important that we blame people for lying and causing harm, and not let them hide behind the imagined inevitability of tech and progress.

Generative AI can't replace shit, but the lie that they can and do, is the weapon wielded more than the tech itself.

[–] Deestan@lemmy.world 9 points 6 days ago* (last edited 6 days ago) (1 children)

This platform would have to have all the same functions

This expectation comes from inertia, not need. No system, thing, or product can ever succeed at checking everyone's little boxes from another product in a satisfactory way.

Also both you and your friends are older, different people now. The old magic will not come back. Figure out what you actually actually need and find something new that will be good at that.

Facebook was new and confusing once. If you can face that again, you'll find beautiful things.

[–] Deestan@lemmy.world 12 points 1 week ago (1 children)

A $10 charity donation in his name

[–] Deestan@lemmy.world 3 points 1 week ago* (last edited 1 week ago) (2 children)

Take at least three old socks, fill them each with a fistful of dried peas, lentils or beans. Sew them shut and cut off the empty part. The end result should be roughly ball shaped soft objects that fit in your hand.

Now, spend at least two hours every day practicing juggling them. Start with one. While staring straight ahead throw it in an arc from left hand to right so that it passes in front of your face. Use circular motions.

The daily physical movement will do wonders for your mood, but most importantly: In a few months you are going to be impressively good at something cool that people around you suck at.

Self-esteem from skill and personal development is more healthy and sustainable than... this.

[–] Deestan@lemmy.world 3 points 1 week ago (1 children)

I always heard their little quips as "My wife for Aiur!"

[–] Deestan@lemmy.world 1 points 1 week ago

You aren't a psycopath with a business degree, I see!

The sphere isn't fully opaque, so we can redirect sunlight to paying subscribers.

 

Current LLM-in-coding trends keep bringing this post up in my head.

I see a lot of the mentioned "King David fetish" in companies trying to cut salary costs by using more AI tools.

Like, the idea that you can make something worthwhile without strong skill, by just adding enough mediocrity to make up for it.

view more: next ›