FauxLiving

joined 9 months ago
[–] FauxLiving@lemmy.world 1 points 8 hours ago

For how good the game looks, it runs amazingly.

[–] FauxLiving@lemmy.world 2 points 8 hours ago (1 children)

Are you using GE-Proton?

I had stability issue with Proton and Proton Experimental.

[–] FauxLiving@lemmy.world 3 points 8 hours ago

It works fine.

Arch

The setup was install it and press play.

I did add the ENVs for HDR.

If anyone knows an easier way to manage all of my steam game's wine versions and command line arguments, lmk

[–] FauxLiving@lemmy.world 7 points 9 hours ago (2 children)

It's okay to be scared, still do the right thing and make jokes about it.

[–] FauxLiving@lemmy.world 5 points 10 hours ago (1 children)

They're overestimating the costs. 4x H100 and 512GB DDR4 will run the full DeepSeek-R1 model, that's about $100k of GPU and $7k of RAM. It's not something you're going to have in your homelab (for a few years at least) but it's well within the budget of a hobbyist group or moderately sized local business.

Since it's an open weights model, people have created quantized versions of the model. The resulting models can have much less parameters and that makes their RAM requirements a lot lower.

You can run quantized versions of DeepSeek-R1 locally. I'm running deepseek-r1-0528-qwen3-8b on a machine with an NVIDIA 3080 12GB and 64GB RAM. Unless you pay for an AI service and are using their flagship models, it's pretty indistinguishable from the full model.

If you're coding or doing other tasks that push AI it'll stumble more often, but for a 'ChatGPT' style interaction you couldn't tell the difference between it and ChatGPT.

[–] FauxLiving@lemmy.world 1 points 10 hours ago* (last edited 10 hours ago) (1 children)
[–] FauxLiving@lemmy.world 1 points 10 hours ago

Yeah, I used it until they rolled it into the business accounts (which I upgraded to in order to dodge data caps and have a symmetrical connection, because bittorrent).

[–] FauxLiving@lemmy.world 3 points 23 hours ago (1 children)

It's because, historically, humanity as a whole is a bunch of subtle and devious con artists wearing different hats and masks. Naturally, anything trained on the output of such a species would adopt its traits.

[–] FauxLiving@lemmy.world 4 points 23 hours ago (1 children)

Everything is poorly lit, the dialogue is inaudible, and all the other sound is way too loud.

The thing you're noticing is that they're mastering movies for home theater setups and then everyone else gets a bad re-encode.

When you're watching a non-HDR 1080p version with Stereo sound using streaming services' low quality streaming codecs you're missing a lot more than if you had a HDR1400 4k OLED and a 7.1 Atmos setup with a Blu-ray encode of the movie.

The problem is that now there is just such a large gap between 'smartphone on a slow connection' and '$80,000 home theater' that it's hard to make content that pushes the latter while still being viewable on the former.

[–] FauxLiving@lemmy.world 21 points 1 day ago

Oh no, where will I go now to have 12 year olds tell me to kill myself while shouting racial slurs at my base.

[–] FauxLiving@lemmy.world 1 points 1 day ago (3 children)

I'm 37. I'm not old.

Well I can't just go around calling you man.

view more: next ›