AnyOldName3

joined 2 years ago
[–] AnyOldName3@lemmy.world 6 points 2 days ago

That's not the conclusion the study's authors drew. The particles being airborne for longer means they can float further and contaminate things further away from the toilet, and also are more likely to end up inhaled. That could be a bigger problem than the number of particles initially released, so the study didn't make a recommendation of whether the lid should be up or down. More research is required before anyone should be issuing definitive commands in bold to strangers on the internet.

[–] AnyOldName3@lemmy.world 4 points 2 days ago (2 children)

My comment was explicitly pointing out that closing the lid can have the opposite of the intuitive effect and make things worse even though you'd expect it to make them better. It seems that I misrepresented the study's findings, though, as while closing the lid does make particles remain airborne for much longer, so my overall point is sound, closing the lid does reduce the number of particles that initially become airborne.

[–] AnyOldName3@lemmy.world 16 points 2 days ago

I can't get the full text, but https://www.microbiologyresearch.org/content/journal/acmi/10.1099/acmi.fis2019.po0192 has the abstract. It looks like I misremembered its findings (or remembered an article that oversimplified them), though - having the lid down does something to the released particles to make more of them stay airborne for much longer, but it does reduce the number that escape, like you'd expect.

[–] AnyOldName3@lemmy.world 8 points 2 days ago (7 children)

There's a University of Cork study showing that putting the lid down aerosolises more material so spreads bacteria etc. over the whole room, whereas having the lid open produces a smaller number of larger droplets that nearly all just fall straight back into the toilet. The lid is not sealing the toilet and preventing the need to clean the bathroom.

[–] AnyOldName3@lemmy.world 2 points 4 days ago

With how the law is written, if you think anyone might ever make a mistake (likely), think the government might ever bother going through the hassle of enforcing it (probably less likely if you're not running a big website), and don't have loads of spare money to pay huge fines with or to pay an age verification service with (likely), then blocking the UK is the only way to be compliant. It doesn't require a technicality. The law just doesn't have any leeway for honest minor mistakes or small hobbyist websites.

[–] AnyOldName3@lemmy.world 3 points 4 days ago

Some instances use the image proxy and others don't. It seems that mine doesn't.

[–] AnyOldName3@lemmy.world 20 points 5 days ago (13 children)

Fun fact: I can't see the screenshot as I'm in the UK and your instance has taken the maximally paranoid literal meaning of the OSA and blocked access just in case anything's accidentally not flagged as NSFW.

[–] AnyOldName3@lemmy.world 18 points 6 days ago (2 children)

The Enterprise D crew was selected specifically to avoid seduction by Riker (otherwise they'd never get anything done), so it takes more than a holodeck malfunction to make them start an orgy.

[–] AnyOldName3@lemmy.world 1 points 1 week ago

You not mentioning LLMs doesn't mean the post you were replying to wasn't talking about LLM-based AGI. If someone responds to an article about the obvious improbability of LLM-based AGI with a comment about the obviously make-believe genie, the only obviously make-believe genie they could be referring to is the one from the article. If they're referring to something outside the article, there's nothing more to suggest it's non-LLM-based AGI than there is Robin Williams' character from Aladdin.

[–] AnyOldName3@lemmy.world 2 points 1 week ago (2 children)

AGI being possible (potentially even inevitable) doesn't mean that AGI based on LLMs is possible, and it's LLMs that investors have bet on. It's been pretty obvious for a while that certain problems that LLMs have aren't getting better as models get larger, so there are no grounds to expect that just making models larger is the answer to AGI. It's pretty reasonable to extrapolate that to say LLM-based AGI is impossible, and that's what the article's discussing.

[–] AnyOldName3@lemmy.world 89 points 1 week ago (2 children)

He'd already contributed to humanity by writing the definition of wanker for the Oxford English Dictionary by the age of 28, though.

[–] AnyOldName3@lemmy.world 8 points 1 week ago (4 children)

Plenty of people lack confidence or have an anxiety disorder, so would be predisposed to assuming that hanging out with them was a burden without any grounds to think so, and potentially feel like they need to include an apology in any invitation. It's obviously not healthy, but it doesn't mean that they're right and that the advances are unwanted.

view more: next ›