wischi

joined 2 years ago
[โ€“] wischi@programming.dev 7 points 1 month ago

Especially to "differentiate yourself" ๐Ÿคฃ it's basically the exact opposite.

[โ€“] wischi@programming.dev 4 points 2 months ago

Probably because a lot or them (especially _iel) use ~~literal~~ letteral translations, nobody in their right mind would use in everyday conversations. Like in this post with "michmichs".

[โ€“] wischi@programming.dev 2 points 2 months ago

No we don't say that and I was completely confused until I read the english line.

[โ€“] wischi@programming.dev 11 points 2 months ago

I'm from Europe and I always assumed that America does that, because it's the cheapest option by far.

[โ€“] wischi@programming.dev 4 points 2 months ago

It's not called Meta data by accident ๐Ÿคฃ

[โ€“] wischi@programming.dev 3 points 2 months ago (1 children)

Shut that section down and ground the wires. Not really that dangerous. It's only dangerous if you don't follow protocol.

[โ€“] wischi@programming.dev 3 points 2 months ago

Maybe the "Office of" part should be dropped ๐Ÿคฃ

[โ€“] wischi@programming.dev 3 points 2 months ago

"Amazingly" fast for bio-chemistry, but insanely slow compared to electrical signals, chips and computers. But to be fair the energy usage really is almost magic.

[โ€“] wischi@programming.dev 2 points 2 months ago (2 children)

But by that definition passing the Turing test might be the same as super human intelligence. There are things that humans can do, but computers can't. But there is nothing a computer can do but still be slower than humans. That's actually because our biological brains are insanely slow compared to computers. So once a computer is better or as accurate as a human it's almost instantly superhuman at that task because of its speed. So if we have something that's as smart as humans (which is practically implied because it's indistinguishable) we would have super human intelligence, because it's as smart as humans but (numbers made up) can do 10 days of cognitive human work in just 10 minutes.

[โ€“] wischi@programming.dev 10 points 2 months ago

AI isn't even trained to mimic human social behavior. Current models are all trained by example so they produce output that would score high in their training process. We don't even know (and it's likely not even expressable in language) what their goals are but (anthropomorphised) are probably more like "Answer something that humans that designed and oversaw the training process would approve of"

[โ€“] wischi@programming.dev 11 points 2 months ago (4 children)

To be fair the Turing test is a moving goal post, because if you know that such systems exist you'd probe them differently. I'm pretty sure that even the first public GPT release would have fooled Alan Turing personally, so I think it's fair to say that this systems passed the test at least since that point.

[โ€“] wischi@programming.dev 24 points 2 months ago (6 children)

We don't know how to train them "truthful" or make that part of their goal(s). Almost every AI we train, is trained by example, so we often don't even know what the goal is because it's implied in the training. In a way AI "goals" are pretty fuzzy because of the complexity. A tiny bit like in real nervous systems where you can't just state in language what the "goals" of a person or animal are.

view more: โ€น prev next โ€บ