echodot

joined 2 years ago
[–] echodot@feddit.uk 1 points 4 days ago (1 children)

That's not what we're talking about.

The assertion was that even text completion constitutes AI. Which is a mad claim because if you're going to say that text completion is AI then basically everything is AI.

[–] echodot@feddit.uk 2 points 4 days ago* (last edited 4 days ago)

People who lived in the 1960s did not by definition live in the 21st century so their definitions of what things may or may not be is immaterial.

We know what we mean by AI, and attempting to redefine that in the service of some kind of all "sides have a point" fence sitter, is a brainless arguement and is is definitively unhelpful. Defining AI strictly by "a definition of a system that does a thing based on an input", is both overly broad and demonstrably unhelpful. It's like arguing that a building that has been reduced to ash by a fire still contains the same constituent elements. Intellectually it's correct, practically it's ridiculous.

Broadly, you are attempting to define AI as anything that any computerised system does. How can you not see that that is an overly broad definition that entirely skirts anything remotely close to the realms of helpfulness.

[–] echodot@feddit.uk 1 points 4 days ago* (last edited 4 days ago) (4 children)

I'm saying that code completion does not constitute AI and certainly isn't LLMs.

I then provided an example of why that isn't the case.

You decided to respond to this by pointing out that some LLM may be involved in some code completion. Although you didn't provide an example, so who knows if that's actually true, it seems sort of weird to use in LLM for code completion as it's completely unnecessary and entirely inefficient, so I kind of doubt it.

I just want to point it out for a minute, because it's sort of feels like you don't know this, code completion is basically autocomplete for programmers. It's doing basic string matching, so that if you type fnc it also completes to function(), hardly the stuff of AI

[–] echodot@feddit.uk 1 points 4 days ago

My tuxedo cat does not have the advantage of the shed brain cell. He has to get by on zero brain cell.

He has fallen in the pond at least three times because learning that liquid water does not support his immense weight, is physiologically impossible.

[–] echodot@feddit.uk 1 points 4 days ago* (last edited 4 days ago) (1 children)

I feel like you have never actually developed a game. Because what you're arguing is just weird. It makes no logical sense.

A grey box is the very most basic of what a game will ever be, it never bears any resemblance to the finished product. It is the basis most fundamental interpretation of game mechanics and systems. The gray box has no bearing on the final result of the game.

No grey box contains any aspect of artistic intent, the art team are never even involved in its creation it's always just developers doing things. Go look up some game blogs.

[–] echodot@feddit.uk 1 points 4 days ago (2 children)

That's my point. These random definitions of AI that have been come up with by the most pedantic people in existence are not in any way helpful. We should ignore them.

They seek to redefine AI as basically anything that a computer does. This is entirely unhealthful and is only happening because they need to be right on the internet.

These irritating idiots need to go away for they serve no purpose.

[–] echodot@feddit.uk 1 points 5 days ago (3 children)

The AI label needs to be present if the finished product contains AI generated assets. So AI generated code, or AI generated art.

In the example above you grey boxed in AI but then replaced all the assets with ones that humans made. There is no distinction there between doing that and just having literal grey boxes.

You couldn't require an AI label in that scenario because it would be utterly unenforceable. How would a developer prove if they did or did not use AI for temporary art?

So yes you can draw a line. Does the finished product contain AI generated assets. You don't like that definition because you're being pedantic but your pedantry interpretation isn't enforceable, so it's useless.

[–] echodot@feddit.uk 2 points 5 days ago (4 children)

By that definition a calculator is AI.

[–] echodot@feddit.uk 1 points 5 days ago (6 children)

Emmet has been around since 2015. So it was definitely not LLM backed.

[–] echodot@feddit.uk 0 points 5 days ago (1 children)

No because AI replaces a human role.

Code completion does not replace a human role, that's like saying that spell check is AI.

[–] echodot@feddit.uk 0 points 5 days ago (5 children)

Since you would never see it that's pretty much irrelevant. Clearly this is about AI generated art and AI generated assets

Whether or not you use AI to grey box something is a pointless distinction given the fact that there's no way to prove it one way or the other.

[–] echodot@feddit.uk 35 points 5 days ago* (last edited 5 days ago) (16 children)

I'm sure everyone has always explained this to you given the number of down votes, but algorithms aren't equal to AI.

Ever since the evolution of AI people seem to have lost the ability to recall things prior to 2019.

view more: ‹ prev next ›