this post was submitted on 15 May 2025
888 points (98.8% liked)

Technology

70064 readers
3807 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

This week YouTube hosted Brandcast 2025 in which it revealed how marketers could make better use of the platform to connect with customers.

A few new so-called innovations were announced at the event but one has caught the attention of the internet – Peak Points. This new product makes use of Gemini to detect “the most meaningful, or ‘peak’, moments within YouTube’s popular content to place your brand where audiences are the most engaged”.

Essentially, YouTube will use Gemini and probably the heatmap generated on YouTube videos by people skipping to popular points, to determine where to place advertising. Anybody who has grown up watching terrestrial television where adverts arrive as a way to build suspense will understand how annoying Peak Points could become.

you are viewing a single comment's thread
view the rest of the comments
[–] Ulrich@feddit.org 124 points 1 day ago (6 children)

I don't understand why this needs AI. I'm guessing this is just more marketing nonsense. You can already see the "most engaged moments" by simply hovering over the timeline.

[–] eager_eagle@lemmy.world 67 points 1 day ago (1 children)

That's because it doesn't. Just don't tell the investors.

[–] Ulrich@feddit.org 23 points 1 day ago (2 children)

At some point you would think the investors would get upset about all the lying...

[–] thejoker954@lemmy.world 8 points 1 day ago

Too greedy. They want all the money so bad they will believe any conman.

[–] dustyData@lemmy.world 3 points 1 day ago

Doesn't matter because they get a cut every time they let their friends lie to the board. Executives get a cut every time they seem like they're approving something. No one is personally liable for the lie. And those selling the lie get bonuses on every contract until they can sell the company to the next bag holder. It's all imaginary power plays to funnel money.

[–] brucethemoose@lemmy.world 24 points 1 day ago* (last edited 1 day ago)

Google’s been deploying engagement models before anyone even knew the name OpenAI.

This is oldschool machine learning, driven by viewing metrics from users. Gemini is just a brand.

[–] LastYearsIrritant@sopuli.xyz 6 points 1 day ago

That only works after the video is out and has usage statistics.

This could theoretically start to identify those moments before the video is public.

[–] mctoasterson@reddthat.com 4 points 1 day ago (2 children)

They already do it in Podcasts and it is usually extremely ham-fisted. The presenter will be mid sentence talking about something and suddenly IMPROVE YOUR DIET WITH FACTOR

[–] TwistedCister@lemm.ee 2 points 1 day ago

Or stand up specials. So much comedy on YouTube and they just drop Dick Pills ^TM^ commercials in the middle of punchlines.

[–] half@lemy.lol 3 points 1 day ago (2 children)

I mean... an "if" statement is technically AI, so investors can see the buzz word and all Google has to code is "if most engaged moment, then play ad" lol

[–] acosmichippo@lemmy.world 5 points 1 day ago* (last edited 1 day ago)

conditional and logical expressions have been the foundation of computing since the very beginning. you are using a definition of "AI" that is completely divorced from that history.

[–] Ulrich@feddit.org 0 points 1 day ago (1 children)
[–] Fiery@lemmy.dbzer0.com 4 points 1 day ago (2 children)

It's a computer (Artificial) making a choice that is better than random (Intelligence)

What it is not is a LLM (aka chatbot). It isn't even any type of neural network. Does not make it any less AI though

[–] acosmichippo@lemmy.world 5 points 1 day ago* (last edited 1 day ago)

No, it's a computer making a computation. The programmer is the one using intelligence by choosing the appropriate computation for the situation at hand.

[–] Ulrich@feddit.org 4 points 1 day ago* (last edited 1 day ago) (1 children)

It's a computer (Artificial) making a choice that is better than random (Intelligence)

That's not what AI is, that's just programming. AI implies the software was trained on a dataset in order to make pseudo-decisions on its own about the best way to do things.

[–] dr_robotBones@reddthat.com 1 points 1 day ago (2 children)

That's machine learning, AI just means artificial intelligence.

[–] Ulrich@feddit.org 4 points 1 day ago (1 children)

Machine learning is artificial intelligence...

[–] dr_robotBones@reddthat.com 4 points 1 day ago (1 children)

Its a subset of artificial intelligence, not the only type.

[–] Ulrich@feddit.org 1 points 1 day ago (1 children)

Right, but basic automation is not artificial intelligence.

[–] dr_robotBones@reddthat.com -1 points 1 day ago (1 children)

I wonder where we can draw the line. Its weird because the definition of AI is continuously changing, even though its two self evident words, artificial intelligence.

[–] aim_at_me@lemmy.nz 4 points 1 day ago

If we took the words at face value, I don't think we could label anything we've built AI.

[–] acosmichippo@lemmy.world 4 points 1 day ago

And artificial intelligence is still not basic programming.