Technology

65389 readers
4311 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
1
 
 
  • X, the former Twitter, has experienced a worldwide outage as of 12PM on Monday CAT.
  • This is likely the first major outage of the company since Musk took ownership in 2022.
  • The outage seems to have only lasted for about half an hour.
2
3
4
5
6
 
 

Introduction

Why does Google insist on making it's assistant situation so bad?

In theory, assistant should be the best it's ever been. It's better at "understanding" what I ask for, and yet it less capable than ever to do so.

This post is a rant about my experience using modern assistants on Android, and why, while I used to use these features actively in the mid-to-late-2010s, I now don't even bother with them.

The task

Back in the late 2010s, I used to be able to hold the home button and ask the Google Assistant to create an event based on this email. It would grab the context from my screen, and do exactly that. This has been impossible, as far as I can tell, to do for years now.

Trying to find the "right" assistant

At some point, my phone stopped responding to "OK Google". I still don't know why it won't work.

Holding down the Home bar (the home button went the way of the dodo) brings up an assistant-style UI, but it's dumb as bricks and only Googles the web. Useless.

collapsed inline mediaHome Bar Assistant

So, I installed Gemini. I asked it to perform a basic task. It responded "in live mode, I cannot do that". Asking it how I can get it to create me a calendar event, it could not answer the question. Saying instead to open my calendar app and create a new event. I know how to use a calendar. I want it to justify its existence by providing more value than a Google search. It was ultimately unable to answer the question.

collapsed inline mediaGemini Live

Searching the internet, apparently both of the ways I had been using assistant features were the wrong way to do it. You have to hold down the power button, that's how to launch the proper one. My internal response was:

No, that's for the power menu. I don't want to dedicate it to Assistant.

Well, apparently, that's the only way to do it now, so there I go sacrificing another convenience turning it on.

Pulling teeth with Gemini

So I ask this power-menu-version of Gemini to do the same simple task. I tried 4 separate times.

First, it created a random event "Meeting with a client" on a completely different day (what?).

Second time it just crashed with an error.

collapsed inline mediaGemini crashes

The third time, it asked me which email to use, giving me a list, but that list did not contain the email I was interested in. I asked it to find the Royal Mail one. No success.

collapsed inline media

So, quite clearly, it wasn't using screen content.

I rephrased the question: "Please create an event from the content on my screen". It replied "Sure, when's this for?"

collapsed inline mediaSure, when's it for

I shouldn't have to tell you. That's the point. It's right there.

Conclusion

There are too many damn assistant versions, and they are all bad. I can't even imagine what it's like to also have Bixby in the mix as a Samsung user. (Feel free to let me know below.)

It seems like none of them are able to pull context from what you are doing anymore, and you'll spend more time fiddling and googling how to make them work than it would take for you to do the task yourself.

In some ways, assistants have gotten worst than almost 10 years ago, despite billions in investments.

As a little bonus, the internet is filled with AI slop that makes finding out real facts, real studies from real people harder than ever.

I write this all mostly to blow off steam, as this stuff has been frustrating me for years now. Let me know what your experience has been like below, I could use some camaraderie.

7
48
submitted 10 hours ago* (last edited 10 hours ago) by Tea@programming.dev to c/technology@lemmy.world
 
 

Companies are turning to tech solutions to screen candidates. Critics and job seekers have concerns.

8
 
 

Not much info yet, but I grew up on Digg, so I’m cautiously optimistic. Probably no Fediverse support, but honestly, any Reddit alternative is a win. Really hoping for real API access and third-party apps.

9
10
11
12
13
 
 
  • CR researchers were able to easily create a voice clone based on publicly available audio in four of the six products in the test set:
  1. These products did not employ any technical mechanisms to ensure researchers had the speaker’s consent to generate a clone or to limit the cloning to the user’s own voice. These companies—ElevenLabs, Speechify, PlayHT, and Lovo—required only that researchers check a box confirming that they had the legal right to clone the voice or make a similar self-attestation.
  2. Descript and Resemble AI took steps to make it more difficult for customers to misuse their products by creating a non-consensual voice clone.
  • Four of the six companies (Speechify, Lovo, PlayHT, and Descript) required only a customer’s name and/or email address to make an account.
14
15
16
 
 

cross-posted from: https://lemmy.ca/post/40385572

If you're getting "Untrusted device" on your Chromecast today, you're not alone. It looks like an expired cert.

17
18
19
20
 
 

Pinterest has updated its privacy policy to reflect its use of platform user data and images to train AI tools.

A new clause, published this week on the company's website, outlines that Pinterest will use its patrons' "information to train, develop and improve our technology such as our machine learning models, regardless of when Pins were posted." In other words, it seems that any piece of content, published at any point in the social media site's long history — it's been around since 2010 — is subject to being fed into an AI model.

In the update, Pinterest claims its goal in training AI is to "improve the products and services of our family of companies and offer new features." Pinterest has promoted tools like a feature that lets users search by body type and its AI-powered ad suite, which according to Pinterest's most recent earnings report has boosted ad spending on the platform. The company is also building a text-to-image "foundational" AI model, dubbed Pinterest Canvas, which it says is designed for "enhancing existing images and products on the platform."

The platform has stressed that there is an opt-out button for the AI training, and says it doesn't train its models on data from minor users.

...

Soon after we reached out to Pinterest with questions about the update, we were contacted by a spokesperson who insisted that the update wasn't newsworthy because the update simply codifies things Pinterest was already doing. Later, the company provided us with an emailed statement.

"Nothing has changed about our use of user data to train Pinterest Canvas, our GenAI model," read the statement. "Users can easily opt out of this use of their data by adjusting their profile settings."

Pinterest was already training its AI tools with user data, as the company touches on in this Medium post about Canvas, but the practice is now codified in the platform's terms of service.

21
22
23
24
25
view more: next ›