Zagorath

joined 2 years ago
 

Most of the threads I've found on other sites (both Reddit and the Synology forums) have basically said "go with Docker". But what do you actually gain from this?

People suggest it's more up-to-date, and maybe for some packages that's true? But for Nextcloud specifically it looks pretty good. 32.0.3 came out 1 day ago and isn't yet supported, but the version immediately preceding that, from 3 weeks ago, is.

I've never done Nextcloud before, but I would assume installing it via the Package Center would be way easier to install and to keep up-to-date than Docker. So what's the reason everyone recommends Docker? Is it easier to extend?

[–] Zagorath@aussie.zone 1 points 15 hours ago

The fact is, right now we know that Facebook has at times made a deliberate, conscious choice to leave in aspects of their algorithm that were causing harm. Their own studies have shown this. Making that practice illegal—knowingly causing harm with your algorithm—would be a good place to start with regulation.

[–] Zagorath@aussie.zone 1 points 16 hours ago (2 children)

no age can go on the internet.

I don't think anyone had ever suggested anything like that.

[–] Zagorath@aussie.zone 1 points 1 day ago

it’s way too obvious that this law ain’t gonna achieve its stated aim

Absolutely. See my much longer comment elsewhere in the thread for all the real problems with this bill. We don't need conspiracy theories. Hanlon's razor very much applies here. It's incompetence, not malice.

However, I think we can look at the worst part of this Bill—the nature of its passage through Parliament—for a clue as to its underlying purpose. It passed in just a week, right before Christmas last year, but didn't actually come into effect until yesterday. The goal was good PR. I suspect not rattling cages with the big social media companies was part of it too. They wanted to look like they were doing something to protect kids, and hopefully win the election off the back of it (not that they needed much help with that, with how incompetent the LNP were), but they didn't want to put up the fight that would be necessary to force the social media companies into actually making their algorithms less harmful...to children and adults. It's lazy, it's cowardly, it won't work. But it's not a secret ploy to spy on you.

[–] Zagorath@aussie.zone 2 points 1 day ago (2 children)

with Labor getting over half the lower house seats from about a third of the votes

Yikes. This is some really dangerous misinformation. Labor received 55% of the votes. Because we use an actual democratic system, not the FPTP farce that America and the UK have. You cannot compare first preferences in IRV to votes in FPTP.

[–] Zagorath@aussie.zone 7 points 1 day ago (5 children)

Step one is stuff like this, require id to verify your age

Right, but the law doesn't do that. In fact it was specifically forbidden from doing that. Here's the full text of the Bill. Section 63DB specifically says:

(1) A provider of an age-restricted social media platform must not:
(a) collect government-issued identification material; ...

(2) Subsection (1) does not apply if:
(a) the provider provides alternative means...for an individual to assure the provider that the individual is not an age-restricted user

In plain language: you can only accept ID to verify age if you also have some other method of verifying age instead.

So far, it looks like most sites are relying on data they already have. The age of your account, the type of content you post, etc. Because I have not heard of a single adult being hit with a request to verify their age anywhere other than Discord, and even on Discord, it's only when trying to view NSFW-tagged channels. (Which is an 18+ thing, and completely unrelated to this law, which is 16+ for all social media. Despite Discord having been officially classified as not social media, but a chat app, which does not apply.)

It also says, in 63F:

(1) If an entity:
(a) holds personal information about an individual that was collected for the purpose of, or for purposes including the purpose of, taking reasonable steps to prevent age - restricted users having accounts with an age - restricted social media platform; and
(b) uses or discloses the information otherwise than:
(i) for the purpose of determining whether or not the individual is an age - restricted user; or ...
(iii) with the consent of the individual, which must be in accordance with subsection (2);
the use or disclosure of the information is taken to be:
(c) an interference with the privacy of the individual for the purposes of the Privacy Act 1988 ; ...

(2) For the purposes of subparagraph (1)(b)(iii): (a) the consent must be:
[(i–v) voluntary, informed, current, specific, and unambiguous]; and
(b) the individual must be able to withdraw the consent in a manner that is easily accessible to the individual.

(3) If an entity holds personal information about an individual that was collected for the purpose of, or for purposes including the purpose of, taking reasonable steps to prevent age - restricted users having accounts with an age - restricted social media platform, then:

(a) the entity must destroy the information after using or disclosing it for the purposes for which it was collected

In other words, whatever information you collect to do the age verification, unless you already have it, with the user's consent, for some other purpose, you must not store their information.

It would not have been hard to just not include that part of the law. Some privacy advocates would have spoken up about it, but the general public would have probably brushed it off. No, they included that because this isn't about information harvesting. It's a misguided but genuine attempt to protect kids. And, if you're looking for a more cynical spin on it, it's to win some good PR with people for being able to say they're protecting kids, while also not doing anything that would substantially hurt big tech's bottom line...like regulating the algorithms themselves.

But again, you mentioned the US government. What does that have to do with this? This is a law passed in Australia, but the Australian government. An entirely different country, and one with an actually functioning government and legislature.

[–] Zagorath@aussie.zone 17 points 1 day ago (8 children)

First of all, what does the US government have to do with this?

Second, I made quite a detailed comment. Which bits do you disagree with and why?

[–] Zagorath@aussie.zone 2 points 1 day ago (4 children)

Most people are viewing this as a good way to prevent misinformation brainwashing

If only the government had actually made that the law. Crack down on the harmful algorithms that commercial social media use, instead of this shit.

[–] Zagorath@aussie.zone 3 points 1 day ago

My server just took a vote and changed our one NSFW channel to no longer be marked NSFW. All we used that channel for was posting slightly sex-themed memes anyway.

[–] Zagorath@aussie.zone 3 points 1 day ago (1 children)

I personally see zero downsides to this

I would encourage you to read my comment in another thread. There's the beginning of a good idea in this legislation, but nearly everything about how it's actually done is awful.

[–] Zagorath@aussie.zone 9 points 1 day ago (2 children)

No kids getting fined or arrested for using VPNs or buying accounts off others

It's actually explicitly not going to do that. The social media companies are the only ones with any legal burden here. That's the intent, and you don't need to go into cooker nonsense to justify it. It's no different from how a harm reductionist approach to drugs involves targeting dealers, not people buying for personal use.

[–] Zagorath@aussie.zone 30 points 1 day ago (10 children)

What do you guys think about this? I think its a big positive.

It's not. But not for the reason you say.

I get why they do it. Its to make people upload identification documents

This is just some conspiracy theory nonsense. The law specifically says that photo ID cannot be the only way users can verify themselves. And it also says that any uploaded documents must not be used for any other purpose. No, the reason behind the law is exactly what they say it is: to protect kids. They're just really bad at their job and don't understand the ways this law will not accomplish that goal.

I'll repost some of my comments from elsewhere:

The ultimate goal is a good one. Keep kids safe from dangerous social media algorithms. The method used to arrive at it...the Government did the wrong thing at pretty much every opportunity they possibly could.

Step 1: the government should have considered regulating the actual algorithms. We know that Facebook has commissioned internal studies which told them certain features of their algorithm were harmful, and they decided to keep it that way because it increased stickiness a little bit. Regulate the use of harmful algorithms and you fix this not just for children, but for everyone

Step 2: if we've decided age verification must be done, it should be done in a way that preserves as much privacy and exposes people to as little risk as possible. The best method would be laws around parental controls. Require operating systems to support robust parental controls, including an API that web browsers and applications can access. Require social media sites and apps do access that API. Require parents to set up their children's devices correctly.

Step 3: if we really, really do insist on doing it by requiring each and every site do its own age verification, require it be done in privacy-preserving ways. There are secure methods called "zero-knowledge proofs" that could be used, if the government supported it. Or they could mandate that age verification is done using blinded digital signatures. This way, at least when you upload your photo or ID to get your age verified, the site doesn't actually get to know who you are, and the people doing the age verification don't get to know which sites you're accessing.

Step 4: make it apply to actually-harmful uses of social media, not a blanket ban on literally everything. Pixelfed is not harmful in the way Instagram is. It just isn't. It doesn't have the same insidious algorithms. Likewise Mastodon compared to Xitter. And why does Roblox, the site that has been the subject of multiple reports into how it facilitates child abuse get a pass, while Aussie.Zone has to do some ridiculous stuff to verify people's age? Not to mention Discord, which is clearly social media, and 4chan, which is...4chan.

Step 5: consider the positive things social media can do. Especially for neurodiverse and LGBTQ+ kids, finding supportive communities can be a literal life-saver, and social media is great at that.

Step 3.5: look at the UK. Their age restriction has been an absolute failure. People using footage from video games to prove they're old enough. Other people having their documents leaked because of insecure age verification processes and companies keeping data they absolutely should not be holding on to

And perhaps most importantly:

Step 0: Transparent democratic processes

Don't put up legislation and pass it within 1 week. Don't restrict public submissions to a mere 24 hours. Don't spend just 4 hours pretending to consider those public submissions that did manage to squeeze into your tight timeframe. There is literally no excuse for a Government to ever act that fast (with possible exception for quick responses to sudden, unexpected, acute crises, which this definitely is not). Good legislation takes time. Good democratic processes require listening to and considering a broad range of opinions. Even if everything about what the legislation delivered actually ended up perfect, this would be an absolutely shameful piece of legislation for the untransparent, anti-democratic way in which it was passed into law.

And that's not to mention the fact that in some ways, not having an account is making things more dangerous. Like how porn bans in other countries have basically just amounted to PornHub bans, with people able to ignore it by going to shadier sites with far worse content on them and less content moderation. And I've seen a number of parents point to YouTube in particular, saying that when their kids had an account, they were able to see the kids' watch history, and could tell the YouTube algorithm to stop recommending specific channels or types of content. Without an account, you can't do that.

And, naturally, we're already seeing cases of kids passing despite being under-age. 11 year-olds who get told they look 18. A 13 year-old whose parent said they could pass for 10, who—just by scrunching his face up a bit—got the facial recognition to say he's 30+. Shock-horror, facial recognition is not a reliable determiner of age. It never should have been allowed.

[–] Zagorath@aussie.zone 1 points 1 day ago

in most countries that would be solved by good regulations

Quite likely both, actually. Good regulations help reduce the chance of it happening, but if it does happen, damage is done. Regulations might mean they receive a fine, but that doesn't make the victim of their negligence whole. Medical bills aren't all there is to it. There's the cost of pain and suffering. Probably time off work. (And having good leave policies doesn't necessarily help, because now that's leave she's used for this that she can't use if she later needs to for another reason.) Cost of repair/cleaning the car. Lawsuits would still happen.

And anyway, I'm not defending American anti-regulation bs. I'm defending people's right to sue companies that wronged them. In the absence of good regulations protecting consumers, suing a company that did the wrong thing isn't "absolute chaos". There is no "absolute chaos of lawsuit nonsense". That is corporate propagandistic bullshit.

 

Text TranscriptionA series of Tweets, each a reply to the previous.

  1. ABC News @ABC: Scientists have discovered a giant new species of stick insect in Australia, which is over 15 inches long and researchers say may be the heaviest insect in the country. [With a picture of a brown stick insect among some green leaves.]
  2. mary @theoceanblooms: can I ask a question: how does something like this go undiscovered until now
  3. soul nate @MNateShyamalan: Entomologist here 🙋‍♂️🤓🐜 Great question! It may seem surprising that the scientific community could miss an entire bug species after all this time, especially when it's THIS big. The answer might surprise you more 👀 Let's dive in 👇🧵 (1/?)
  4. soul nate @MNateShyamalan: he look like stick (2/2)
 

I'm currently trying to install Docker on my old Raspberry Pi (3 Model B+) to host some personal projects. When I run docker run hello-world, I get:

Unable to find image 'hello-world:latest' locally
docker: Error response from daemon: Get "https://registry-1.docker.io/v2/library/hello-world/manifests/sha256:ec153840d1e635ac434fab5e377081f17e0e15afab27beb3f726c3265039cfff": dial tcp [2600:1f18:2148:bc00:eff:d3ae:b836:fa07]:443: connect: network is unreachable

My Internet connection does not support IPv6 at all, which would explain why this error occurs. But how do I force docker-pull to only use IPv4?

 

TranscriptionThe GM: *Makes a clearly overpowered monster, intending for the party to flee.*

The Party:

[Picture with the text "Hit him with your crossbow Steve!" overlaid, of a large octopus/squid-like creature with tentacles raised out of the ocean. It towers over a pair of humanoid figures, one holding a staff in one hand and pointing at the squid with the other, the other person aiming a crossbow at it.]

 

TranscriptionFighter: So uhhhhh... you gotta pretty neat weapon there.

Artificer: Thanks, designed "Ol Buzzy" myself!

Fighter: Mind if I give her a go?

Artificer: Sure, but you need any pointers?

Fighter: Naaaaaw, I can figure it out.

Artificer: *To the DM* CAN he figure it out?

DM: *To fighter* ...roll me a wisdom check.

Fighter: *Nat 1*

DM, Artificer, and Fighter in unison: Hoo boy.

[A picture of a man starting a chainsaw while the blade is placed between his legs, resting on his crotch.]

 

It's been down for me most of today, as far as I can see. Have its admins made any public statements?

 

I realise this is a very niche question, but I was hoping someone here either knows the answer or can point me to a better place to ask.

My @DailyGameBot@lemmy.zip uses Puppeteer to take screenshots of the game for its posts. I want to run the bot on my Synology NAS inside of a Docker container so I can just set it and forget it, rather than needing to ensure my desktop is on and running the bot. Unfortunately, the Synology doesn't seem to play nicely with Puppeteer's use of the Chrome sandbox. I need to add the --no-sandbox and --disable-setuid-sandbox flags to get it to run successfully. That seems rather risky and I'd rather not be running it like that.

It works fine on my desktop, including if run in Docker for Windows on my desktop. Any idea how to set up Synology to have the sandbox work?

 

I've written a bot for !dailygames@lemmy.zip that I'm currently just running on my desktop. But I'd like to be able to set and forget it (except for when I do updates) by running it on my Synology NAS.

How can I best pull the node app from GitHub and run it on my Synology, preferably automatically running on start-up if the Synology is restarted.

view more: next ›