TLJR: Basic Online Safety Expectations

The Australian government has opened consultation on its proposed set of Basic Online Safety Expectations (BOSE) that form part of the recently passed Online Safety Act 2021.

I read through it and did a Too Long; Justin Read thread on it, which I’ve captured here for posterity.

I encourage you to make a submission to the consultation.

Let’s Read!

Fine. Let’s #tljr the Basic Online Safety Expectations (BOSE) the government wants to pass. Docs are here if you want to read along: https://www.communications.gov.au/have-your-say/draft-online-safety-basic-online-safety-expectations-determination-2021-consultation

First of all, this is a Determination, which is something the Minister just gets to do whenever they feel like it. There’s no voting on this. It’s “delegated legislation”. The power to do this comes from s45 of the Online Safety Act which was rushed through earlier this year over the top of lots of objections.

Recall that this Act was ‘bipartisan’: https://twitter.com/jpwarren/status/1405688061254533124

“Labor Senator Nita Green summed up the party’s position on the bill, saying they would be supporting it but weren’t happy about that.” #timelesstweet

Anyhow, this determination applies to s45(1), (2), and (3) which is a (1)”social media service”, (2) “relevant electronic service”, and (3) a “designated internet service”. The definitions of “social media service” is in s13 of the #OnlineSafetyAct. s13A defines “relevant electronic service”. s14 defines “designated internet service”. It’s worth looking at those definitions, and I’ll come back to it in a bit.

The expectations starts with this: “The provider of the service will take reasonable steps to ensure that end-users are able to use the service in a safe manner.” WTF does “safe” even mean? Same from what? Or who? Let’s look at how the government describes the #OnlineSafetyAct:

“A key principle underlying the Act is that the rules and protections we enjoy offline should also apply online.”

The doors in my house aren’t safe because I can jam my fingers in them. Same with all the cupboards. So could any 12 year-old. The #OnlineSafetyAct requires a much greater level of “safety” than exists in the offline world. It is absolutely not trying to create parity between online and offline, but it helps to confuse people by repeating this nonsense.

Example? Easy. s6 (2) of the determination: The provider of the service will take reasonable steps to proactively minimise the extent to which material or activity on the service is or may be unlawful or harmful.

If offline was held to this standard, certain Ministers in the government would have been forced to resign by the PM for spouting medical misinformation that got Sky News a 1 week suspension from YouTube.

s6 (3) provides a bunch of suggestions for what “reasonable” might look like. One of them is that people providing the services should be “trained in online safety” which is definitely* a thing with a clear definition and objective threshold for competence.

Certified Online Safety Black Belt™ training coming soon to your favourite overpriced consulting firms.

s7 tells us that figuring out what “reasonable” means for s6 involves consulting the eSafety Commissioner, who were still figuring out “how the sausage was made” when the OnlineSafetyAct became law. I’m sure* it’ll be fine.

s8 is the encryption backdoor we warned you about.

“If the service uses encryption, the provider of the service will take reasonable steps to develop and implement processes to detect and address material or activity on the service that is or may be unlawful or harmful.”

*IF* the service uses encryption? It’s 2021. Every service uses encryption! Encryption is good! It’s one of the only tools in infosec we have that works! Stop trying to break it! FFS infosec is enough of a binfire without actively trying to make things *worse* you muppets!

s9 is the “ban anonymous accounts” section.

“If the service permits the use of anonymous accounts, the provider of the service will take reasonable steps to prevent those accounts being used to deal with material, or for activity, that is or may be unlawful or harmful.”

Let’s see you apply this standard to the “staff writer” OpEds in the Australian first. lulz

Further lulz in s9 (2) where a suggestion for a “reasonable step that could be taken” is “(b) having processes that require verification of identity or ownership of accounts.”

The great thing about this advice is that it means none of the accounts are anonymous any more, so you don’t have to follow s9’s advice. Wait… that’s a bit of a paradox…

[The Jon English production of Pirates of Penzance is the best one. Fight me.]

s10 tells providers to collude with each other on online safety, so I hope they get a note from ACCC to say this isn’t violating any competition laws. Hahahahaahaha. ACCC enforcing competition laws. Ah, such japes.

s11 sounds reasonable, but the details will cause havoc. e.g. Does s11(h) “material that depicts abhorrent violent conduct” cover recordings of police brutalising minorities? That’s violent and abhorrent but we need to know it’s going on.

Internet Filter Rises Again

s12 is the internet filter, and it’s really fucked.

“The provider of the service will take reasonable steps to ensure that technological or other measures are in effect to prevent access by children to class 2 material provided on the service.”

Remember the classes of services this determination applies to that I mentioned at the beginning? It’s basically the whole Internet that touches Australia.

A screenshot of s13A of the Basic Online Safety Expectations direction.

A screenshot of s13A of the Basic Online Safety Expectations direction.

The designation applies to:

“(a) a social media service; (b) a relevant electronic service of any kind; (c) a designated internet service of any kind.”

ANY KIND

The suggestions for what “reasonable steps” are is this: (a) implementing age assurance mechanisms; (b) conducting child safety risk assessments.

Age verification for internet services is a really tricky problem, as the UK discovered when it tried to put in age verification before you could look at porn. The UK abandoned its efforts when it discovered it’s not feasible, but AusGov is undaunted. #tljr

I note that we don’t make newspapers or broadcast television conduct child safety risk assessments before letting overpaid columnists talk at length about “cultural Marxism”.

We also let Play School teach kids how to make a drum from household items while their parents are trying to work at home during lockdown and I want to see that child safety risk assessment.

s13 is the complaints mechanism, which will get online services to pre-emptively take down LGBT content when gronks brigade the reporting mechanism. An obvious outcome that has already happened in lots of places but that AusGov will ignore. Again.

Division 5 is the “swamp people with pages of opaque legalese constantly” which it refers to as “making information accessible”. This will train people to blindly click Accept whenever a popup appears in an app.

It’s called Safety by Design, according to eSafety, apparently. They’re experts, you see.

s19 requires providers to keep records on reports and complaints for 5 years so keep an eye out for some epic doxxings when that data get breached.

Abdication by Government

Other general observations from the government’s consultation paper on this:

“‘Other harmful material’ is intended to capture emerging forms of harmful material and behaviours that are not already specified in the Act or Expectations.”

This is the catchall “I’ll know it when I see it” discretion that keeps getting put into these things.

AusGov says “Service providers are best placed to identify these emerging forms of harmful end-user conduct or material” and I want you to read that again and think about it for a bit.

This is the government explicitly abdicating its responsibility to consult with the public on what community standards are and what wrestle with the difficult question of what “harmful end-user conduct or material” actually is.

Instead of doing its job, the government wants Facebook and Google and other private companies to define what constitutes acceptable content. And tries to claim this is treating online the same as offline.

Imagine putting Sky News Australia in charge of defamation law.

Those are the main points as I see them.

Take Action

If you have enjoyed* this #tljr join or donate to @efa_oz and @DRWaus and @apf_oz and start turning up to help us protect digital rights before they’re all gone. https://www.efa.org.au/

Make a submission to the consultation on this hamfisted bollocks of a determination: https://www.communications.gov.au/have-your-say/draft-online-safety-basic-online-safety-expectations-determination-2021-consultation

If you’re not sure how, get in touch, as some of us are planning a session to teach people how to do this.

Bookmark the permalink.

Comments are closed.