🤷🏻‍♀️ Forking Hell

An algorithmic bias bounty | Etherium has begun forking | Apple to pass photos of your kids to law enforcement

🤷🏻‍♀️ Forking Hell
Just do the right thing...

Hello fellow surfers of the infinite cyberverse. If you pay to receive this content: well done, and I think I'm in love with you?

If you don't pay, and you have financial stability: Each instalment of Horrific/Terrific takes five minutes to read, but much more time to write. Okay sure, time is a human construct, but do you know what? So is money (because time is money). Please click here to donate £4 a month and feel better about yourself.

This week was soft of okay I guess? 🤷. Ah, the soft gooey centre of my five-point scale. Why?

  • Apple want to crack down on child abuse, because it's their job to do that apparently
  • Twitter take bug bounties to an 'ethical' level
  • Stick a fork in Etherium — it's DONE.

👩‍💻 If the bug bounty thing wasn't dumb enough already...

We all know what a bug bounty is: software engineers spend hours finding vulnerabilities in services provided by huge, faceless corporations, so that they can report their findings on a site like HackerOne and maybe get paid.

Well now Twitter are doing this for algorithmic bias! The new Machine Learning Ethics, Transparency, and Accountability team (META) ran a 'spot the bias' competition the other day, and would you look at that: it was in fact hosted on HackerOne — how FUNNY. The challenge was to demonstrate potential harms that may come out of Twitter's saliency algorithm for cropping photos.

☝️ So basically: Twitter use a machine to auto-crop photos ready for your timeline, and ensure that the most important thing is showing once cropped. Shocking news: this algorithm contains racial and gender biases. There are some things here which just don't quite work for me:

  • META have identified that their algorithm is cutting POC's out of photos, and they want to throw money at the problem — but only enough money for a silly competition, and not like, an actual set of employees (or is that what META is meant to be??)
  • Can I just ask... consider the kinds of people who frequent HackerOne: are they qualified to identify potential harms in tech? What if everyone who entered the competition sucks at this? Aren't technologists to blame for all these bias problems?
  • This is what happens when you do things at scale that you shouldn't need to do in the first place. I wrote about META when it was announced, and to quote my past self (a person who I RESPECT): "They are attempting to shine a light into a black box when really they should just be throwing the black box out of the fucking window."

But look, it's fine. At least they're doing something. Of course in this case, 'something' = running a competition for software engineers who are probably either underpaid, between jobs, or desperate — or all of the above.


🍏 Apple are trying to do the quiet part quiet and the loud part loud

But it's not working. Context: Apple have announced some new features which will — apparently — protect children. Ahh, children... my most favourite of the doomed out-groups.

What are the new features? The link I provided will give you the saucy details but what I consider the most interesting and important part is the way they will scan iCloud photos in search for Child Sexual Abuse Material (CSAM). How this works:

  • Before your phone deposits new photos into iCloud, it first checks to see if they contain any CSAM
  • It does this by generating a hash of your image, and checking that hash against a database of other image hashes — all of which are CSAM.
  • These 'other hashes' are provided by the National Centre for Missing and Exploited Children in the US, and other such organisations. Crucial point: the database of hashes is stored on your phone.
  • If an image of yours matches any on the database, the hash gets uploaded along with your image. If you get too many matches, some humans are brought in, and your images may be shared with law enforcement.

💆‍♀️ I'm explaining in this in such detail because I would like you to understand it so that you can form an opinion about whether or not this is problematic — I am still thinking about this. Hit reply to let me know your thoughts!

Apple's messaging has also been pretty confusing — my good friend Josh very aptly pointed this out on Twitter the other day:

I asked Josh why he thought Apple were doing this now. Here's his heavy-brained answer:

"They might just wanna save money on their cloud infrastructure, because they're not having to do the scanning server-side. Or they might be about to encrypt iCloud backups, which are so conspicuously unencrypted, given their all privacy in their marketing."

We agreed that his second theory was more likely. It's fairly embarrassing that Apple can just see every photo that you stick in their cloud — so they probably want to find a way of encrypting those, without making it obvious that they weren't doing that already and while still being able to catch potential CSAM.

As you all may already know, I am weary of any process which 'makes it okay' to hand data over to law enforcement (EFF also have a lot to say about this). This is exactly why I hate it when people conflate privacy and security. If these new features prove anything, it's that if you want more security, you have to give up more privacy — and vice versa.


🍴 Well fork me, Etherium

Etheirum has just gone through it's London Hard Fork, which apparently has nothing to do with London. I honestly can't be bothered to go into the technical details, because we've had enough of that today. Instead I guide you sensually through what this means in practice:

  • Up until now, the transaction fees have been unpredictable, expensive, and given only to miners — this problem was aggravated by the gruesome NFT boom of early 2021.
  • Instead of just casting Etherium into oblivion where it belongs, the people in charge have decided to make it so that transaction fees are algorithmically determined, as opposed to humanly determined via closed auction.
  • This makes it 'fairer' because you can't just pay more to prioritise your transaction — yes, an algorithm will somehow make things fairer. You read correctly.
  • This will also lead to deflation apparently. Which sounds like the opposite of inflation so... you figure out what it means based on that.

Thank you for traversing my meaty, well-seasoned content with your ravenous eyes. I wrote most of this issue while on a wobbly train to Devon, so I hope it makes sense.


💌 Seeing as you're at the end of this week's Horrific/Terrific, I guess you enjoyed it. Please express gratitude by donating money to me so I can keep doing this. You can share opinions/submit news stories any time on my Twitter or email.