Not Your AI, Not In Your Interest
On the Idea of Personal AIs Safeguarding Our Data and Activity Streams
A Regurgitation — After Thinking About This for a Few Days, I Got a Crazy Idea
You know how you get on Youtube or TikTok or any such social media platform (with a very few notable exceptions), and it “suggests” content you might like? They own the algorithm, and the data you create through your online behavior is used to give you these “suggestions” with the ultimate aim to shape your behavior online and off, and make you more predictable to them. What if you owned your data and that data can only be shared by you? In this scenario, you also own an algorithm that you train (lets set aside the question of how to train your personal algo), and you lease that access to your attention and data to these platforms. So you arrive at, say whatever decentralized or federated version of Youtube the future brings us, and you tell it what you want to see and hear? If we play our cards right, we could control not only our data, but own the algos through which we interact with the collective unconscious-but-seemingly-more-consciouss-every-day of the internet?
I think this may be a variation on Griffin’s vision in her piece? If so, let’s figure it out. It would basically be a layer between you or your device and the internet, like a VPN service, that shapes back at the shaping of content that these current platforms want to serve up for us. Our personal, private algo does batte with their algo, and they bid for our attention. We could say, no thanks YT, I don’t want any more right wing conspiracy theories, yes I know I watched Loose Change many years ago, but it’s not what I want. No, I am not a guy just cause I like Philosophy, stop sending me their things. You do your little clickity-clackity work and find me something I want to really see or I wont accept your landing page. So there.
Speaking of which, check out Kwaai. I have no relationship to them, just discovered them a few days ago through a family member that sent this to me: