Sunday, 1 June 2025

Algorithmic Identity: When “Me” Is No Longer Mine

In an age when our preferences, behaviours and identities are relentlessly harvested for profit, SaaS: Self-as-a-Service provides a much-needed satirical lens on what it feels like to live inside the algorithm. As the author quips, “You no longer search for something you already desire. You pick up your phone, and it subtly, or not so subtly, tells you what you wanted all along.” It’s a darkly funny truth: we’re not choosing anymore, we’re being nudged, channeled, optimized.

The piece may provoke laughter, but it also prompts unease. What happens when identity becomes a digital product, pre-shaped by data patterns and AI predictions? When selfhood is no longer created through reflection and experience, but assembled through targeted ads and recommender systems? This is no longer science fiction; it's everyday life.

Erosion of Choice and the Rise of Datafied Selves

The piece’s dark humour points to a deeper issue: the collapse between identity and data. When the author remarks that “you’re not ‘you’ anymore, you’re the you that exists for the ad network,” they capture a truth explored in surveillance studies. David Lyon (2007) calls this your data double: a constructed version of you, assembled from clicks, swipes, purchases and pauses. It’s this version that platforms interact with, profile and sell.

Shoshana Zuboff (2019) goes further, describing how these systems aim “not only to know our behaviour but to shape it.” The predictive logic of digital platforms isn’t just descriptive, it’s prescriptive. As the author sarcastically notes, “Your past decisions are just training data for future manipulation,” we begin to see that personal autonomy is less personal than we’d like to believe.

In that sense, your digital identity is less about who you are and more about how valuable you are to advertisers. The author's anecdote about unplugging devices and holding a garage sale for smart tech isn’t just a joke, it’s a metaphor for reclaiming agency. When even your refrigerator has data-driven preferences about your lifestyle, opting out feels like a radical act. It reminds us of Zuboff’s warning: “The goal now is to automate us, not just to know us.” The humour masks a serious truth: we are being subtly domesticated by our technology.


Personalization as Erasure

While the article riffs on the universal experience of algorithmic overload, it also hints at how personalization can become a form of erasure. When everything is tailored “just for you,” it can reduce complex identities to narrow behavioural profiles. As the author puts it, “The algorithm doesn’t see you. It sees a cluster of traits that are statistically correlated with some conversion funnel.”

This is especially troubling for those whose identities fall outside normative data sets. Safiya Noble’s Algorithms of Oppression (2018) details how algorithmic systems replicate structural biases, noting that “search results reflect the interests, assumptions, and biases of the people who create them — and those biases disproportionately harm marginalized groups” (p. 4). When personalization is based on incomplete or biased data, it not only misrepresents but also invisibilizes.

The article never says this outright, but the implications are there. If you’re a POC, queer, disabled, or otherwise underrepresented in the training data, your algorithmic “self” is either inaccurately modeled or erased altogether. Personalization becomes exclusion by design.


Resisting Algorithmic Colonization

The image of a garage sale in the article becomes a poignant metaphor, not just for physical decluttering, but for reclaiming agency. It’s a rare act of un-choosing, of reasserting control over what stays and what goes. The algorithm doesn’t like unpredictability. It thrives on patterns. To defy those patterns, to reject a suggestion, to value something without a five-star review, is a radical and defiant act.

This echoes Indigenous digital sovereignty movements, which advocate for “control over the creation, collection, ownership, and application of data about Indigenous peoples” (First Nations Information Governance Centre, 2020). These efforts challenge the extractive, one-size-fits-all models of Western datafication and call for tech systems that respect Indigenous laws, knowledge systems and identities. Such movements expose the colonial underpinnings of dominant tech infrastructures. 

Western data systems often treat all users as interchangeable nodes in a revenue-generating matrix. In contrast, Indigenous frameworks insist on relationality, consent and community accountability. In this context, the garage sale is more than decluttering; it is a symbolic refusal to be rendered as a commodity. It echoes these broader struggles for digital self-determination, reminding us that personal resistance and collective decolonization are interlinked. Whether in the form of an unplugged smart device or a data sovereignty policy, both reject the idea that our lives must be legible and lucrative to be valued.

This reframing urges us to rethink what resistance looks like in a world where personalization is power. It’s not always a protest or a policy. Sometimes, it’s refusing to rate, review, or click. Sometimes, it’s putting the smart speaker in the garage sale pile and remembering that identity can’t, and shouldn’t,  be optimized.


Conclusion: A Call for Critical Digital Literacy

The article’s satirical tone delivers a serious message: we must reclaim the space between impulse and purchase, between identity and algorithm. Digital literacy today is not just about navigating interfaces, it's about understanding power, resisting manipulation and designing equitable futures.

To challenge algorithmic identity is to reassert human identity, in all its messiness, contradiction and autonomy. As the author wryly notes, “My ‘self,’ it seems, is a constantly evolving, AI-curated SaaS (Self-as-a-Service) platform, and I keep forgetting to read the terms and conditions.” Maybe it’s time we read those terms more closely, or better yet, write our own.


References

  • Ferrie, C. (2025, May 15). SaaS: Self-as-a-Service - Chris Ferrie - Medium. Medium. https://csferrie.medium.com/saas-self-as-a-service-062003a9a568
  • First Nations Information Governance Centre (FNIGC). (2020). The First Nations Principles of OCAP®. https://fnigc.ca/ocap-training/

  • Lyon, D. (2007). Surveillance Studies: An Overview. Polity Press.

  • Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.

  • Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.


No comments:

Post a Comment