Something we already knew has become even clearer: social media platforms are really f*cking powerful.
Long gone are the days when social media existed merely to supplement our in-person lives — social media is now deeply integrated into our lives. One day last week I participated in a virtual protest, signed petitions among millions of other names, started a book club, sent messages to my state legislators, viewed powerful and moving art, and discussed with friends possible ways to address racism at our summer camp all from the comfort of my living room couch. That is far more than I could’ve done in a day than if I had had to do it all IRL.
But this is nothing new. We’ve been witnessing the expansion of social media’s role in our day-to-day lives for years now. Much of it can be attributed to monetization and the fact that social media is how millions of individuals are making the money they need to survive and even more millions are aspiring to reach that level.
In a series of blog posts, Li Jin, a partner at Andreessen Horowitz (a16z), has described this as a shift into what she calls “The Passion Economy.”
Pursuing the Passion Economy Dream
The idea is that in a “Passion Economy,” people can make a career out of “doing what they love.” In 2017, nearly 17 million Americans earned income posting their personal creations across nine different platforms. It doesn’t matter what they do — shopping, putting on makeup, shooting basketball trick shots, dancing, painting, or activism. Individuality is something that is welcomed and — if you can get over a certain threshold of followers and engagement metrics — potentially highly profitable.
Therefore, it is important to highlight that, in the Passion Economy, individuality is also labor. Expressing yourself — and branding yourself —to the point of being able to monetize takes a lot of hard work. Not only is this labor often overlooked in the glorification of the “Passion Economy dream,” but it is also unpaid, or at least not paid at any life-sustaining level for a while.
This is a challenge for all creators but especially for creators from marginalized communities. The labor of individuality is going to vary based on the characteristics of the individual, and we cannot forget that we have historically demanded a lot of unnecessary, unpaid physical and emotional labor from women*, Black, Indigeneous and people of color, and LGBTQIA+-identifying folks.
This labor has been necessary to overcoming structural inequalities that, for decades, have created barriers to capital and required entrepreneurs from these groups to do more with less.
And this is labor that cannot continue to go unaccounted for if we really want to see a healthy and equitable post-Covid-19 recession economy and future beyond.
So, here’s how we begin to account for it…
It’s Not (Just) About The Money, Money, Money
We need to be paid more, of course. But we also need better tools to help us efficiently and effectively grow and scale our businesses — especially, technological tools because the ones we currently have are not serving us.
The technologies behind the platforms most of us are currently using don’t seem to care so much about the success of the creators who use them. Of course, they care a lot about the existence of successful creators because their existence supports the platforms’ ad-driven business models (*we elaborate more on this later on*). But they don’t seem to care one bit about who these creators are and what they might need — especially the ones who struggle most to succeed.
This is the case for most of the technology we’ve seen so far in the world. It’s built to best serve people most similar to the people who build it — mostly heterosexual cis white men.
But without technologies that understand and support the success of creators who are women*, LGBTQIA+, Black, Indigenous and people of color, we will continue to see the reproduction of structural inequalities. For example…
- Facebook’s ad approval process has been shown to be sexist
- Instagram’s “Community Guidelines” hide and delete queer content
- Tik Tok’s algorithm apparently has a race problem
- See more of our research on the Lips Instagram page
Without better options, these entrepreneurs will continue to be stricken with additional labor that is both wasted time and dollars lost; as just one example, hundreds of business owners and creators have vented to us about the hours they’ve spent personally contacting Facebook and Instagram reps about unfairly rejected and content deleted.
We need to start designing new technology in new ways — ways that actually meet the needs of our underserved yet most promising entrepreneurs by correcting for the undue burdens placed on us as a result of the political, economic and social inequality that exists in our society.
As a group of entrepreneurs ourselves who, together, represent all three categories (female*, POC, LGBTQIA+), we know this is the future we need to work towards. And here’s where to begin…
Towards Liberty and Design Justice for All
Passion Economy platforms must commit themselves to the theory and practice of design justice. Design justice is “an approach to design that is led by marginalized communities and that aims explicitly to challenge, rather than reproduce, structural inequalities. It has emerged from a growing community of designers in various fields who work closely with social movements and community-based organizations around the world.” You can learn more about Design Justice by reading the book by design scholar Sasha Costanza-Chock (they/them or she/her).
As a practice rooted in Black Feminist thought, design justice also focuses in on the effects of being part of multiple of these groups and therefore being subject to compounding challenges (and perhaps entirely new challenges) as a result of being members of intersectionally marginalized communities.
So, for example, it is widely known that many women* deal with self-image issues and that social media has mostly made it worse. This study from 2018 shows that after spending just one hour on social media, women feel more insecure about the way they look, especially feeling pressure to be thin and toned. Another study conducted by FEM Inc. in partnership with Google showed that viewing just a single sexualized online display ad resulted in significantly higher “Benevolent Sexism” scores in men, as well as a range of significant negative emotional reactions to sexualized ads among women.
A design justice approach to addressing this problem would be to learn that viewing body-positive content improves young women’s body image and build an algorithm that spreads body-positive content far and wide across the platform thereby encouraging self-love in women* everywhere. But that is not in fact what we’re seeing… (*we’ll show you what we are seeing later on, keep reading*).
Some of these major flaws in technology could be prevented or solved by simply taking the time to research and speak directly with the folks who normally tend to have the least influence on design decisions.
For Lips, this means centering the experiences of BIPOC, POC, LGBTQIA+-identifying folks, and women* to understand the particular challenges these creators face online.
5 Tips for Designing Digital Platforms for Marginalized Groups, According to Lips
Lips was founded in 2008 as a university campus zine for women* and the LGBTQIA+ community. Today, our community has grown to 10 other university chapters, almost 15K followers on Instagram, 1,500 digital creatives, 50 brand partnerships, and 1 youth mental health-focused accelerator. We are currently designing and building a social commerce app for our community.
The following observations come from the Lips team’s rigorous academic study, thousands of individual conversations, dozens of community-led workshops where we’ve prioritized the needs of women*, POC, and the LGBTQIA+ community in digital spaces.
As a brief introduction to our extensive body of research, here is our list of Top 5 Tips for Designing Digital Platforms for Marginalized Groups.
- Invest in the financial success and wellbeing of users as creators
- Cultivate a supportive community
- Provide a safe space for genuine and authentic self-representation
- Prove to your users that they can trust you and your technology
- Be transparent about guidelines and how they are enforced
1. Invest in the financial success and wellbeing of users as creators
Jennifer, Mara, Astrid & Aniyia said it best in Zebras Fix What Unicorns Break:
In short: “The business model is the message.” From that business model flows company culture and beliefs, strategies for success, end-user experiences, and, ultimately, the very shape of society.
We believe that developing alternative business models to the startup status quo has become a central moral challenge of our time. These alternative models will balance profit and purpose, champion democracy, and put a premium on sharing power and resources. Companies that create a more just and responsible society will hear, help, and heal the customers and communities they serve.
98% of Facebook and Instagram’s revenue comes from advertising sources.
Large corporations are paying them large sums of money to bombard our feeds with ads, targeted specifically for us based on 1) data they collect from us and sell with the bare minimum of our consent and 2) judgments they make about who we are and what we might be inclined to buy, which come from predictive algorithms carrying loads of biases and prejudices. The technology behind these platforms was built to support this business model and make these corporations (and especially their CEOs) even richer.
Small businesses owners aren’t even benefiting from this model, with 62% of them saying Facebook ads miss their targets.
Although Instagram recently decided to share some of this revenue with creators, we think it’s pretty safe to say that they care much more about us spending money than they do us making money. And the platforms themselves surely reflect that.
TLDR; Use your business model to show your creators that your success is contingent upon theirs (aka that you actually give a sh*t about them)!
2. Cultivate a supportive community
“Community Guidelines” are where a platform sets the tone for the kind of community it wishes to create. While setting the intentions to be a safe place is important, the responsibility does not end there. Many platforms enforce these guidelines inconsistently and often at the expense of its marginalized community members.
For example, just last week, conflict arose over Facebook’s decision not to do anything about Trump’s inflammatory posts in response to the Black Lives Matter protests in Minneapolis. Written into the platform’s guidelines is a promise to ban speech that promotes violence. Yet when the President posted “when the looting starts, the shooting starts” — clearly threatening to authorize police use of deadly force on protestors which is exactly the threat that has caused nationwide outrage in the first place — for some reason, these guidelines were not applied and the post was able to continue circulating on the site.
Several movement leaders, politicians, Civil Rights groups, and even Facebook employees expressed outrage, confusion, and disappointment in Mark Zuckerberg’s decision. Even Twitter, on the other hand, had chosen to flag and hide the post “based on the historical context of the last line, its connection to violence, and the risk it could inspire similar actions today.”
Hundreds of Facebook employees conducted a virtual walkout in protest of their company’s decision, viewing it as a choice to uphold the status-quo of discrimination and injustice that perpetuate violence against the Black community.Facebook Employees Stage Virtual Walkout to Protest Trump PostsWhile Twitter started labeling some of the president’s inflammatory messages, Facebook’s chief executive, Mark…www.nytimes.com
TLDR; A platform specifically designed for marginalized groups will always stand in solidarity with marginalized groups.
3. Provide a safe space for genuine and authentic self-representation
Ok, this one’s a head turner. Prepare yourselves.
We believe the next generation of platforms should be sex- and body-positive.
Currently, platforms are unable to differentiate between sexual expression and sexual exploitation. Therefore, they have chosen to take a ruthlessly combative approach to sex which is doing significant damage on female and LGBTQIA+ creators as individuals and on their businesses.
Sexuality is a crucial component to our self-expression.
In a study called Defining and Designing Trans Technologies, Oliver Haimson argues that for a platform to be considered a “trans technology,” it must give people space for genuine self-expression including queer aspects of multiplicity, fluidity, and ambiguity necessary during gender transition. This also includes upholding policies and an economic model that embraces adult or erotic content without characterizing and removing it as pornographic. Tumblr was deemed a trans technology by this study’s account, until it issued its infamous “porn ban” in 2018. RIP.
Similarly, in a study titled “(Not) Getting Paid to Do What You Love” by Brooke Erin Duffy, queer women who exhibited sexuality/intimacy on social media said doing so helped them 1) better connect with users, 2) build their personal brand, and 3) connect with fellow, like-minded influencers. Expressing sexuality was actually helping queer women do the work that social media was requiring of them, and all of the restrictions around it were just making that work more laborious for them.
TLDR; Build sex-positive platforms. Sexuality is a crucial component of the individuality for many women* and LGBTQIA+ creators. Let’s embrace and celebrate it on our platforms (and use humans + technology to moderate it)!
4. Prove to your users that they can trust you and your technology
Unfortunately, many features of the platforms we use today are guilty of enabling discrimination, harassment, and ultimately violence that is, of course, mostly targeted towards these marginalized groups. This creates a lack of trust between the creators and the platforms.
Hate groups and trolls have unfortunately become inescapable on social media — trans people being one of the most vulnerable populations — and sadly, most platforms have done little to control or prevent harmful antics. In fact, their features often reinforce the behavior by removing the creator’s account when it’s being reported without detecting that the actions are motivated purely by hate.
Making it possible to turn features such as messaging, commenting, and tagging on and off can be a good option for some — and many platforms do some form of this already. But think about it this way, an approach that requires creators to turn off their messaging is basically a form of virtual victim-blaming. Creators have the choice of continuing to receiving hate or disabling messaging, which usually comes at the expense of their businesses.
Messaging is important to marginalized creators — especially those with “complicated” identities and who perhaps might also do “confusing” work. They rely on engaging more with their audiences through explaining themselves, educating compassionately, answering questions — usually all before a single sale is made. And, again, most platforms are not equipped to keep people behaving nicely.
The answer is not necessarily getting rid of those features entirely but thinking about how we might be able to make them better or offer alternatives.
For example, my friend Lorrae Jo is a sex, love and empowerment coach who has accrued over 273K followers on Instagram. For her business she: markets, sells, and delivers 1:1 coaching sessions, reviews and promotes products on IG from brand sponsorships, and pubs the E-book she wrote — which I highly recommend.
She also receives hundreds of messages from her followers and spends lots of time answering as many as she can. Some are nice and come from potential customers inquiring about different aspects of her business. Others are not nice and express an animosity she knows comes from living in such a sex-negative society that is especially controlling of women and resentful about what they choose to do with their bodies.
During the quarantine, however, Lorrae Jo made an account on OnlyFans — a platform that gives her the ability to monetize her content and services — and describes the experience of interacting with customers in a much more positive way. Not only does the feature of a paywall mean she is paid for the labor of responding to messages, but it also makes people nicer (or simply weeds out the mean ones).
But while I mention OnlyFans, might I also say that establishing trust looks the opposite of kicking sex workers off of your platform suddenly once it becomes mainstream.
TLDR; Think about how the features of your platform distribute risks, harms, and benefits across your community. Build features that protect vulnerable populations from abuse and correct for the unnecessary, unpaid labor they do every day to defend themselves and their businesses.
5. Be transparent about guidelines and how they are enforced
You already know that we believe the next generation of platforms need to be sex-positive. Here is further proof as to why…
Most platforms today that ban nudity permit nipples if they appear on a male body but ban nipples on a female. We believe this actively props up the sexist system that teaches us — including and especially our youth — that it’s okay to sexualize the female body. Photographs and illustrations of topless female bodies are still pervasive on the platforms, of course, with the scribbled over nipples doing absolutely nothing but serving as an omni-present reminder that our bodies cannot be anything but objects of sexual predilection.
These nudity bans are also enforced incorrectly, further harming women* and LGBTQIA+-identifying folks. Many photographs women* post of themselves in attempt to celebrate their bodies, hashtagging #bodylove and #curvy, are hidden and removed constantly by the moderation algorithm that incorrectly labels them as inappropriate, “sexually suggestive” content and in violation of their guidelines.
This phenomenon is a result of — going back to Tip #3 — the technology of platforms today built with biases that make it unable to understand many marginalized identities. Imagine how the algorithm might handle trans bodies, disabled bodies, bodies healing from surgery or trauma, for example.
It is not trained to know what these bodies are, nor cater to the identities of the humans who live in them, which affects users’ experiences in tremendously detrimental and costly ways.
Here is where the added labor comes in. In many cases, Instagram does not even inform creators that their content or account is being demoted or hidden — this has been dubbed the “shadowban.” On a platform where growth of followers and engagement are key success metrics, visibility is crucial and having your content shadowbanned for weeks or even months can be an extremely challenging obstacle to overcome.
It takes many of these creators a lot of extra, unnecessary, unpaid labor to try to get themselves back in front of users — from persistently trying to contact Instagram to posting more content, more frequently — not to mention the harmful, emotional toll it takes on a person when photographs of their bodies have been sexualized by the platform without their consent.
TLDR; A platform’s “Community Guidelines” should reflect the values of the communities who use it and be enforced in fair and unbiased ways.
A Better (Digital) World Is Possible
Digital platforms are now the location of so much of the world — tweets constitute “events” that become news stories, Tik Tok videos can become your career. Thus, our discussions about equal opportunity and justice must turn their attention to the way these platforms are designed — by whom and for whom.
While the Passion Economy brings exciting new opportunities, it also makes us worried — that as women*, BIPOC, POC and LGBTQIA+-identifying folks, our labor will continue to go under-recognized, under-valued, under-paid-for.
But if we had technology that, instead of marginalizing us, understood us, catered to our needs, and actually helped us to better navigate these blurry lines — more equitable technology — we would be able to grow and scale our businesses in extraordinary ways.
Given what we do with the resources we get, imagine a world where we have equal resources.
We are ready to see that world. So, we’ve gone ahead and designed it ourselves.
This piece originally appeared on Medium, and was published here with permission.