Can unfair partnerships be harming your peer support group on the internet?
1. Conflicts of Interest
Conflicts of interest happen when the best interests of a person or group could be affected by relationships with other people, groups, or businesses, then a conflict of interest exists. When data generated becomes an “asset” for companies without any checks on power, then we create incentive to monetize data without first asking for consent. Left unchecked, conflicts of interest between companies and online patient communities create opportunities for exploitation and harm to patients.
2. “The Matthew Effect”
The Matthew Effect is social phenomenon often linked to the idea that the rich get richer and the poor get poorer. In essence, this refers to a common concept that those who already have status are often placed in situations where they gain more, and those that do not have status typically struggle to achieve more.
3. Erosion of Trust
When vulnerable communities lose trust, we stop engaging on the tech platforms where we reside. We delete our accounts. We stop trusting that it is safe to share things with one another on social media. We become silent. As we become silent, the value of ‘listening’ to our engagement is lost for everyone involved. That’s what we call a lose/lose situation.
“Partnership.” We keep saying this word. We don’t think it means what the tech industry thinks it means.
Fair partnerships are about balancing assets and vulnerabilities.
The Light Collective is working with peer support groups to fundamentally change the relationship between vulnerable communities and tech platforms where they reside. As a first step in this process, we are developing resources to help groups that formed online to understand how to negotiate fair partnerships.
1. The Age of Surveillance Capitalism
This book by Shoshana Zuboff brings to life the consequences as surveillance capitalism advances from Silicon Valley into every economic sector. Vast wealth and power are accumulated in ominous new “behavioral futures markets,” where predictions about our behavior are bought and sold, and the production of goods and services is subordinated to a new “means of behavioral modification.”
3. When You’re Not Just The Product on Facebook, but the Manager
Facebook wants more of its users to join “meaningful groups,” but there can be questionable incentives for the people running them. Excerpt: “Facebook didn’t create the opioid crisis, nor did it invent the shady process of “pay per head” marketing for addiction treatment centers. But the addiction support groups herd vulnerable people together, creating a big target for marketers. And the labor that the groups require from administrators leads to some perverse incentives.”