But unlike traditional broadcasters, the social media platforms are not held accountable for their decisions. The choices that they make to amplify certain voices over others, or whether and how to police content, have had grave consequences, ranging from the radicalization of young men to the proliferation of hate speech on Facebook that fanned the flames of genocide in Myanmar.
And most important, no one can even see what choices their algorithms are making across a community because no two people see exactly the same content. Users only see what gets amplified in their individual algorithmically curated news feed.
More fantastic work from The Markup (see also: their Blacklight tool) to account for how the blackboxes of social media actually work. It’s interesting that they are starting with a representative sample of 1200 people rather than just releasing an open source browser plug-in. Frankly, I appreciate and trust a methodological approach more.