Thing of the day, well we’ll post today’s thing a bit later, before that, we’ll share this. Have you been noticing your art or your posts about art disappearing? A couple of my life-drawings have completely vanished and that photo of that performance artist and that link to that band who expressed that political opinion…
Nudity in art has been around for thousands of years, but Facebook still can’t take it. The social media site has deleted pics of artworks by people like Kate Durbin and Erika Ordosgoitto. It has blocked users like Frédéric Durand-Baïssas for sharing paintings including Gustave Courbet’s “L’Origine du Monde.” And though some people have protested by creating Facebook groups like Artists Against Art Censorship, recording every instance of censorship — let alone fighting back — is next to impossible.
The Electronic Frontier Foundation (EFF) hopes to change that with a new site called Online Censorship. Created with Visualizing Impact, the website invites people to privately report instances of censorship not just on Facebook, but also on Twitter, Flickr, YouTube, Instagram, and Google+. EFF then studies the data to understand what gets censored, why, and how it affects people around the world. It’s all in the hopes of encouraging companies to tread more lightly in making decisions that impact free speech. Read more of this via ‘s piece on the Hyperallergic website
Michelangelo’s “David” (photo byJustin Ennis/Flickr)
Sites like Facebook, Twitter, Instagram, YouTube and Google+ have an outsized impact in our social lives. We treat these platforms as a “public sphere”, using them to discuss issues both controversial and menial, to connect with friends far and near, and to engage in activism and debate. But while these platforms may be used by the public, they’re ultimately owned by private companies with their own rules and systems of governance that control—and in some cases, censor—users’ content.
Onlinecensorship.org seeks to encourage companies to operate with greater transparency and accountability toward their users as they make decisions that regulate speech. We know they’re big fans of data—so we’re collecting reports from their users in an effort to shine a light on what content is taken down, why companies make certain decisions about content, and how content takedowns are affecting communities of users around the world.
By collecting these reports, we’re not just looking for trends. We’re also looking for context, and to build an understanding of how the removal of content affects users’ lives. Often the communities that are most impacted by online censorship are also the most marginalized—so the people that are censored are also those that are least likely to be heard. Our aim is to amplify those voices and help them to advocate for change.