Nudes and N00dz
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Part Number | 34 | |
Number of Parts | 188 | |
Author | ||
License | CC Attribution - ShareAlike 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor and the work or content is shared also in adapted form only under the conditions of this | |
Identifiers | 10.5446/20564 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
re:publica 201634 / 188
1
2
3
5
6
13
14
15
17
18
20
21
22
24
26
27
28
29
31
32
33
34
35
37
38
39
40
41
42
44
46
47
49
51
52
54
55
58
59
63
64
65
66
67
68
70
71
72
75
77
79
80
82
85
86
90
91
93
94
96
97
98
99
102
103
105
106
109
111
112
113
115
116
118
119
120
121
123
124
126
128
129
132
133
136
137
138
139
140
141
144
146
147
148
149
151
155
157
160
161
162
163
165
167
169
171
172
173
174
176
178
179
180
181
183
185
186
188
00:00
HypermediaWeightJSONXMLComputer animation
00:17
ExpressionGroup actionUrinary bladderLecture/Conference
00:35
Regulärer Ausdruck <Textverarbeitung>Sturm's theoremFacebookHypermediaMetropolitan area networkMereologyPresentation of a groupLecture/ConferenceMeeting/Interview
01:13
Digital photographyComputer-generated imageryCounterexampleMedical imagingDigital photographyFacebookObject-oriented programmingMultiplication signRule of inferenceProduct (business)Musical ensembleDistanceDrawing
02:42
Rule of inferenceDigital photographyFacebookBitContent (media)Medical imagingWebsiteMusical ensemble
03:36
Digital photographyDigital photographyFacebookEmailProjective planeMultiplication
04:00
Digital photographyMedical imaging
04:17
Contrast (vision)DichotomyNormal (geometry)Digital photographyWebsiteState of matterCASE <Informatik>BitTwitterFacebookHypermediaMedical imagingStandard deviationAutomatic differentiationRule of inferenceGenderDifferent (Kate Ryan album)Lecture/Conference
06:02
OvalFacebookProcess (computing)Web pageInternetworkingMessage passingLecture/ConferenceMeeting/Interview
06:19
Projective planeFacebookSoftwareTraffic reportingLecture/Conference
06:42
Digital photographyFacebookTraffic reportingProjective planeFigurate numberDigital photographyFrequencyContent (media)Different (Kate Ryan album)Musical ensembleClassical physicsPrice indexRule of inferenceMedical imagingError messageMultiplication signLecture/Conference
08:51
Digital photographyFacebookResultantFrequencyDecision theoryMultiplication signLecture/ConferenceMeeting/Interview
09:08
Conditional-access moduleState of matterContext awarenessPower (physics)FacebookVideoconferencingWebsiteProjective planeCollaborationismLecture/Conference
09:47
WebsiteInternetworkingMultiplication signForm (programming)Projective planeComputer animationMeeting/InterviewLecture/Conference
10:17
Interactive televisionMedical imagingWindowComputer animation
10:32
MassComputer animation
10:48
World Wide Web ConsortiumRegulärer Ausdruck <Textverarbeitung>MassInternetworkingSystem callInternet forumMereologyStatisticsDecision theoryGroup actionReal numberCore dumpDifferent (Kate Ryan album)Panel paintingComputer animation
12:07
Cloud computingMetropolitan area networkHost Identity ProtocolNeumann boundary conditionDecision theoryStaff (military)MereologySlide ruleBitTerm (mathematics)Proof theoryFacebookCASE <Informatik>Position operatorComputer animationProgram flowchart
13:01
Tape driveStandard deviationElectronic visual displayLine (geometry)Speech synthesisCurvatureComputer animation
13:38
Maxima and minimaNear-ringSubsetDoubling the cubeSlide ruleInheritance (object-oriented programming)Computing platformStandard deviationLocal ringFacebookLecture/ConferenceMeeting/Interview
14:23
DivisorCircleMusical ensembleBit rateInformation securityRight angleLecture/ConferenceMeeting/Interview
14:58
BitMultiplication signView (database)Lecture/Conference
15:24
Inheritance (object-oriented programming)InternetworkingInternet forumMeeting/Interview
15:49
Formal languageWeb 2.0Exterior algebraLecture/Conference
16:13
Inheritance (object-oriented programming)Order (biology)Different (Kate Ryan album)Real numberLecture/Conference
16:31
Electric generatorSet (mathematics)YouTubeFacebookVapor barrierInheritance (object-oriented programming)Different (Kate Ryan album)Perspective (visual)Figurate numberBitLecture/ConferenceMeeting/Interview
17:21
Normal (geometry)AlgorithmLecture/Conference
17:47
Standard deviationNormal (geometry)Medical imagingQuicksortInternetworkingDependent and independent variablesAlgorithmFamilyMultiplication signLecture/ConferenceMeeting/Interview
18:43
Pulse (signal processing)Lecture/Conference
18:58
HypermediaMetropolitan area networkJSONXML
Transcript: English(auto-generated)
00:21
I founded a group called DeepLab, and this is... I'm Jillian York, and I'm a member of DeepLab, and also the Director for International Freedom of Expression at the EFF, the Electronic Frontier Foundation. So what we're talking about today is how social media, and specifically Facebook, because we've found that they have the strictest policies around
00:41
this topic, how these social media companies censor art, and specifically nude art. We believe that nude art is an important part of our culture, an important part of our history, and an important part of our present. So this piece, just as an example, is something that I successfully posted on Facebook just a few days ago.
01:01
Now, of course, because this is the Venus de Milo, this is a famous painting, sorry, sculpture, accepted throughout the world as high art. Therefore, Facebook thinks that it's okay for you to post this. Oh, we have a mic. So a good counterexample of this is a piece that was published on February 2016 by the Philadelphia Museum of Art.
01:24
It's a piece that was on exhibition during a pop art exhibition in retrospective. This piece was banned and deleted off of Facebook, and they said, due to the, quote, excessive amount of skin. So we found that that was a painting that was too sexy for Facebook, but.
01:45
Oops. Double mic. Double click. So this is another one again, just to give you an example of what's being blocked, this is from the Centre Pompidou, they had a retrospective of this female photographer. And because the nipples were shown, the piece was removed,
02:02
as well as the community. It was actually the second time the museum has had a ban, and they left a note after they were re-released in the community after 24 hours that they said, quote, we will not publish nudes in the future. So what is a nude and what is a nude?
02:21
Well, so Facebook says that they remove photographs of people displaying genitals or focusing in on fully exposed buttocks. They also restrict some images of female breasts if they include the nipple, but they say we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mystectomy scarring. Okay, so they've laid out the rules clearly and they've told us what we
02:41
can and we cannot post. Except, they're lying, because here is an image by an artist, a tattoo artist, Amy Black. The tattoo and the photograph are both by her. Now, Amy Black, she's incredible. She does tattoo for women who've had mastectomies. And so she will do whatever you like. I've seen one that's a beautiful grapevine crossing where the scars
03:02
were and across them and around them. But this is a photorealistic tattoo of a nipple that she did for a woman who had had a mastectomy. Amy Black has found that her content is regularly taken down from Facebook. Other communities as well as her that post it have received bans from the website from 24 hours to 30 days, depending on the content
03:22
that they post and how many previous offences they've had. And I'll get to the bans a little bit later because there's some more interesting things there. But I think that this is a really good example of how the rules don't really actually matter even after they've been spelled out. So again, this is another example of a photographer who did a project
03:40
and wanted to start a community for women to accept their bodies post pregnancy and postpartum. It was specifically around breastfeeding and the act of breastfeeding. So she established this community and received multiple hate mails from people within the community and was ultimately taken down by Facebook as a whole.
04:00
Another similar example is an artist named Petra Collins who works primarily on Instagram. She had images removed that were not of rape or violence or hate but rather pubic hair because she showed a picture of herself in a bikini where she hadn't properly waxed and they took this down. What's interesting to think about is kind of the contrast
04:22
and the dichotomy that you look at when you see people like, for example, Justin Bieber, who's a major movie star singer in the States. We know who he is. That guy. So he did a campaign for Calvin Klein. The ads were put all over New York City, major billboards, online,
04:41
but with the difference that they actually photoshopped the pubic hair in. So again, it comes down to this relationship women have with their bodies and what is considered acceptable or not acceptable based off of these corporate censorship rules and norms. And these rules certainly affect women disproportionately as well as transgender people and other people who fit outside of the gender norm
05:02
because Facebook decides based on what the breast looks like, not who it actually belongs to. And so we've seen examples where trans men and trans women have had photos taken down. We've even seen examples where prosthetic nipples that looked real enough got a photo taken down. Now, this is a really fantastic campaign.
05:21
It was from Pink Ribbon, Germany, and it was called Check It Before It's Removed, which, just in case anyone didn't catch that, they're talking about both checking your breasts for lumps and for possible cancer, but also before it's removed from social media. And so they put this up there intentionally, designed these photos that fit perfectly on your Facebook wall or Twitter,
05:44
and encouraged people to actually break the rules. And what happened was that I posted this very image to Facebook, and because it was my second offence, I was banned for 24 hours from the site. Now, let me tell you what that's like because, you know, we've spoken to the media quite a bit the last couple of days, and I made the mistake of reading their comments.
06:03
And one of the things that I found was that I don't understand Facebook. It's not the internet. It doesn't really matter. You can go somewhere else. But what I found when I was banned was not only that I couldn't post to Facebook or send messages to my contacts, which might seem fairly insignificant, I also couldn't administer pages that I run for my job.
06:21
I couldn't use Spotify, which I pay for. I couldn't use Tinder. I couldn't comment on the Huffington Post. Which uses Facebook's comments. And so an entire world was cut off for me, not just my Facebook network. Now, for me, it was only 24 hours, but I run a project called onlinecensorship.org that collects reports from individuals, and if you're ever censored, please come to us.
06:42
We collect reports from individuals who have experienced takedowns or account deactivations on social networks. And from those reports, we've learned that these bans can extend to 30 days and that they can be issued repeatedly. And so while support for terrorism might get your account taken down immediately,
07:01
violating the rules against nudity seems to result in ban after ban after ban. And so people who regularly break this rule will find themselves cut off for 30 days at a time just over and over again indefinitely. There's no reset button. So this is another example of an artist and a project that was actually removed from Instagram.
07:21
Again, Instagram is owned by Facebook. So she was looking at how you demystify a period. 50% of the world experienced this. We have dates and weddings and vacations that we work around these things, but it's nothing that anyone really talks about. So again, it's a fully clothed woman, but because of something in the image,
07:41
the entire image was removed as well as her account. And Instagram is owned by Facebook, and one of the other things that I've seen through my work and over the years is that different bodies are treated in different ways. Many people post their bikini shots on Instagram, but some of the content that I've seen taken down has been larger women wearing bikinis. So they'll have their photos taken down,
08:00
but skinny women will not. There's quite a bias there. Now Facebook also says that they allow photographs of paintings, sculptures, and other art that depicts nude fingers. As we saw, the Venus de Milo, perfectly fine, high art, classic, everyone knows what it is. But what happens when you post lesser known art? Anyone recognize this person?
08:24
I'm guessing if you're American or maybe over 30. So this is Bea Arthur. She starred in a TV show called The Golden Girls. She also had a wonderful career before that, but this was about that era. This is a nude painting from 1991 called Nude Bea Arthur by John Curran. And this was my first offense.
08:43
So last fall at Nets Politics Conference, I gave a talk just after this had happened. I posted this image, and I asked my friends to please report on me because the way that Facebook works is that it encourages you to snitch on your friends. So I asked my friends to do this. They reported it, and the result was
09:00
that the photo was taken down. I wasn't banned for any period of time, but the photo was gone and I couldn't repost it. So I was able to successfully appeal that decision, but it still counted against me as an offense. So Facebook has approximately 1.4 billion users. And when you think about this in the context of countries or states, they typically also have
09:21
then more power and social capital than any country or king or state or anything in the world. So this is a project I did, if you can cue the video, in 2012, 2013, in collaboration with someone named Pablo Garcia, who is an artist and professor
09:42
out of the School of Art in Chicago. So how many of you have been to porn sites? Nice, thank you, come on, there's more, there's more. We've all done it. So with this project, we actually were really interested
10:00
in exploring kind of this taboo's place of the internet, where a lot of people don't admit to they go to, but the majority of us do frequent quite often. So we spent about two months on a site called Cam4. It's a site where it's not just read-only porn, as I like to call it, but it's read-write porn. So you can actually interact with people through chat windows.
10:22
So we started to ask them to pose, and we would give them images of really iconic, well-known kind of pieces of art, and ask them if they could emulate that and recreate it for us for a small amount of money. So we were trying to kind of go around these questions about what is porn versus art,
10:41
and what is beauty versus manufactured beauty or real beauty, and how do these things kind of juxtapose when you see what is considered over the masses as art versus porn. So this piece was also shown in 2013 in New York during a thing called Internet Week.
11:02
So you'd think New York is a very liberal part of the US, and the piece was put up, and within a few days, Google was next to us and decided to have it taken down. So it comes down to this question of, it's 2013, it's the future, we're supposed to have jet packs, all this cool stuff should be happening,
11:22
but we're still debating this core moral question of what is art, what is porn, is there a difference, and how do you delineate those things? So we looked at, I guess it really came down to that question of who makes the decisions and how can you hear fruit when you can't? Because when Google pulled down the work in the exhibition, I tried to fight it, I argued it,
11:43
and there was ultimately no one I could speak to, and that was being physically present. So it was really a wake-up call about how the online transitions to the real. So to that question, who does decide? I think it's worth looking at the statistics of these companies to see who's working there and who's making these decisions.
12:02
Now, some of you may have seen Sarah Roberts talk yesterday, and she talked about content moderation and who the people are actually making the judgment. And I think that that's a big part of it, but so is whoever makes the decisions at the top. And in this case, as you can see, this is Facebook's, I don't know if you can actually see it that well, so I'll just read it a little bit. This is the US staff of Facebook,
12:22
and the breakdown of ethnicity. 55% are white, 36% are Asian, and everyone else is under 4%. In terms of men and women, we don't have that slide here, but it's over 70% men in leadership positions. And that's on both policy and technology as far as we can tell.
12:41
And so what that says to me, now, I don't think necessarily that these are prudes, that these are people who believe that nudity is wrong, but this is so ingrained in American culture. We have it in the way that our films are regulated, the way that our television is regulated. And the problem here is that Facebook, as an American company, is making these decisions for the entire world.
13:02
But this is interesting because what they say about that is that they restrict the display of nudity because some audiences within our global community may be sensitive to this type of content, particularly because of their cultural background or age. Now, if I can read between the lines here, what I think they're saying is, we don't want the government of Saudi Arabia to block us.
13:21
It's a real threat, and I think economically what these companies see is we have to create this flat, dull global standard for speech that allows everyone to be happy. But what does that say when we're teaching our children that nudity is wrong, that women can't be topless but men can? I just wanna leave you with this slide here.
13:41
This is a picture from Berlin about three days ago, near Rosenthalerplatz. If you've walked around recently, you've probably seen this poster. It's for a local art exhibition. And I was sitting at a cafe, and I noticed the poster and I decided to watch. And you know what I didn't see is I didn't see a single parent cover their child's eyes.
14:01
Because it's fine, it's acceptable. So why is it okay on a street in Berlin but not on Facebook, a global platform that we all use? Why do we wanna teach people this double standard of women's bodies? Why do we wanna treat bodies like pornography? And I would even go a step further with what we've been talking about and say, why is pornography even wrong?
14:22
So as an artist, it's also I think really important, again, like what Gillian was bringing up, as artists and cultural capital creators, you start to think about the implications of this. And do you make nudes? Do you put them online? Do you expose them to a community of 1.4 billion? And what are the implications of that being deleted
14:41
to your personal practice, to your social circles and everything? And of course we recognize that there are concerns around security as well. We know that there's a talk tomorrow that we recommend, Joan of our own, and it's the other speaker, sorry. Joan of our own and Coding Rights. Yes. Right there. There we go, excellent. So you should definitely see their talk, send safe nudes.
15:01
But if you'd like to talk to us a little bit more, you are welcome to and we'd like to open up the floor for questions if that's all right with you. Anybody? No, Sean? Is there a mic? I don't know. Okay, thank you, thank you.
15:21
That was really interesting. What would you recommend as an alternative, alternative guidelines, or should there be guidelines at all? I know, Adi, I know you're a parent, and me as a parent, I wouldn't want the internet to be the one
15:40
that moderates, for better or worse, the whole concept of nudity and sexuality to my children. So do you have an idea of an alternative, alternative guidelines, or is that more of a provocation?
16:00
I mean, for me, so my son's six, and it's starting to become a question because he is becoming fluent in language enough to start navigating the web on his own. But ultimately, I don't see necessarily a difference between the online and the real. So it's a question of, does his friends online show it to him,
16:22
or does he discover it with his friends socially and real? Because when I was younger, we would find the porn in our friend's parent's drawer, and that was how you discovered sex. So I don't really necessarily, for me, see a huge difference between the two. So from my perspective, I think that there are a couple things
16:41
that Facebook could realistically do to make this a little bit better. I think the first one is, treat women's and men's bodies the same. It's fine if they decide, for example, that they don't want to show anything below the waist. If they do that, fine, treat men and women's bodies the same. But right now, they're not doing that, and they're essentially perpetuating this sexism and teaching it to the next generation.
17:01
The other thing that I would say is that companies like YouTube, rather than completely banning nudity, they put an interstitial that says you must be 18 to click through and you have to have a YouTube account. Presumably for a parent, that would be enough of a barrier to make sure that your, for example, make sure that your child has the right account, that they have these settings clicked in. They could also have child settings
17:20
that parents can set up. I would be fine with all of those things. I don't find that to be censorship, but I think that blanket banning it is what's really problematic to me. Hi, I was wondering whether you thought it would be a good idea to have a nationally-based algorithm
17:44
that tried to implement local norms so that you don't have one global standard. So I'm sort of torn on this, because I am sensitive to the fact
18:01
that there are some places where no one wants to see, well, I actually don't believe that. I think everyone wants to see this. But I believe that there are some places where it's treated differently. At the same time, I believe in one internet. I believe in a global internet that is for everyone. And I know that not everyone will agree with that,
18:20
and that's fine, but I think that it would be really, once you institute an algorithm like that, who's to say, okay, now we're going to ban images of women entirely in this country? That would not be acceptable to me. And so treating cultures differently in that sense rather than giving people the autonomy and responsibility for themselves and their children and families, I would be skeptical.
18:43
Are there more questions? Okay, then I say thank you, Jillian and Adi. A little applause.