Highest Rated Comments


DanielleCitron427 karma

Great question. That is what Bobby Chesney and I call the Liar's Dividend--the likelihood that liars will leverage the phenomenon of deep fakes and other altered video and audio to escape accountability for their wrongdoing. We have already seen politicians try this. Recall that a year after the release of the Access Hollywood tape the US President claimed that the audio was not him talking about grabbing women by the genitals. So we need to fight against this possibility as well as the possibility that people will be believe fakery.

DanielleCitron113 karma

Great questions. The social risks of deep fakes are many and they include both believing fakery and the mischief that can ensue as well as disbelieving the truth and deepening distrust often to the advantage of those seeking to evade accountability, which Bobby Chesney and I call the Liar's Dividend. In court, one would have to debunk a deep fake with circumstantial evidence when (and I say when deliberately) we get to the point that we cannot as a technical matter tell the difference between fake and real. Hany Farid, my favorite technologist, says we are nearing that point. We can debunk the fakery but it will be expensive and time consuming. I have a feeling that you are really going to enjoy my coauthored work with Bobby Chesney on deep fakes.

DanielleCitron97 karma

Great question. We need both lawmakers and social media companies on the case. Social media companies should ban harmful manipulated or fabricated audio and video (deep fakes or shallow ones) showing people doing or saying things they never did or said. Companies should exempt parody and satire from their TOS bans. This will require human content moderators, an expensive proposition, but one worth the candle. Bobby Chesney and I have more to say on this front in our California Law Review article on deep fakes. Now for lawmakers. Mary Anne Franks and I have been working with House and Senate staff on prohibiting digital forgeries causing cognizable harm like defamation, privacy invasions, etc. Law needs to be carefully and narrowly drafted. It likely will not come in time to meet the 2020 moment so we also need to be much more careful consumers and spreaders of information.

DanielleCitron46 karma

Great question. This is why we must be very careful in our definitions of digital impersonations or forgeries, narrow enough to exclude speech of legitimate concern to the public including parody and satire. As Mill and Milton worried, government can be counted on to disfavor speech that would challenge its authority. Hence, any regulatory action must be narrowly tailored and circumscribed to harmful digital forgeries amounting to defamation, fraud, invasions of sexual privacy etc. and exclude matters of public importance including parody and satire.

DanielleCitron42 karma

Great and difficult question. Sometimes, there is nothing someone could have done to prevent the sexual abuse. People doctor photos and create deep fake sex videos so there was literally nothing the victim could have done differently. I also don't want people to stop expressing themselves sexually. This generation shares nude photos and there is nothing wrong with that. The key is trust and confidentiality. We need to stress the importance of confidentiality, and law needs to protect against invasions of sexual privacy.