Congress has never made a law saying, "Corporations should get to decide who gets to publish truthful information about defects in their products,"— and the First Amendment wouldn't allow such a law — but that hasn't stopped corporations from conjuring one out of thin air, and then defending it as though it was a natural right they'd had all along.

But in 1998, Bill Clinton and his Congress enacted the Digital Millennium Copyright Act (DMCA), a giant, gnarly hairball of digital copyright law that included section 1201, which bans bypassing any "technological measure" that "effectively controls access" to copyrighted works, or "traffic[ing]" in devices or services that bypass digital locks.

Notice that this does not ban disclosure of defects, including security disclosures! But decades later, corporate lawyers and federal prosecutors have constructed a body of legal precedents that twists this overbroad law into a rule that effectively gives corporations the power to decide who gets to tell the truth about flaws and bugs in their products.

Likewise, businesses and prosecutors have used Section 1201 of the DMCA to attack researchers who exposed defects in software and hardware. Here's how that argument goes: "We designed our products with a lock that you have to get around to discover the defects in our software. Since our software is copyrighted, that lock is an 'access control for a copyrighted work' and that means that your research is prohibited, and any publication you make explaining how to replicate your findings is illegal speech, because helping other people get around our locks is 'trafficking.'"

EFF has [sued the US government to overturn DMCA 1201]( and we [just asked the US Copyright Office]( to reassure security researchers that DMCA 1201 does not prevent them from telling the truth.

We are:

Cory Doctorow [u/doctorow]: Special Advisor to Electronic Frontier Foundation

Mitch Stoltz [/u/effmitch]: Senior Staff Attorney for the Electronic Frontier Foundation

Kyle Wiens [u/kwiens]: Founder of iFixit []

Note! Though one of us is a lawyer and EFF is a law firm, we're (almost certainly) not your lawyer or law firm, and this isn't legal advice. If you have a legal problem you want to talk with EFF about, get in touch at [[email protected]](mailto:[email protected])

Comments: 73 • Responses: 18  • Date: 

jonl22 karma

What, in your opinions, are the best ways for net.activists to pursue legislative activism to prevent laws like the DMCA?

doctorow20 karma

Nice to see you, Jon! Thank you for your years of hard work with EFF-Austin.

I think that federal legislation to fix this is a long way off -- though who can tell, given the current legislative mess? -- but at the state level, a string of Right to Repair bills (18 in the last year!) at the state level held out enormous potential to make DRM a thing of the past, by imposing a duty on DRM users to unlock their DRM for legit purposes like repair and interoperability.

The R to R bills were killed by heavy lobbying from the corporate sector, but they'll be back. The statehouses are local and in some ways a lot easier to win these fights in.

KeithJairusM13 karma

We're seeing a huge push in the Auto industry surrounding Telematics, aka data being collected by your car that is only available to the dealer and not the public. Can you chat about the information security concerns this poses?

doctorow25 karma

We've seen a bunch of high profile attacks on cars (hellooooo Jeeps!) and it's increasingly obvious that a car is a 110MPH casemod that you put your body into, so getting security right is REALLY important.

Auto manufacturers have, for a variety of reasons, decided to treat the owners of cars as adversaries, designing engines and components that encrypt their data with keys that owners are not provided with, effectively locking the owners out of gaining insight into (and control over) their cars.

Here's a few of the reasons this is happening:

  1. It lets car manufacturers monopolize service: if diagnostic information can't be read without manufacturer authorization, the manufacturers can institute a licensing regime for who may fix its products (manufacturers have always been able to give their preferred mechanics "official authorized status" but this is one step further, making all unofficial mechanics into criminals because they have bypassed the car's DRM in the process of fixing the car (less obnoxiously, this also allows manufacturers to charge thousands of dollars for commodity diagnostic tools that cost a few dollars to make);
  2. It lets manufacturers monopolize the parts market; smart engine components increasingly go through cryptographic handshaking before they are recognized by the car (this is often billed as an "anti-counterfeiting" procedure). The cryptographic secrets needed to complete the handshake aren't available to OEMs who make their own compatible parts, so perfectly functional parts don't work when installed in cars that do this (naturally, the official parts get much more expensive when the unofficial parts market is shut down);
  3. It lets manufacturers sell products to third parties that only work if your car treats you as an untrusted party -- for example, a manufacturer can promise an insurer that a car will produce faithful driving telemetry that the owner can't change.

This all has profound security implications:

  1. It incentivizes auto companies to add digital to everything, vastly increasing the attack-surface;
  2. Worse, because all these measures are brittle (they only work if the driver doesn't know a secret hidden in their own car), manufacturers rely on DMCA 1201 to scare off security researchers who might reveal defects in their cars

So we have cars that are increasingly vulnerable to software attacks, increasingly networked, and increasingly off-limits to independent scrutiny. You want Dieselgate? Because that's how you get Dieselgate.

KeithJairusM5 karma

Thanks for going so in depth! I totally agree! #Dieselgate could become a very real possibility right before our eyes. With so many privacy concerns nowadays, I fear however consumers are starting to become numb to their information being exposed. With DMCA 1201 scaring off researchers, do you see any other way for the public to become truly informed about their exposed information?

doctorow5 karma

I think that there isn't a good way, alas. The thing about defects is that it's impossible to prove that you've gotten rid of all of them. Independent researchers are always discovering defects in code that was previously thought to be solid and bug-free (remember the OpenSSL kerfuffle)? We need pluralistic scrutiny of the systems we depend on, people from different backgrounds with different insights and angles of approach in order to continuously improve our systems.

coryrenton7 karma

Has anyone successfully used the DMCA itself against corporations that typically use the DMCA as a bullying cudgel?

doctorow6 karma

Corporations do threaten each other with DMCA 1201 (that's the part that bans circumvention) all the time - the threat of being sued over bypassing DRM to make new inkjet cartridges or spare parts for phones or tools to jailbreak devices so they can use alternative app stores is never far from the thoughts of businesspeople who have noticed that the dominant players are making titanic margins on commodity goods, services, consumables and parts.

cipher_wire3 karma

What is the reality of the consequences of a researcher who does bypass these locks as it were?

In the title it says that it threatens this years security talks. I recognize the harm that could come to researchers and speakers but what is it about this year in particular that makes this an issue?

Just want to say a quick thanks to the EFF. Without you the world would be far more of a burning crap carton.

doctorow2 karma

Thanks! You're right, it's not just this year's talks - every year at every tech conference, there are talks where researchers weigh the threat of reprisals from the people whose mistakes they're revealing against the right of users to know whether the systems they're trusting are fit for purpose.

wizoomer953 karma

Hi, EFF! Big supporter of yours. My question is twofold as it relates to your lawsuit to declare DMCA section 1201 unconstitutional (Green v. U.S. Department of Justice).

  1. First one is simple: Any updates with regards to that? The last I heard was in 2016, and granted it was an election year and the shift from one Presidential administration to another, but I would have thought that something would have come of it by now.
  2. In addition to the free speech arguments that the lawsuit makes (which I agree with as 100% valid), one thing I was thinking about was with regards to DMCA section 1201 being unconstitutional has to do with the copyright clause of the US Constitution. That clause states that Congress has the power to "promote the progress of science and the useful arts" by enacting laws like patent and copyright laws. However, given the actual text of the law in question and the results of the enactment of the law, one could argue that DMCA section 1201 has prevented progress instead of promoting it. How can a law that enables copyright holders to "control access" through technological measures to an otherwise publicly available copyrighted work promote progress? How does a range of otherwise legal activities, from ripping DVDs/Blu-Rays to create new fair uses of the copyrighted works, to Jailbreaking iPhones, and even security research, that is put in legal jeopardy by copyright law (even when the activity in question has nothing to do with copyright infringement) promote progress? Couldn't you argue that on its face, DMCA section 1201 is unconstitutional because it's a copyright law that doesn't live up to copyright's constitutional purpose of promoting progress, or is that just wishful thinking or a misunderstanding on my part?

Thank you in advance!!

doctorow6 karma

(Thank you for your support!)

IANAL, so I really can't answer on 2; my lay opinion is that we had hopes for the "progress clause" but that there was a lot of disappointment about that line of attack after Eldred (

As to 1.; we're still waiting for the judge to rule on the government's initial motions so we can get down to business. It's been a very long time! Another judge in that circuit recently ruled on a similar motion in a similar case ( so we're hoping that means we'll hear from our judge soon.

coder_ent3 karma

Is there anything that EU/UK Citizens should be prepared for or more aware of ?

doctorow6 karma

Article 6 of 2001's EUCD is very similar to DMCA 1201 and has created plenty of mischief in the EU.

The UK, of course, has a very fraught relationship with EU directives, so it's hard to say what will happen there, though in my wildest dreams, I like to think that we can at least salvage an equitable copyright regime out of Brexit (when live gives you SARS, you make sarsaparilla).

The EUCD is scheduled for its first overhaul since 2001. The directive is also a big and gnarly hairball of mostly technical, not overly objectionable revisions to the EU's copyright rules.

BUT! On GDPR day, an MEP named Axel Voss reinstated two discredited proposals that have the power to destroy the internet as we know it, known as "Article 13" and "Article 11."

Article 13 requires that anyone who provides a platform that can be used to publicly display any copyrighted work (that's Reddit, of course, but also little Minecraft servers that let you make your own skins, as well as Github and everything else, to a first approximation) must allow anyone to submit millions of fingerprints of copyrighted works, and anything the public tries to post must be matched to these fingerprints and discarded if they are near-matches or perfect matches. There are no penalties for falsely claiming copyright on works owned by someone else, or works in the public domain, making this an excellent tool for continent-wide censorship (a griefer could upload the works of Shakespeare to Wordpress and no one could quote the Bard on their WP site; or a political leader could claim copyright in an embarrassing video and prevent its spread in the runup to an election).

Article 11 bans posting links to news sites without a paid license (no, really). It also doesn't define "link" or "news site" -- devolving those definitions to each of the 28 EU members, and then requiring anyone who operates a service that has links to comply with all 28 rules.

After these were reintroduced into the EUCD, we helped get 1,000,000 signatures on platforms like and that was enough to force the EU Parliament to schedule a debate on these clauses (otherwise they would have likely passed with the EUCD itself). That debate is coming up on Sept 10 or 11. You should go to RIGHT NOW and then you should tell FIVE OF YOUR FRIENDS about this. It is an utter catastrophe in the offing and time is running out.

disfit3 karma

Copyright is awarded to 'professional' creators. Yet even a like (upvote, retweet, whatever) by a regular user might be considered the result of a creative process. But even if a singular click might not be considered as something that falls under copyright protection, at the very least an original post, tweet, photo, meme or any other expression should be. Yet every piece of legislation I have seen only recognizes the professional creation as falling under copyright. Most if not all social media, and even bog standard old media , thrives on the creation and proliferation of non-professional content. In my mind the regular non-professional user is therefor supplying most of the original content an/or is responsible for the proliferation of non-professional content of others while not being rewarded for this. At the same time the regular user is paying for the right to create and distribute original (personal) content by supplying (social) media with their data and their behavior which is used to generate advertising income at huge but hidden (privacy) costs.

I have wondered if recognition of this non-professional / personal content as copyrighted content would create a situation that creates a better equilibrium between the user and the platforms (and the professional content creators). Experiments (for lack of a better word) like the Brave browser and its Basic Attention Token seem to me to be a possible commercial or market-based solution to this uneven playing field.

On the other hand I would be very much in favor of a more lax copyright legislation, or maybe better a return to its original goal: a limited time of recovery, monopolistic profitability and/or personal recompense after which the content (or work) would fall into public domain.

This is different from most software that is now protected under copyright or patent. This material is being treated wrongly (in my eyes) as a either an original copyrighted work or an original and unique (and therefor pateneted) solution.

I am not the first, but maybe the second, to state that writing of source code can be seen as an art and is the result of a creative process. One that I can be in awe with, admire and see as a thing of beauty or aesthetically pleasing. But those cases are rare. But most programming and most programmers do not excel into that realm. Nor is their work a unique and new (business) solution to a (new or old) problem. So I have long doubted the special distinction software gets.

With the rise of Machine Learning (Deep Learning, Neural Networks, or any other term of the day), there is more and more unique coding being preformed and taken into production every day. And even though this might lead to code and a solution that is unique, we do not give a copyright or patent to the ML system that created it, but to the owner of the originating algorithms and the user data that they can freely use without any recompense.

I am fully supporting EFF's efforts regarding the DMCA and the call to protest the past and upcoming EU copyright legislation. But I wonder if this will be enough to counter the decades long misuse and strengthening of bad legislation and jurisprudence by the now seemingly unbeatable oligarchies, duopolies and/or monopolies.

It feels to me that more people are more easily a consumer of something new and better (or alas perceived as better), than a protester of something old and bad.

What would be more effective in your eyes: a change of regulations / laws or the re-interpretation of the existing regulations and laws via jurisprudence? Or would a commercial or market alternative be more successful?

Alternatively: what would you suggest people that want to make a difference put their energy towards? Change of business or change of politics?

Many thanks and all the best :-)

doctorow5 karma

Thanks for the thought-provoking question!

I think you've got the wrong end of the stick, a little. You're right that copyright legislation is universally made in light of the needs of (primarily) giant entertainment companies and (distantly secondarily) professional artists, the laws are nevertheless applicable to all creative works in fixed media (recordings, data streams, etc).

This is a real problem. Copyright is made to respond to the needs and desires of a big, complex industry (entertainment). Industries need legal frameworks, so the idea of something like copyright is something I support (my dayjob is writing novels, and my EFF work is part-time), though I have some ideas about how the current system could be fixed to make it fairer to artists and less tilted towards the corporations we sell our works to.

But every industrial regulation needs to decide who it applies to. I am also in favor of banking regulations, but I don't want to have to consult them when I give my 10 year old her allowance every week. So we use tests to decide who these industrial regs apply to, and in the case of copyright, that rule is,

  • If you make or handle a copy of a creative work, you are part of the entertainment industry and must abide by its rules

This rule was reasonable when every book had a printing press in its history and every film had a film lab in its history, but now we handle and duplicate creative works like we breathe -- the ctrl-r keystroke you entered to reload this page and read this reply triggered hundreds, if not thousands, of actions that qualify as part of the entertainment industry.

Yet we have stubbornly clung to this outdated test as the right way to figure out who needs to abide by copyright law. Virtually everything we do on the internet is thus within copyright's scope and virtually everything we do is on the internet -- which makes copyright the de facto legal framework for politics and education, romance and play, automotive repair and agribusiness.

As a guy who pays his mortgage with entertainment industry checks, this really alarms me. A copyright rule that is simple enough for my 10 year old to understand and stay on the right side of while reading an ebook will never be subtle and technical enough for me to use while I negotiate the sale of my next ebook to a multinational publishing conglomerate: either the rules have to be simplified to uselessness or made so complex that everyone is breaking them, all the time (recall that it is virtually impossible to violate copyright law while reading a printed book, and virtually impossible not to while reading an ebook).

I think that so much of this gets solved by just changing copyright to apply to the entertainment industry, as it always has. If we need rules to govern security disclosures, let's make security rules -- not distort entertainment law to govern who can warn you about wireless attacks on your

questionedauthority2 karma

Is it time for a complete overhaul of copyright law - not just the DMCA, but the whole system?

doctorow2 karma

I think there's an element where DMCA 1201 is both a cause and effect of bad corporate law. The law has allowed corporations to monopolize so much of their ecosystems, freezing out competing consumables, parts, service, apps, etc -- and they get really rich from that and use their lobbying mightily to expand these powers. They also get to use the surplus capital from this kind of monopolization to buy potential competitors, or price them out of existence with predatory pricing.

AsIfProductions2 karma

Taking a shot in the dark, what might be your most realistic (ontologically conservative) prediction of how copyright law will change in the coming decades?

kwiens6 karma

It is very possible that nothing will change. Bob Goodlatte is the chair of the House Judiciary committee, which is would lead the charge in changes to copyright law. He held an extensive series of hearings a few years ago, and it really looked like he wanted to open things up and make extensive changes.

Corynne McSherry from EFF testified that 1201 should be abolished, and I spent a lot of time in DC talking to staffers about the possibility of a major fix. I think we were making good headway.

Then the political winds shifted, and talk of major changes have died down.

The biggest modification to this part of copyright law is the cell phone unlocking bill that we passed, which was the first time that Congress overrode the Copyright Office. Turns out that taking away Americans ability to unlock their own devices wasn't very popular.

Even during that battle, as we were fighting for a narrow commonsense fix, we faced heated opposition from the cell carriers and the entertainment industry.

The Copyright Office has suggested a permanent exemption to 1201 would make sense for the purposes of repair. We'd really like to get something like that passed—but there needs to be a loud public outcry to make it happen.

This is a good template for what we'd like to accomplish:

The only thing necessary for the triumph of evil is for good men to do nothing.

prof_oblivion4 karma

Question I had related to needing a loud public outcry. Do you have thoughts on how to appeal to the masses concerning harmful laws, and more broadly, privacy in general?

How do we get the average person concerned about privacy? It’s been eroding for years, and it’s getting worse because companies get away with surreptitiously tracking users and collecting and forwarding that data for profit. An alarming number of people don’t seem to care. Were the average user to covet their privacy, taking steps to prevent (or hinder) themselves from being tracked, I would think it would cripple large scale privacy violations as a profitable option. How do we convince our fellow end user to value privacy as a right that must be protected, as something more valuable than convenience?

doctorow9 karma

Funnily enough, I just turned in final edits on an essay about this about an hour ago!

Here's how I think we need to think about these issues.

The problem with DRM, privacy, etc, is that they produce immediate, concentrated gains for their proponents and diffused, far-off losses for everyone else. A company that uses DRM starts making bank on parts and service right away, while you only experience the harms gradually, when the DRM-locked devices you buy break down, a long time from now (probably!).

So at first, activists have a hard time convincing you that there's anything wrong. You just got a new console or car or phone or whatever, and it works well, and the fact that it fails badly isn't apparent and won't be for some time.

This means that we start to accumulate technology debt: every one of us ends up with pockets and drawers and desks and garages full of DRM-poisoned gadgets, and we don't realize we made a bad trade off until they start breaking down.

Because it's so hard to get people to care about bad stuff until it happens, this debt mounts and mounts and finally starts to come due. When it does, people begin to convince themselves that there's a problem, as they are hacked, or ripped off, or end-of-lifed, or any of the other DRM horribles are visited upon them.

I call this moment "peak indifference" -- the moment at which the number of people who agree there's a problem starts to grow on its own, without any necessary action from activists.

After peak indifference, the activist's job changes from convincing people that there's a problem to convincing them that it's not too late to do something about it -- that is, we're in a race between "peak indifference" and nihilism, and it's a very different kind of race.

We're at that moment now, the race between nihilism and no-return, in a bunch of domains: climate, for example. But also privacy and DRM. The good news is that in some ways these are all aspects of the same fight - the fight to put evidence-based, fair policies ahead of corporate profits. So as we win victories and recruit allies in one domain, they help us in the rest.

doctorow5 karma

As a science fiction writer, I know how ridiculous it is to claim to be a fortuneteller!

But as a full time, working artist, here's what I hope copyright becomes: a technical set of regulations for the entertainment industry, ideally one that deals fairly with labor (me) and capital (my publishers).

That would mean retreating from the ridiculous position that since software is copyrighted, and everything has software in it, everything should be governed by entertainment regulations (see also: all online conversation is fixed, all fixed communications are copyrighted, therefore entertainment regulations should regulate all conversations)

BalkanLabs1 karma

Any place to donate some crypto for this cause?

doctorow1 karma

Here you go!

Thanks for your support.

trai_dep1 karma

Thanks so much, everyone!

Before, copyrights were important but not life-threatening or being a factor in whether our economic infrastructure would function or not. But copyrights will shield the autos and trucks being driven on public streets soon(ish). They'll protect the IP running Internet of Things devices controlling our physical environments, further threatening the internet by way of their poorly implemented defenses giving rise to their being parts of botnets capable of wrecking havoc on the entire internet.

DMCA seems even more important now than before. Literally life- and livelihood-threatening. Yet if the DMCA prevents third-party audits or review, it's a vastly larger threat. Is there a recognition by regulators and policy-makers of this shift?

I'm skeptical since many Congress members don't seem very technically competent. And some (many?) seem resistant to admitting they are, then becoming more competent.

doctorow3 karma

There's been years of on-again/off-again efforts to legislatively reform DMCA 1201. A simple fix is to change the law so that it's only illegal to break DRM if you're infringing on copyright. If you're breaking DRM to do something that doesn't violate copyright itself, you're gold.

The state-level Right to Repair bills that got off to a great start last year couldn't undo the federal copyright statute, but they did limit the ability of manufacturers to use DRM in shady ways, like blocking third-party parts and service. The states can't make it legal to break DRM, but they can make it illegal to market a product whose DRM is used in abusive ways. A law on those lines in California would have huge ripple-effects across the nation, because the state is so populous.

Ultimately, I think we'll get a series of steps that lead up to real change: maybe a well-publicized DRM scandal will convince people to pressure lawmakers in a big state for Right to Repair, and that will lead to businesses that thrive in a DRM-free world, and they'll lobby other states for more and broader Right to Repair laws, which will create more popular sentiment against DRM, and more businesses that lobby for better laws, and more open products that don't even try to use DRM because there are so many states that limit its use, and so on and so on.

yes_its_him-1 karma

Don't you think it's probably worth thinking about why this is taking place? There's probably a need for some sort of control here, and the wrong mechanism is being used because it's all there is.

There's quite a bit of information that people are prohibited from disclosing in order to protect society as a whole. Not saying that there are no abuses of this, but clearly there is a need to limit some types of disclosures because the disclosure itself raises risks.

It's also the case that society puts a value on privacy, and makes it illegal to attempt to gather and to divulge certain types of information simply because doing so is not in the interest of the one whose privacy is being violated.

There probably needs to be some sort of specific policy about this type of information, which is directly related to the efforts of the organizations that own the products.

doctorow5 karma

I don't think there is "quite a bit of information that people are prohibited from disclosing." While your disclosure of my private information is regulated (albeit not as strongly as I'd like!) and while there are narrow domains of classified and secret government info (along with the odd trade secret), I don't think there are any instances of you being prohibited from disclosing true facts about things you use or own. You can tell the world about any defect you find in any product or service you use -- unless the system is digital, and the manufacturer has designed it so that you have to bypass a copyright lock to discover the defects in it.

I think there's an interesting debate to be had about whether someone should be in charge of deciding when it's OK to make truthful, factual disclosures about defective products and services (though I'm going to take the "no" side of that debate!), but I think you'd have to search far and wide to find a disinterested party who thinks that corporations should get to make that call about their own products. They have an obvious, gigantic conflict of interest there.

Remember that the bad guys here -- criminals, surveillance software makers who sell to autocratic governments, griefers, etc -- are not affected by this at ALL. You only need to worry about liability for security research if you disclose your findings. If all you do with your knowledge of a defect is make a weapon out of it and use it to attack everyone else, the manufacturer has no way to know whom to threaten.

Remember also that security researchers make disclosures because they have learned important, urgent facts about defects in systems whose users need to know about those facts. Experience tells us that banning researchers from making disclosures without permission from the corporation that stands to lose from them just drives researchers into anonymously dumping their work (e.g. on pastebin) -- it doesn't drive them to make coordinated disclosures that give the companies they don't trust time to plan a fix before the news goes out.

Corporations that want to coordinate disclosures with security researchers should be confined to using enticements ("Show us before you go public and we promise we'll fix the bugs you've found, and quickly!") not threats ("If you don't let us decide whether other people get to know the facts you've learned, we'll sue you into oblivion!").

yes_its_him1 karma

While your disclosure of my private information is regulated (albeit not as strongly as I'd like!)

At the risk of making a point via hyperbole:

What if disclosing your private character flaws is in the public interest? How else are you going to be motivated to correct them, if researchers are not free to publicize their findings?

doctorow5 karma

My character flaws (and there are many of them) belong to me. Your car (computer, phone, thermostat, pacemaker, tuned-mass seismic damper) belongs to you. The fact that I helped you install them or sold them to you or whatnot does not give me to right to determine how you use and talk about them.

The better analogy here is to Yelp reviews of poor-quality tradespeople: should plumbers get to decide whether you publicize the fact that they charged you a fortune and didn't fix your toilet?

yes_its_him1 karma

I think this is too simplistic, though. A web site doesn't "belong to you" in any meaningful sense just because you use it. You might take advantage of services it offers, in the same way that an individual might service clients. If reverse-engineering a website to find its weaknesses is in the public interest, then someone vetting your school transcripts, medical records and banking transactions might be not so dissimilar.

And while the negative review question is an interesting one, if only because of the inherent limitations on the reliability of crowdsourced information, I don't think it's valid analogy for what security researchers are doing. Even if we limit the scope to product manufacturers, their product may be completely serviceable for the intended purpose and available at a very attractive price, yet there may be small flaws completely unrelated to normal use of the product that are very difficult to find that can be deliberately exploited. You can say the bad guys already know this, but that assumes bad guys are monolithic and a few bad guys knowing something is the same as every bad guy knowing something, when that's clearly not the case.

doctorow3 karma

I don't think I understand the objection. Are you saying that the maker of a "servicable product" that has flaws should get to decide whether its customers can discuss those flaws? When I buy a product, I don't care about its "intended purposes." I care about my purposes. How do I know that the product is "completely serviceable" for my purposes unless I can find out about its defects?

yes_its_him1 karma

I am saying that I think that is a useful area of policy to discuss, and I would not take at face value the notion that people who provide a service through technology have inherently fewer privacy interests than people who provide a service through manpower.

The arguments for making defects known are similar to the arguments in favor of a Chinese-style social reputation score. (Or, reportedly, the same sort of thing as implemented via facebook.) Why not know whom you are dealing with, warts and all?

doctorow3 karma

Because centuries of consumer protection law and policy have protected the rights of the public to discuss and debate the flaws of systems and products; while human rights and privacy laws have limited the ability of corporations and states to gather and disclose private information about members of the public.

A discoverable fact like "If you increment the URL for your account data on this service, you get someone else's account data" is not private information. It's public and visible to anyone who looks for it.

yes_its_him1 karma

I don't think you are interested in what I am saying, which could simply mean I am not saying anything interesting, but I'm not 100% sure that's the only reason.

Even here, you are simply saying that since it's always been this way, including dissimilar handling of otherwise similar concepts based on whether we are talking about a "system" vs. a "person", then it has to always be this way, and that may not necessarily reflect how needs change over time.

I think if I argued that your DNA left on a cup was a discoverable fact that entitled me to use any information I learned from it, you wouldn't necessarily think that was a great idea. That's visible to anyone who looks for it, too.

doctorow3 karma

If my DNA was part of a service I sold to you, I think you'd have a legitimate interest in studying it and disclosing what you found.