Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting. Do you have similar concerns about cryptography?

Edit: to more strongly bind 'similar': would you also say of cryptography that it is "cynical academic malpractice"?



I fully expect privacy disasters based on imperfect implementations of Differential Privacy. Do you not? Do the researchers not?

The superior alternative is to avoid sharing sensitive datasets and avoid keeping data whenever possible. No such alternative exists for many applications of cryptography.

But we live in an era where organizations find our private data impossibly tempting and are content to sacrifice the rights of individuals so long as they can't fight back. This research gives such entities the excuse to build tools that should not be built and publish data that should not be published. By saying "OK now it's safe (if you did everything right)" rather than "don't do that", it is the enabler of future privacy fiascos.

The answer, if there is one, is probably legislative: hold entities criminally liable for data breach. Should such legislation pass, I wonder how much interest in this research will wane.


If you want to rip in to Apple or Google or Uber for claiming they should have a pass for using privacy tech, feel free. Understand that this is distinct from most research on differential privacy.

The US Census collects demographic data about as much of the population as they can manage, and releases summary data in a large part to support enforcement of the Civil Rights Act. They have a privacy mandate, but also the obligation to provide information in support of the rights of subpopulations (e.g. Equal Protection). So what's your answer here? A large fraction of the population gets disenfranchised if you go with "avoid sharing the datasets".

You end up with similar issues in preventative medicine, epidemiology, public health, where there is a real social benefit to analyzing data, and where withholding data has a cost that hasn't shown up yet in your analysis. Understanding the trade-off is important, and one can come to different conclusions when the subjects are civil rights versus cell phone statistics. But you are wrong to be upset that math allows the trade-off to exist.


"Privacy tech" is a perverse description, since this tech's existence results in a net loss of privacy -- without it, the data-sharing applications it powers would be more obviously irresponsible and more conservative decisions would be forced. A less Orwellian name would be "Anonymization tech".

If it were possible to wish away this tech, I absolutely would -- just like I would wish away advanced weapons technology if I could. In our networked era, the private data of individuals is being captured and abused at an unprecedented, accelerating rate, and whatever good this tech does cannot begin to make up for its role in facilitating and excusing that abuse.


There needs to be a word to describe the practice of arrogantly explaining a researcher's own results back to them on HN. Maybe HNsplaining?


Not the person above, but I do, yes.

Heartbleed, side channel attacks, etc.

Implementation in the real world matters.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: