Thursday, 28 Mar 2024

Opinion | Do You Know What You’ve Given Up?

We’ve all been making some big choices, consciously or not, as advancing technology has transformed the real and virtual worlds. That phone in your pocket, the surveillance camera on the corner: You’ve traded away a bit of anonymity, of autonomy, for the usefulness of one, the protection of the other.

Many of these trade-offs were clearly worthwhile. But now the stakes are rising and the choices are growing more fraught. Is it O.K., for example, for an insurance company to ask you to wear a tracker to monitor whether you’re getting enough exercise, and set your rates accordingly? Would it concern you if police detectives felt free to collect your DNA from a discarded coffee cup, and to share your genetic code? What if your employer demanded access to all your digital activity, so that it could run that data through an algorithm to judge whether you’re trustworthy?

These sorts of things are already happening in the United States. Polling suggests that public anxiety about privacy is growing, as data breaches at companies like Facebook and Equifax have revealed how much information we’ve already traded away — and how vulnerable we can find ourselves when it’s exposed. Following the example of the European Union, which toughened its privacy regulations last year, officials in city halls, state capitals and Washington are considering new rules to protect privacy. Industry leaders are scrambling to influence that debate, and to rewrite their own rules.

It seems like a good moment to pause and consider the choices we’ve already made, and the ones that lie ahead. That’s why Times Opinion is launching The Privacy Project, a monthslong initiative to explore the technology, to envision where it’s taking us, and to convene debate about how we should control it to best realize, rather than stunt or distort, human potential.

We mean for this exploration to be thorough and the debate to range widely, from arguments for radical openness to whether, through their access to detailed information about so many of us, corporations and politicians have already gained dangerous power to manipulate how we perceive the world. This project will inevitably consider the work of The New York Times, along with that of other media companies, since, as our publisher writes, this newspaper’s own commerce depends to a degree on the gathering and sharing of people’s data. (So, of course, does its product — journalism.)

We intend to challenge our own assumptions and biases about technology and the nature of privacy, along with those of our readers. At first blush it seems deeply creepy to me, for example, that Shenzhen, China, is considering automatically texting fines to jaywalkers identified through face-recognition technology. But when you think about it, is that necessarily such a bad idea? After all, the law is the law, and if facial recognition could nab all violators, without the racial bias that can warp enforcement by human officers, wouldn’t uses like this result in a fairer world?

The reality is that right now the algorithms that do the recognizing tend to enforce, rather than overcome, racial bias. Even in private hands, and even when the gizmo is as seemingly cool and innocuous as a doorbell video camera, this new technology is already being used to profile people in ways that seem certain to particularly hurt people of color. -In the hands of the state such tools radically increase its power.

We’re looking to you to participate in this project. We’d like to publish your stories about how the sharing of your data hurt or helped you. When it comes to debating the right path forward, we hope to hear from technologists and policymakers, whistle-blowers and tech executives, advocates and academics, and anyone else who has an original and important solution or idea to contribute. Eventually, months from now, having considered this debate, the Times editorial board will weigh in with its own proposed solutions.

[As technology advances, will it continue to blur the lines between public and private? Sign up for Charlie Warzel’s limited-run newsletter to explore what’s at stake and what you can do about it.]

There is no explicit right to privacy in the Constitution. The word “privacy” doesn’t even appear. As Tim Wu writes, the notion of privacy for everyone — rather than just for the rich — is of relatively recent vintage. Maybe its time is passing. People may tell pollsters they’re worried about their privacy, but millions are also happily paying some of the world’s most all-seeing corporations to install listening devices in their homes. It may be that, as some of our contributors argue, a world in which people more freely share intimate details will be a world that is more honest, healthy and fair.

On the other hand, privacy came to be recognized by the United Nations as a universal human right for good reason. Privacy sustains space for free thought and expression, for the growth that comes from mistakes without public shame. It’s a bulwark against the power of the state and the society, the workplace and the marketplace.

In recent years, as we’ve been blurring the boundaries between what’s public and what’s private, we’ve been doing so largely by accident, or by leaving the decisions to the vagaries of innovation and the pull of market incentives. As consumers and as citizens, we need to understand the benefits and the costs, and make deliberate choices. Rather than hurriedly consenting to someone else’s privacy policy, it’s time for us to write our own.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

glossary replacer

General Data Protection Regulation personal data

James Bennet, the editorial page editor since 2016, oversees the editorial board and the Letters and Op-Ed sections. He was previously the editor in chief of The Atlantic and, before that, worked as a correspondent for The Times for 15 years.

Source: Read Full Article

Related Posts