White House moves WhiteHouse.gov to HTTPS by default, tying privacy to security

The_White_House-https

A .gov website that uses HTTPS encryption by default for its visitors is a superb example of “privacy by design.” On March 6th, the Federal Trade Commission enabled encryption for FTC.gov. When I visited whitehouse.gov tonight, I found that the White House digital team had flipped the site for what’s likely the most prominent government website in the world. The White House Web team confirmed the change just after midnight.

According to Leigh Heyman, director of new media technologies at the White House, over the next few days, the team be migrating other domains, like the bare domain name, whitehouse.gov, and m.whitehouse.gov, over to HTTPS as well, joining http://www.whitehouse.gov.

“Americans care about their privacy, and that’s what the White House’s move to HTTPS by default is about,” said Eric Mill, an open government software engineer at 18F. “The White House’s use of HTTPS protects visitors’ personal information and browsing activity when they connect to whitehouse.gov across the vast, unpredictable network of computers that is the internet.”

If you’re unfamiliar with HTTPS, it’s a way of encrypting the way you connect to a Web server online. Specifically, HTTPS refers to layering the Hypertext Transfer Protocol (HTTP) on top of the Secure Sockets Layer (SSL) or Transport Layer Security (TLS). What that means in practice is that your requests to the Web server and the pages results from it are encrypted and decrypted. Why does that matter? Consider, for instance, if someone is looking up sensitive health information online and visits a government website without HTTPS that also has data collection.

“Use of https is generally considered to be good practice, however, as opposed to unencrypted, regular http, although it adds a small amount of extra processing and delay to do the encryption,” commented Eugene Spafford, a Purdue University computer science professor and founder and executive director of the Center for Education and Research in Information Assurance and Security.

“HTTPS primarily provides three things: greater authentication, stream privacy, and message integrity. A quick look at the site doesn’t reveal (to me) anything that would likely require privacy or heightened message integrity. The most immediate consequence is that parties connecting to the website can have increased confidence of the site’s authenticity because a signed certificate will be employed. Of course, most people don’t actually verify certificates and their roots (cf. Superfish), so this isn’t an ironclad identification.”

Why does this matter?

“This immediately creates a strong baseline of privacy and security for anyone in the world, American or otherwise, who visits the White House website — whether to read their blog, learn more about the President, download official policies, or anything else inside whitehouse.gov,” said Mill.

“At a basic level, what a person sees and does on whitehouse.gov should be between them and the White House. When someone reads official policies published on whitehouse.gov, they should be confident that policy is real and authentic. The White House’s use of HTTPS by default means those promises just got a lot stronger.”

Ashkan Soltani, the FTC’s chief technologist, explained why that federal agency shifted at the Tech@FTC blog:

As a quick primer, HTTPS encryption secures your communications while in transit with websites so that only you and the website are able to view the content. The lock icon now appearing in your browser represents that the communication is encrypted and eavesdroppers are unable to look in. At this time, secure browsing is generally not a requirement for federal websites, but it is considered an industry best practice. Transit encryption is an important safeguard against eavesdroppers and has been the subject of previous investigations where we alleged companies failed to live up to their security promises when collecting personal information. It’s an important step when websites or apps collect personal information, and is a great best practice even if they don’t.

What broader trends does this tap into?

The White House moving to HTTPS is part of a larger move to lead by example in promoting privacy and security best practices, related Soltani, over email.

“I believe we’ll see a slow shift over the next few years of websites and services moving to HTTPS by default,” he said, “something a number of standards bodies including ISOC, IETF, and IAB have also called for.”

Along with FTC.gov, Mill highlighted the Privacy and Civil Liberties Oversight Board (PCLOB), the independent agency charged with balancing the rights of American citizens against the security steps taken in the wake of the terrorist attacks of 9/11, to HTTPS.

They’re far from alone: “Last month, 18F worked with 19 other .gov domains to go the distance to ensure browsers would always connect to them over HTTPS,” said Mill.

“Tt’s important to understand that what’s happening now in the federal government is what the broader internet has been working on for a while: making privacy the default.

The standards bodies that guide the internet’s development are recommending that the internet be encrypted by default, instructing their working groups to prioritize encryption in new protocol development, and declaring a more secure future for the web. The fastest versions of HTTP today already require encryption in major browsers, and it’s becoming easier to imagine a future where web browsers proactively warn users about unencrypted websites.

This is also why every .gov that 18F builds with its partner agencies uses HTTPS, full stop. We work hard to demonstrate that HTTPS can be fast, inexpensive, and easy. It’s a better future, and a practical one.”

The kind of privacy and security the White House is offering its visitors is what we should come to expect from the entire web, not just websites someone thinks are “sensitive”. All Web browsing is sensitive, and the White House’s leadership here reinforces that.”

It looks like Chris Soghoian, the principal technologist at the Speech, Privacy and Technology Project in the American Civil Liberties Union, is going to have a good day tomorrow.

While the Obama administration has taken its lumps on digital privacy after revelations of bulk surveillance of the Internet backbone by the National Security Agency, this is undeniably an important step towards securing the traffic of millions of people who visit whitehouse.gov every month.

Now that the White House is leading by example, hopefully other federal, state and local government entities will also adopt the standard.

“Everyone should want a simple feeling of privacy as they use the web, and confidence that they’re at the real and exact website they meant to visit,” said Mill. “While not everyone is highly attuned to watching for that padlock in their browser, the more websites that add it — especially high profile ones like the White House — the more that people can depend on that promise being met.”

PCLOB issues report on U.S. government surveillance under Section 702 of FISA [UPDATED]

pclob-report

The pre-release version of the Privacy and Civil Liberties Oversight Board’s Report on the Surveillance Program Operated Pursuant to Section 702 of the Foreign Intelligence Surveillance Act (FISA) is now available online. [PDF]

Short version: The board found little legally awry with surveillance conducted under Section 702 of FISA, which permits the federal government to compel United States companies to assist them in conducting surveillance targeting foreign people and entities, noting that it was a strong, effective tool for counterterrorism. The extensive report explores the legal rationales for such surveillance and lists ten recommendations in its report. The scope of digital surveillance was detailed  in The Washington Post on Monday, which reported that only four countries in the world (the USA, Canada, UK, New Zealand and Australia) are not subject to the surveillance enabled by legal authority to intercept communications.

Context from Gregory McNeal in Forbes:

“Section 702 of FISA has not received the same level of attention as the 215 metadata collection program, largely because the program is not directly targeted at U.S. persons. However, under Section 702, the government can collect the contents of communications (for example examining email and other communications), rather than mere metadata, which it collects under Section 215.”

“702 is also a more powerful program because under it the government can collect the content of U.S. persons communications, if those persons are communicating with a foreign target. This means that U.S. persons communications can be incidentally collected by the agency, such as when two non-U.S. persons discuss a U.S. person. Communications of or concerning U.S. persons that are acquired in these ways may be retained and used by the government, subject to applicable rules and requirements. The communications of U.S. persons may also be collected by mistake, as when a U.S. person is erroneously targeted or in the event of a technological malfunction, resulting in “inadvertent” collection. In such cases, however, the applicable rules generally require the communications to be destroyed. Another circumstance where 702 collection has raised concerns is the collection of so-called “about” communication. An “about” communication is one in which the selector of a targeted person (such as that person’s email address) is contained within the communication but the targeted person is not necessarily a participant in the communication.” The PCLOB addresses each of these issues in their report.”

The PCLOB did find that “certain aspects of the program’s implementation raise privacy concerns,” specifically the “scope of the incidental collection of U.S. persons’ communications” when intelligence analysts targeted other individuals or entities.

As Josh Gerstein reported in Politico, the PCLOB “divided over key reforms to government collection of large volumes of email and other data from popular web businesses and from the backbone of the Internet. A preliminary report released Tuesday night hows that some of the proposals for changes to the Section 702 program caused a previously unseen split on the five-member Privacy and Civil Liberties Oversight Board: Two liberal members of the commission urged more aggressive safeguards, but a well-known privacy activist on the panel joined with two conservatives to withhold official endorsement of some of those changes.”

As Gerstein pointed out in a tweet, that means that reforms proposed in the House as Representatives go further than those recommended by the independent, bipartisan agency within the executive branch vested with the authority “to review and analyze actions the executive branch takes to protect the Nation from terrorism, ensuring the need for such actions is balanced with the need to protect privacy and civil liberties” and “ensure that liberty concerns are appropriately considered in the development and implementation of laws, regulations, and policies related to efforts to protect the Nation against terrorism”

Perhaps even more problematically, the PCLOB wrote in the report that “the government is presently unable to assess the scope of the incidental collection of U.S. person information under the program.”

As Matt Sledge observed in the Huffington Post, the report’s authors “express frustration that the NSA and other government agencies have been unable to furnish estimates of the incidental collection of Americans’ communications, which ‘hampers attempts to gauge whether the program appropriately balances national security interests with the privacy of U.S. persons.’

But without signs of abuse, the board concludes privacy intrusions are justified in protecting against threats to the U.S. Nevertheless, the board suggests that the government take on the ‘backdoor searches’ that have alarmed Wyden. In those searches, the government searches through the content of communications collected while targeting foreigners for search terms associated with U.S. citizens and residents. The House voted in June to end such searches. The searches ‘push the program close to the line of constitutional reasonableness,’ the privacy board report says, but it doesn’t recommend ending them.

Privacy and civil liberties advocates issued swift expressions of dismay about the constitutionality of the surveillance and questioned the strength of the recommendations.

“The Board’s report is a tremendous disappointment,” said Nuala O’Connor, the president of the Center for Democracy and Technology, in a statement. “Even in the few instances where it recognizes the privacy implications of these programs, it provides little reassurance to all who care about digital civil liberties. The weak recommendations in the report offer no serious reform of government intrusions on the lives of individuals. It also offers scant support to the U.S. tech industry in its efforts to alleviate customer concerns about NSA surveillance, which continue to harm the industry in the global marketplace,” she added.

“If there is a silver lining, it is that the Board recognized that surveillance of people abroad implicates their human rights, as well as the constitutional rights of people in the U.S.,” said Greg Nojeim, director of the Center’s Project on Freedom, Security and Technology.  “However, the Board defers until a future date its consideration of human rights and leaves it to Congress to address the important constitutional issues.”

“If the Board’s last report on the bulk collection of phone records was a bombshell, this one is a dud,” said Kevin Bankston, policy director of New America’s Open Technology Institute (OTI).

“If the Board’s last report on the bulk collection of phone records was a bombshell, this one is a dud.  The surveillance authority the Board examined in this report, Section 702 of 2008’s FISA Amendments Act, is in many ways much more worrisome than the bulk collection program.  As the Board itself explains, that law has been used to authorize the NSA’s wiretapping of the entire Internet backbone, so that the NSA can scan untold numbers of our emails and other online messages for information about tens of thousands of targets that the NSA chooses without individualized court approval.  Yet the reforms the Board recommends today regarding this awesome surveillance power are much weaker than those in their last report, and essentially boil down to suggesting that the government should do more and better paperwork and develop stricter internal protocols as a check against abuse.

“As Chief Justice Roberts said just last week, “the Founders did not fight a revolution to gain the right to government agency protocols,” they fought to require search warrants that are based on probable cause and specifically identify who or what can be searched.  Yet as we know from documents released earlier this week, government agents are searching through the data they’ve acquired through this surveillance authority–an authority that was sold to Congress as being targeted at people outside the US–tens of thousands of times a year without having to get a warrant first.

“The fact that the Board has endorsed such warrantless rummaging through our communications, just weeks after the House of Representatives voted almost three to one to defund the NSA’s “backdoor” searches of Americans’ data, is a striking disappointment.  The Board is supposed to be an independent watchdog that aggressively seeks to protect our privacy against government overreach, rather than undermining privacy by proposing reforms that are even weaker than those that a broad bipartisan majority of the House has already endorsed.

“We are grateful to the Board for its last report and are grateful to them now for laying out, in the clearest and most comprehensive way we’ve seen so far, exactly how the NSA is using its surveillance authority.  But Congress shouldn’t wait for the NSA to take the Board’s weak set of recommendations and get its own house in order.  Congress should instead move forward with strong reforms that protect our privacy and that tell the NSA, as the Supreme Court told the government last week: if you want our data you need to come back with a warrant.”

The Electronic Frontier Foundation was even stronger, with Cindy Cohn calling the PCLOB report “legally flawed and factually incomplete.”

Hiding behind the “complexity” of the technology, it gives short shrift to the very serious privacy concerns that the surveillance has rightly raised for millions of Americans. The board also deferred considering whether the surveillance infringed the privacy of many millions more foreigners abroad.

The board skips over the essential privacy problem with the 702 “upstream” program: that the government has access to or is acquiring nearly all communications that travel over the Internet. The board focuses only on the government’s methods for searching and filtering out unwanted information. This ignores the fact that the government is collecting and searching through the content of millions of emails, social networking posts, and other Internet communications, steps that occur before the PCLOB analysis starts.  This content collection is the centerpiece of EFF’s Jewel v. NSA case, a lawsuit battling government spying filed back in 2008.

Trevor Timm, writing in the Guardian, said the PCLOB “chickened out of making any real reform proposals” and questioned why one member of the panel didn’t support more aggressive recommendations in

“More bizarrely, one of the holdouts on the panel for calling for real reform is supposed to be a civil liberties advocate. The Center for Democracy and Technology’s vice president, James Dempsey, had the chance to side with two other, more liberal members on the four-person panel to recommend the FBI get court approval before rummaging through the NSA’s vast databases, but shamefully he didn’t.

Now, as the Senate takes up a weakened House bill along with the House’s strengthened backdoor-proof amendment, it’s time to put focus back on sweeping reform. And while the PCLOB may not have said much in the way of recommendations, now Congress will have to. To help, a coalition of groups (including my current employer, Freedom of the Press Foundation) have graded each and every representative in Washington on the NSA issue. The debate certainly isn’t going away – it’s just a question of whether the public will put enough pressure on Congress to change.”

Editor’s note: This post has been substantially rewritten. More statements were added, and the headline has been amended.

PCAST report on big data and privacy emphasizes value of encryption, need for policy

pcast-4-4-2014 (1)
April 4, 2014 meeting of PCAST at National Academy of Sciences

This week, the President’s Council of Advisors on Science and Technology (PCAST) met to discuss and vote to approve a new report on big data and privacy.

UPDATE: The White House published the findings of its review on big data today, including the PCAST review of technologies underpinning big data (PDF), discussed below.

As White House special advisor John Podesta noted in January, the PCAST has been conducting a study “to explore in-depth the technological dimensions of the intersection of big data and privacy.” Earlier this week, the Associated Press interviewed Podesta about the results of the review, reporting that the White House had learned of the potential for discrimination through the use of data aggregation and analysis. These are precisely the privacy concerns that stem from data collection that I wrote about earlier this spring. Here’s the PCAST’s list of “things happening today or very soon” that provide examples of technologies that can have benefits but pose privacy risks:

 Pioneered more than a decade ago, devices mounted on utility poles are able to sense the radio stations
being listened to by passing drivers, with the results sold to advertisers.26
 In 2011, automatic license‐plate readers were in use by three quarters of local police departments
surveyed.  Within 5 years, 25% of departments expect to have them installed on all patrol cars, alerting
police when a vehicle associated with an outstanding warrant is in view.27  Meanwhile, civilian uses of
license‐plate readers are emerging, leveraging cloud platforms and promising multiple ways of using the
information collected.28
 Experts at the Massachusetts Institute of Technology and the Cambridge Police Department have used a
machine‐learning algorithm to identify which burglaries likely were committed by the same offender,
thus aiding police investigators.29
 Differential pricing (offering different prices to different customers for essentially the same goods) has
become familiar in domains such as airline tickets and college costs.  Big data may increase the power
and prevalence of this practice and may also decrease even further its transparency.30
 reSpace offers machine‐learning algorithms to the gaming industry that may detect
early signs of gambling addiction or other aberrant behavior among online players.31
 Retailers like CVS and AutoZone analyze their customers’ shopping patterns to improve the layout of
their stores and stock the products their customers want in a particular location.32  By tracking cell
phones, RetailNext offers bricks‐and‐mortar retailers the chance to recognize returning customers, just
as cookies allow them to be recognized by on‐line merchants.33  Similar WiFi tracking technology could
detect how many people are in a closed room (and in some cases their identities).
 The retailer Target inferred that a teenage customer was pregnant and, by mailing her coupons
intended to be useful, unintentionally disclosed this fact to her father.34
 The author of an anonymous book, magazine article, or web posting is frequently “outed” by informal
crowd sourcing, fueled by the natural curiosity of many unrelated individuals.35
 Social media and public sources of records make it easy for anyone to infer the network of friends and
associates of most people who are active on the web, and many who are not.36
 Marist College in Poughkeepsie, New York, uses predictive modeling to identify college students who are
at risk of dropping out, allowing it to target additional support to those in need.37
 The Durkheim Project, funded by the U.S. Department of Defense, analyzes social‐media behavior to
detect early signs of suicidal thoughts among veterans.38
 LendUp, a California‐based startup, sought to use nontraditional data sources such as social media to
provide credit to underserved individuals.  Because of the challenges in ensuring accuracy and fairness,
however, they have been unable to proceed.

The PCAST meeting was open to the public through a teleconference line. I called in and took rough notes on the discussion of the forthcoming report as it progressed. My notes on the comments of professors Susan Graham and Bill Press offer sufficient insight and into the forthcoming report, however, that I thought the public value of publishing them was warranted today, given the ongoing national debate regarding data collection, analysis, privacy and surveillance. The following should not be considered verbatim or an official transcript. The emphases below are mine, as are the words of [brackets]. For that, look for the PCAST to make a recording and transcript available online in the future, at its archive of past meetings.


 

graham-sSusan Graham: Our charge was to look at confluence of big data and privacy, to summarize current tech and the way technology is moving in foreseeable future, including its influence the way we think about privacy.

The first thing that’s very very obvious is that personal data in electronic form is pervasive. Traditional data that was in health and financial [paper] records is now electronic and online. Users provide info about themselves in exchange for various services. They use Web browsers and share their interests. They provide information via social media, Facebook, LinkedIn, Twitter. There is [also] data collected that is invisible, from public cameras, microphones, and sensors.

What is unusual about this environment and big data is the ability to do analysis in huge corpuses of that data. We can learn things from the data that allow us to provide a lot of societal benefits. There is an enormous amount of patient data, data about about disease, and data about genetics. By putting it together, we can learn about treatment. With enough data, we can look at rare diseases, and learn what has been effective. We could not have done this otherwise.

We can analyze more online information about education and learning, not only MOOCs but lots of learning environments. [Analysis] can tell teachers how to present material effectively, to do comparisons about whether one presentation of information works better than another, or analyze how well assessments work with learning styles.
Certain visual information is comprehensible, certain verbal information is hard to understand. Understanding different learning styles [can enable] develop customized teaching.

The reason this all works is the profound nature of analysis. This is the idea of data fusion, where you take multiple sources of information, combine them, which provides much richer picture of some phenomenon. If you look at patterns of human movements on public transport, or pollution measures, or weather, maybe we can predict dynamics caused by human context.

We can use statistics to do statistics-based pattern recognition on large amounts of data. One of the things that we understand about this statistics-based approach is that it might not be 100% accurate if map down to the individual providing data in these patterns. We have to very careful not to make mistakes about individuals because we make [an inference] about a population.

How do we think about privacy? We looked at it from the point of view of harms. There are a variety of ways in which results of big data can create harm, including inappropriate disclosures [of personal information], potential discrimination against groups, classes, or individuals, and embarrassment to individuals or groups.

We turned to what tech has to offer in helping to reduce harms. We looked at a number of technologies in use now. We looked at a bunch coming down the pike. We looked at several tech in use, some of which become less effective because of pervasivesness [of data] and depth of analytics.

We traditionally have controlled [data] collection. We have seen some data collection from cameras and sensors that people don’t know about. If you don’t know, it’s hard to control.

Tech creates many concerns. We have looked at methods coming down the pike. Some are more robust and responsive. We have a number of draft recommendations that we are still working out.

Part of privacy is protecting the data using security methods. That needs to continue. It needs to be used routinely. Security is not the same as privacy, though security helps to protect privacy. There are a number of approaches that are now used by hand that with sufficient research could be automated could be used more reliably, so they scale.

There needs to be more research and education about education about privacy. Professionals need to understand how to treat privacy concerns anytime they deal with personal data. We need to create a large group of professionals who understand privacy, and privacy concerns, in tech.

Technology alone cannot reduce privacy risks. There has to be a policy as well. It was not our role to say what that policy should be. We need to lead by example by using good privacy protecting practices in what the government does and increasingly what the private sector does.

pressBill Press: We tried throughout to think of scenarios and examples. There’s a whole chapter [in the report] devoted explicitly to that.

They range from things being done today, present technology, even though they are not all known to people, to our extrapolations to the outer limits, of what might well happen in next ten years. We tried to balance examples by showing both benefits, they’re great, and they raise challenges, they raise the possibility of new privacy issues.

In another aspect, in Chapter 3, we tried to survey technologies from both sides, with both tech going to bring benefits, those that will protect [people], and also those that will raise concerns.

In our technology survey, we were very much helped by the team at the National Science Foundation. They provided a very clear, detailed outline of where they thought that technology was going.

This was part of our outreach to a large number of experts and members of the public. That doesn’t mean that they agree with our conclusions.

Eric Lander: Can you take everybody through analysis of encryption? Are people using much more? What are the limits?

Graham: The idea behind classical encryption is that when data is stored, when it’s sitting around in a database, let’s say, encryption entangles the representation of the data so that it can’t be read without using a mathematical algorithm and a key to convert a seemingly set of meaningless set of bits into something reasonable.

The same technology, where you convert and change meaningless bits, is used when you send data from one place to another. So, if someone is scanning traffic on internet, you can’t read it. Over the years, we’ve developed pretty robust ways of doing encryption.

The weak link is that to use data, you have to read it, and it becomes unencrypted. Security technologists worry about it being read in the short time.

Encryption technology is vulnerable. The key that unlocks the data is itself vulnerable to theft or getting the wrong user to decrypt.

Both problems of encryption are active topics of research on how to use data without being able to read it. There research on increasingly robustness of encryption, so if a key is disclosed, you haven’t lost everything and you can protect some of data or future encryption of new data. This reduces risk a great deal and is important to use. Encryption alone doesn’t protect.

Unknown Speaker: People read of breaches derived from security. I see a different set of issues of privacy from big data vs those in security. Can you distinguish them?

Bill Press: Privacy and security are different issues. Security is necessary to have good privacy in the technological sense if communications are insecure, they clearly can’t be private. This goes beyond, to where parties that are authorized, in a security sense, to see the information. Privacy is much closer to values. security is much closer to protocols.

Interesting thing is that this is less about purely tech elements — everyone can agree on right protocol, eventually. These things that go beyond and have to do with values.

On data journalism, accountability and society in the Second Machine Age

On Monday, I delivered a short talk on data journalism, networked transparency, algorithmic transparency and the public interest at the Data & Society Research Institute’s workshop on the social, cultural & ethical dimensions of “big data”. The forum was convened by the Data & Society Research Institute and hosted at New York University’s Information Law Institute at the White House Office of Science and Technology Policy, as part of an ongoing review on big data and privacy ordered by President Barack Obama.

Video of the talk is below, along with the slides I used. You can view all of the videos from the workshop, along with the public plenary on Monday evening, on YouTube or at the workshop page.

Here’s the presentation, with embedded hyperlinks to the organizations, projects and examples discussed:

For more on the “Second Machine Age” referenced in the title, read the new book by Erik Brynjolfsson and Andrew McAfee.

Privacy and Civil Liberties Report Finds NSA bulk phone records program illegal and ineffective

Earlier this afternoon, I emailed info@pclob.gov in search of the report that the New York Times  and Washington Post had obtained and reported upon this morning. 2 hours later, I received a response: www.pclob.gov. There, visitors can now find, download and read a “Report on the Telephone Records Program Conducted under Section 215 of the USA PATRIOT Act and on the Operations of the Foreign Intelligence Surveillance Court” and separate statements by Elisebeth Collins Cook  Rachel Brand. As Charlie Savage and Ellen Nakashima reported, Cook and Brand dissented from the report’s recommendation to end the collection of phone records under the 215 programs of the USA Patriot Act.

The privacy and civil liberties board’s report is strongly critical of the impact that mass surveillance has upon the privacy and civil liberties of American citizens, along with billions of other people around the world.

“The Section 215 bulk telephone records program lacks a viable legal foundation under Section 215, implicates constitutional concerns under the First and Fourth Amendments, raises serious threats to privacy and civil liberties as a policy matter, and has shown only limited value. As a result, the Board recommends that the government end the program.”

PCLOB Board Members meet with President Obama on June 21, 2013​. Photo by Pete Souza.

PCLOB Board Members meet with President Obama on June 21, 2013​. Photo by Pete Souza.

While President Obama met with the board and heard their recommendations prior to his speech last week, his administration is disputing its legal analysis.

“We disagree with the board’s analysis on the legality,” said Caitlin Hayden, spokeswoman for the White House National Security Council, in an e-mail to Bloomberg News. “The administration believes that the program is lawful.”

House Intelligence Committee Chairman Mike Rogers (R-MI) was also critical of the report’s findings. “I am disappointed that three members of the Board decided to step well beyond their policy and oversight role and conducted a legal review of a program that has been thoroughly reviewed,” he said in a statement.

The Electronic Frontier Foundation hailed the report as a vindication of its position on the consitutionality of the programs.

“The board’s other recommendations—increasing transparency and changing the FISA court in important ways—similarly reflect a nearly universal consensus that significant reform is needed,” wrote Mark Rumold, a staff attorney. “In the coming weeks, PCLOB is set to release a second report addressing the NSA’s collection under Section 702 of the FISA Amendments Act. We hope that the board will apply similar principles and recognize the threat of mass surveillance to the privacy rights of all people, not just American citizens.”

Should Congress criminalize online “revenge pornography”?

1-Blind-JusticeAShould “revenge porn” be made a crime? In California, revenge porn could soon be illegal.

This weekend, in an op-ed for CNN.com, University of Maryland law professor Danielle Citron argues that Congress and other states in the union also should move to criminalize sharing nude pictures of a person without that person’s consent.

“New Jersey is the only state to make it a felony to disclose a person’s nude or partially nude images without that person’s consent,” she writes. “The New Jersey statute is a helpful model for states like California that are considering proposals to criminalize revenge porn. Congress should amend the federal cyberstalking law, 18 U.S.C. § 2261A, to cover the use of any interactive computer service to produce or disclose a sexually graphic visual depiction of an individual without that individual’s consent.”

Citron argues that that, given the profound effects upon someone’s personal and professional life in the schools, workplaces and communities they inhabit “offline,” criminalizing this online action is a necessary curb on the damage it can do. She makes a strong case that the U.S. Code should catch up to the pace of technological change.

We’re several years past the time the world crossed a Rubicon, with respect to the ability to share embarrassing images of one another. The global adoption of cheap camera phones, smartphones, social networks, search engines and wireless Internet access has created a tidal wave of disruptions across industries, governments and nations. Taking pictures with the world has been made trivially easy by those technologies, a capability that can capture both our best and worst moments.

When combined with the capacity to share those images with the rest of humanity in an instant, billions of people now wield great power in their back pockets. Whether they uphold the responsibility that comes with it is in question, given what history shows us of humans acting badly to those who have less power in society. The power to publicize and shame others is not equally distributed, given the expense of devices, data, and unequal access between the sexes.

In her op-ed, Citron anticipates the First Amendment concerns of organizations like the ACLU, arguing that it’s possible to craft sufficient limits into legislation — again, using New Jersey’s law as a model — that will enable the United States to preserve constitutional protections for free speech online.

“First Amendment protections are less rigorous for purely private matters because the threat of liability would not risk chilling the meaningful exchange of ideas,” writes Citron.

“Listeners and speakers have no legitimate interest in nude photos or sex tapes published without the subjects’ permission. That online users can claim a prurient interest in viewing sexual images does not transform them into a matter of legitimate public concern. Nonconsensual pornography lacks First Amendment value as a historical matter, and could be understood as categorically unprotected as obscenity. Although the Court’s obscenity doctrine has developed along different lines with distinct justifications, nonconsensual pornography can be seen as part of obscenity’s long tradition of proscription.”

The American Civil Liberties Union opposes the California legislation and the Electronic Frontier Foundation has expressed concerns with how broadly it has been drafted.

Legal precision in how legislatures make revenge porn a criminal offense really will matter here, given both existing statutes and the number of entities that are involved in the act, from the person who took the image to the site that hosts it to the people who spread it.

Making anyone but the original person who broke the trust of another by uploading the picture culpable would run up against Section 230 of the United States Communications Decency Act, which provides “intermediary liability,” protecting online platforms from being held liable for user-generated content shared on them.

As more people gain the ability to take, store and share digital images, however, improving systems that govern non-consensual surveillance and distribution looks precisely like the kind of thorny problem that our elected representatives should grapple with in the 21st century.

Societies around the world will need to find answers that reconcile online civil rights with longstanding constitutional protections. The way the United States handles the issue could be a model for other states to consider — or not, if a dysfunctional Congress proves unable to enact effective legislation, a scenario that unfortunately seems all too likely over the next year.