Google reaches agreement with FTC on Buzz privacy concerns

Google has agreed to an independent review of its privacy procedures once every two years and to ask it users to give “affirmative consent” before it changes how it shares their personal information. The agreement raises the bar for the way that companies handle user privacy in the digital age.

Alma Whitten, director of privacy, product and engineering, announced that that Google had reached the agreement with the United States Federal Commission in an update in Buzz posted to Google’s official blog this morning.

“The terms of this agreement are strong medicine for Google and will have a far-reaching effect on how industry develops and implements new technologies and services that make personal information public,” said Leslie Harris, president of the Center for Democracy and Technology.  “We expect industry to quickly adopt the new requirement for opt-in consent before launching any new service that will publicly disclose personal information,” Harris said.

In a statement posted to FTC.gov, the FTC charged deceptive privacy practices in Google’s rollout of its buzz social network. (Emphasis is mine):

The agency alleges the practices violate the FTC Act. The proposed settlement bars the company from future privacy misrepresentations, requires it to implement a comprehensive privacy program, and calls for regular, independent privacy audits for the next 20 years. This is the first time an FTC settlement order has required a company to implement a comprehensive privacy program to protect the privacy of consumers’ information. In addition, this is the first time the FTC has alleged violations of the substantive privacy requirements of the U.S.-EU Safe Harbor Framework, which provides a method for U.S. companies to transfer personal data lawfully from the European Union to the United States.

“When companies make privacy pledges, they need to honor them,” said Jon Leibowitz, Chairman of the FTC. “This is a tough settlement that ensures that Google will honor its commitments to consumers and build strong privacy protections into all of its operations.”

The FTC turned to Twitter for a live Q&A with the Web. Here’s a recap of the conversation:

http://storify.com/digiphile/ftc-hosted-privacy-chat-around-google-buzz-settlem.js

In her post, Whitten highlighted the efforts that the search engine has made in this intersection of Google, government and privacy:

For example, Google Dashboard lets you view the data that’s stored in your Google Account and manage your privacy settings for different services. With our Ads Preferences Manager, you can see and edit the data Google uses to tailor ads on our partner websites—or opt out of them entirely. And the Data Liberation Front makes it easy to move your data in and out of Google products. We also recently improved our internal privacy and security procedures.

2011 NASA Open Source Summit convenes innovators and technologists

Today in California, NASA is hosting its first Open Source Summit. You can watch the open source livestream here. The first Open Source Summit is at Ames Research Center in Mountain View, California. Engineers and policy makers across NASA are meeting with members of the open source community to discuss the challenges with open source policy. Here’s the agenda. The liveblog is below.

Virtual attendees connected on morning phone conversations on Maestro Conference and collaboratively took notes online at the Ideation Forum.

In the afternoon, the NASA Open Source Summit turned to breakout groups with discussions driven by the online conversation. Photo by NASA’s Chris @Gerty:

Presentations are also going up over at Slideshare. Here are great examples:

John Wonderlich on the aspirations and limitations of open data initiatives

New abilities given to citizens and open government watchdogs through the lens of technology means sunlight can be applied to government transparency in powerful ways. This past weekend, John Wonderlich, policy director for the Sunlight Foundation, spoke about open data and transparency at the slashroots ./roots/DEV in Jamaica. His keynote is embedded below.

./roots/Dev Conference 2011 – Keynote Day 2 Pt 1/2 from slashroots on Vimeo.

Google Public Data Explorer adds Census Bureau data, state government statistics

Last month, the Google Public Data Explorer went public. Today, Google added U.S. Census Bureau and state government finance statistics to the database, allowing everyone to gain new insight into our present.

The numbers may be beautifully displayed but they tell a grim tale when it comes to state budgets. The crisis in state budgets across the country will be the primary driver for the adoption of new approaches to governance and service delivery in 2011. If Gov 2.0 goes local, citizensourcing smarter government couldn’t come at a more timely moment.

Disaster 2.0: UN OCHA releases report on future of information sharing in crisis

The emergence of crisiscamps and subsequent maturation of CrisisCommons into a platform for civic engagement were important developments in 2010. Hearing digital cries for help has never been more important. A year after the devastating earthquake in Haiti, a new report by a team at the Harvard Humanitarian Initiative analyzes how the humanitarian, emerging volunteer and technical communities collaborated in the aftermath of the quake. The report recommends ways to improve coordination between these groups in future emergencies. There are 5 specific recommendations to address the considerable challenges inherent in coordinating crisis response:

  1. A neutral forum to surface areas of conflict or agreement between the volunteer/technical community and established humanitarian institutions
  2. An space for innovation where new tools and approaches can be experimented with before a crisis hits
  3. A deployable field team with the mandate to use the best practices and tools established by the community
  4. A research and development group to evaluate the effectiveness of tools and practices
  5. An operational interface that identifies procedures for collaboration before and during crises, including data standards for communication

Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies” was commissioned by the United Nations Foundation and Vodafone Foundation Technology Partnership in collaboration with the UN Office for the Coordination of Humanitarian Affairs (OCHA). You can find more discussion of the report in a series of posts on disaster relief 2.0 at UNDispatch.com, like this observation from Jen Ziemke:

…a substantial majority of members on the Crisis Mappers Network have held positions in formal disaster response, some for several decades. Volunteers in groups like the Standby Task Force include seasoned practitioners with the UNDP or UN Global Pulse. But what is really needed is a fundamental rethinking of who constitutes the “we” of disaster response, as well as dispensing with current conceptions of: “volunteers”, “crowds,” and “experts.” While distinctions can be endlessly debated, as humans, we are far more the same than we are different.

Whether it’s leveraging social media in a time of need or geospatial mapping, technology empowers us to help one another more than ever. This report offers needed insight about how to do it better.

A movement to spur innovation and participation in government

This past weekend, Syracuse MPA grad student Pat Fiorenza spoke about Gov 2.0 at the We Live NY Conference in upstate New York. In a wrap up posted after the conference, Fiorenza touched of what people think about when they hear “Gov 2.0,” including:

Fiorenza’s recap of his Gov 2.0 presentation also describes both why the idea is important to him and why it’s important to people who aren’t developers.

“Gov 2.0 extends beyond a great programmer – I’ve noticed that when I talk to some people about Gov 2.0 they immediately associate me as a geeky-computer programming-MPA student (only 2 of the 3!). I’ve developed a passion for Gov 2.0 because it holds so much potential for government. It’s about getting access to data and information immediately, improving constituent services, crowd sourcing information, and empowering citizens. Gov 2.0 requires someone to identify an existing problem and conceptualize a solution – then someone to run with the idea and develop the program, with a lot of collaboration in between.”

Fiorenza also pointed the way to Remy DeCausemaker (@remy_d, a “resident hacktivist and storyteller” at the Rochester Institute for Technology’s Lab for Technological Literacy, who also presented on Gov 2.0 at the conference.

DeCausemaker works on FOSS at RIT and CIVX, an open source public information system for raw data. His presentation (PDF) on open government and open data will be of interest to many people in the Gov 2.0 community.

Building a revolution in relevance in an age of information abundance

Revolutions rooms“We’ve had a decade’s worth of news in less than two months,” Mike Allen, chief White House correspondent for Politico. In the Saturday edition of Politico’s Playbook, Allen looked back at the Arab Spring and Japanese ongoing challenges:

It was Feb. 11 – seven weeks ago — that Mubarak fled the Arab spring, a rolling reordering of Middle East power that could wind up affecting global security as profoundly as 9/11.

It was March 11 – 15 days ago – that we woke to the news of the Japanese earthquake and tsunami, which will have ripple effects on the fragile global economy for months to come.

And, oh, we’re in three hot conflicts at once, for the first time since World War II.”

Related, in the NEW YORK TIMES: “Inundated With News, Many Find It Difficult to Keep Up on Libya

“People interviewed across four states said that at a time when the world seems to stagger from one breathtaking news event to another — rolling turmoil across the Middle East, economic troubles at home, disaster upon disaster in Japan — the airstrikes on military targets in Libya can feel like one crisis too many.”

Through it all, I’ve been following Andy Carvin (@acarvin), whose Twitter feed has been a groundbreaking curation of the virtual community and conversation about the Middle East, including images, video, breaking news and unverified reports.

To wax metaphorical, his account has become a stream of crisis data drawn from from the data exhaust created by the fog of war across the Middle East, dutifully curated by a veteran digital journalist for up to 17 hours a day.

Carvin has linked to reports, to video and images from the front lines that are amongst the most graphic images of war I have ever seen. While such imagery is categorically horrific to view, they can help to bear witness to what is happening on the ground in countries where state media would never broadcast their like.

The vast majority of the United States, however, is not tracking what’s happening on the ground in the region so closely. NEW YORK TIMES:  

“A survey by the Pew Research Center — conducted partly before and partly after the bombing raids on Libya began on March 19 — found that only 5 percent of respondents were following the events ‘very closely.’ Fifty-seven percent said they were closely following the news about Japan.”

Understanding the immensity of the challenges that face Japan, Egypt and Libya is pushing everyone’s capacity to stay informed with day to day updates, much less the larger questions of what the larger implications of these events all are for citizens, industry or government. In the context of the raw information available to the news consumer in 2011, that reality is both exciting and alarming. The tools for newsgathering and dissemination are more powerful and democratized than ever before. The open question now is how technologists and journalists will work together to improve them to provide that context that everyone needs.

Finally, an editor’s note: My deepest thanks to all of the brave and committed journalists working long hours, traveling far from their families and risking their lives under hostile regimes for the reporting that helps us make it so.

Beth Noveck on connecting the academy to open government R&D

Earlier this week, the White House convened an open government research and development summit at the National Archives. Columbia statistics professor Victoria Stodden captures some key themes from it at her blog, including smart disclosure of government data and open government at the VA. Stodden also documented the framing questions that federal CTO Aneesh Chopra asked for help answered from the academic community:

1. big data: how strengthen capacity to understand massive data?
2. new products: what constitutes high value data?
3. open platforms: what are the policy implications of enabling 3rd party apps?
4. international collaboration: what models translate to strengthen democracy internationally?
5. digital norms: what works and what doesn’t work in public engagement?

In the video below, former White House deputy CTO for open government, Beth Noveck, reflected on what the outcomes and results from the open government R&D summit at the end of the second day. If you’re interested in a report from one of the organizers, you’d be hard pressed to do any better.

The end of the beginning for open government?

The open government R&D summit has since come under criticism from one of its attendees, Expert Labs’ director of engagement Clay Johnson, for being formulaic, “self congratulatory” and not tackling the hard problems that face the country. He challenged the community to do better:

These events need to solicit public feedback from communities and organizations and we need to start telling the stories of Citizen X asked for Y to happen, we thought about it, produced it and the outcome was Z. This isn’t to say that these events aren’t helpful. It’s good to get the open government crowd together in the same room every once and awhile. But knowing the talents and brilliant minds in the room, and the energy that’s been put behind the Open Government Directive, I know we’re not tackling the problems that we could.

Noveck responded to his critique in a comment where she observed that “Hackathons don’t substitute for inviting researchers — who have never been addressed — to start studying what’s working and what’s not in order to free up people like you (and I hope me, too) to innovate and try great new experiments and to inform our work. But it’s not enough to have just the academics without the practitioners and vice versa.”

Justin Grimes, a Ph.D student who has been engaged in research in this space, was reflective after reading Johnson’s critique. “In the past few years, I’ve seen far more open gov events geared towards citizens, [developers], & industry than toward academics,” he tweeted. “Open gov is a new topic in academia; few people even know it’s out there; lot of potential there but we need more outreach. [The] purpose was to get more academics involved in conversation. Basically, government saying ‘Hey, look at our problems. Do research. Help us.'”

Johnson spoke with me earlier this year about what else he sees as the key trends of Gov 2.0 and open government, including transparency as infrastructure, smarter citizenship and better platforms. Given the focus he has put on doing, vs researching or, say, “blogging about it,” it will be interesting to see what comes out of Johnson and Expert Labs next.

Todd Park on unleashing the power of open data to improve health

What if open health data were to be harnessed to spur better healthcare decisions and catalyze the extension or creation of new businesses? That potential future exists now, in the present. Todd Park, chief technology officer of the Department of Heath and Human Services, has been working to unlock innovation through open health data for over a year now. On many levels, the effort is the best story in federal open data. Park tells it himself in the video below, recorded yesterday at the Mutter Museum in Philadelphia.

Over at e-patients.net, Pew Internet researcher Susannah Fox asked how community organizations can tap into the health data and development trend that Park has been working hard to ignite. She shared several resources (including a few from this correspondent) and highlighted the teams who competed in a health developer challenge tour that culminated at the recent Health 2.0 conference.

Check out this article about HealthData.gov including footage of Park talking about the “health data eco-system” at the code-a-thon (and actually, the video also features local health hacker Alan Viars sitting there at the right).

Here are 3 blog posts about last year’s event, including mine:

Making Health Data Sing (Even If It’s A Familiar Song)

Community Health Data Initiative: vast amounts of health data, freed for innovators to mash up!

Making community health information as useful as weather data: Open health data from Health and Human Services is driving more than 20 new apps.

The next big event in this space on June 9 at the NIH. If you’re interested in what’s next for open health data, track this event closely.

Micah Sifry on “Wikileaks and the Age of Transparency”

The emergence of Wikileaks as a global player in technology-fueled transparency was one of the biggest stories of 2010. Micah Sifry, co-founder of the Personal Democracy Forum and editor of techPresident, used Wikileaks as a peg to explore the new information ecosystem in his excellent new book, “Wikileaks and the Age of Transparency.” Last night in Washington, Sifry spoke about why he wrote the book and offered some cogent reflections about how transparency has gone global. Video of his talk is embedded below.

Few people have as rich an understanding of the intersection of technology and politics than Sifry. I’m looking forward to reading my new copy of the book immensely and, of course, to following his chronicling of the age of transparency in realtime at @mlsif.