John Wonderlich on the aspirations and limitations of open data initiatives

New abilities given to citizens and open government watchdogs through the lens of technology means sunlight can be applied to government transparency in powerful ways. This past weekend, John Wonderlich, policy director for the Sunlight Foundation, spoke about open data and transparency at the slashroots ./roots/DEV in Jamaica. His keynote is embedded below.

./roots/Dev Conference 2011 – Keynote Day 2 Pt 1/2 from slashroots on Vimeo.

Google Public Data Explorer adds Census Bureau data, state government statistics

Last month, the Google Public Data Explorer went public. Today, Google added U.S. Census Bureau and state government finance statistics to the database, allowing everyone to gain new insight into our present.

The numbers may be beautifully displayed but they tell a grim tale when it comes to state budgets. The crisis in state budgets across the country will be the primary driver for the adoption of new approaches to governance and service delivery in 2011. If Gov 2.0 goes local, citizensourcing smarter government couldn’t come at a more timely moment.

Disaster 2.0: UN OCHA releases report on future of information sharing in crisis

The emergence of crisiscamps and subsequent maturation of CrisisCommons into a platform for civic engagement were important developments in 2010. Hearing digital cries for help has never been more important. A year after the devastating earthquake in Haiti, a new report by a team at the Harvard Humanitarian Initiative analyzes how the humanitarian, emerging volunteer and technical communities collaborated in the aftermath of the quake. The report recommends ways to improve coordination between these groups in future emergencies. There are 5 specific recommendations to address the considerable challenges inherent in coordinating crisis response:

  1. A neutral forum to surface areas of conflict or agreement between the volunteer/technical community and established humanitarian institutions
  2. An space for innovation where new tools and approaches can be experimented with before a crisis hits
  3. A deployable field team with the mandate to use the best practices and tools established by the community
  4. A research and development group to evaluate the effectiveness of tools and practices
  5. An operational interface that identifies procedures for collaboration before and during crises, including data standards for communication

Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies” was commissioned by the United Nations Foundation and Vodafone Foundation Technology Partnership in collaboration with the UN Office for the Coordination of Humanitarian Affairs (OCHA). You can find more discussion of the report in a series of posts on disaster relief 2.0 at UNDispatch.com, like this observation from Jen Ziemke:

…a substantial majority of members on the Crisis Mappers Network have held positions in formal disaster response, some for several decades. Volunteers in groups like the Standby Task Force include seasoned practitioners with the UNDP or UN Global Pulse. But what is really needed is a fundamental rethinking of who constitutes the “we” of disaster response, as well as dispensing with current conceptions of: “volunteers”, “crowds,” and “experts.” While distinctions can be endlessly debated, as humans, we are far more the same than we are different.

Whether it’s leveraging social media in a time of need or geospatial mapping, technology empowers us to help one another more than ever. This report offers needed insight about how to do it better.

A movement to spur innovation and participation in government

This past weekend, Syracuse MPA grad student Pat Fiorenza spoke about Gov 2.0 at the We Live NY Conference in upstate New York. In a wrap up posted after the conference, Fiorenza touched of what people think about when they hear “Gov 2.0,” including:

Fiorenza’s recap of his Gov 2.0 presentation also describes both why the idea is important to him and why it’s important to people who aren’t developers.

“Gov 2.0 extends beyond a great programmer – I’ve noticed that when I talk to some people about Gov 2.0 they immediately associate me as a geeky-computer programming-MPA student (only 2 of the 3!). I’ve developed a passion for Gov 2.0 because it holds so much potential for government. It’s about getting access to data and information immediately, improving constituent services, crowd sourcing information, and empowering citizens. Gov 2.0 requires someone to identify an existing problem and conceptualize a solution – then someone to run with the idea and develop the program, with a lot of collaboration in between.”

Fiorenza also pointed the way to Remy DeCausemaker (@remy_d, a “resident hacktivist and storyteller” at the Rochester Institute for Technology’s Lab for Technological Literacy, who also presented on Gov 2.0 at the conference.

DeCausemaker works on FOSS at RIT and CIVX, an open source public information system for raw data. His presentation (PDF) on open government and open data will be of interest to many people in the Gov 2.0 community.

Building a revolution in relevance in an age of information abundance

Revolutions rooms“We’ve had a decade’s worth of news in less than two months,” Mike Allen, chief White House correspondent for Politico. In the Saturday edition of Politico’s Playbook, Allen looked back at the Arab Spring and Japanese ongoing challenges:

It was Feb. 11 – seven weeks ago — that Mubarak fled the Arab spring, a rolling reordering of Middle East power that could wind up affecting global security as profoundly as 9/11.

It was March 11 – 15 days ago – that we woke to the news of the Japanese earthquake and tsunami, which will have ripple effects on the fragile global economy for months to come.

And, oh, we’re in three hot conflicts at once, for the first time since World War II.”

Related, in the NEW YORK TIMES: “Inundated With News, Many Find It Difficult to Keep Up on Libya

“People interviewed across four states said that at a time when the world seems to stagger from one breathtaking news event to another — rolling turmoil across the Middle East, economic troubles at home, disaster upon disaster in Japan — the airstrikes on military targets in Libya can feel like one crisis too many.”

Through it all, I’ve been following Andy Carvin (@acarvin), whose Twitter feed has been a groundbreaking curation of the virtual community and conversation about the Middle East, including images, video, breaking news and unverified reports.

To wax metaphorical, his account has become a stream of crisis data drawn from from the data exhaust created by the fog of war across the Middle East, dutifully curated by a veteran digital journalist for up to 17 hours a day.

Carvin has linked to reports, to video and images from the front lines that are amongst the most graphic images of war I have ever seen. While such imagery is categorically horrific to view, they can help to bear witness to what is happening on the ground in countries where state media would never broadcast their like.

The vast majority of the United States, however, is not tracking what’s happening on the ground in the region so closely. NEW YORK TIMES:  

“A survey by the Pew Research Center — conducted partly before and partly after the bombing raids on Libya began on March 19 — found that only 5 percent of respondents were following the events ‘very closely.’ Fifty-seven percent said they were closely following the news about Japan.”

Understanding the immensity of the challenges that face Japan, Egypt and Libya is pushing everyone’s capacity to stay informed with day to day updates, much less the larger questions of what the larger implications of these events all are for citizens, industry or government. In the context of the raw information available to the news consumer in 2011, that reality is both exciting and alarming. The tools for newsgathering and dissemination are more powerful and democratized than ever before. The open question now is how technologists and journalists will work together to improve them to provide that context that everyone needs.

Finally, an editor’s note: My deepest thanks to all of the brave and committed journalists working long hours, traveling far from their families and risking their lives under hostile regimes for the reporting that helps us make it so.

Beth Noveck on connecting the academy to open government R&D

Earlier this week, the White House convened an open government research and development summit at the National Archives. Columbia statistics professor Victoria Stodden captures some key themes from it at her blog, including smart disclosure of government data and open government at the VA. Stodden also documented the framing questions that federal CTO Aneesh Chopra asked for help answered from the academic community:

1. big data: how strengthen capacity to understand massive data?
2. new products: what constitutes high value data?
3. open platforms: what are the policy implications of enabling 3rd party apps?
4. international collaboration: what models translate to strengthen democracy internationally?
5. digital norms: what works and what doesn’t work in public engagement?

In the video below, former White House deputy CTO for open government, Beth Noveck, reflected on what the outcomes and results from the open government R&D summit at the end of the second day. If you’re interested in a report from one of the organizers, you’d be hard pressed to do any better.

The end of the beginning for open government?

The open government R&D summit has since come under criticism from one of its attendees, Expert Labs’ director of engagement Clay Johnson, for being formulaic, “self congratulatory” and not tackling the hard problems that face the country. He challenged the community to do better:

These events need to solicit public feedback from communities and organizations and we need to start telling the stories of Citizen X asked for Y to happen, we thought about it, produced it and the outcome was Z. This isn’t to say that these events aren’t helpful. It’s good to get the open government crowd together in the same room every once and awhile. But knowing the talents and brilliant minds in the room, and the energy that’s been put behind the Open Government Directive, I know we’re not tackling the problems that we could.

Noveck responded to his critique in a comment where she observed that “Hackathons don’t substitute for inviting researchers — who have never been addressed — to start studying what’s working and what’s not in order to free up people like you (and I hope me, too) to innovate and try great new experiments and to inform our work. But it’s not enough to have just the academics without the practitioners and vice versa.”

Justin Grimes, a Ph.D student who has been engaged in research in this space, was reflective after reading Johnson’s critique. “In the past few years, I’ve seen far more open gov events geared towards citizens, [developers], & industry than toward academics,” he tweeted. “Open gov is a new topic in academia; few people even know it’s out there; lot of potential there but we need more outreach. [The] purpose was to get more academics involved in conversation. Basically, government saying ‘Hey, look at our problems. Do research. Help us.'”

Johnson spoke with me earlier this year about what else he sees as the key trends of Gov 2.0 and open government, including transparency as infrastructure, smarter citizenship and better platforms. Given the focus he has put on doing, vs researching or, say, “blogging about it,” it will be interesting to see what comes out of Johnson and Expert Labs next.

Todd Park on unleashing the power of open data to improve health

What if open health data were to be harnessed to spur better healthcare decisions and catalyze the extension or creation of new businesses? That potential future exists now, in the present. Todd Park, chief technology officer of the Department of Heath and Human Services, has been working to unlock innovation through open health data for over a year now. On many levels, the effort is the best story in federal open data. Park tells it himself in the video below, recorded yesterday at the Mutter Museum in Philadelphia.

Over at e-patients.net, Pew Internet researcher Susannah Fox asked how community organizations can tap into the health data and development trend that Park has been working hard to ignite. She shared several resources (including a few from this correspondent) and highlighted the teams who competed in a health developer challenge tour that culminated at the recent Health 2.0 conference.

Check out this article about HealthData.gov including footage of Park talking about the “health data eco-system” at the code-a-thon (and actually, the video also features local health hacker Alan Viars sitting there at the right).

Here are 3 blog posts about last year’s event, including mine:

Making Health Data Sing (Even If It’s A Familiar Song)

Community Health Data Initiative: vast amounts of health data, freed for innovators to mash up!

Making community health information as useful as weather data: Open health data from Health and Human Services is driving more than 20 new apps.

The next big event in this space on June 9 at the NIH. If you’re interested in what’s next for open health data, track this event closely.

The US CIO goes to the white board to describe good government

Earlier this week, United States CIO Vivek Kundra turned to the White House whiteboard to talk about sunshine, savings and service. If you’re unfamiliar with Kundra, he’s the man who has proposed and now is entrusted with implementing sweeping federal IT reform. One of the tools he’s been applying to the task is the so-called IT dashboard, which helps the White House Office of Management and Budget, where he serves to track IT spending. He claims to have reduced federal IT spending by some $3 billion dollars over the past two years with increased tracking and scrutiny.The federal CIO explains more about the results from that work, below.

http://www.whitehouse.gov/sites/all/modules/swftools/shared/flash_media_player/player5x2.swf

UPDATE: As open data consultant Dan Morgan pointed out, however, the Government Accountability Office reported that while OMB has made improvements to its dashboard, “further work is needed by agencies and OMB to ensure data accuracy.”

…inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance. Agencies generally concurred with the recommendations; OMB did not concur with the first recommendation but concurred with the second. GAO maintains that until OMB implements both, performance may continue to be inaccurately represented on the Dashboard.

One question left unanswered: Is /good the new /open? Decide for yourself at the newGood Government” section at WhiteHouse.gov.

Improving open government oversight through FOIA reform

The Freedom of Information Act is one of the primary levers by which journalists, government watchdogs and other organizations can hold the United States government accountable. Today in Washington, the U.S. House of Representatives Committee on Oversight and Government Reform held a hearing on “The Freedom of Information Act: Crowd-Sourcing Government Oversight.
Full House Oversight and Reform Committee

The testimony of the witnesses made it clear that major issues persist with the cost, mechanism and compliance with FOIA requests made to government agencies.

Public information should be online in real time, said Representative Darrell Issa (R-CA) (@DarrellIssa), chairman of the committee.

His prepared statement provided context for the focus of the hearing:

The Freedom of Information Act (“FOIA”) is one of the most important tools for government transparency and accountability. It permits the private-sector, the media, watchdog groups, and the general public to scrutinize the activities of federal agencies – from the telephone logs and email correspondence of federal employees to internal memoranda, transcripts, and meeting minutes.

Minus a few specific exemptions designed to protect narrowly-defined privacy concerns, national security and law enforcement matters, claims of executive privilege and trade secrets, information about the government’s work is required by law to be publicly accessible. Indeed, every federal agency, commission, department and corporation – as well as the White House itself – falls under FOIA’s expansive authority.

Representative Elijah Cummings (D-MD) defended the record of the Obama administration on open government and quoted President James Madison in his opening statement:

A popular government without popular information or the means of acquiring it, is but a prologue to a farce, or a tragedy, or perhaps both. Knowledge will forever govern ignorance, , and a people who mean to be their own governors, must arm themselve with the power knowledge gives.”

Rep. Cummings introduced a new bill today, entitled the “Transparency and Openness in Government Act.” (As of the time this post went live, it was not in Thomas.gov yet.) According to Rep. Cummings, the bill would make federal commissions more transparent, increase access to records, ensure government email records were preserved and improve GAO access to govt records. The legislation includes five bills that passed the House during the 111th Congress:

  • The Federal Advisory Committee Act requires agencies to disclose more information about advisory committees and closes existing loopholes;
  • The Presidential Records Act increases public access to White House records by establishing statutory procedures prior to FOIA releases;
  • The Presidential Libraries Donation Reform Act mandates greater public disclosure of library donor information;
  • The Electronic Message Preservation Act modernizesthe Federal Records Act and the Presidential Records Act to ensure that White House and agency e-mail records are preserved;
  • The GAO Improvement Act strengthens the authority of the Government Accountability Office to access agency records.

Transparency shouldn’t be a partisan issue, emphasized Cummings.

Miriam Nisbit of OGIS

The committee heard from a distinguished panel of witnesses, including Miriam Nesbit, the director of the Office of Government Information Services (OGIS) at the National Archives and Records Administration. OGIS opened in September 2009 and acts as an ombudsman for FOIA request. “OGIS encourages a more collaborative, accessible FOIA for everyone,” said Nisbet.

While both witnesses and congressmen recognized that the Department of Justice launched FOIA.gov at the outset of Sunshine Week, “there is the awkward fact the Justice Department’s own FOIA backlog has not been reduced in the past year,” observed Daniel Metcalfe, executive director of Collaboration on Government Secrecy.

The costs of FOIA are part of that story. “In 2010, agencies reported nearly $400 million to process FOIA requests,” testified Rick Blum of SunshineInGovernment.org.

There’s also the issue of agencies and officials claimed exemptions to requests. Blum noted that for Sunshine Week, ProPublica created a searchable database of FOIA exemptions.

These claimed exemptions extend to the White House. Tom Fitton of Judicial Watch challenged the Secret Services’ contention that visitor logs are not subject to FOIA.

While the Project of Government Oversight’s Angela Canterbury gave the administration credit for proactive information release at USASpending.gov, Data.gov, Recovery.gov and FOIA.gov, she acknowledged that “if FOIA is the yardstick for openness, then we haven’t gotten very far yet.”

The issue lies is in the default towards secrecy versus openness. “Too often, overt secrecy has not only impaired the promise of FOIA but also has put the American people at risk,” said Canterbury.

That said, Daniel Metcalfe did offer recognition of President Obama’s elevation of open government in his administration, including a speech at the United Nations where openness was highlighted in an “unprecedented” way.

The written testimony of the witnesses is linked below. Video of the hearing will be available through the tireless efforts of citizen archivist Carl Malamud at House.Resource.org later in the week.

Making open government data visualizations that matter

Every month, more open government data is available online. Local governments are becoming data suppliers. Open healthcare data is spurring better decisions. There’s a tremendous amount of activity in open data – but there’s a long road ahead for open government. At the SXSW Interactive Festival in Austin, Texas, Jeremiah Akin and Michael Castellon made a case for “why visualizing government data makes taxpayers happy.”

The expectation of transparency is creating demand for government agencies to develop new ways to communicate complex data and trends to the public in easy-to-access and easy-to-understand formats.

Some agencies are turning to Google Maps and KML data to visualize raw information online and on mobile devices. Delivering data in more easily understandable formats not only boosts trust and confidence between government agencies and their publics, but also streamlines workloads among Data, Web, Editorial, and Customer Service teams.

The two men talked about how the Texas Comptroller is using public-facing maps to communicate with the public, including to the rapidly increasing numbers of citizens accessing government websites from mobile devices.

The utility of open government data can be quite concrete, as when live tsunami data is used to help save lives. It can also help people to understand more about the virtual lines in their towns and cities. In Texas, ClaimItTexas.org shows unclaimed property in Lone Star State.

Mobile transparency

The Texas Transparency Map is also available for touchscreen mobile devices and for tablets. That’s no accident: Akin said that mobile traffic to the site up four-fold since lat year.

“We’re seeing a lot more mobile access,” he said. “If we want to make it available on multiple devices, we need to create in a way that can be displayed.” That insight is a crucial one, and reverberates far beyond the government sphere. By choosing to develop non-native Web applications written in HTML5, Javascript and JSON, this cohort of Texas government avoided “Shiny App Syndrome.” Next steps include support for street level detail, Google Fusion tables, and geolocation.

Putting open government data to work

“Open government data has been used for a long time,” said Akin, citing the use of census data in newspapers. A new class of new media journalism is putting data to use in innovative ways, pointed out Castellon. “The Texas Tribune is one of the leaders in data visualization,” he said, which helps citizens to make sense of government data.

The key here, emphasized Akin, is that is not just enough to simply dump data. You need ways to visualize it and make it meaningful as information. “There’s a lot of resistance – people have been there, and that’s not how they’ve done things,” he said. “If you make a visualization that makes someone’s job easier pretty soon they start coming back to you.

With better data visualizations and more information, Castellon posited that more problem solving can take place. “When you release data, especiallly with science, education or research, there are stories embedded in that data,” he said.

In this narrative, it’s up to governments to release better, clean data in consumable formats and the evolving art of data journalism to make stories from it that give citizens, businesses and elected officials insight into the complexities of modern life.