MacRae on open sourcing Data.gov as an open government platform

Andrew MacRae, the program manager for strategy and innovation forData.gov, spoke about
at Data.gov Developer Day in Washington about how the General Services Agency (GSA) and India government plan to collaborate on open sourcing the United States federal government’s open data platform.

Chief software architect of Data.gov outlines next steps for the open government platform

Chris Musialek, speaking at the Data.gov Developer Day, talks about next steps for the United States federal government’s open data platform.

A future of cities fueled by citizens, open data and collaborative consumption

The future of cities was a hot topic this year at the SXSW Interactive Festival in Austin, Texas, with two different panels devoted to thinking about what’s next. I moderated one of them, on “shaping cities with mobile data.” Megan Schumann, a consultant at Deloitte, was present at both sessions and storified them. Her curatorial should gives you a sense of the zeitgeist of ideas shared.

http://storify.com/Schumenu/sxsw-future-cities-fueled-by-citizens-and-collabo.js[View the story “#SXSW Future Cities Fueled by Citizens and Collaborative Consumption” on Storify]

A Conversation About Social Media, Open Government and eDemocracy [VIDEO]

If the town square now includes public discourse online, democratic governments in the 21st century are finding that part of civic life now includes listening there. Given what we’ve seen in this young century, how governments deal with social media is now part of how they deal with civil liberties, press freedom, privacy and freedom of expression in general.

At the end of Social Media Week 2012, I moderated a discussion with Matt Lira, Lorelei Kelly our Clay Johnson at the U.S. National Archives. This conversation explored more than how social media is changing politics in Washington: we looked at its potential to can help elected officials and other public servants make better policy decisions in the 21st century.

I hope you find it of interest; all three of the panelists gave thoughtful answers to the questions that I and the audience posed.

On the ambiguity of open government and open data

A new paper on “The New Ambiguity of ‘Open Government’” by Princeton scholars David Robinson and Harlan Yu is essential reading on the state of open government and open data in 2012. As the Cato Institute’s Jim Harper noted in a post about the new paper and open government data this morning, “paying close attention to language can reveal what’s going on in the world around you.”

https://twitter.com/harlanyu/status/174861718621655040

From the abstract:

“Open technologies involve sharing data over the Internet, and all kinds of governments can use them, for all kinds of reasons. Recent public policies have stretched the label “open government” to reach any public sector use of these technologies. Thus, “open government data” might refer to data that makes the government as a whole more open (that is, more transparent), but might equally well refer to politically neutral public sector disclosures that are easy to reuse, but that may have nothing to do with public accountability. Today a regime can call itself “open” if it builds the right kind of web site—even if it does not become more accountable or transparent. This shift in vocabulary makes it harder for policymakers and activists to articulate clear priorities and make cogent demands.

This essay proposes a more useful way for participants on all sides to frame the debate: We separate the politics of open government from the technologies of open data. Technology can make public information more adaptable, empowering third parties to contribute in exciting new ways across many aspects of civic life. But technological enhancements will not resolve debates about the best priorities for civic life, and enhancements to government services are no substitute for public accountability.”

Yu succinctly explained his thinking in two more tweets:

https://twitter.com/harlanyu/status/174862012445245440

https://twitter.com/harlanyu/status/174862230305779712

While it remains to be seen whether the Open Knowledge Foundation will be “open” to changing the “Open Data Handbook” to the “Adaptable Data Handbook,” Yu and Robinson are after something important here.

There’s good reason to be careful about celebrating the progress in cities, states and counties are making in standing up open government data platforms. Here’s an excerpt from a post on open government data on Radar last year:

Open government analysts like Nathaniel Heller have raised concerns about the role of open data in the Open Government Partnership, specifically that:

“… open data provides an easy way out for some governments to avoid the much harder, and likely more transformative, open government reforms that should probably be higher up on their lists. Instead of fetishizing open data portals for the sake of having open data portals, I’d rather see governments incorporating open data as a way to address more fundamental structural challenges around extractives (through maps and budget data), the political process (through real-time disclosure of campaign contributions), or budget priorities (through online publication of budget line-items).”

Similarly, Greg Michener has made a case for getting the legal and regulatory “plumbing” for open government right in Brazil, not “boutique Gov 2.0” projects that graft technology onto flawed governance systems. Michener warned that emulating the government 2.0 initiatives of advanced countries, including open data initiatives:

“… may be a premature strategy for emerging democracies. While advanced democracies are mostly tweaking and improving upon value-systems and infrastructure already in place, most countries within the OGP have only begun the adoption process.”

Michener and Heller both raise bedrock issues for open government in Brazil and beyond that no technology solution in of itself will address. They’re both right: Simply opening up data is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.

Heller and Michener speak for an important part of the open government community and surely articulate concerns that exist for many people, particularly for a “good government” constituency whose long term, quiet work on government transparency and accountability may not be receiving the same attention as shinier technology initiatives.

Harper teased out something important on that count: “There’s nothing wrong with open government data, but the heart of the government transparency effort is getting information about the functioning of government. I think in terms of a subject-matter trio—deliberations, management, and results—data about which makes for a more open, more transparent government. Everything else, while entirely welcome, is just open government data.”

This new paper will go a long way to clarifying and teasing out those issues.

The expanding world of open data journalism

From healthcare to finance to emergency response, data holds immense potential to help citizens and government. Putting data to work for the public good, however, will require data journalists to apply the powerful emerging tools in the newsroom stack to the explosion of information from government, business and their fellow citizens. The promise of data journalism has been a strong theme throughout the National Institute for Computer-Assisted Reporting’s (NICAR) 2012 conference.

It was in that context that I presented upon “Open Data Journalism” this morning, which, to paraphrase Jonathan Stray, I’d define as obtaining, reporting upon, curating and publishing open data in the public interest. My slides, which broadly describe what I’m seeing in the world of open government today, are embedded below.

San Francisco pitches lean government as a platform for innovation [PRESENTATION]

Over at TechCrunch, Eric @Eldon reports that “San Francisco Launches The 2012 Innovation Portfolio, From Open Taxi Data To Beta Tests In City Hall,” sourcing the post on a presentation from the city’s innovation staff, which I’ve embedded below. Eldon posts a summary over in his post but here’s the gist of it:

Mayor Ed Lee, who came to power last year with heavy support from the local tech scene, is announcing a new initiative today at the TechFellows awards ceremony, that has some intriguing ideas for making the city itself more relevant to the booming industry within it.

Broadly, the so-called 2012 Innovation Portfolio is trying to do everything from helping founders making it easier to complete the paperwork for creating a company, to giving developers new access to city data, to introducing new ways for citizens to share their opinions with the city, to actually testing out tech products at City Hall itself.

As Sara Lai Stirland reported last month, however, while San Francisco’s plans for open government, open data, open doors to new business and better services is focused on worthy goals, achieving them won’t be a walk in Golden Gate Park. Then again, it’s rare that anything worth doing is easy.

Honestly, in reading this over, I’m not sure about how much of this innovation initiative is truly new, although there is one news nugget “As part of this effort, the City is moving to a cloud-based data sharing service for launch in March.”

While that appears to have perplexed Eldon, many Govfresh readers will be able decipher it: San Francisco looks likely to be adopting Socrata next month. If so, that means that, in theory, civic developers will have more (better?) APIs for SF open data soon.

I have a feature in the works on what San Francisco is up to in open government and will report back when I have more to share.

Update: Govfresh founder Luke Fretwell noticed that San Francisco’s new innovation site is running on WordPress. In doing so, the city government would be adopting two of the planks from Luke’s manifesto to reboot government innovation in San Francisco. It’s a promising start.

Regulations.gov relaunches with APIs, integrates social media, hopes for public participation

President Barack Obama signs H.R. 2751, the “FDA Food Safety Modernization Act,” in the Oval Office, Jan. 4, 2011. (Official White House Photo by Pete Souza)

President Barack Obama in the Oval Office, Jan. 4, 2011. (Official White House Photo by Pete Souza)

On January 18, 2011, President Obama issued an executive order directing that regulations shall be adopted through a process that involves participation. 13 months later, the nation’s primary online regulatory website received an overdue redesign and, significantly, a commitment from the administrator of the White House Office of Information and Regulatory Affairs (OIRA) to make regulatory data available to the public.

Today, the White House announced the relaunch of Regulations.gov in a post on remaking public participation by Cass Sunstein, the administrator of the OIRA:

…the President issues Executive Order 13563, in which he directed regulatory agencies to base regulations on an “open exchange of information and perspectives” and to promote public participation in Federal rulemaking.  The President identified Regulations.gov as the centralized portal for timely public access to regulatory content online.

In response to the President’s direction, Regulations.gov has launched a major redesign, including innovative new search tools, social media connections, and better access to regulatory data.  The result is a significantly improved website that will help members of the public to engage with agencies and ultimately to improve the content of rules.

The redesign of Regulations.gov also fulfills the President’s commitment in The Open Government Partnership National Action Plan to “improve public services,” including to “expand public participation in the development of regulations.” This step is just one of many, consistent with the National Action Plan, designed to make our Federal Government more transparent, participatory, and collaborative.

I’ve embedded the video that Regulations.gov released about the launch below:

The relaunch includes the following changes:

  • New Regulations.gov and Web design.
  • A new “Browse” tab that groups regulations into 10 categories, sorted by industry
  • A new “Learn” tab that describes the regulatory process
  • Improved search
  • Integrated social media tools (Twitter, Facebook, YouTube and Regulations.gov Exchange)
  • New Application Programming Interfaces (APIs) and standard, Federal Register-specific URLs.

That last detail will be of particular interest to the open government and open data community. Sunstein explained the thinking behind the role of APIs at the WhiteHouse.gov blog:

Application Programming Interfaces (APIs) are technical interfaces/tools that allow people to pull regulatory content from Regulations.gov. For most of us, the addition of “APIs” on Regulations.gov doesn’t mean much, but for web managers and experts in the applications community, providing APIs will fundamentally change the way people will be able to interact with public federal regulatory data and content.

The initial APIs will enable developers to pull data out of Regulations.gov, and in future releases, the site will include APIs for receiving comment submissions from other sites. With the addition of APIs, other web sites – ranging from other Government sites to industry associations to public interest groups – will now be able to repurpose publicly-available regulatory information on Regulations.gov, and format this information in unique ways such as mobile apps, analytical tools, “widgets” and “mashups.” We don’t know exactly where this will lead us – technological advances are full of surprises – but we are likely to see major improvements in public understanding and participation in rulemaking.

While the APIs will need to be explored and the data behind them assessed for quality, releasing regulatory data through APIs could in theory underpin a wide variety of new consumer-facing services. If you’re interested in the APIs, click on “Developers – Beta” at Regulations.gov to download a PDF with that contains API directions, URLs and information about an API Key.

A time for e-rulemaking

This move comes as part of a larger effort towards e-rulemaking by this White House that will almost certainly be carried over into future administrations, regardless of the political persuasion of the incumbent of the Oval Office. In the 21st century, the country desperately needs a smarter approach to regulations.

As the Wall Street Journal reported last year, the ongoing regulatory review by OIRA is a nod to serious, long-standing concerns in the business community about excessive regulation hampering investment and job creation as citizens struggle to recover from the effects of the Great Recession.

As the cover story of this month’s issue of The Economist highlights, concerns about an over-regulated America are cresting in this election year, with headlines from that same magazine decrying “excessive environmental regulation” and calling for more accurate measurement of the cost of regulations. Deleting regulations is far from easy to do but there does appear to be a political tailwind behind doing so.

We’ll see if an upgraded online portal that is being touted as a means to include the public in participating in rulemaking makes any difference in regulatory outcomes. Rulemaking and regulatory review are, virtually by their nature, wonky and involve esoteric processes that rely upon knowledge of existing laws and regulations.

While the Internet could involve many more people in the process, improved outcomes will depend upon an digitally literate populace that’s willing to spend some of its civic surplus on public participation.

To put it another way, getting to “Regulations 2.0” will require “Citizen 2.0” — and we’ll need the combined efforts of all our schools, universities, libraries, non-profits and open government advocates to have a hope of successfully making that upgrade.

Will ESRI allow public GIS data to be fully open government data?

As has been true for years, there’s a robust debate in municipal information technology world around the use of proprietary software or open source. An important element of that conversation centers on open data, specifically whether the formats used by companies are interoperable and “open,” in the sense of being usable by more than one kind of software. When the license required to use a given software application is expensive, that requirement can put budget-strapped cities and towns in a difficult position. Last week, former New York State Senate CIO Andrew Hoppin weighed in on the debate, writing about proprietary software lions and bears in the Civic Commons marketplace, a new online directory of civic software.

http://storify.com/nickgrossman/proprietary-lions-and-bears-in-the-civic-commons-m.js I believe the Civic Commons Marketplace will ultimately save US taxpayers billions of dollars in government IT spending, while accelerating the propagation of technology-driven civic innovation in the bargain.  I’ve believed this for a while.   Thus, it’s a debate worth having; the Marketplace deserves attention, and critique.

In order to realize its potential, from my perspective as a recovering government CIO, I believe that the Civic Commons Marketplace must give equal billing to all software used in government, regardless of the software license associated with it.

Nick Grossman, the executive director of Civic Commons, chronicled the debate that Hoppin described in a Storify:

http://storify.com/nickgrossman/proprietary-lions-and-bears-in-the-civic-commons-m.js[View the story “Proprietary Lions and Bears in the Civic Commons Marketplace” on Storify]

I talked with ESRI founder Jack Dangermond in September 2010 about how he was opening up ESRI and the role he saw for mapping in open government. My sense then, as now, is that this is an issue that’s deeply important to him.

There are clearly strong feelings in the civic development community about the company’s willingness to open up its data, along with what that means for how public data is coded and released. If you’re a GIS developer and have an opinion on this issue, please let us know in the comments.

Expert Labs data: How does the @WhiteHouse drive engagement on Twitter? [INFOGRAPHIC]

Over at ExpertLabs, Andy Baio created a snazzy infographic of engagement around the White House’s Twitter account using data collected through the ThinkUp App.

There are lots of views into engagement on Twitter, but we have the data to give a unique view into what it looks like from the @whitehouse perspective.

We’ve tracked their activity for the last couple years using ThinkUp to analyze and publicly release large datasets. We decided it might be nice show how the White House engaged their audience last year — without resorting to cheap gimmicks like linkbait infographics.

As Baio points out, if you want to work some mojo on this data set, you can download the .CSV file and have some fun. Kudos to the Expert Labs team for making both the open data and visualization available to all.