Yesterday, the United States Freedom of Information Act Advisory Committee met at the National Archives in Washington and approved a series of recommendations that would, if implemented, dramatically improve public access to public information. And in May, it will consider … Continue reading
Less than a year after I called for tech companies to publish a a public political ad file as open data online, Facebook has committed to doing so this August, through an API. Working with Congress to draft a law … Continue reading
Yesterday, I wrote up 15 key insights from the Pew Internet and Life Project’s new research on the American public’s attitude towards open data and open government. If you missed it, what people think about government data and the potential impact of releasing it is heavily influenced by the prevailing low trust in government and their politics.
Media coverage of the survey reflected the skepticism of the reporters (“Most Americans don’t think government transparency matters a damn“) or of the public (“Who cares about open data” and “Americans not impressed by open government initiatives“). This photo by Pete Souza below might be an apt image for this feeling:
Other stories pulled out individual elements of the research (“Open data on criminals and teachers is a-okay, say most US citizens” or mixed results (“People Like U.S. Open Data Initiatives, But Think Government Could Do More” and “Sorry, open data: Americans just aren’t that into you“) or general doubts about an unfamiliar topic (“Many Americans Doubt Government Open Data Efforts“). At least one editor’s headline suggested that the results were an indictment of everything government does online: (“Americans view government’s online services and public data sharing as a resounding ‘meh’.) Meh, indeed.
As usual, keep a salt shaker handy as you browse the headlines and read the original source. The research itself is more nuanced than those headlines suggest, as my interview with the lead researcher on the survey, John Horrigan, hopefully made clear.
Over at TechPresident, editor-in-chief Micah Sifry saw a glass half full:
- Digging deeper into the Pew report, it’s interesting to find that beyond the “ardent optimists” (17% of adults) who embrace the benefit of open government data and use it often, and the “committed cynics” (20%) who use online government resources but think they aren’t improving government performance much, there’s a big group of “buoyant bystanders” (27%) who like the idea that open data can improve government’s performance but themselves aren’t using the internet much to engage with government. (Heads up Kate Krontiris, who’s been studying the “interested bystander.”)
- It’s not clear how much of the bystander problem is also an access problem. According to a different new analysis done by the Pew Research Center, about five million American households with school-age children–nearly one in five–do not have high-speed internet access at home. This “broadband gap” is worst among households with incomes under $50,000 a year.
Reaction from foundations that have advocated, funded or otherwise supported open government data efforts went deeper. Writing for the Sunlight Foundation, communications director Gabriela Schneider saw the results from the survey in a rosy (sun)light, seeing public optimism about open government and open data.
People are optimistic that open data initiatives can make government more accountable. But, many surveyed by Pew are less sure open data will improve government performance. Relatedly, Americans have not quite engaged very deeply with government data to monitor performance, so it remains to be seen if changes in engagement will affect public attitudes.
That’s something we at Sunlight hope to positively affect, particularly as we make new inroads in setting new standards for how the federal government discloses its work online. And as Americans shift their attention away from Congress and more toward their own backyards, we know our newly expanded work as part of the What Works Cities initiative will better engage the public, make government more effective and improve people’s lives.
Jonathan Sotsky, director of strategy and assessment for the Knight Foundation, saw a trust conundrum for government in the results:
Undoubtedly, a greater focus is needed on explaining to the public how increasing the accessibility and utility of government data can drive accountability, improve government service delivery and even provide the grist for new startup businesses. The short-term conundrum government data initiatives face is that while they ultimately seek to increase government trustworthiness, they may struggle to gain structure because the present lack of trust in government undermines their perceived impact.
Steven Clift, the founder of e-democracy.org, views this survey as a wakeup call for open data advocates.
One reason I love services like CityGram, GovDelivery, etc. is that they deliver government information (often in a timely way) to the public based on their preferences/subscriptions. As someone who worked in “e-government” for the State of Minnesota, I think most people just want the “information” that matters to them and the public has no particular attachment to the idea of “open data” allowing third parties to innovate or make this data available. I view this survey as a huge wake up call to #opengov advocates on the #opendata side that the field needs to provide far more useful stuff to the general public and care a lot more about outreach and marketing to reach people with the good stuff already available.
Mark Headd, former chief data officer for the City of Philadelphia and current developer evangelist for Accela software, saw the results as a huge opportunity to win hearts and minds:
The modern open data and civic hacking movements were largely born out of the experience of cities. Washington DC, New York City and Chicago were among the first governments to actively recruit outside software developers to build solutions on top of their open data. And the first governments to partner with Code for America – and the majority over the life of the organization’s history – have been cities.
How do school closings impact individual neighborhoods? How do construction permit approvals change the character of communities? How is green space distributed across neighborhoods in a city? Where are vacant properties in a neighborhood – who owns them and are there opportunities for reuse?
These are all the kinds of questions we need people living and working in neighborhoods to help us answer. And we need more open data from local governments to do this.
If you see other blog posts or media coverage that’s not linked above, please let me know. I storified some reactions on Twitter but I’m certain that I missed conversations or opinions.
There are two additional insights from Pew that I didn’t write about yesterday that are worth keeping in mind with respect to how how Americans are thinking about the release of public data back to the public. First, it’s unclear whether the public realizes they’re using apps and services built upon government data, despite sizable majorities doing so.
— Alex Howard (@digiphile) April 21, 2015
Second, John Horrigan told me that survey respondents universally are not simply asking for governments to make the data easier to understand so that they can figure out what I want to figure out: what people really want is intermediaries to help them make sense of the data.
“We saw a fair number of people pleading in comments for better apps to make the data make sense,” said Horrigan. “When they went online, they couldn’t get budget data to work. When the found traffic data, couldn’t make it work. There were comments on both sides of the ledger. Those that think government did an ok job wish they did this. Those that thin government is doing a horrible job also wish they did this.”
This is the opportunity that Headd referred to, and the reason that data journalism is the critical capacity that democratic governments which genuinely want to see returns on accountability and transparency must ensure can flourish in civil society.
If a Republican is elected as the next President of the United States, we’ll see if public views shift on other fronts.
Today, Open Knowledge released its global 2014 Open Data Index, refreshing its annual measure of the accessibility and availability of government releases of data online. When compared year over year, these indices have shown not only the relatives openness of data between countries but also the slow growth in the number of open data sets. Overall, however, the nonprofit found that the percentage of open datasets across all 97 surveyed countries (up from 63 in 2013) remained low, at only 11%.
“Opening up government data drives democracy, accountability and innovation,” said Rufus Pollock, the founder and president of Open Knowledge, in a statement. “It enables citizens to know and exercise their rights, and it brings benefits across society: from transport, to education and health. There has been a welcome increase in support for open data from governments in the last few years, but this year’s Index shows that real progress on the ground is too often lagging behind the rhetoric.”
The map below can be explored in interactive form at the Open Knowledge website.
Open Knowledge also published a refreshed ranking of countries. The United Kingdom remains atop the list, followed by Denmark and France, which moved up from number 12 in 2013. India moved into the top 10, from #27, after the relaunch of its open data platform.
Despite the rhetoric emanating from Washington, the United States is ranked at number 8, primarily due to deficiencies in open data on government spending and an open register of companies. Implementation of the DATA Act may help, as would the adoption of an open corporate identified by the U.S. Treasury.
Below, in an interview from 2012, Pollock talks more about the relationship between open data and open government.
This morning, I gave a short talk on data journalism and the changing landscape for policy making in the age of networked transparency at the Woodrow Wilson Center in DC, hosted by the Commons Lab.
— Alex Howard (@digiphile) July 30, 2014
Video from the event is online at the Wilson Center website. Unfortunately, I found that I didn’t edit my presentation down enough for my allotted time. I made it to slide 84 of 98 in 20 minutes and had to skip the 14 predictions and recommendations section. While many of the themes I describe in those 14 slides came out during the roundtable question and answer period, they’re worth resharing here, in the presentation I’ve embedded below:
On May 30, I gave a keynote talk on my research on the art and science of data journalism at the first Tow Center research conference at Columbia Journalism School in New York City. I’ve embedded the video below:
My presentation is embedded below, if you want to follow along or visit the sites and services I described.
Here’s an observation drawn from an extensive section on open government that should be of interest to readers of this blog:
“Proactive, selective open data initiatives by government focused on services that are not balanced by support for press freedoms and improved access can fairly be criticized as “openwashing” or “fauxpen government.”
Data journalists who are frequently faced with heavily redacted document releases or reams of blurry PDFs are particularly well placed to make those critiques.”
My contribution was only one part of the proceedings for “Quantifying Journalism: Metrics, Data and Computation,” which you can catch up through the Tow Center’s live blog or TechPresident’s coverage of measuring the impact of journalism.
— Michael Keller (@mhkeller) May 30, 2014
This morning, Adam Liptak reported at the New York Times that the Supreme Court has been quietly editing its legal decisions without notice or indication. According to Richard J. Lazarus, a law professor at Harvard Liptak interviewed about a new study examining the issue, these revisions include “truly substantive changes in factual statements and legal reasoning.”
The court does warn readers that early versions of its decisions, available at the courthouse and on the court’s website, are works in progress. A small-print notice says that “this opinion is subject to formal revision before publication,” and it asks readers to notify the court of “any typographical or other formal errors.”
But aside from announcing the abstract proposition that revisions are possible, the court almost never notes when a change has been made, much less specifies what it was. And many changes do not seem merely typographical or formal.
Four legal publishers are granted access to “change pages” that show all revisions. Those documents are not made public, and the court refused to provide copies to The New York Times.
The Supreme Court secretly editing the legal record seems like a big deal to me. (Lawyers, professors, court reporters, tell me I’m wrong!)
To me, this story highlights the need for and, eventually the use of data and software to track the changes in a public, online record of Supreme Court decisions.
Static PDFs that are edited without notice, data or indication of changes doesn’t seem good enough for the legal branch of a constitutional republic in the 21st century.
Just as the U.S. Code, state and local codes, are being constantly being updated and consulted by lawyers, courts and the people, the Supreme Court’s decisions could be published and maintained online as a body of living legislation at SupremeCourt.gov so that they may be read and consulted by all.
Embedded and integrated into those decisions and codes would be a record of the changes to them, the “meta data” of the actions of the legislative organ of the republic.
What you’ll find now at SupremeCourt.gov is a significant improvement over past years. Future versions, however, might be even better.
Citizensourcing and open innovation can work in the public sector, just as crowdsourcing can in the private sector. Around the world, the use of prizes to spur innovation has been booming for years. The United States of America has been significantly scaling up its use of prizes and challenges to solving grand national challenges since January 2011, when, President Obama signed an updated version of the America COMPETES Act into law.
According to the third congressionally mandated report released by the Obama administration today (PDF/Text), the number of prizes and challenges conducted under the America COMPETES Act has increased by 50% since 2012, 85% since 2012, and nearly six-fold overall since 2011. 25 different federal agencies offered prizes under COMPETES in fiscal year 2013, with 87 prize competitions in total. The size of the prize purses has also grown as well, with 11 challenges over $100,000 in 2013. Nearly half of the prizes conducted in FY 2013 were focused on software, including applications, data visualization tools, and predictive algorithms. Challenge.gov, the award-winning online platform for crowdsourcing national challenges, now has tens of thousands of users who have participated in more than 300 public-sector prize competitions. Beyond the growth in prize numbers and amounts, Obama administration highlighted 4 trends in public-sector prize competitions:
- New models for public engagement and community building during competitions
- Growth software and information technology challenges, with nearly 50% of the total prizes in this category
- More emphasis on sustainability and “creating a post-competition path to success”
- Increased focus on identifying novel approaches to solving problems
The growth of open innovation in and by the public sector was directly enabled by Congress and the White House, working together for the common good. Congress reauthorized COMPETES in 2010 with an amendment to Section 105 of the act that added a Section 24 on “Prize Competitions,” providing all agencies with the authority to conduct prizes and challenges that only NASA and DARPA has previously enjoyed, and the White House Office of Science and Technology Policy (OSTP), which has been guiding its implementation and providing guidance on the use of challenges and prizes to promote open government.
“This progress is due to important steps that the Obama Administration has taken to make prizes a standard tool in every agency’s toolbox,” wrote Cristin Dorgelo, assistant director for grand challenges in OSTP, in a WhiteHouse.gov blog post on engaging citizen solvers with prizes:
In his September 2009 Strategy for American Innovation, President Obama called on all Federal agencies to increase their use of prizes to address some of our Nation’s most pressing challenges. Those efforts have expanded since the signing of the America COMPETES Reauthorization Act of 2010, which provided all agencies with expanded authority to pursue ambitious prizes with robust incentives.
To support these ongoing efforts, OSTP and the General Services Administration have trained over 1,200 agency staff through workshops, online resources, and an active community of practice. And NASA’s Center of Excellence for Collaborative Innovation (COECI) provides a full suite of prize implementation services, allowing agencies to experiment with these new methods before standing up their own capabilities.
Sun Microsystems co-founder Bill Joy famously once said that “No matter who you are, most of the smartest people work for someone else.” This rings true, in and outside of government. The idea of governments using prizes like this to inspire technological innovation, however, is not reliant on Web services and social media, born from the fertile mind of a Silicon Valley entrepreneur. As the introduction to the third White House prize report notes:
“One of the most famous scientific achievements in nautical history was spurred by a grand challenge issued in the 18th Century. The issue of safe, long distance sea travel in the Age of Sail was of such great importance that the British government offered a cash award of £20,000 pounds to anyone who could invent a way of precisely determining a ship’s longitude. The Longitude Prize, enacted by the British Parliament in 1714, would be worth some £30 million pounds today, but even by that measure the value of the marine chronometer invented by British clockmaker John Harrison might be a deal.”
Centuries later, the Internet, World Wide Web, mobile devices and social media offer the best platforms in history for this kind of approach to solving grand challenges and catalyzing civic innovation, helping public officials and businesses find new ways to solve old problem. When a new idea, technology or methodology that challenges and improves upon existing processes and systems, it can improve the lives of citizens or the function of the society that they live within.
“Open innovation or crowdsourcing or whatever you want to call it is real, and is (slowly) making inroads into mainstream (i.e. non high-tech) corporate America,” said MIT principal research professor Andrew McAfee, in an interview in 2012.” P&G is real. Innocentive is real. Kickstarter is real. Idea solicitations like the ones from Starbucks are real, and lead-user innovation is really real.”
Prizes and competitions all rely upon the same simple idea behind the efforts like the X-Prize: tapping into the distributed intelligence of humans using a structured methodology. This might include distributing work, in terms of completing a given task or project, or soliciting information about how to design a process, product or policy.
Over the past decade, experiments with this kind of civic innovation around the world have been driven by tight budgets and increased demands for services, and enabled by the increased availability of inexpensive, lightweight tools for collaborating with connected populations. The report claimed that crowdsourcing can save federal agencies significant taxpayer dollars, citing an example of a challenge where the outcome cost a sixth of the estimated total of a traditional approach.
One example of a cost-effective prize program is the Medicaid Provider Screening Challenge that was offered by the Centers for Medicare & Medicaid Services (CMS) as part of a pilot designed in partnership with states and other stakeholders. This prize program was a series of software development challenges designed to improve capabilities for streamlining operations and screening Medicaid providers to reduce fraud and abuse. With a total prize purse of $500,000, the challenge series is leading to the development of an open source multi-state, multi-program provider screening shared-service software program capable of risk scoring, credential validation, identity authentication, and sanction checks, while lowering the burden on providers and reducing administrative and infrastructure expenses for states and Federal programs. CMS partnered with the NASA Center of Excellence for Collaborative Innovation (COECI), NASA’s contractor Harvard Business School, Harvard’s subcontractor TopCoder, and the State of Minnesota. The State of Minnesota is working on full deployment of the software, and CMS is initiating a campaign to encourage other states to leverage the software. COECI estimates that the cost of designing and building the portal through crowdsourcing was one-sixth of what the effort would have cost using traditional software development methods. Through the success of this and subsequent
challenges, CMS is attempting to establish a new paradigm for crowdsourcing state and Federal information technology (IT) systems in a low-cost, agile manner by opening challenges to new players, small companies, and talented individual developers to build solutions which can “plug and play” with existing legacy systems or can operate in a shared, cloud-based environment.
As is always the nature of experiments, many early attempts failed. A few have worked and subsequently grown into sustainable applications, services, data sources, startups, processes and knowledge that can be massively scaled. Years ago, Micah Sifry predicted that the “gains from enabling a culture of open challenges, outsider innovation and public participation” in government were going to be huge. He was right.
Linked below are the administration’s official letters to the House and Senate, reporting the results of last year’s prizes.
On Monday, I delivered a short talk on data journalism, networked transparency, algorithmic transparency and the public interest at the Data & Society Research Institute’s workshop on the social, cultural & ethical dimensions of “big data”. The forum was convened by the Data & Society Research Institute and hosted at New York University’s Information Law Institute at the White House Office of Science and Technology Policy, as part of an ongoing review on big data and privacy ordered by President Barack Obama.
Video of the talk is below, along with the slides I used. You can view all of the videos from the workshop, along with the public plenary on Monday evening, on YouTube or at the workshop page.
Here’s the presentation, with embedded hyperlinks to the organizations, projects and examples discussed:
Today, the Center for Effective Government released a scorecard for access to information from the 15 United States federal government agencies that received the most Freedom of Information Act (FOIA) requests, focusing upon an analysis of their performance in 2013.
The results of the report (PDF) for the agencies weren’t pretty: if you computed a grade point average from this open government report card (and I did) the federal government would receive a D for its performance. 7 agencies outright failed, with the State Department receiving the worst grade (37%).
The grades were based upon:
- How well agencies processed FOIA requests, including the rate of disclosure, fullness of information provided, and timeliness of the response
- How well the agencies established rules of information access, including the effectiveness of agency polices on withholding information and communications with requestors
- Creating user-friendly websites, including features that facilitate the flow of information to citizens, associated online services, and up-to-date reading rooms
The report is released at an interesting historic moment for the United States, with Sunshine Week just around the corner. The United States House of Representatives just unanimously passed a FOIA Reform Act that is substantially modeled upon the Obama administration’s proposals for FOIA reforms, advanced as part of the second National Open Government Action Plan. If the Senate takes up that bill and passes it, it would be one of the most important, substantive achievements in institutionalizing open government beyond this administration.
The Citizens for Responsibility and Ethics in Washington have disputed the accuracy of this scorecard, based upon the high rating for the Department of Justice. CREW counsel Anne Weismann:
It is appropriate and fair to recognize agencies that are fulfilling their obligations under the FOIA. But CEG’s latest report does a huge disservice to all requesters by falsely inflating DOJ’s performance, and ignoring the myriad ways in which that agency — a supposed leader on the FOIA front — ignores, if not flouts, its obligations under the statute.
Last Friday, I spoke with Sean Moulton, the director of open government policy at the Center for Effective Government, about the contents of the report and the state of FOIA in the federal government, from the status quo to what needs to be done. Our interview, lightly edited for content and clarity, follows.
What was the methodology behind the report?
Moulton: Our goal was to keep this very quantifiable, very exact, and to try and lay out some specifics. We thought about what the components were necessary for a successful FOIA program. The processing numbers that come out each year are a very rich area for data. They’re extremely important: if you’re not processing quickly and releasing information, you can’t be successful, regardless of other components.
We did think that there are two other areas that are important. First, online services. Let’s face it, the majority of us live online in a big way. It’s a requirement now for agencies to be living there as well. Then, the rules. They’re explained to the agencies and the public, in how they’re going to do things when they get a request. A lot of the agencies have outdated rules. Their current practices may be different, and they may be doing things that the rules don’t say they have to, but without them, they may stop. Consistent rules are essential for consistent long term performance.
A few months back, we released a report that laid out what we felt were best practices for FOIA regulations. We went through a review of dozens of agencies, in terms of their FOIA regulations, and identified key issues, such as communicating with the requester, how you manage confidential business information, how you handle appeals, and how you handle timelines. Then we found inside existing regulations the best ways this was being handled. It really helped us here, when we got to the rules. We used that as our roadmap. We knew agencies were already doing these things, and making that commitment. The main thing we measured under the rules were the items from that best practices report that were common already. If things were universal, we didn’t want to call a best practice, but a normal practice.
Is FOIA compliance better under the Obama administration, more than 4 years after the Open Government Directive?
Moulton: In general, I think FOIA is improving in this administration. Certainly, the administration itself is investing a great deal of energy and resources in trying to make greater improvements in FOIA, but it’s challenging. None of this has penetrated into national security issues.
I think it’s more of a challenge than the administration thought it would be. It’s different from other things, like open data or better websites. The FOIA process has become entrenched. The biggest open government wins were in areas where they were breaking new ground. There wasn’t a culture or way of doing this or problems that were inherited. They were building from the beginning. With FOIA, there was a long history. Some agencies may see FOIA as some sort of burden, and not part of their mission. They may think of it as a distraction from their mission, in fact. When the Department of Transportation puts out information, it usually gets used in the service of their mission. Many agencies haven’t internalized that.
There’s also the issue of backlogs, bureaucracy, lack of technology or technology that doesn’t work that well — but they’re locked into it.
What about redaction issues? Can you be FOIA compliant without actually honoring the intent of the request?
Moulton: We’re very aware of this as well. The data is just not there to evaluate that. We wish it was. The most you get right now is “fully granted” or “partly granted.” That’s incredibly vague. You can redact 99% or 1% and claim it’s partially redacted, either way. We have no indicator and no data on how much is being released. It’s frustrating, because something like that would help us get a better sense on whether agencies would benefit would new policies
We do know that the percentage of full grants has dropped every year, for 12 years, from the Clinton administration all the way through the Bush administration to today. It’s such a gray area. It’s hard to say whether it’s a terrible thing or a modest change.
Has the Obama administration’s focus on open government made any difference?
Moulton: I think it has. There were a couple of agencies that got together on FOIA reform. The EPA led the team, with the U.S. National Archives and the Commerce Department, to build a new FOIA tool. The outward-facing part of the tool enables a user to go to a single spot, request and track it. Other people could come and search FOIA’ed documents. Behind the scenes, federal workers could use the tool to forward requests back and forth. This fits into what the administration has been trying to do, using technology better in government
Another example, again at the EPA, is where they’ve put together a proactive disclosure website. They got a lot of requests, like if there are inquiries about properties, environmental history, like leaks and spills, and set up a site where you could look up real estate. They did this because they went to FOIA requests and see what people wanted. That has cut down their requests to a certain percentage.
Has there been increasing FOIA demand in recent years, affecting compliance?
Moulton: I do think FOIA requests have been increasing. We’ll see what this next year of data shows. We have seen a pretty significant increase, after a significant decrease in the Bush administration. That may be because this administration keeps speaking about open government, which leads to more hopeful requestors. We fully expect that in 2013, there will be more requests than the prior year.
DHS gets the biggest number of all, but that’s not surprising when we look at the size of it. It’s second biggest agency, after Defense, and the biggest domestic facing agency. when you start talking about things like immigration and FEMA, which go deep into communities and people’s lives, in ways that have a lot impact, that makes sense.
What about the Department of Justice’s record?
Moulton: Well, DoJ got the second highest rating, but we know they have a mixed record. There are things you can’t measure and quantify, in terms of culture and attitude. I do know there were concerns about the online portal, in terms of the turf war between agencies. There were concerns about whether the tech was flexible, in terms of meeting all agency needs. If you want to build a government-wide tool, it needs to have real flexibility. The portal changed the dialogue entirely
Is FOIA performance a sufficient metric to analyze any administration’s performance on open government?
Moulton: We should step back further and look at the broader picture, if we’re going to talk about open government. This administration has done things, outside of FOIA, to try to open up records and data. They’ve built better online tools for people to get information. You have to consider all of those things.
Does that include efforts like the Intelligence Community Tumblr?
Moulton: That’s a good example. One thing this administration did early on is to identify social media outlets. We should be going there. We can’t make citizens come to us. We should go to where people are. The administration pushed early on that agencies should be able to use Tumblr and Twitter and Facebook and Flickr and so on.
Is this social media use “propaganda,” as some members of the media have suggested?
Moulton: That’s really hard to decide. I think it can result in that. It has the potential to be misused to sidestep the media, and not have good interaction with the media, which is another important outlet. People get a lot of their information from the media. Government needs to have good relationship.
I don’t think that’s the intention, though, just as under Clinton, when they started setting up websites for the first time. That’s what the Internet is for: sharing information. That’s what social media can be used for, so let’s use what’s there.
— For Effective Gov (@ForEffectiveGov) March 10, 2014