USASpending.gov addresses some data issues, adds Github issues tracker for feedback

usaspending

On April 1st, some reporters, open government advocates and people in industry may have hoped that a new redesign of USASpending.gov, the flagship financial transparency website of the United States government, was just a poorly conceived April Fool’s joke. Unfortunately, an official statement about the USASpending.gov redesign at the U.S. Treasury’s blog confirmed that the redesign was real. Analysts, media and businesses that rely on the contracting data on the site were loudly decried the decreased functionality of USASpending.gov.

A week later, there’s a still no evidence of deliberate intent on the part of Treasury not to publish accurate spending data or break the tool, despite headlines about rolling back transparency. Rather, it looks more likely that there were been a number of mistakes or even unavoidable errors made in the transitioning the site and data from a bankrupt federal contractor. There was certainly poor communication with the business community and advocates who use the site, a reality that Luke Fretwell helpfully suggested at Govfresh that other government agencies work to avoid next time.

Today, as Fretwell first reported, the federal government launched a new repository for tracking issues on USASpending.gov on Github, the social coding site that’s become an increasingly important platform for 18F, which committed to developing free and open source software by default last year.

In an email to the White House’s open government Google Group, Corinna Zarek, the senior advisor for open government in the Obama administration, followed up on earlier concerns about the redesign:

The USAspending team has been working to improve the usability of the site and has made some great strides to make it easier for average citizens to navigate information. But at the same time, we all understand that some of our expert users (like a lot of you) seek more technical information and the team is striving to meet your needs as well.

This is definitely a work in progress so please keep working with the team as it iterates on the best ways to improve function of the site while maintaining the content you seek. Your initial comments have been really helpful and the USAspending team is already working to address some of them.

Zarek also said that several of the problems with data that people have reported been addressed, including the capacity to download larger data sets and define specific dates in search, and asked for more feedback.

Specifically, this week the team addressed data export issues to allow the ability to specify date ranges to download data, added the bulk file format API, and modified the download capability so larger datasets can be downloaded. Additionally, data archives are being added continually. This week, they loaded the 2014 and 2015 delta files that show the new transactions in the last month. You can keep track of the ongoing improvements on the “What’s new” page.

Please keep sharing your feedback and continue working with the USAspending team as it makes improvements to the site. You can do this through the site’s contact page or on the new Github page where you can report issues and track them in the open.

If you find bugs, let the feds know about them on Github so that everyone can see the issues and how they’re addressed. As Mollie Walker reported for FierceGovernmentIT, there’s still missing functionality yet to be restored.

[Image Credit: Govfresh, via USASpending.gov]

U.S. government launches online traffic analytics dashboard for federal websites

There are roughly 1,361 .gov domains* operated by the executive branch of the United States federal government, 700-800 of which are live and in active use. Today, for the first time, the public can see how many people are visiting 300 executive branch government domains in real-time, including every cabinet department, by visiting analytics.usa.gov.

According to a post on the White House blog, the United States Digital Service “will use the data from the Digital Analytics Program to focus our digital service teams on the services that matter most to the American people, and analyze how much progress we are making. The Dashboard will help government agencies understand how people find, access, and use government services online to better serve the public – all while protecting privacy.  The program does not track individuals. It anonymizes the IP addresses of all visitors and then uses the resulting information in the aggregate.”

On Thursday morning, March 19th, tax-related services, weather, and immigration status are all popular. Notably, there’s an e-petition on the White House WeThePeople platform listed as well, adding data-driven transparency to what’s popular there right now.
analytics_usa_gov___The_US_government_s_web_traffic_

Former United States deputy chief technology officer Nick Sinai is excited about seeing the Web analytics data opened up online. Writing for the Harvard Shorenstein Center, where he is currently a fellow, Sinai adds some context for the new feature:

“Making government web performance open follows the digital services playbook from the new U.S. Digital Services,” he wrote. “Using data to drive decisions and defaulting to open are important strategies for building simple and useful citizen-facing digital services. Teal-time and historical government web performance is another example of how open government data holds the promise of improving government accountability and rebuilding trust in government.”

Here’s what the U.S. digital services team says they’ve already learned from analyzing this data:

Here’s what we’ve already learned from the data:

  • Our services must work well on all devices. Over the past 90 days, 33% all traffic to our sites came from people using phones and tablets. Over the same period last year, the number was 24%. Most of this growth came from an increase in mobile traffic. Every year, building digital services that work well on small screens becomes more important.
  • Seasonal services and unexpected events can cause surges in traffic. As you might expect, tax season is a busy time for the IRS. This is reflected in visits to pages on IRS.gov, which have more than tripled in the past 90 days compared with the previous quarter. Other jumps in traffic are less easy to predict. For example, a recently-announced settlement between AT&T and the Federal Trade Commissiongenerated a large increase in visits to the FTC’s website. Shortly after the settlement was announced, FTC.gov had four times more visitors than the same period in the previous year. These fluctuations underscore the importance of flexibility in the way we deploy our services so that we can scale our web hosting to support surges in traffic as well as save money when our sites are less busy.
  • Most people access our sites using newer web browsers. How do we improve digital services for everyone when not all web browsers work the same way? The data tells us that the percentage of people accessing our sites using outdated browsers is declining steadily. As users adopt newer web browsers, we can build services that use modern features and spend less time and money building services that work on outdated browsers. This change will also allow us to take advantage of features found in modern browsers that make it easier to build services that work well for Americans with disabilities, who access digital services using specialized devices such as screen readers.

If you have ideas, feedback or questions, the team behind the dashboard is working in the open on Github.

Over the coming months, we will encourage more sites to join the Digital Analytics Program, and we’ll include more information and insights about traffic to government sites with the same open source development process we used to create the Dashboard. If you have ideas for the project, or want to help improve it, let us know by contributing to the project on GitHub or emailing digitalgov@gsa.gov.

That last bit is notable; as its true all of the projects that 18F works on, this analytics dashboard is open source software.

There are some interesting additional details in 18F’s blog post on how the analytics dashbard was built, including the estimate that it took place “over the course of 2-3 weeks” with usability testing at a “local civic hacking meetup.”

First, that big number is made from HTML and D3, a Javascript library, that downloads and render the data. Using open standards means it renders well across browsers and mobile devices.

Second, 18F made an open source tool to manage the data reporting process called “analytics-reporter” that downloads Google Analytics reports and transforms that data into JSON.

Hopefully, in the years ahead, the American people will see more than the traffic to .gov websites: they’ll see concrete performance metrics like those displayed for the digital services the United Kingdom’s Government Digital Services team publishes at gov.uk/performance, including uptime, completion rate and satisfaction rate.

In the future, if the public can see the performance of Heathcare.gov, including glitches, or other government digital services, perhaps the people building and operating them will have more accountability for uptime and quality of service.

National Security Archive finds 40% E-FOIA compliance rate in federal government agencies

underConstruction

For Sunshine Week 2015, the National Security Archive​ conducted an audit of how well 165 federal government agencies in the United States of America comply with the E-FOIA Act of 1996. They found that only 67 of them had online libraries that were regularly updated with a significant number of documents released under the Freedom of Information Act. The criteria for the 165 agencies were that they had to have a chief Freedom of Information Officer and components that handled more than 500 FOIA requests annually.

Almost a decade after the E-FOIA Act, that’s about a 40% compliance rate. I wonder if the next U.S. Attorney General or the next presidential administration will make improving on this poor performance priority. It’s important for The United States Department of Justice​ to not only lead by example but push agencies into the 21st century when it comes to the Freedom of Information Act.

It would certainly help if Congress passed FOIA reform.

On that count, the Archive highlights a relevant issue in the current House and Senate FOIA reform bills in Congress: the FOIA statute states that documents that are “likely to become the subject of subsequent requests” should be published electronic reading rooms:

“The Department of Justice’s Office of Information Policy defines these records as “frequently requested records… or those which have been released three or more times to FOIA requesters.” Of course, it is time-consuming for agencies to develop a system that keeps track of how often a record has been released, which is in part why agencies rarely do so and are often in breach of the law. Troublingly, both the current House and Senate FOIA bills include language that codifies the instructions from the Department of Justice.

The National Security Archive believes the addition of this “three or more times” language actually harms the intent of the Freedom of Information Act as it will give agencies an easy excuse (“not requested three times yet!”) not to proactively post documents that agency FOIA offices have already spent time, money, and energy processing. We have formally suggested alternate language requiring that agencies generally post “all records, regardless of form or format that have been released in response to a FOIA request.”

This is a point that Members of Congress should think through carefully as they take another swing at reform. As I’ve highlighted elsewhere, FOIA requests that industry make are an important demand signal to show where data with economic value lies. (It’s also where the public interest tends to lie, with respect to FOIA requests from the media.)

While it’s true that it would take time and resources to build and maintain a system that tracks such requests by industry, there should already be a money trail from the fees paid to the agency. If FOIA reform leads to modernizing how it’s implemented, perhaps tying FOIA.gov to Data.gov might finally take place. The datasets are the subject of the most FOIA requests are the ones that should be prioritized for proactive disclosure online.

Adding a component that identifies which data sets are frequently requested, particularly periodically, should be a priority across the board for any administration that seeks to “manage information as an asset.” Adding the volume and periodicity of requests to the expanding national enterprise data inventory might naturally follow. It’s worth noting, too, that reform of the FOIA statute may not be necessary to achieve this end, if the 18F team working on modernizing FOIA software worked on it.

In a step towards sunlight, United States begins to publish a national data inventory

20130929-142228.jpg
Last year, a successful Freedom of Information request for the United States enterprise data inventory by the Sunlight Foundation was a big win for open government, nudging Uncle Sam towards a better information policy through some creative legal arguments. Today, the federal government started releasing its enterprise indices at data.gov. You can browse the data for individual agencies, like the feed for the Office for Personnel Management, using a JSON viewer like this one.

“Access to this data will empower journalists, government officials, civic technologists, innovators and the public to better hold government accountable,” said Sunlight Foundation president Chris Gates, in a statement. “Previously, it was next to impossible to know what and how much data the government has, and this is an unprecedented window into its internal workings. Transparency is a bedrock principle for democracy, and the federal government’s response to Sunlight’s Freedom of Information request shows a strong commitment to open data. We expect to see each of these agencies continue to proactively release their data inventories.”

Understanding what data an organization holds is a critical first step in deciding how it should be stored, analyzed or published, shifting towards thinking about data as an asset. That’s why President Barack Obama’s executive order requiring federal agencies to catalog the data they have was a big deal. When that organization is a democratic government and the data in question was created using taxpayer funds, releasing the inventory of the data sets that it holds is a basic expression of open and accountable government.

2014 Open Knowledge Index shows global growth of open data, but low overall openness

Today, Open Knowledge released its global 2014 Open Data Index, refreshing its annual measure of the accessibility and availability of government releases of data online. When compared year over year, these indices have shown not only the relatives openness of data between countries but also the slow growth in the number of open data sets. Overall, however, the nonprofit found that the percentage of open datasets across all 97 surveyed countries (up from 63 in 2013) remained low, at only 11%.

“Opening up government data drives democracy, accountability and innovation,” said Rufus Pollock, the founder and president of Open Knowledge, in a statement. “It enables citizens to know and exercise their rights, and it brings benefits across society: from transport, to education and health. There has been a welcome increase in support for open data from governments in the last few years, but this year’s Index shows that real progress on the ground is too often lagging behind the rhetoric.”

The map below can be explored in interactive form at the Open Knowledge website.

Open_government_data_around_the_world__right_now____Global_Open_Data_Index_by_Open_Knowledge

Open Knowledge also published a refreshed ranking of countries. The United Kingdom remains atop the list, followed by Denmark and France, which moved up from number 12 in 2013. India moved into the top 10, from #27, after the relaunch of its open data platform.

Place_overview___Global_Open_Data_Index_by_Open_Knowledge

Despite the rhetoric emanating from Washington, the United States is ranked at number 8, primarily due to deficiencies in open data on government spending and an open register of companies. Implementation of the DATA Act may help, as would the adoption of an open corporate identified by the U.S. Treasury.

Below, in an interview from 2012, Pollock talks more about the relationship between open data and open government.

More details and discussion are available at the Open Knowledge blog.

Thoughts on the future of the US CIO, from capabilities to goals

vanroekel

This weekend, ZDNet columnist Mike Krigsman asked me what I thought of the tenure of United States chief information officer Steven VanRoekel and, more broadly, what I thought of the role and meaning of the position in general. Here’s VanRoekel’s statement to the press via Federal News Radio:

“When taking the job of U.S. chief information officer, my goal was to help move federal IT forward into the 21st Century and to bring technology and innovation to bear to improve IT effectiveness and efficiency. I am proud of the work and the legacy we will leave behind, from launching PortfolioStat to drive a new approach to IT management, the government’s landmark open data policy to drive economic value, the work we did to shape the mobile ecosystem and cloud computing, and the culmination of our work in the launch of the new Digital Service, we have made incredible strides that will benefit Americans today and into the future,” VanRoekel said in a statement. “So it is with that same spirit of bringing innovation and technology to bear to solve our most difficult problems, that I am excited to join USAID’s leadership to help stop the Ebola outbreak. Technology is not the solution to this extremely difficult task but it will be a part of the solution and I look forward to partnering with our federal agencies, non-profit organizations and private sector tech communities to help accelerate this effort.”

Here’s the part of what I told Krigsman that ended up being published, with added hyperlinks for context:

As US CIO, Steven VanRoekel was a champion of many initiatives that improved how technology supports the mission of the United States government. He launched an ambitious digital government strategy that moved further towards making open data the default in government, the launch of the U.S. Digital Service, 18F, and the successful Presidential Innovation Fellows program, and improved management of some $80 billion dollars in annual federal technology spending through PortfolioStat.

As was true for his predecessor, he was unable to create fundamental changes in the system he inherited. Individual agencies still have accountability for how money is spent and how projects are managed. The nation continues to see too many government IT projects that are over-budget, don’t work well, and use contractors with a core competency in getting contracts rather than building what is needed.

The U.S. has been unable or unwilling to reorganize and fundamentally reform how the federal government supports its missions using technology, including its relationship to incumbent vendors who fall short of efficient delivery using cutting-edge tech. The 113th Congress has had opportunities to craft legislative vehicles to improve procurement and the power of agency CIOs but has yet to pass FITARA or RFP-IT. In addition, too many projects still look like traditional enterprise software rather than consumer-facing tools, so we have a long way to go to achieve the objectives of the digital playbook VanRoekel introduced.

There are great projects, public servants and pockets of innovation through the federal government, but culture, hiring, procurement, and human resources remain serious barriers that continue to result in IT failures. The next U.S. CIO must be a leader in all respects, leading by example, inspiring, and having political skill. It’s a difficult job and one for which it is hard to attract world-class talent.

We need a fundamental shift in the system rather than significant tweaks, in areas such as open source and using the new Digital Service as a tool to drive change. The next US CIO must have experience managing multi-billion dollar budgets and be willing to pull the plug on wasteful or mismanaged projects that serve the needs of three years ago, not the future.

In a win for open government advocacy, DC removes flaws in its municipal open data policy

Update:

dcgov_logoIt’s a good day for open government in the District of Columbia. Today, DC’s Office of the Chief Technology Officer (OCTO) has updated the Terms and Conditions for DC.gov and the city’s new open data platform, addressing some of the concerns that the Sunlight Foundation and Code for DC expressed about the new open data policy introduced in July. The updated terms and conditions rolling out onto the city’s digital civic architecture this afternoon. “Today’s changes are really focused on aligning DC.Gov’s Terms and Conditions of Use with the new open data and transparency policy released this summer,” explained Mike Rupert, the communications director for OCTO,” in an interview. “The site’s T&C hadn’t been updated in many years,” according to Rupert. The new T&C will apply to DC.gov, the open data platform and other city websites. “It is encouraging that DC is taking steps toward considering feedback and improving its Terms and Conditions, but there is still room for improvement in the broader scope of DC’s policies,”said Alisha Green, a policy associate with Sunlight Foundation’s local policy team.  “We hope those implementing DC’s new open data policy will actively seek stakeholder input to improve upon what the policy requires. The strength of the policy will be in its implementation, and we hope DC will take every opportunity to make that process as open, collaborative and impactful as possible.” So, OCTO both heard and welcomed the feedback from open government advocates regarding the policy and agreed that the policy implications of the terms and conditions were problematic. Certain elements of the previous Terms and Conditions of Use (Indemnity, Limitation of Liability) could have chilled the understanding of the public’s right to access and have been eliminated,” said Rupert. Those were the sections that prompted civic hacker Josh Tauberer to wonder whether he needed a lawyer to hack in DC are simply gone, specifically that Indemnity and Liability Section. Other sections, however, still remain. The revised policy I obtained before the updated terms and conditions went online differs in a couple of ways from the one that just that went online. First, the Registration section remains, as does the Conduct section, although DC eliminated the 11 specific examples. That said, it’s better, and that’s a win. District officials remain cautious about how and where reuse might occur, they’re going to at least let the data flow without a deeply flawed policy prescription. “While we want to be mindful of and address the potential for harm to or misuse of District government information and data, the Terms and Conditions of Use should promote the new open data and transparency philosophy in a more positive manner,” said Rupert. Sharp-eyed readers of the new policy, however, will note that DC’s open data and online information has now been released to the public under a Creative Commons license, specifically Attribution 3.0 United States. That means that anyone who uses DC’s open data is welcome to “Share — copy and redistribute the material in any medium or format and Adapt — remix, transform, and build upon the material — for any purpose, even commercially,” as long as they provide “Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.” When asked about the CC license choice, Rupert said that “The new copyright language from Creative Commons – which as you is know is becoming the international standard – better states the overriding principle of the public’s right to web content and data. ” That did not sit entirely well with open government advocates who hold that making open data license free is a best practice. Asked for comment, Tauberer emailed the following statement in a response to the draft of the revision, welcoming the District’s responsiveness but questioning the premise of the District of Columbia having any “terms and conditions” for the public using open government data at all.

The new terms drop the most egregious problems, but these terms still don’t count as “open.” Should I expect a lawsuit if I don’t tip my hat and credit the mayor every time I use the data we taxpayers paid to create? Until the attribution requirement is dropped, I will recommend that District residents get District data through Freedom of Information Act requests. It might take longer, but it will be on the people’s terms, not the mayor’s. It’s not that the District shouldn’t get credit, but the District shouldn’t demand it and hold civil and possibly criminal penalties over our heads to get it. For instance, yesterday Data.gov turned their attribution requirement into a suggestion. That’s the right way to encourage innovation. All that said, I appreciate their responsiveness to our feedback. Tim from DC GIS spent time at Code for DC to talk about it a few weeks ago, and I appreciated that. It is a step in the right direction, albeit one deaf to our repeated explanation that “open” does not mean “terms of use.

The good news is that DC’s OCTO is listening and has committed to being responsive to future concerns about how it’s handling DC’s online presences and policies. “Several of your questions allude to the overall open data policy and we will definitely be reaching out to you and all other interested stakeholders as we begin implement various elements of that policy,” said Rupert.

Update: On October 29th, DC updated its Terms and Conditions again, further improving them. Tauberer commented on the changes to the open data policy on his blog. In his view, the update represents a step forward and a step back:

In a new update to the terms posted today, which followed additional conversations with OCTO, there were two more great improvements. These terms were finally dropped:

  • agreeing to follow all “rules”, a very ambiguous term
  • the requirement to attribute the data to the District in all uses of the data (it’s now merely a suggestion)

The removal of these two requirements, in combination with the two removed in September, makes this a very important step forward.

One of my original concerns remains, however, and that is that the District has not granted anyone a copyright license to use District datasets. Data per se isn’t protected by copyright law, but the way a dataset is presented may be. The District has claimed copyright over its things before, and it remains risky to use District datasets without a copyright license. Both the September update and today’s update attempted to address this concern but each created more confusion that there was before.

Although today’s update mentions the CC0 public domain dedication, which would be the correct way to make the District data available, it also explicitly says that the District retains copyright:

  • The terms say, at the top, that they “apply only to . . . non-copyrightable information.” The whole point is that we need a license to use the aspects of the datasets that are copyrighted by the District.
  • Later on, the terms read: “Any copyrighted or trademarked content included on these Sites retains that copyright or trademark protection.” Again, this says that the District retains copyright.
  • And: “You must secure permission for reuse of copyrighted … content,” which, as written (but probably not intended), seems to say that to the extent the District datasets are copyrighted, data users must seek permission to use it first. (Among other problems, like side-stepping “fair use” in copyright law.)

With respect to the copyright question, the new terms document is a step backward because it may confuse data users into thinking the datasets have been dedicated to the public domain when in fact they haven’t been.

This post has been updated with comments from Tauberer and the Sunlight Foundation.

On its 3rd anniversary, opportunities and challenges for the Open Government Partnership

opgFrance1

In 2010, President Barack Obama spoke to the United Nations General Assembly about open government. “The common thread of progress is the principle that government is accountable to its citizens,” he said, “and the diversity in this room makes clear — no one country has all the answers, but all of us must answer to our own people.”

In all parts of the world, we see the promise of innovation to make government more open and accountable.  And now, we must build on that progress.  And when we gather back here next year, we should bring specific commitments to promote transparency; to fight corruption; to energize civic engagement; to leverage new technologies so that we strengthen the foundations of freedom in our own countries, while living up to the ideals that can light the world.

Open government, said Samantha Power, now the U.S. ambassador to the United Nations, could have a global impact.

In 2011, a historic Open Government Partnership launched in New York City, hailed as a fresh approach to parting the red tape by the Economist. “The partnership is really the first time that there is a multilateral platform to address these issues,” said Maria Otero, former under secretary of state for democracy and global affairs at the United States State Department. “The partnership could have focused on countries come in and present best practices and exchange ideas and then just go home.”

“The partnership is really focused on first having countries participate that have already demonstrated interest in this area and have already put in place a number of specific things and the material laid out, if you will, the minimum standards that are being requested. What the partnership really looks for is to provide a mechanism by which the countries can each develop their own national plans on ways to expand what they’re doing on transparency, accountability, and civic engagement, or to start new initiatives for them. That is really what is very different and important about this partnership, is that it is very action- and results-oriented.”

In 2012, the Open Government Partnership became a player on the world stage as it hosted a global gathering of national leaders and civil society an annual meeting in Brazil, with the responsibilities and challenges that accompany that role, including pushing participants to submit missing action plans and progress reports, not just letters of commitment.

In January 2013, Power hailed the Open Government Partnership (OGP) as President Obama’s signature governance initiative:

It’s not about the abstraction about ‘fighting corruption’ or ‘promoting transparency’ or ‘harnessing innovation’ — it’s about ‘are the kids getting the textbooks they’re supposed to get’ or does transparency provide a window into whether resources are going where they’re supposed to go and, to the degree to which that window exists, are citizens aware and benefiting from the data and that information such that they can hold their governments accountable. And then, does the government care that citizens care that those discrepancies exist?

In May 2013, a seminal event in the evolution of OGP occurred when Russia withdrew from the Open Government Partnership:

If the dominant binary of the 21st century is between open and closed, Russia looks more interested in opting towards more controllable, technocratic options that involve discretionary data releases instead of an independent judiciary or freedom of assembly or the press. One of the challenges of the Open Government Partnership has always been the criteria that a country had to pass to join and then continue to be a member. Russia’s inclusion in OGP instantly raised eyebrows, doubts and fears last April, given rampant corruption in the public sector and Russia’s terrible record on press freedom. “Russia’s withdrawal from the OGP is an important reminder that open government isn’t easy or politically simple,” said Nathaniel Heller, executive director of Global Integrity. “While we don’t yet fully understand why Russia is leaving OGP, it’s safe to assume that the powers that be in the Kremlin decided that it was untenable to give reformers elsewhere in the Russian government the freedom to advance the open government agenda within the bureaucracy.”

In November 2013, the world may have hit ‘peak open‘ at the OGP annual summit in London, despite the partnerships’ members facing default states of closed.

Swirling underneath the professional glitz of an international summit were strong undercurrents of concern about its impact upon governments reluctant to cede power, reveal corruption or risk embarrassment upon disclosure of simple incompetence. The OGP summit took place at a moment where 21st century technology-fueled optimism has splashed up against the foundations of institutions created in the previous century. While the use of the Internet as a platform for collective action has grown, so too have attendent concerns about privacy and surveillance, in the wake of disclosures by NSA contractor Edward Snowden, where the same technologies that accelerated revolutions across the Middle East and North Africa are being used to capture and track the people advocating for change.

In 2014 the Open Government Partnership has matured and expanded, with France joining earlier in the year and Bosnia and Herzegovina bringing the total number of participating countries to 65 out of about 88 eligible countries worldwide. As OGP turns three, the partnership is celebrating the success of its expansion and looking ahead to its future, with a clearer mission and goals and ambitious four year strategy (PDF). The partnership is finally writing letters to countries that are not living up to their commitments, although the consequences for their continued participation if they do not comply remain to be seen.

The challenges and opportunities ahead for a partnership that provides a platform for civil society to hold government accountable are considerable, given the threats to civil society worldwide and the breathtaking changes brought about through technological innovation. Today, 10 national leaders will speak in New York City to mark OGP’s third anniversary. (I’ll be there to listen and share what I can.)

After the speeches end and the presidents and prime ministers return home, serious questions will remain regarding their willingness to put political capitol behind reforms and take tough stands to ensure that their governments actually open up. Digital government is not open government, just as not all open data supports democratic reforms.  As Mexico prepares to become lead co-chair of OGP, one element that didn’t make it into the challenges listed for the country is the state of press freedom in Mexico. As the Committee to Protect Journalists highlighted, open government is not sustainable without a free press. As long as the murders of journalists go unpunished in Mexico, the commitments and efforts of the Mexican national government will have to be taken in context.

Given this blog’s past stance that as press freedom goes, so too does open government, I’ve signed a petition urging the White House to explicitly support a right to report. Every other country that has committed to open government should do the same. Given OGP’s own challenges around the media and open government (PDF), I would also urge the partnership to make sure that press freedom and freedom of expression occupies a prominent place in its advocacy efforts in the years ahead.

Open government advocates: terms and conditions mean DC open data is fauxpen data

500px-WilsonbldgEarlier this summer, this blog covered the launch of District of Columbia’s executive order on open government, open data policy, open data platform and online FOIA portal. Last week, the Sunlight Foundation laid out what DC should have done differently with its open data policy.

“The evolution of open data policies since 2006 provides a chance for stakeholders to learn from and build on what’s been accomplished so far,” wrote policy associate Alisha Green. “This summer, a new executive directive from Mayor Vincent Gray’s office could have taken advantage of that opportunity for growth, It fell far short, however. The scope, level of detail, and enforceability of the policy seem to reveal a lack of seriousness about making a significant improvement on DC’s 2006 memorandum.”

Green says that DC’s robust legal, technology and advocacy community’s input should have helped shape more of the policy and that “the policy should have been passed through the legislative, not executive, process.” Opportunities, missed.

Yesterday, civic hacker and Govtrack.us founder Joshua Tauberer took the critique one step further, crying foul over the terms of use in the DC data catalog.

“The specter of a lawsuit hanging over the heads of civic hackers has a chilling effect on the creation of projects to benefit the public, even though they make use of public data released for that express purpose,” he wrote. “How does this happen? Through terms of service, terms of use, and copyright law.”

The bottom line, in Tauberer’s analysis, is that the District oF Columbia’s open data isn’t truly open. To put it another way, it’s fauxpen data.

“Giving up the right to take legal action and being required to follow extremely vague rules in order to use public data are not hallmarks of an open society,” writes Tauberer. “These terms are a threat that there will be a lawsuit, or even criminal prosecution, if civic hackers build apps that the District doesn’t approve of. It has been a long-standing tenant that open government data must be license-free in order to truly be open to use by the public. If there are capricious rules around the reuse of it, it’s not open government data. Period. Code for DC noted this specifically in our comments to the mayor last year. Data subject to terms of use isn’t open. The Mayor should update his order to direct that the city’s “open data” be made available a) without restriction and b) with an explicit dedication to the public domain.”

In the wake of these strong, constructive critiques, I posted an update in an online open government community wondering what the chances ar that DC public advocates, technologists, lawyers, wonks, librarians and citizens will go log on to the DC government’s open government platform, where the order is hosted, and suggest changes to the problematic policy? So far, few have.

The issue also hasn’t become a serious issue for the outgoing administration of Mayor Vincent Gray, or in the mayoral campaign between Muriel Bowser and David Catania, who both sit on the DC Council.

The issues section of Bowser’s website contains a positive but short, vague commitment to “improved government”: “DC needs a government that works for the people and is open to the people,” it reads. “Muriel will open our government so that DC residents have the ability to discuss their concerns and make suggestions of what we can do better.”

By way of contrast, Catania published a 128 page platform online that includes sections on “democracy for the District” and “accountable government.”(Open data advocates, take note: the document was published on Scribd, not as plaintext or HTML.) The platform includes paragraphs on improving access to government information, presenting information in user-friendly formats, eradicating corruption and rooting out wasteful spending.

Those are all worthy goals, but I wonder whether Catania knows that the city’s current policy and the executive order undermines the ability and incentives for journalists, NGOs, entrepreneurs and the District’s residents to apply the information he advocates disclosing for the purposes intended.

Last week, I asked Bowser and Catania how their administrations would approach open data in the District.

To date, I’ve heard no reply. I’ve also reached out to DC’s Office of Open Government. If I hear from any party, I’ll update this post.

Update: In answer to a question I posed, the Twitter account for DC.gov, which manages DC’s online presence and the open data platform in question as part of the Office of the Chief Technology Officer, indicated that “new terms and conditions [were] coming shortly.” No further details were offered.

Chris Gates will be the new president of the Sunlight Foundation

As reported by Politico, Chris Gates will be the next president of the Sunlight Foundation, the Washington, DC-based nonprofit that advocates for open government and creates tools that empower people to improve government transparency and accountability around the world.

“I couldn’t be more excited to join the team at Sunlight to help advance their work to bring more accountability and transparency to our politics and our government,” said Gates, in a statement. “For those of us who care deeply about the health of our democracy, these are perilous times. Our political system is swimming in anonymous money and influence, and our federal government is paralyzed and unable to respond to the challenges of our times. Our hope is that the new tools, data and information generated by Sunlight will help break through this impasse. We look forward to working with others in the reform field to fix a system that clearly isn’t working.”

Gates is currently the executive director of Philanthropy for Active Civic Engagement, part of the Council on Foundations. Previously, he served as president of America’s oldest good government organization, the National Civic League.

“Chris, who is a thought leader in the fields of democratic theory and practice and political and civic engagement, has, for the past decade, been a leading voice for strengthening democratic processes and structures and developing new approaches to both engagement and decision-making,” wrote Ellen Miller, co-founder and current president of the Sunlight Foundation, at the organization’s blog. “He and I have been colleagues for the past several years — we’ve worked together on numerous occasions — and I am truly thrilled that he will become Sunlight’s president.”

Miller went on to say that Gates will “bring a breadth of experience and style of leadership that will take us to new levels.” This fall, the Sunlight Foundation will undergo the biggest transition of leadership since its founding: last week, Sunlight Labs director Tom Lee announced that he was leaving to work at DC-based Mapbox, with James Turk stepping up to assume responsibility for the nonprofit’s powerful technology resources and development team. I look forward to seeing how both men build out the civic infrastructure, reporting group, and advocacy shop that Sunlight has established since the organization opened its doors in 2006.

[Image Credit: Sunlight Foundation]