As David Moore, founder of PPF, put it, “AskThem is like a version of the White House’s “We The People” petition platform, but for over 142,000 elected officials nationwide.”
The platform is an evolution from earlier attempts to ask questions of candidates for public office, like “10 Questions” from Personal Democracy Media, or the myriad online town halls that governors and the White House have been holding for years.
AskThem enables anyone to pose a question to any elected official or Verified Twitter account. Notably, the cleanly designed Web app uses geolocation to enable users to learn who represents them, in of itself a valuable service.
As with e-petitions, AskThem users can then sign questions they support, voting them up and sharing the questions with their social networks. When a given question hits a preset threshold, the platform delivers the questions to to the public figure and “encourages a public response.”
That last bit is key: there’s no requirement for someone to respond, for the response itself to be substantive, nor for the public figure to act. There’s only the network effect of public pressure to make any of that happen.
After a year of development, Moore was excited to see the platform go live today, noting a number of precedents set in the process.
“I believe we’re the first open-source web app to support geolocation of elected officials, down to the municipal level, from street address,” he said, via email. “And I believe we’re the first to offer access to over 142,000 elected officials through our combined data sources. And I believe we’re the first to incorporate open government data for informed questions of elected officials at every level of government.”
AskThem goes online just in time for tomorrow’s day of action against mass surveillance, where over 5,000 websites will try to activate their users to contact their elected representatives in Washington. Whether it gets much use or not will depend on awareness of the new tool.
That could come through use by high-profile early adopters like Chris Hayes (@chrislhayes), of MSNBC’s “All In with Chris Hayes,” or OK Go, the popular band.
At launch, 66 elected officials nationwide have signed on to participate, though more may join if it catches on. In the meantime, you can use AskThem’s handy map to find local elected officials and see a listing of all of the questions to date across the USA — or pose your own.
The Obama administration announced significant adoption for the Blue Button in the private sector today. In a post at the White House Office of Science and Technology blog, Nick Sinai, U.S. deputy chief technology officer and Adam Dole, a Presidential Innovation Fellow at the U.S. Department of Health and Human Services, listed major pharmacies and retailers joining the Blue Button initiative, which enables people to download a personal health record in an open, machine-readable electronic format:
“These commitments from some of the Nation’s largest retail pharmacy chains and associations promise to provide a growing number of patients with easy and secure access to their own personal pharmacy prescription history and allow them to check their medication history for accuracy, access prescription lists from multiple doctors, and securely share this information with their healthcare providers,” they wrote.
“As companies move towards standard formats and the ability to securely transmit this information electronically, Americans will be able to use their pharmacy records with new innovative software applications and services that can improve medication adherence, reduce dosing errors, prevent adverse drug interactions, and save lives. ”
While I referred to the Blue Button obliquely at ReadWrite almost two years ago and in many other stories, I can’t help but wish that I’d finished my feature for Radar a year ago and written up a full analytical report. Extending access to a downloadable personal health record to millions of Americans has been an important, steadily shift that has largely gone unappreciated, despite reporting like Ina Fried’s regarding veterans getting downloadable health information. According to the Office of the National Coordinator for Health IT, “more than 5.4 million veterans have now downloaded their Blue Button data and more than 500 companies and organizations in the private-sector have pledged to support it.
As I’ve said before, data standards are the railway gauges of the 21st century. When they’re agreed upon and built out, remarkable things can happen. This is one of those public-private initiatives that has taken years to take fruit that stands to substantially improve the lives of so many people. This one started with something simple, when the administration gave military veterans the ability to download their own health records using from on MyMedicare.gov and MyHealthyVet and scaled progressively to Medicare recipients and then Aetna and other players from there.
There have been bumps and bruises along with the way, from issues with the standard to concerns about lost devices, but this news of adoption by places like CVS suggests the Blue Button is about to go mainstream in a big way. According to the White House, “more than 150 million Americans today are able to use Blue Button-enabled tools to access their own health information from a variety of sources including healthcare providers, health insurance companies, medical labs, and state health information networks.”
Notably, HHS has ruled that doctors and clinics that implement the new “BlueButton+” specification will be meeting the requirements of “View, Download, and Transmit (V/D/T)” in Meaningful Use Stage 2 for electronic health records under the HITECH Act, meaning they can apply for reimbursement. According to ONC, that MU program currently includes half of eligible physicians and more than 80 percent of hospitals in the United States. With that carrot, many more Americans should expect to see a Blue Button in the doctor’s office soon.
In the video below, U.S. chief technology officer Todd Park speaks with me about the Blue Button and the work of Dole and other presidential innovation fellows on the project.
Yesterday, I participated in a short teleconference with Canada’s open government advisory panel considering the next version of the country’s open government “action plan.” As readers may know, I accepted an invitation in 2012 from Canadian Minister of Parliament Tony Clement, the president of Canada’s Treasury Board, to be a member of Canada’s advisory panel on open government, joining others from Canada’s tech industry, the academy and civil society. (I shared several recommendations for open government in the first meeting, held on February 28th, 2012, and in another in 2013.)
In preparation for yesterday’s discussion, I downloaded the Open Government Partnership’s Internal Review Mechanism’s report on Canada, which highlights progress in meeting the country’s (largely self-defined) goals for open government, particularly with respect to open data, and identified significant weaknesses in the public consultation taken to date.
The consultative process during the development of the action plan was weak. The consultation, which was only done online, including a Twitter chat session with the TBS President, took place during a public holiday and no draft plan was circulated in advance for discussion. There was minimal awareness raising around the consultation process, which resulted in low participation.
The IRM researcher found minimal evidence of attempts to engage civil society during implementation of the action plan with the exception of the consultation on open data and the Open Government Licence. Consultation on commitments in these areas was seen as significantly stronger and more productive than the consultations for development of the action plan and the year one government selfRassessment.
Consultation of the self assessment report was carried out online and was not widely publicized, resulting in a limited level of participation.
Based upon this report and my own observations, I made three suggestions on yesterday’s call:
1) Adoption of an open source e-petition platform from the United Kingdom. While many people remain dubious about online petitions, the tool could be seeded with proposed open government reforms and solicit new ones.
2) Acknowledgement of ongoing debates about electronic surveillance. The Harper administration should launch a more proactive public discussion of what the Canadian people have a right to know about how their electronic communications are being collected, stored and used. Any broad consultation around open government Canada will include this issue.
3) More civic engagement with the media. If improving public consultation is a priority, government officials must go onto television and radio broadcasts, along with sitting down for print interviews. Public engagement through social media and government websites are simply not enough.
The Canadian government should also engage journalists who are making information requests, specifically data journalists, as they are key players in the ecosystem around confirming data releases and quality. If the government faces significant doubts, it will have to turn to more trusted third parties to validate its programs and their efficacy.
One of the more interesting aspects of Dave Eggers’ dystopic new novel, “The Circle,” is the introduction of the “SeaChange,” a small, powerful camera that can transmit wireless images to a networked global audience. The SeaChange is adopted by politicians who “go transparent,” broadcasting all of their interactions to the public all day long.
Regardless of whether that degree of radical transparency in beneficial for elected representatives or not, in early 2014, we’ve now seen many early glimpses of what a more networked world full of inexpensive cameras looks like when United States politicians are online and on camera more often, from scandals to threats to slurs to charged comments that may have changed a presidential election. Most of that video has been captured by small video cameras or, increasing, powerful smartphones. Over the next year, more people will be wearing Google Glass, Google’s powerful facial computing device. Even if Google Glass has led to a backlash, the next wave of mobile devices will be wearable, integrated into clothing, wristbands, shoes and other gear. This vision of the future is fast approaching, which means that looking for early signals of various aspects of it is crucial.
One such signal came across my desktop earlier this week, in the form of a new app for Google Glass from RedEdge, a digital advocacy consultancy based in Arlington, Virginia. Their new “augmented advocacy” application for Google Glass is a proof of concept that demonstrates how government data can be served to someone wearing glass as she moves around the world. It’s not in the GDJ Store but people interested in testing it can request the Glass application file (android 1) from RedEdge, its maker.
“While we don’t expect widespread deployment of this app, though that would be cool, this is a window into what’s possible with wearable computing just using federal department data,” said Ian Spencer, chief technology officer of RedEdge, in an interview. “The data we used to launch this app and populate the database was all sourced from publicly available information. We primarily used publications from the Office of Management and Budget for budget figures, as well as the president’s own budget, for monetary data. Location data on federal buildings was sourced from Google Maps.”
The app leverages Google Glass’s ability to detect the wearer’s location, feeding a government data through RedEdge’s API to populate a relevant card. It pulls in from open data, formatted as JSON, and provides a list of all locations.
“You can just walk around with the app running in background,” said Spencer. “It doesn’t take up a ton of battery life. With geofencing, Glass knows when you’re near a building and triggers the app, which pops in a card that shows you a phone number and budget information. You can then tap to get more information and it loads up public contact information. Eventually the GDK [Glass Developer Kit] will let you make calls and emails.”
Visitors to the White House with this app, for instance, could call the White House switchboard, though they would be unlikely to get President Obama on the phone.
The RedEdge app is currently limited by the amount of time and investment RedEdge has put into it, along with the technology of Glass itself. “Once we add more data points, we will need a more complicated API,” said Spencer. “User experience was our focus, not massive complete sets. Even if we were using a government API, which would be ideal at some point, we would need a hashing layer so that we don’t overwhelm their servers.”
The only data the developers are feeding into it is the total federal budget for a given agency, not more granular details concerning how it related to programs, their performance or who is in charge of them. It’s very much a “proof of concept.”
“We’re looking at it as a trial balloon,” said Spencer. “It started with our tech team. We haven’t had researchers go over tons of entries. If there is interest in it, we then may do more, like adding more federal data and state-level data.”
One potentially interesting application of augmented advocacy might seem to be Congress, where data from the Sunlight Foundation’s Influence Explorer or Open Congress could be integrated as the Glass wearer walked around. The technical limitations of Glass, however, mean that citizens will need to keep downloading Sunlight’s popular Congress app for smartphones.
“The problem is the precision of the GPS,” said Spencer. “If you’re wearing Glass in the Hart building, you don’t have enough accuracy. You can get building-to-building precision, but not more. There are technical problems with trying to use satellites for this, whether it’s GPS or GLONASS, the Russian version.”
That doesn’t mean such precision might not be possible in the future. As Spencer highlighted, app developers can determine “micropositioning” through wifi or Bluetooth, enabling triangulation within a room. “A classic example comes from marketing in a store –” I see you’re looking at X,” he said.
That technology is already live, as Brian Fung reported in the Washington Post: stores are using cellphones to track shopping habits. In Washington, a more palatable example might be around the Mall, where geofences and tracking trigger information about Smithsonian paintings, trees, statuary, or monuments.
The limitation on facial recognition capabilities in Glass also means that the most interesting and disturbing potential application of its gaze is still far away: looking at someone in a lobby, bar, hearing or conference and learning not only who the person is but what role he or she may play in DC’s complicated ecosystem of lobbyists, journalists, Congressional staffers, politicians, media, officials, public advocates and campaign operatives. (For now, the role of the trusted aide, whispering brief identifiers into the ears of the powerful is safe.)
When more apps like this go live in more devices, expect some fireworks to ensure around the United States and the world, as more private and semi-public spaces become recorded. Glass and its descendents will provide evidence of misbehavior by law enforcement, just as cellphones have in recent years. The cameras will be on the faces of officers, as well. While some studies suggest that police wearing cameras may improve the quality of their policing — and civil liberties advocates support their introduction — such devices aren’t popular with the New York City Police Department.
As with the dashboard cameras that supply much of the footage for “Cops” in the United States and offer some protection against corrupt police and fraud in Russia, wearable cameras look likely to end up on the helmets, glasses, lapels or shoulders of many officers in the future, from Los Angeles to London.
The aspirational view of this demo is that it will show how it’s possible to integrate more public data into the life of a citizen without requiring her to pull out a phone.
“There’s a lot of potential for this app to get people to care about an issue and take action,” said Spencer. “It’s about getting people aware. The cool thing about this is its passive nature. You start it once and it tells you when you’re near something.”
A more dystopian view is that people will see a huge budget number and call the switchboard of a given agency to angrily complain, as opposed to the constituent relations staff of their representatives in Congress.
Given the challenges that Congress already faces with the tidal wave of social media and email that has swelled up over the last decade, that would be unhelpful at best. If future digital advocates want to make the most of such tools, they’ll need to provide users with context for the data they’re being fed, from sources to more information about the issues themselves the progress of existing campaigns.
This initial foray is, after all, just a demo. More integration may be coming in the next generation of wearables.
The Google home page currently has a link to ask President Obama a question in a Google+ Hangout. That’s some mighty popular online real estate devoted to citizen engagement.
As ever, laws and institutions lag the rapid pace of technological change. In 2014, for instance, mandating that the person designated to publish federal information must be a practical printer “versed in the art of bookbinding” is a statutory remnant of a bygone age.
Last week, Senator Amy Klobuchar [D-MN] introduced the Government Publishing Office Act of 2014, S.1947, which would rename the United States Government Printing Office the Government Publishing Office. (It would also strike the bookbinding requirement.)
The current Public Printer of the United States supported the proposal. “Publishing defines a broad range of services that includes print, digital, and future technological advancements,” said Public Printer Davita Vance-Cooks, in a statement. “The name Government Publishing Office better reflects the services that GPO currently provides and will provide in the future. I appreciate the efforts of Senators Klobuchar and Chambliss for introducing and supporting this bill. GPO will continue to meet the information needs of Congress, Federal agencies, and the public and
carry out our mission of Keeping America Informed.”
“The idea of renaming GPO was discussed in a December Committee on House Administration hearing entitled “Mission of the Government Printing Office in a post-print world”, which I wrote about here,” said Daniel Schuman, policy director at Citizens for Responsibility and Ethics in Washington (CREW), in a blog post on the GPO bill.
One of the most important open government data efforts in United States history came into being in 1993, when citizen archivist Carl Malamud used a small planning grant from the National Science Foundation to license data from the Securities and Exchange Commission, published the SEC data on the Internet and then operated it for two years. At the end of the grant, the SEC decided to make the EDGAR data available itself — albeit not without some significant prodding — and has continued to do so ever since. You can read the history behind putting periodic reports of public corporations online at Malamud’s website, public.resource.org.
Two decades later, Malamud is working to make the law public, reform copyright, and free up government data again, buying, processing and publishing millions of public tax filings from nonprofits to the Internal Revenue Service. He has made the bulk data from these efforts available to the public and anyone else who wants to use it.
“This is exactly analogous to the SEC and the EDGAR database,” Malamud told me, in an phone interview last year. The trouble is that data has been deliberately dumbed down, he said. “If you make the data available, you will get innovation.”
November Form 990s now ready. http://t.co/HDoMzPjpY0 We have 7,335,804 Form 990s available. *STILL* no word from the IRS.
Making millions of Form 990 returns free online is not a minor public service. Despite many nonprofits file their Form 990s electronically, the IRS does not publish the data. Rather, the government agency releases images of millions of returns formatted as .TIFF files onto multiple DVDs to people and companies willing and able to pay thousands of dollars for them. Services like Guidestar, for instance, acquire the data, convert it to PDFs and use it to provide information about nonprofits. (Registered users view the returns on their website.)
As Sam Roudman reported at TechPresident, Luke Rosiak, a senior watchdog reporter for the Washington Examiner, took the files Malamud published and made them more useful. Specifically, he used credits for processing that Amazon donated to participants in the 2013 National Day of Civic Hacking to make the .TIFF files text-searchable. Rosiak then set up CItizenAudit.org a new website that makes nonprofit transparency easy.
“This is useful information to track lobbying,” Malamud told me. “A state attorney general could just search for all nonprofits that received funds from a donor.”
Malamud estimates nearly 9% of jobs in the U.S. are in this sector. “This is an issue of capital allocation and market efficiency,” he said. “Who are the most efficient players? This is more than a CEO making too much money — it’s about ensuring that investments in nonprofits get a return.
“I think inertia is behind the delay,” he told me, in our interview. “These are not the expense accounts of government employees. This is something much more fundamental about a $1.6 trillion dollar marketplace. It’s not about who gave money to a politician.”
If I order these IRS DVDs, my cost is $2910. Media and gov get them free, but none of them lifting a finger to help. http://t.co/B6m5VECV1O
When asked for comment, a spokesperson for the White House Office of Management and Budget said that the IRS “has been engaging on this topic with interested stakeholders” and that “the Administration’s Fiscal Year 2014 revenue proposals would let the IRS receive all Form 990 information electronically, allowing us to make all such data available in machine readable format.”
Today, Malamud sent a letter of complaint to Howard Shelanski, administrator of the Office of Information and Regulatory Affairs in the White House Office of Management and Budget, asking for a review of the pricing policies of the IRS after a significant increase year-over-year. Specifically, Malamud wrote that the IRS is violating the requirements of President Obama’s executive order on open data:
The current method of distribution is a clear violation of the President’s instructions to
move towards more open data formats, including the requirements of the May 9, 2013 Executive Order making “open and machine readable the new default for government
information.”
I believe the current pricing policies do not make any sense for a government
information dissemination service in this century, hence my request for your review.
There are also significant additional issues that the IRS refuses to address, including
substantial privacy problems with their database and a flat-our refusal to even
consider release of the Form 990 E-File data, a format that would greatly increase the
transparency and effectiveness of our non-profit marketplace and is required by law.
It’s not clear at all whether the continued pressure from Malamud, the obvious utility of CitizenAudit.org or the bipartisan budget deal that President Obama signed in December will push the IRS to freely release open government data about the nonprofit sector,
The furor last summer over the IRS investigating the status of conservative groups claimed tax-exempt status, however, could carry over into political pressure to reform. If political groups were tax-exempt and nonprofit e-file data were published about them, it would be possible for auditors, journalists and Congressional investigators to detect patterns. The IRS would need to be careful about scrubbing the data of personal information: last year, the IRS mistakenly exposed thousands of Social Security numbers when it posted 527 forms online — an issue that Malamud, as it turns out, discovered in an audit.
“This data is up there with EDGAR, in terms of its potential,” said Malamud. “There are lots of databases. Few are as vital to government at large. This is not just about jobs. It’s like not releasing patent data.”
If the IRS were to modernize its audit system, inspector generals could use automated predictive data analysis to find aberrations to flag for a human to examine, enabling government watchdogs and investigative journalists to potentially detect similar issues much earlier.
That level of data-driven transparency remains in the future. In the meantime, CitizenAudit.org is currently running on a server in Rosiak’s apartment.
Whether the IRS adopts it as the SEC did EDGAR remains to be seen.
This September, I visited the United Kingdom’s Ministry of Justice and looked at the last remaining section of the Magna Carta that remains in effect. I was not, however, in a climate-controlled reading room, looking at a parchment or sheepskin.
Rather, I was sitting in the Ministry’s sunny atrium, where John Sheridan was showing me the latest version of the seminal legal document, now living on online, on his laptop screen. The remaining section that is in force is rather important to Western civilization and the rule of law as many citizens in democracies now experience it:
NO Freeman shall be taken or imprisoned, or be disseised of his Freehold, or Liberties, or free Customs, or be outlawed, or exiled, or any other wise destroyed; nor will We not pass upon him, nor [X1condemn him,] but by lawful judgment of his Peers, or by the Law of the Land. We will sell to no man, we will not deny or defer to any man either Justice or Right.
From due process to eminent domain to a right to a jury trial, many of the rights that American or British citizens take as a given today have their basis in the English common law that stems from this document.
Over a cup of tea, Sheridan caught me up on the progress that his team has made in digitizing documents and improving the laws of the land. There are now 2 million monthly unique visitors to legislation.gov.uk every month, with 500+ million page views annually. People really are reading Parliament’s output, he observed, and increasingly doing so on tablets and mobile devices. The amount of content flowing into the site is considerable: according to Sheridan, the United Kingdom is passing laws at an estimated rate of 100,000 words every month, or twice as much as the complete works of Shakespeare.
Notable improvements over the years include the ability to compare the original text of legislation versus the latest version (as we did with the Magna Carta) and view a timeline of changes using a slider for navigation, exploring any given moment in time. Sheridan was particularly proud of the site’s rendering of legislation in HTML, include human-readable permanent uniform resource locators (URLS) and the capacity to produce on-demand PDFs of a given document. (This isn’t universally true: I found some orders appear still as PDFs).
More specifically, Sheridan highlighted a “good law” project, wherein the Office of the Parliamentary Counsel (OPC) of Britain is working to help develop plain language laws that are “necessary, clear, coherent, effective and accessible.” A notable component of this good law project is an effort to apply a tool used in online publishing, software development and advertising — A/B testing — to testing different versions of legislation for usability.
The video of a TedX talk embedded below by Richard Heaton, the permanent secretary of the United Kingdom’s Cabinet Office and first parliamentary counsel, explores the idea of “good law” at more length:
Sheridan went on to describe one of the more ambitious online collaborations between a government and its citizens I had heard of to date, a novel cross-Atlantic challenge co-sponsored by the UK and US governments, and a hairy legal technology challenge bearing down upon societies everywhere: what happens when software interprets the law?
For instance, he suggested, consider the increasing use of Oracle software around legislation. “As statutes are interpreted by software, what’s introduced by the code? What about quality testing?”
As this becomes a data problem, “you need information to contextualize it,” said Sheridan. “If you’re thinking about legislation as code, and as data, it raises huge questions for the rule of law.”
Sheridan has been one of the world’s foremost proponents of publishing legislative data through APIs, an approach that has come under criticism by open government data advocates after the government shutdown in the United States. (In 2014, forward-thinking governments publishing open data might consider provide basic visualization tools to site visitors, API access for third-party developers and internal users, and bulk data downloads.) One key difference between the approach of his team and other government entities might be that the National Archives are “dogfooding,” or consuming the same data through the same interface that they expect third-parties to use, as Sheridan wrote last March:
“We developed the API and then built the legislation.gov.uk website on top of it. The API isn’t a bolt-on or additional feature, it is the beating heart of the service. Thanks to this approach it is very easy to access legislation data – just add /data.xml or /data.rdf to any web page containing legislation, or /data.feed, to any list or search results. One benefit of this approach is that the website, in a way, also documents the API for developers, helping them understand this complex data.”
Perhaps because of that perspective, Sheridan, was as supportive of an APIs when we talked this September as he had been in 2012:
The legislation.gov.uk API has changed everything for us. It powers our website. It has enabled us to move to an open data business model, securing the editorial effort we need from the private sector for this important source of public data. It allows us to deliver information and services across channels and platforms through third party applications. We are developing other tools that use the API, using Linked Data – from recording the provenance of new legislation as it is converted from one format to another, to a suite of web based editorial tools for legislation, including a natural language processing capability that automatically identifies the legislative effects. Everything we do is underpinned by the API and Linked Data. With the foundations in place, the possibilities of what can be done with legislation data are now almost limitless.
Sheridan noted to me that the United Kingdom’s legislative open government data efforts are now acting as a platform for large commercial legal publishers and new entrants, like mobile legislative app, iLegal.
The iLegal app content is derived from the legislation.gov.uk API and offers handy features, like offline access to all items of legislation. iLegal currently costs £49.99/$74.99 annually or £149.99/$219.99 for a lifetime subscription, which might seem steep but is a fraction of the cost of of Halsbury’s Statutes, currently listed at £9,360.00 from Lexis-Nexis.
This approach to publishing the laws of the land online, in structured form under an open license, is an instantiation of the vision for Law.gov that citizen archivist Carl Malamud has been advocating for in the United States. 2013 saw some progress in that vein when the U.S. House of Representatives publishes U.S. Code as open government data.)
What’s notable about the United Kingdom’s example, however, is that less then a decade ago, none of this could have been possible. Why? As ScraperWiki founder Francis Irving explained, the UK’s database of laws was proprietary data until December 2006. Now, however, the law of the land is released back to the people as it is updated, a living code available in digital form to any member of the public that wishes to read or reuse it.
The United Kingdom, however, has moved beyond simply publishing legislation as open data: they’re actively soliciting civic participation in its maintenance and improvement. For the last year, the National Archives has been guiding the world’s leading commercial open data curation project.
“We are using open data as business model for fulfilling public services,” said Sheridan, in our interview. “We train people to do editorial work. They are paid to improve data. The outputs are public.”
In other words, the open government data always remains free to the people through legislation.gov.uk but any academic, nonprofit or commercial entity can act to add value to it and sell access to the resulting applications, analyses or interfaces.
Since the start of the UK project, they have doubled the number of people working on their open data, Sheridan told me. “The bottleneck is training,” he said. “We have almost unlimited editorial expertise available through our website. We define the process and rules, and then let anyone contribute. For example, we’re now working on revising legislation, identifying changes, researching it — when it comes in, what it affects — and then working with editor. Previous to this effort, government hasn’t been able to revise secondary legislation.”
Sheridan said that the next step is feedback for other editorial values.
“We’re looking for more experts,” he said. “They’re generally paid for by someone. It’s very close to open source software model. They must be able to demonstrate competence. There’s a 45-minute test, which we’re now given to thousands of people.”
If this continues to work, distributed online collaboration is a “brilliant way to help improve the quality of law,” said Sheridan.
“It’s a way to get the work done — and the work is really hard. You have to invest time and energy, and you must protect the reputation of the Archive. This is somewhat radical for the nation’s statute book. We have redesigned the process so people can work with us. It’s not a wiki, but participation is open. It’s peer production.”
A trans-Atlantic challenge to map legislative data
The U.K. National Archives and U.S. Library of Congress have asked for help mapping elements from bills to the most recent Akoma Ntoso schema. (Akoma Ntoso is an emerging global standard for machine-readable data describing parliamentary, legislative and judiciary documents.) The best algorithm that maps U.S. bill XML or UK bill XML to Akoma Ntoso XML, including necessary data files and supporting documentation, will win $10,000.
If you have both skills and interest, get cracking: the challenge closes on December 31, 2013.
As more and more governments release data around the world, the conditions under which it is published and may be used will become increasingly important. Just as open formats make data easier to put to work, open licenses make it possible for all members of the public to use it without fear.
Given that wonky but important issue, it’s important that governments that want to maximize the rewards of the work involved in cleaning and publishing open government data get the policy around its release right. Today, several open government advocates have released an updated Best-Practices Language for Making Data “License-Free”, which can found online at at theunitedstates.io/licensing.
“In short what we say is ‘Use Creative Commons Zero (CC0),’ which is a public domain dedication,” said Josh Tauberer, the founder of Govtrack.us, via email. “We provide recommended language to put on government datasets and software to put the data and code into the world-wide public domain. In a way, it’s the opposite of a license.
Tauberer, Eric Mill, developer at the Sunlight Foundation, and Jonathan Gray, director of policy and ideas at the Open Knowledge Foundation, who have been working on the guidance since May, all blogged about the new guidance:
“Back in May, the Administration’s Memorandum on Open Data created very confusing guidance for agencies about what constitutes open data by saying open data should be ‘openly licensed’,” explained Tauberer, via email. “In response to that, we began working on guidance for federal agencies for how to make sure their data in open under the definition in the 8 Principles of Open Government Data.”
The basic issue, he said, is that the memorandum directed agencies to make data open but, in the view of these advocates, told agencies the wrong thing about what open data actually means. “We’re correcting that with precise, actionable direction,” said Tauberer.
What would the consequences of United States government entities not adopting this guidance be?
“Because M-13-13 required open licensing as the new default, I worry about agencies taking the guidance too literally and applying licensing where they might not have before, even if the work is exempt from copyright,” said Tauberer. “Or they may now consider open licensing of works produced by a contractor to be the new norm, since it is permitted by M-13-13, but for certain core information produced by government this would be a major step backward.”
“Imagine if after FOIA’ing an agency’s deliberative documents, The New York Times was legally required to provide attribution to a contractor, or, worse, to the government itself,” said Tauberer. “The federal government is relying more and more on contractors and lawyers, so it’s important that we reinforce these norms now.”
While it remains to be seen if the White House Office of Management and Budget merges this best practice into its open data policy, the advocates have already had success getting it adopted.
“Since we first published the guidance in August, it’s led to three government projects using our advice,” said Tauberer. “Partly in response to our nudging, in October OSTP’s Project Open Data re-licensed its schema for federal data catalog inventory files. (It had been licensed under CC-BY because of non-governmental contributors to the schema, but now it uses CC0.) In September and October, The CFPB followed our guidance and applied CC0 to their “qu” project and their eRegs platform.”
Open government advocates in the United States can expect to find public support for more accountability on a host of federal programs and policies among an electorate deeply distrustful of the White House’s commitment to more transparency regarding them.
Anyone interested in engaging the public regarding rules, regulations and proposed laws, however, should take note of the tenor of the comments on the coverage of the second United States National Action Plan on Open Government. They are a bellwether for the degree of damage to public trust in government that now persists in the United States.
I couldn’t find a single positive or even neutral comment on any of the stories. Considered in the context of the current political climate in the United States, that’s not surprising.
Gallup polling data from September 2013 indicated then that the trust of Americans in government had now fallen to historic lows.
After the government shutdown this fall and the messy rollout of the Affordable Care Act over the past two months, including a high stakes Internet failure at Healthcare.gov, I suspect that a Gallup poll taken today would find that even fewer people trust that the executive or legislative branch of the federal government of the United States.