What began as an idea just over two years ago is now a reality: a citizen corps of designers and programmers who committed to a year-long fellowship to Code for America. Today in San Francisco, the inaugural Code for America Summit will highlight the year past, look to the year ahead and convene a conversation around four core topics that will be familiar to people who have been following the story of Gov 2.0: citizen participation, data-driven decision making, co-Creation and co-stewardship, government as a platform. The SPUR Center is packed with civic innovators from all around the country and buzzing with energy. My liveblog is below.
This Web app takes 311 reports about vacant and abandoned buildings from the Chicago and visualizes them onto a searchable map. “It’s specifically set up to pull data from Chicago’s data portal,” said Eder, linking to the 311 service requests of vacant and abandoned buildings dataset.
Eder shared more about how mapping Chicago’s vacant buildings in a blog post earlier this week. The results are unsurprising: there are many more vacant buildings in areas with high poverty rates.
Eder said that the app could be used by other cities, depending on how they store or format their data. The code for Chicago Buildings is on Github. On that front, he says that Chicago “isn’t using Open 311 yet, so this site isn’t either. That being said, it wouldn’t be too hard to hook up the same interface to a different data source.” Code for America will help Chicago to implement Open311 in 2012. Eder shared that he wrote a script that converts Socrata to Google Fusion Tables that could be modified for this purpose.
In the video below, Elizabeth Park, the creator of IFindit Chicago, talks about how she was inspired to build the team that created an Android app to help homeless and lower income citizens find resources like as shelters, medical clinics,and food pantries.
Voting for the winners ends this Friday, October 14th, so check out the community round entries and weigh in.
As a reminder: If you have open government news to share, you can always find me at @digiphile on Twitter, where I share my email address, alex@oreilly.com.
Two weeks ago at the Strata Conference in NYC, I donned a headset, grabbed a tablet worth of questions and headed to the podium to talk with the chairman of the U.S. House Committee on Oversight and Government Reform about data and open government.
Congressman Darrell Issa (R-CA) joined me via remote webcast from chambers in Washington, D.C. Our crack video team is working on an improved version of this video in which you’ll see my side of the broadcast, along with a boost in audio. Until then, the video that the House Oversight digital team uploaded to YouTube will suffice — and I don’t want to wait to share this story any longer in the meantime, particularly as interest builds behind the principle subject of our conversation, a proposed bill to standardize financial reporting data standards in the federal government and create single database for financial spending.
Daniel Schuman listened in and summarized our conversation on open government data over at the Sunlight Foundation’s blog:
The Chairman focused his remarks on the DATA Act, the bipartisan legislation he introduced that would transform how government tracks federal spending and identifies waste, fraud, and abuse.
He emphasized the importance of making government data available online in real time so that innovative minds can immediately make use the information to build their own businesses. Business, in turn, would help the government identify program mismanagement and data quality problems. The Chairman specifically singled out Vice President Biden as a supporter of efforts to find a common solution to make data available in a systematic way.
…Chairman Issa explained that the private sector must step up as advocates for greater openness because they will benefit from building and using the tools made possible by greater transparency. He added when government drives down the cost of obtaining information, private individuals will derive value from the analysis of data, not its ownership.
“In a cost estimate dated Sept. 16, the CBO attributes $325 million of the estimated total to requirements in the bill regarding the collecting and reporting of financial information. The DATA Act would require federal agencies, and most government contractors and grant award winners to adopt XBRL as a financial data reporting mechanism.”
Left unsaid in the CBO estimate is what the impact of this kind of transparency on the federal government’s finances might be, in terms of savings. House Oversight staff have estimated annual savings from standards and centralized spending database that would more than offset that outlay, including:
$41 million in funds recovered from questionable recipients
$63 million in funds withheld from questionable recipients
$5 billion in savings recommended by inspectors general
unknown savings resulting from better internal spending control and better oversight by Congressional appropriators.
The DATA Act, which would expand the role of the Recovery Accountability and Transparency Board to track all federal spending and make all of the information available to the public, has bipartisan support in the Senate from Senator Mark Warner (D-VA), who has introduced a companion bill there.
Given the White House’s embrace of the mantle of open government on President’s first day in office, the executive branch has gathered a lot of the press, attention, praise, scrutiny and criticism in this area.
That looks to be changing, and for the better. As Clay Johnson pointed out at the beginning of 2011, any competition between the White House and Congress on open government is likely to be a win for the American people.
Jim Harper, director of information policy studies at the Cato Foundation and webmaster of WashingtonWatch.com, wrote then that the GOP can eclipse Obama on transparency. “House Republicans can quickly outshine Obama and the Democratic Senate,” he opined. “It all depends on how they implement the watch phrase of their amendment package: “publicly available in electronic form.”
The GOP House leadership must make sure that this translates into real-time posting of bills, amendments and steps in the legislative process, in formats the Internet can work with. It’s not about documents anymore. It’s about data. Today’s Internet needs the data in these documents.
There are no technical impediments to a fully transparent Congress. Computers can handle this. The challenges, however, are institutional and practical.”
Johnson identified the moment in history as an important inflection point, and one that, if the White House rose to the challenge, could legitimately be seen as an open government win for the American people and a smarter, more accountable government.
The White House may hold the considerable advantages of the bully pulpit and the largest followings of any federal entity or politician on Twitter, for now, but that has to be balanced against the considerable new media prowess that the GOP has built up over their Democratic counterparts in Congress, where Republicans hold an edge on social media.
While some projects or choices continue to cast questions on commitment in the rank and file to open government principles, with the GOP bending new House rules, there’s progress to report. The leadership of the House of Representatives has supported the creation of open, online video archives, like House.Resource.org. The House revamped its floor feed recently, adding live XML. And House leadership has recently venerated the role of technology in making Congress more transparent, engaged and accountable.
Rep. Issa, in particularly, appears to have taken on open government as a cause and, for the moment, its rhetoric. He even tweets using the #opengov hashtag. When it comes to the legislature, “the American people have a right to all the data from Congress. They have a right,” he said at a recent forum on Congressional transparency, as reported by Diana Lopez.
Government secrecy and transparency are, in theory, non-partisan issues. In practice, they are often used a political bludgeons against an opposing party, particularly by a partisan minority, and then discarded once power is gained. For government transparency to outlast a given White House or Congress, laws and regulatory changes have to happen.
Open government has to be “baked in” to culture, practices, regulations, technology, business practices and public expectations. Needless to say, that’s going to take a while, but it looks like both the administration and some members of Congress are willing to keep trying.
As these efforts go forward, it will be up to the media, businesses, nonprofits, watchdogs and, of course, citizens to hold them accountable for actions taken, not just rhetoric.
What’s the future of the DATA Act?
I’m writing a feature article about the bill, this conversation, context for government performance data and whether open government and transparency will have any legs in the upcoming presidential campaign.
If you have any questions that are unanswered after watching the conversation, comments about the use of XBRL or perspective on the proposed law’s future in Congress, please ring in in the comments or find me at alex[at]oreilly.com.
The school year may have just begun but Congress has already received an early report card on the transparency of its legislative data. The verdict? A 2.47 GPA, on average, if you don’t include the 4 Incompletes. That’s on average a bit better than a C+, for those who’ve long since forgotten how grade point averages are computer. It also means that while Congress “passed this term,” any teacher’s note would likely include a stern warn that when it comes to legislative transparency, the student needs to show improvement before graduation.
If you’re interested in opening up the United States federal legislative system, you can tune into a livestream of special DC forum this morning where Harper and other open government stakeholders “rates Congress. Brandon Arnold, director of government affairs at the Cato Institute, will moderate a discussion between Harper, Rep. Darrell Issa, chairman of the House Committee on Oversight and Government Reform, and John Wonderlich, policy director at the Sunlight Foundation.
The Cato paper analyzes Congressional achievement through the lens of four basic concepts in data publication: authoritative sourcing, availability, machine-discoverability, and machine-readability. “Together, these practices will allow computers to automatically generate the myriad stories that the data Congress produces has to tell,” writes Harper in a blog post today. “Following these practices will allow many different users to put the data to hundreds of new uses in government oversight.
That data model used to produce this analysis should be of interest to the broader open government data community, in terms of a good matrix for rating a given legislature. “Data modeling is pretty arcane stuff, but in this model we reduced everything to ‘entities,’ each having various ‘properties,’ explained Harper. “The entities and their properties describe the logical relationships of things in the real world, like members of Congress, votes, bills, and so on. We also loosely defined several ‘markup types’ guiding how documents that come out of the legislative process should be structured and published. Then we compared the publication practices in the briefing paper to the ‘entities’ in the model.”
While the obvious takeaway is that Congress could do better, Harper gives the Senate and House due credit and time to improve. “This stuff is tough sledding,” he allowed. “The data model isn’ the last word, and there are things happening varied places on and around Capitol Hill to improve things. Several pieces of the legislative process nobody has ever talked about publishing as data before, so we forgive the fact that this isn’t already being done. If things haven’t improved in another year, then you might start to see a little more piquant commentary.”
Today at the Social Good Summit, Dr. Raj Shah, the administrator of the United States Agency for International Development (USAID) will launch a new public engagement effort to raise awareness about the devastating famine in the Horn of Africa. USAID is calling it the “FWD campaign” and it includes some interesting uses of open data, mapping and citizen engagement. USAID launched USAID.gov/FWD today and a text to donate initiative up and running in time to be amplified by the reach of Mashable’s Social Good Summit. You can txt “GIVE’ to 777444 to donate $10.
FWD stands for “Famine, War, Drought,” the unfortunate combination that lies behind the crisis in the Horn of Africa. “It also stands for our call to action,” writes in Haley Van Dyck, director of digital strategy at USAID, with an eye to getting people involved in raising awareness and “forwarding” the campaign on to friends, family and colleagues. Each of the components of the page includes the options to share on Twitter, Facebook or “FWD” on to people using email.
“Frankly, it’s the first foray the agency is taking into open government, open data, and citizen engagement online,” said Van Dyck. “We recognize there is a lot more to do on this front, but are happy to start moving the ball forward. This campaign is different than anything USAID has done in the past. It is based on informing, engaging, and connecting with the American people to partner with us on these dire but solvable problems. We want to change not only the way USAID communicates with the American public, but also the way we share information.”
Van Dyck was particularly excited about the interactive maps that USAID has built and embedded on the FWD site. The agency built the maps with open source mapping tools and published the data sets they used to make these maps on data.gov.
The combination of publishing maps and the open data that drives them simultaneously online is significantly evolved for any government agency and will serve as a worthy bar for other efforts in the future to meet. They’ve done that by migrating their data to an open, machine-readable format. In the past, we released our data in inaccessible formats – mostly PDFs — that are often unable to be used effectively, wrote Van Dyck.
“USAID is one of the premiere data collectors in the international development space,” wrote Van Dyck. “We want to start making that data open, making that data sharable, and using that data to tell stories about the crisis and the work we are doing on the ground in an interactive way.”
While neither of those stories are good data points for the state of open government at the federal level, they are both part of a much larger narrative where some 40 countries (including the founding 8 members) have reportedly now submitted letters of intent to join this unprecedented international open government partnership.
Next Tuesday, I’ll be in New York on the same day that President Obama introduces the U.S. National Plan for open government as part of its commitment to the Open Government Partnership As John Wonderlich observed at the Sunlight Foundation on Friday, preparing for the U.S. National Plan and then delivering upon whatever is contains will be a “complex, ongoing effort that takes dedicated effort and attention,” adding to the progress towards a more transparent, participatory, collaborative or innovative government made to date.
If, like me, your German isn’t so good, try reading it using Google Translate.
Key points:
Ulrich Freise, Berlin’s Secretary of State for Home Affairs, described Berlin’s open data site as base for administrative action that citizens could use to make decisions, find facts and involve themselves in decision-making processes.
Domscheit-Berg, a member of the Berlin Open Data Platform for Action, praised the launch as an important milestone on the way to a more transparent government in Germany.
An “Open Data Day” in Berlin this May helped introduce more government staff to the idea and resulted in a agenda that subsequently helped shape the release.
Much of the data is not machine-readable, at present, nor released under a Creative Commons license that would free to be used commercially or otherwise adapted for further civic use, as German civic developer Stefan Wehr Meyer pointed out to Heise.
While Open Data Berlin launches with just 18 data sets, there’s plenty of room to grow. Data.gov, in the US, went online with 49 data sets in 2009. Now there are over 400,000 listed there. If Berlin can similarly expand and open up more meaningful data in a manner that’s usable to Germany’s civic developers, there will be more Deutchland data stories to tell this year.
Washington-based DevelopmentSeed continues to tell dazzling data stories with open source mapping tools. This week, they’ve posted a map of the local impact of unemployment and recovery spending. The map visualizes unemployment rate changes at a county level and folds in total economic recovery spending by the government under the American Recovery and Reinvestment Act of 2009. In the map embedded below, red corresponds to an increased unemployment rate and green corresponds to a lower unemployment rate or job growth. Counties that received less than $10 million dollars in recovery spending have a white pattern.
David Cole explains more in a post at DevelopmentSeed.org:
Over the last year, we see that unemployment dropped in 58% of counties by an average of 0.25 percentage points. On average the Recovery Act funded 31 projects at a total of $24,131,582.47 per county. Nationally this works out to about $282.66 in recovery spending per person.
…
Overall, it’s impossible to tell for sure how much recovery spending improved the economic situation, because we just don’t know how bad things could have been. It may be the case that without spending, this map would have a lot more red. Or maybe not. What’s interesting here is the local impact and information we are able to see from processing a few sets of open data. Check out how your county is doing compared to its surroundings. How about compared to a more or less urban county nearby?
Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science — from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively. Save 20% on registration with the code STN11RAD
“This is a huge accomplishment — a nationwide UK system for individuals to document and report problems with any kind of public transportation system,” wroteCivicCommons executive director Andrew McLaughlin this morning. “MySociety has figured out how to route every kind of report to the responsible agency (or even person) — “the service works everywhere in Great Britain, our database has over 300,000 stops and routes for train, tube, tram, bus, coach and ferry.” Great design and interface. Congratulations, +Tom Steinberg and team!”
“We’ve never before launched a site that took so much work to build, or that contained so much data,” writes Steinberg at the MySociety blog, where he explained more about what it’s for. (The emphasis below is mine.)
FixMyTransport has two goals – one in your face, and the other more subtle.
The first goal, as the site’s name suggests, is to help people get common public transport problems resolved. We’re talking broken ticket machines, gates that should be open and stations without stair-free access. We’ll help by dramatically lowering the barrier to working out who’s responsible, and getting a problem report sent to them – a task that would have been impossible without the help of volunteers who gathered a huge number of operator email addresses for us. Consequently the service works everywhere in Great Britain, our database has over 300,000 stops and routes for train, tube, tram, bus, coach and ferry.
The second goal – the subtle one – is to see if it is possible to use the internet to coax non-activist, non-political people into their first taste of micro-activism. Whilst the site intentionally doesn’t contain any language about campaigning or democracy, we encourage and provide tools to facilitate the gathering of supporters, the emailing of local media, the posting of photos of problems, and the general application of pressure where it is needed. We also make problem reports and correspondence between operators and users public, which we have frequently seen create positive pressure when used on sister sites FixMyStreet and WhatDoTheyKnow.
I’m not saying it is impossible to hack brilliant things without piles of VC gold. But if you are going to hack something really, genuinely valuable in just a couple of weeks, and you want it to thrive and survive in the real Internet, you need to have an idea that is as simple as it is brilliant. Matthew Somerville’s accessible Traintimes fits into this category, as does FlyOnTime.us, E.ggtimer.com and doodle.ch. But ideas like this are super rare — they’re so simple and powerful that really polished sites can be built and sustained on volunteer-level time contributions. I salute the geniuses who gave us the four sites I just mentioned. They make me feel small and stupid.
If your civic hack idea is more complicated than this, then you should really go hunting for funding before you set about coding. Because the Internet is a savagely competitive place, and if your site isn’t pretty spanking, nobody is going to come except the robots and spammers.
To be clear — FixMyTransport is not an example of a super-simple genius idea. I wish it were. Rather it’s our response to the questions “What’s missing in the civic web?” and “What’s still too hard to get done online?”