Congress weighs deep cuts to funding for federal open government data platforms

Several core pillars of federal open government initiatives brought online by the Obama administration may be shuttered by proposed Congressional budget cuts. Data.gov, IT.USASpending.gov, and other five other websites that offer platforms for open government transparency are facing imminent closure. A comprehensive report filed by Jason Miller, executive editor of Federal News Radio, confirmed that the United States of Office of Management and Budget is planning to take open government websites offline over the next four months because of a 94% reduction in federal government funding in the Congressional budget. Daniel Schuman of the Sunlight Foundation first reported the cuts in the budget for data transparency. Schuman talked to Federal News Radio about the potential end of these transparency platforms this week.

Cutting these funds would also shut down the Fedspace federal social network and, notably, the FedRAMP cloud computing cybersecurity programs. Unsurprisingly, open government advocates in the Sunlight Foundation and the larger community have strongly opposed these cuts.

As Nancy Scola reported for techPresident, Donny Shaw put the proposal to defund open government datain perspective at OpenCongress: “The value of data openness in government cannot be overestimated, and for the cost of just one-third of one day of missile attacks in Libya, we can keep these initiatives alive and developing for another year.”

Daniel Schuman was clear about the value of data transparency funding at the Sunlight Foundation blog:

The returns from these e-government initiatives in terms of transparency are priceless. They will help the government operate more effectively and efficiently, thereby saving taxpayer money and aiding oversight. Although we have significant issues with some of these program’s data quality, and we are concerned that the government may be paying too much for the technology, there should be no doubt that we need the transparency they enable. For example, fully realized transparency would allow us to track every expense and truly understand how money — like that in the electronic government fund — flows to federal programs. Government spending and performance data must be available online, in real time, and in machine readable formats.

There is no question that Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. OMB Watch found progress on open government in a recent report by cautioned that there’s a long road ahead. It is clear that we are in open government’s beta period. The transparency that Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote today in the Washington Post. There are real data quality and cultural issues that need to be addressed to match the rhetoric of the past three years. “Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, president of the Metro Chicago Information Center. “I think the New Tech world forgets that — open data is a political process first and foremost, and a technology problem second.”

Carlson highlighted how some approaches taken in establishing Data.gov have detracted from success of that platform:

First, no distinction was made between making transparent operational data about how the government works (e.g, EPA clean up sites; medicaid records) and making statistical data more useful (data re: economy and population developed by the major Federal Statistical Agencies). So no clear priorities were set regarding whether it was an initiative meant to foster innovation (which would emphasize operational data) or whether it was an initiative meant to open data dissemination lines for agencies that had already been in the business of dissemination (Census, BLS, etc.), which would have suggested an emphasis on developing API platforms on top of current dissemination tools like American Fact Finder or DataFerrett.

Instead, a mandate came from above that each agency or program was responsible for putting X numbers of data sets on data.gov, with no distinction made as to source or usefulness. Thus you have weird things like cutting up geo files into many sub-files so that the total number of files on data.gov is higher.

The federal statistical agencies have been disseminating data for tens of decades. They felt that the data.gov initiative rolled right over them, for the most part, and there was a definite feeling that the data.gov people didn’t “get it” from the FSA perspective – who are these upstarts coming in to tell us how to release data, when they don’t understand how the FSAs function, how to deal with messy statistical data that have a provenance, etc. An open data session at the last APDU conference saw the beginnings of a conversation between data.gov folks and the APDU folks (who tend to be attached to the major statistical agencies), but there is a long way to go.

Second, individuals in bureaucracies are risk-averse. The political winds might be blowing toward openess now, but executives come and go while those in the trenches stay, (or would like to). Thus the tendency was to find data that was relatively low-risk. Agencies literally culled their catalogs to find the least controversial data that could be released.

Neither technical nor cultural changes will happen with the celerity that many would like, despite the realities imposed by the pace of institutional change. “Lots of folks in the open government space are losing their patience for this kind of thing, having grown accustomed to startups that move at internet speed,” said Tom Lee, director of Sunlight Labs. “But USAspending.gov really can be a vehicle for making smarter decisions about federal spending.”

“Obviously the data quality isn’t there yet. But you know what? OMB is taking steps to improve it, because the public was able to identify the problems. We’re never going to realize the incredible potential of these sites if we shutter them now. A house staffer, or journalist, or citizen ought to be able to figure out the shape of spending around an issue by going to these sites. This is an achievable goal! Right now they still turn to ad-hoc analyses by GAO or CRS — which, incidentally, pull from the same flawed data. But we really can automate that process and put the power of those analyses into everyone’s hands.”

Potential rollbacks to government transparency, if seen in that context, are detrimental to all American citizens, not just for those who support one party or the other. Or, for that matter, none at all. As Rebecca Sweger writes at the National Priorities Project, “although $32 million may sound like a vast sum of money, it is actually .0009% of the proposed Federal FY11 budget. A percentage that small does not represent a true cost-saving initiative–it represents an effort to use the budget and the economic crisis to promote policy change.”

Lee also pointed to the importance of TechStat to open government. TechStat was part of the White House making the IT Dashboard open source yesterday. “TechStat is one of the most concrete arguments for why cutting the e-government fund would be a huge mistake,” he said. “The TechStat process is credited with billions of dollars of savings. Clearly, Vivek [Kundra, the federal CIO] considers the IT Dashboard to be a key part of that process. For that reason alone cutting the e-gov fund seems to me to be incredibly foolish. You might also consider the fact pointed out by NPP: that the entire e-gov budget is a mere 7.7% of the government’s FOIA costs.”

In other words, it costs far more to release the information by the current means. This is the heart of the case for data.gov and data transparency in general: to get useful information into the hands of more people, at a lower cost than the alternatives,” said Lee. Writing on the Sunlight Labs blog, Lee emphasized today that “cutting the e-gov funding would be a disaster.”

The E-Government Act of 2002 that supports modern open government platforms was originally passed with strong bipartisan support, long before before the current president was elected. Across the Atlantic, the British parallel to Data.gov, Data.gov.uk continues under a conservative prime minister. Open government data can be used not just to create greater accountability, but also economic value. That point was made emphatically last week, when former White House deputy chief technology officer Beth Noveck made her position clear on the matter: cutting e-government funding threatens American jobs:

These are the tools that make openness real in practice. Without them, transparency becomes merely a toothless slogan. There is a reason why fourteen other countries whose governments are left- and right-wing are copying data.gov. Beyond the democratic benefits of facilitating public scrutiny and improving lives, open data of the kind enabled by USASpending and Data.gov save money, create jobs and promote effective and efficient government.

Noveck also referred to the Economist‘s support for open government data: “Public access to government figures is certain to release economic value and encourage entrepreneurship. That has already happened with weather data and with America’s GPS satellite-navigation system that was opened for full commercial use a decade ago. And many firms make a good living out of searching for or repackaging patent filings.”

The open data story in healthcare continues to be particularly compelling, from new mobile apps that spur better health decisions to data spurring changes in care at the Veterans Administration. Proposed cuts to weather data collection could, however, subtract from that success.

As Clive Thompson reported at Wired this week, public sector data can help fuel jobs, “shoving more public data into the commons could kick-start billions in economic activity.” Thompson focuses on the story of Brightscope, where government data drives the innovation economy. “That’s because all that information becomes incredibly valuable in the hands of clever entrepreneurs,” wrote Thompson. “Pick any area of public life and you can imagine dozens of startups fueled by public data. I bet millions of parents would shell out a few bucks for an app that cleverly parsed school ratings, teacher news, test results, and the like.”

Lee doesn’t entirely embrace this view but makes a strong case for the real value that does persist in open data. “Profits are driven toward zero in a perfectly competitive market,” he said.

Government data is available to all, which makes it a poor foundation for building competitive advantage. It’s not a natural breeding ground for lucrative businesses (though it can certainly offer a cheap way for businesses to improve the value of their services). Besides, the most valuable datasets were sniffed out by business years before data.gov had ever been imagined. But that doesn’t mean that there isn’t huge value that can be realized in terms of consumer surplus (cheaper maps! free weather forecasts! information about which drug in a class is the most effective for the money!) or through the enactment of better policy as previously difficult-to-access data becomes a natural part of policymakers’ and researchers’ lives.

To be clear, open data and the open government movement will not go away for lack of funding. Government data sets online will persist if Data.gov goes offline. As Samantha Power wrote at the White House last month, transparency has gone global. Open government may improve through FOIA reform. The technology that will make government work better will persist in other budgets, even if the e-government budget is cut to the bone.

There are a growing number of strong advocates who are coming forward to support the release of open government data through funding e-government. My publisher, Tim O’Reilly, offered additional perspective today as well. “Killing open data sites rather than fixing them is like Microsoft killing Windows 1.0 and giving up on GUIs rather than keeping at it,” said O’Reilly. “Open data is the future. The private sector is all about building APIs. Government will be left behind if they don’t understand that this is how computer systems work now.”

As Schuman highlighted at SunlightFoundation.com, the creator of the World Wide Web, Sir Tim Berners-Lee, has been encouraging his followers on Twitter to sign the Sunlight Foundation’s open letter to Congress asking elected officials to save the data.

What happens next is in the hands of Congress. A congressional source who spoke on condition of anonymity said that they are aware of the issues raised with cuts to e-government finding and are working on preserving core elements of these programs. Concerned citizens can contact the office of the House Majority Leader, Representative Eric Cantor (R-VI) (@GOPLeader), at 202.225.4000.

UPDATE: The Sunlight Foundation’s Daniel Schuman, who is continuing to track this closely, wrote yesterday that, under the latest continuing resolution under consideration, funding for the E-Government Fund would be back up in the tens of millions range. Hat tip to Nancy Scola.

UPDATE II: Final funding under FY 2011 budget will be $8M. Next step: figuring out the way forward for open government data.

Open government scrutinized before the House Oversight Committee

This morning, the Oversight Committee in the United States House of Representatives held a hearing on the Obama administration’s open government efforts. The “Transparency Through Technology: Evaluating Federal Open-Government Initiatives hearing was streamed live online at oversight.house.gov.

House Oversight Chairman Darrell Issa (R-CA) asked his Twitter followers before the hearing a simple question “Have you tried to get facts on how gov’t spends your $ on USASpending.gov?” He received no answers.

The oversight committee did, however, hear extensive testimony from government IT executives and open government watchdogs. As Representative Issa probes how agencies balance their books, such insight will be crucial, particularly with respect to improving accountability mechanism and data. Poor data has been a reoccurring theme in these assessments over the years. Whether the federal government can effectively and pervasively apply open data principles appears itself to be open question.

The first half of the hearing featured testimony from Dr. Danny Harris, chief information officer for the Department of Education, Chris Smith, chief information officer for the Department of Agriculture, Jerry Brito, senior research fellow at the Mercatus Center at George Mason University and Ellen Miller, co-founder and executive director of the Sunlight Foundation.

Alice Lipowicz of Federal Computer Week tweeted out a few data points from the hearing.

  • A Sunlight Foundation audit found that the USDA spent $12.7B on school lunches but only reported $250,000 on USASpending.gov
  • According to Brito, “half of 3000 datasets on Data.gov are on EPA toxic releases, with only 200 to 300 datasets are on fed gov activity.” Lipowicz also tweeted that Brito testified that federal agencies need outside auditors and “ought to report ‘earnings’ similar to private sector.”
  • USDA CIO Chris Smith said that the agency did not report school lunch payments below $25,000 to USASpending.gov; will report in FY2012

In her testimony before the House committee on clearspending, Miller reiterated the position of the Sunlight Foundation that the efforts of the administration to make government spending data open, accurate and available have been insufficient, particularly when the data is wrong.

The Sunlight Foundation has been excited about the new promises of data transparency, but sometimes the results are nowhere near the accuracy and completeness necessary for the data to be useful for the public.

Sunlight’s Clearspending analysis found that nearly $1.3 trillion of federal spending as reported on USASpending.gov was inaccurate. While there have been some improvements, little to no progress has been made to address the fundamental flaws in the data quality. Correcting the very complicated system of federal reporting for government spending is an enormous task. It has to be done because without it there is no hope for accountability.

Miller made several recommendations to the committee to improve the situation, including:

  • unique identifiers for government contracts and grants
  • publicly available hierarchical identifiers for recipients to follow interconnected entities
  • timely bulk access to all data.

Her remarks ultimately reflect the assessment that she made at last year’s Gov 2.0 Summit, where she made it clear that open government remains in beta. Our interview is below:

Tracking the progress of the Open Government Directive requires better data, more auditors and improved performance metrics. That said, this looks like the year when many of the projects at agencies will move forward towards implementation.

Last month, the U.S. moved forward into the pilot phase of an open source model for health data systems as the fruits of the Direct Project came to Minnesota and Rhode Island. The Direct Project allows for the secure transmission of health care data over a network. Some observers have dubbed it the Health Internet, and the technology has the potential to save government hundreds of millions of dollars, along with supporting the growth of new electronic health records systems .Open source and open government have also come together to create OpenStack, an open cloud computing platform that’s a collaboration between NASA, Rackspace, Cisco and a growing group of partners.

It’s too early to judge the overall effort open government as ultimately a success or failure. That said, the administration clearly needs to do more. In 2011, the open question is whether “We the people” will use these new participatory platforms to help government work better.

Video of the hearing will be posted here when available. Testimony from today’s hearing is linked to PDFs below.

Dr. Danny Harris

Chris Smith

Jerry Brito

Ellen Miller

The Honorable Danny Werfel

Note: Video of the hearing was provided through the efforts of citizen archivist Carl Malamud at house.resource.org, the open government video website that he set up in collaboration with Speaker Boehner and Congressman Issa. While the open government efforts of the federal government have a long way to go, in this particular regard, a public-private collaboration is making the proceedings of the House Oversight committee available to the world online.

Clinton: There is no silver bullet in the struggle against Internet repression. There’s no “app” for that

Today in Washington, Secretary of State Clinton reiterated the State Department’s commitment to an Internet freedom policy in a speech at George Washington University. Rebecca MacKinnon, journalist, free speech activist, and expert on Chinese Internet censorship, provided some on the spot analysis immediately following Clinton’s words. MacKinnon made an interesting, and timely, point: there are limits to directly funding certain groups. “I think one of the reasons that the Egyptian and Tunisian revolutions were successful was that they were really home grown, grass roots. At the end of the day, the people in the countries concerned need to really want change and drive that change.”

MacKinnon parsed the considerable complexity of advocating for Internet freedom in the context of Wikileaks and electronic surveillance in other areas of the federal government. For those interested, she elaborated on the issues inherent in this nexus of government and technology in her Senate testimony last year. At some point this winter, there will be a hearing on “CALEA 2″ in the United States Congress that’s going to be worth paying close attention to for anyone tracking Internet freedom closer to home, so to speak.

Should the U.S. support Internet freedom through technology, whether it’s an “app” or other means? To date, so far the State Department has allocated only $20 million of the total funding it has received from Congress, according to a report on Internet censorship from the Senate Foreign Relations Committee obtained by the AFP. (Hat tip to Nick Kristof on that one).

Clinton defended the slow rollout of funding today in her speech (emphasis is added):

“The United States continues to help people in oppressive Internet environments get around filters, stay one step ahead of the censors, the hackers, and the thugs who beat them up or imprison them for what they say online. While the rights we seek to protect are clear, the various ways that these rights are violated are increasingly complex. Some have criticized us for not pouring funding into a single technology—but there is no silver bullet in the struggle against Internet repression. There’s no “app” for that. And accordingly, we are taking a comprehensive and innovative approach—one that matches our diplomacy with technology, secure distribution networks for tools, and direct support for those on the front lines.”

The caution in spending may well also be driven by the issues that the State Department encountered with Haystack, a much celebrated technology for Internet freedom tool that turned out to be closer to a fraud than a phenomenon.

There may be no silver bullet to deliver Internet freedom to the disconnected or filtered masses, per se, but there are more options beyond the Tor Project that people in repressive regimes can leverage. Today, MIT’s Technology Review reported on an app for dissidents that encrypts phone and text communications:

Two new applications for Android devices, called RedPhone and TextSecure, were released last week by Whisper Systems, a startup created by security researchers Moxie Marlinspike and Stuart Anderson. The apps are offered free of charge to users in Egypt, where protesters opposing ex-president Hosni Mubarak have clashed with police for weeks. The apps use end-to-end encryption and a private proxy server to obfuscate who is communicating with whom, and to secure the contents of messages or phone conversations. “We literally have been working night and day for the last two weeks to get an international server infrastructure set up,” says Anderson.

No word on whether they’ve received funding from State yet. For more on today’s speech, read the full report on the State department’s Internet freedom policy at the Huffington Post, Ethan Zuckerman or the ever sharp Nancy Scola on #NetFreedom, which does, in fact, now look like a “big deal.”

Malamud: add bulk open government data access to Thomas.gov

An image of (insert name here), taken at about 2:30 this afternoon. (Photo by Abby Brack/Library of Congress)

An image of (insert name here), taken at about 2:30 this afternoon. (Photo by Abby Brack/Library of Congress)

Open government advocate Carl Malamud made a succinct recommendation for improving the United States House of Representatives on January 24th: “Open it up. Bulk access, developer day, an API, long-term open source model. People’s house.” Malamud linked to a letter at House.Resource.org to Representative Eric Cantor (R-VI), House Majority Leader in which he made the case for making bulk data access to bills and corollary data available to the public online through Thomas.gov:

Access to bulk data, both for the core Thomas system and for corollary databases, would have a huge and immediate effect. Hosting a developer day and making sure stakeholders are part of the long-term development will help keep the next- generation system in tune with the needs of the Congress and of the public.

As Malamud pointed out, long term plans to improve public access to the law are evolving, including the announcement that the Cornell Law Library would redesign Thomas.gov legislative/meta data models:

It’s finally official: The Library of Congress has selected us to work on a redesign of their legislative-metadata models. This sounds like really geeky stuff (and it is), but the effects for government and for citizens should be pretty big. What’s really being talked about here is (we hope) a great improvement not only in what can be retrieved from systems like THOMAS and LIS (the less-well-known internal system used by Congress itself), but also in what can be linked to and referenced. We’ll begin with a careful compilation of use cases, build functional requirements for what the data models should do, and go from there to think about prototype systems and datasets. The idea is to bring Semantic Web technology to bills, public laws, the US Code, Presidential documents, and a variety of other collections. Longtime LII friends and collaborators Diane Hillmann, John Joergensen and Rob Richards& will be working with our regular team to create the new models and systems.

Will the new GOP leadership take Malamud up on his proposal for an open developer day and bulk data? Stay tuned. As Nancy Scola wrote in techPresident that “Republicans in the House are making technology-enabled openness, transparency, and participation central to the public presentation of their core political values in a way that their Democratic counterparts never fully did.” Malamud has a track record that lends considerable credibility to his prospects: he helped to get the SEC online in 1993. More recently, “Washington’s IT guy” was able to work with the House leadership to start publishing hundreds of high-resolution videos from the House Oversight Committee hearings at House.Resource.org earlier this month.

If the new GOP leadership is serious about adopting the infrastructure to enable transparency and accountability in the House, perhaps adoption of open government data standards will be one of the enduring accomplishments of this 112th Congress.

gov.house.20110120_to http://d1.scribdassets.com/ScribdViewer.swf?document_id=47510400&access_key=key-28dgxfnpla0o1b17qgmp&page=1&viewMode=list

House 2.0: Building out the House.gov platform with Drupal and social media

As I reported for the O’Reilly Radar yesterday, when the House chose Drupal as the preferred web content management system for House.gov, it made the “People’s House” one of the largest government institutions to move to the open source web content management platform.

The House.gov platform is moving to Drupal but House.gov itself is not on Drupal quite yet. That will probably happen in the next several months, according to Dan Weiser, communications director of the Office of the Chief Administrative Officer in the United States House of Representatives.

In the meantime, the incoming Congressmen and Congresswomen do appear to have adopted Drupal as the platform for their official websites. For instance, Congresswoman Colleen Hanabusa‘s site, below, uses one of several templates on the Drupal platform. Notably, each of the new sites includes default modules for the leaders in the respective verticals in the social media world: Flickr, YouTube, Twitter and Facebook.

Some questions remain about the cost and choices that representatives have as they choose their online Web presences. As NextGov reported today, while House websites can move to the open source platform – they don’t have to do so.

Given the context of citizens turning to the Internet for government information, data and services in increasing numbers, however, a well-designed Congressional website with clear connections to the various digital outposts has moved from a “nice to have” to a “must have” in the eyes of the digitally connected. (For citizens on the other side of the digital divide, the House switchboards are still available via phone call at (202) 224-3121 or TTY: (202) 225-1904).

If that’s a given, then the question is then why Drupal is now the preferred web hosting environment for the House. On that count, “Drupal was chosen because it is open source and widely accepted, therefore allows Members to leverage a large community of programmers which gives them more choices and innovation,” wrote Weiser in an email. “It should also be noted that Members still will have the option to use other platforms.”

Weiser told NextGov that, because, Drupal developers are in every member’s district, “that hopefully means expanded choice and more innovation for our members.”

The current content management system limits the choice of site programmer as well as innovation, said Dan Weiser, communications director for the chief administrative officer, in an e-mail. Drupal, which uses a common framework and code that can be customized, will allow members to leverage a large community of programmers, providing more opportunities for innovation, he added.

The House expects to save some money with the transition to Drupal, since the chief administrative officer will manage the infrastructure and members pay vendors only for development time, Weiser said.

The inclusion of social media is also no longer a novelty in the beginning of 2011. “We expected there would be interest by the incoming freshmen to have social media on their sites; it just seemed natural to offer the option,” wrote Weiser.

[Disclosure: One of the vendors involved in the House’s Drupal effort is Acquia. O’Reilly AlphaTech Ventures is an investor in Acquia.]

Clay Johnson on key trends for Gov 2.0 and open government in 2011

As dozens of freshmen Representatives move into their second week of work as legislators here in the District of Columbia, they’re going to come up against a key truth that White House officials have long since discovered since the heady … Continue reading

House 2.0: Livestreams of special session on Tucson Shooting on Facebook, CSPAN.org

Today, C-SPAN’s Facebook page will host streaming video coverage of Wednesday’s special U.S. House session on the Tucson shootings. The livestream will start at 10 AM ET, when the House will consider a resolution on the shootings. The session is also … Continue reading

Themes to watch in 2011: E-democracy in Brazil

As Nat Torkington put it this morning at O’Reilly Radar, “people who consider tech trends without considering social trends are betting on the atom bomb without considering the Summer of Love.” Torkington was annotating a link to 2011 predictions and prognostications at venture capitalist Fred Wilson’s blog which center on the following presentation that Paul Kedrosky sent him from JWT, a marketing agency.

JWT’s thirteenth prediction will be of particular interest to readers of this blog: “Brazil as E-Leader.”

This digitally savvy, economically vibrant country will prove to be an e-leader. Social media is more popular here than in developed markets, and Brazil has the highest Twitter penetration (23 percent, as of October ComScore figures). PC penetration has reached 32 percent, and many Internet cafes further broaden access. Mobile subscriptions have 86% penetration. Already Brazil is ahead in electronic democracy (with innovations like online town halls and crowd-sourced legislative consulting), and its 2010 census was paperless, conducted electronically.

http://static.slidesharecdn.com/swf/ssplayer2.swf?doc=2f100thingstowatchin2011-101222142649-phpapp02&stripped_title=2f-100-things-to-watch-in-2011-6306251&userName=jwtintelligence

There are many other themes that will matter to the Gov 2.0 world in 2011 in there, including smart infrastructure investment, scanning everything, home energy monitors, and mHealth. Heck, seemingly mobile everything. Of course, as Mike Loukelides pointed out in his own watchlist of 2011 themes to track, “you don’t get any points for predicting ‘Mobile is going to be big in 2011.'” He thinks that Hadoop, real-time data, the rise of the GPU, the return of P2P, social ubiquity and a new definition for privacy will all play important roles in 2011. Good bets.

JWT does get points for this set of trends, however, and that prediction about e-democracy in Brazil strikes me as apt. Last year at the International Open Government Data Conference, I met Cristiano Ferri Faria, project manager in e-democracy and legislative intelligence at the Brazilian House of Representatives. Faria talked about his work on e-Democracia, a major electronic lawmaking program in Brazil since 2008. As the 112th United States House of Representatives goes back to work today, there are definitely a few things its legislators, aides and staffers might learn from far south of the border. You can download his presentation as a PDF from Data.gov or view it below, with an added bonus: reflections on open government data in New Zealand and Australia.

One caution: Faria concluded that “this kind of practice is too complex” and that e-Democracia “needs a long-term approach.”

Looks like they’re still in an e-government in beta down there too.

Iogdc 2010 Day1 Plenary http://d1.scribdassets.com/ScribdViewer.swf