FCC.gov 2.0 Preview: FCC launches FCC.us URL shortener

FCC Data Center

FCC Data Center

Later this week, a new version of FCC.gov will go live. It’s a complete redesign of the Federal Communications Online presence. You could even call it a reboot, in keeping with the FCC launch of reboot.gov last January.

There’s much more to report on when the new FCC.gov goes online. For now, here’s a preview of something nifty that’s already live: the new FCC custom URL shortener, FCC.us.

The new custom URL shortener, is based upon bit.ly, like the 1.usa.gov URL shortener for civilian use. It automatically shortens any FCC.gov that’s shortened using bit.ly or the shorter j.mp. For instance, FCC.gov/developer becomes http://fcc.us/bkJYlG. In a new media world that is often shortened to 140 characters, that’s rather handy.

More to come soon.

Multiple federal open data initiatives at risk under budget cuts

Earlier today, Virginia Carlson, president of the Metro Chicago Information Center (MCIC), commented extensively upon proposed deep Congressional cuts to funding for open government data platforms. Carlson provided more context for other federal open data initiatives that may also be cut. Her thoughts are shared below as a guest post. -Editor

Recent news that data transparency initiatives at the federal level are set to be shut down are coupled with an attack on long-standing federal data initiatives that produce critical economic and demographic data.

In March 2011, H.R. 931 was introduced to make participation in the American Community Survey voluntary by removing the legal penalty for not responding to the survey. Without compulsory participation, the ACS likely would not capture the broad swath of the American populace it needs to, –such citizens in towns and rural counties– and would become inaccurate and thus irrelevant. Congress relies on ACS data to guide the distribution of $485 billion annually in federal grants to states and localities. Already cash-strapped state and local governments would be hindered in their ability to efficiently target tax dollars in public investments such as roads, schools and health clinics. Private sector investments that rely on economic and demographic profiles of people in places (real estate and media industries for example) would also suffer.

At the same time, the Census Bureau budget for Fiscal Year 2012 submitted to Congress proposes to terminate six programs for a total of $10.3 million, about 1 percent of the Census Bureau budget. Among those items on the chopping block are online and print versions of the U.S. Statistical Abstract, State and Metropolitan Area Data Book, Population Change in Central and Outlying Counties of Metropolitan Statistical Areas, and the Consolidated Federal Funds Report.

What does this apparent diminishing commitment to federal data leadership mean for our future ability to make good policy, prioritize public investments, and compete globally? One scenario is that we turn to other, perhaps less democratic and more expensive, sources: internet-generated data (social apps, web scrapes), business-gathered data (market research firms) or harnessing administrative data (from driver’s license files, Medicare records, etc.). Who then will be counted? How do we ensure privacy?

Congress weighs deep cuts to funding for federal open government data platforms

Several core pillars of federal open government initiatives brought online by the Obama administration may be shuttered by proposed Congressional budget cuts. Data.gov, IT.USASpending.gov, and other five other websites that offer platforms for open government transparency are facing imminent closure. A comprehensive report filed by Jason Miller, executive editor of Federal News Radio, confirmed that the United States of Office of Management and Budget is planning to take open government websites offline over the next four months because of a 94% reduction in federal government funding in the Congressional budget. Daniel Schuman of the Sunlight Foundation first reported the cuts in the budget for data transparency. Schuman talked to Federal News Radio about the potential end of these transparency platforms this week.

Cutting these funds would also shut down the Fedspace federal social network and, notably, the FedRAMP cloud computing cybersecurity programs. Unsurprisingly, open government advocates in the Sunlight Foundation and the larger community have strongly opposed these cuts.

As Nancy Scola reported for techPresident, Donny Shaw put the proposal to defund open government datain perspective at OpenCongress: “The value of data openness in government cannot be overestimated, and for the cost of just one-third of one day of missile attacks in Libya, we can keep these initiatives alive and developing for another year.”

Daniel Schuman was clear about the value of data transparency funding at the Sunlight Foundation blog:

The returns from these e-government initiatives in terms of transparency are priceless. They will help the government operate more effectively and efficiently, thereby saving taxpayer money and aiding oversight. Although we have significant issues with some of these program’s data quality, and we are concerned that the government may be paying too much for the technology, there should be no doubt that we need the transparency they enable. For example, fully realized transparency would allow us to track every expense and truly understand how money — like that in the electronic government fund — flows to federal programs. Government spending and performance data must be available online, in real time, and in machine readable formats.

There is no question that Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. OMB Watch found progress on open government in a recent report by cautioned that there’s a long road ahead. It is clear that we are in open government’s beta period. The transparency that Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote today in the Washington Post. There are real data quality and cultural issues that need to be addressed to match the rhetoric of the past three years. “Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, president of the Metro Chicago Information Center. “I think the New Tech world forgets that — open data is a political process first and foremost, and a technology problem second.”

Carlson highlighted how some approaches taken in establishing Data.gov have detracted from success of that platform:

First, no distinction was made between making transparent operational data about how the government works (e.g, EPA clean up sites; medicaid records) and making statistical data more useful (data re: economy and population developed by the major Federal Statistical Agencies). So no clear priorities were set regarding whether it was an initiative meant to foster innovation (which would emphasize operational data) or whether it was an initiative meant to open data dissemination lines for agencies that had already been in the business of dissemination (Census, BLS, etc.), which would have suggested an emphasis on developing API platforms on top of current dissemination tools like American Fact Finder or DataFerrett.

Instead, a mandate came from above that each agency or program was responsible for putting X numbers of data sets on data.gov, with no distinction made as to source or usefulness. Thus you have weird things like cutting up geo files into many sub-files so that the total number of files on data.gov is higher.

The federal statistical agencies have been disseminating data for tens of decades. They felt that the data.gov initiative rolled right over them, for the most part, and there was a definite feeling that the data.gov people didn’t “get it” from the FSA perspective – who are these upstarts coming in to tell us how to release data, when they don’t understand how the FSAs function, how to deal with messy statistical data that have a provenance, etc. An open data session at the last APDU conference saw the beginnings of a conversation between data.gov folks and the APDU folks (who tend to be attached to the major statistical agencies), but there is a long way to go.

Second, individuals in bureaucracies are risk-averse. The political winds might be blowing toward openess now, but executives come and go while those in the trenches stay, (or would like to). Thus the tendency was to find data that was relatively low-risk. Agencies literally culled their catalogs to find the least controversial data that could be released.

Neither technical nor cultural changes will happen with the celerity that many would like, despite the realities imposed by the pace of institutional change. “Lots of folks in the open government space are losing their patience for this kind of thing, having grown accustomed to startups that move at internet speed,” said Tom Lee, director of Sunlight Labs. “But USAspending.gov really can be a vehicle for making smarter decisions about federal spending.”

“Obviously the data quality isn’t there yet. But you know what? OMB is taking steps to improve it, because the public was able to identify the problems. We’re never going to realize the incredible potential of these sites if we shutter them now. A house staffer, or journalist, or citizen ought to be able to figure out the shape of spending around an issue by going to these sites. This is an achievable goal! Right now they still turn to ad-hoc analyses by GAO or CRS — which, incidentally, pull from the same flawed data. But we really can automate that process and put the power of those analyses into everyone’s hands.”

Potential rollbacks to government transparency, if seen in that context, are detrimental to all American citizens, not just for those who support one party or the other. Or, for that matter, none at all. As Rebecca Sweger writes at the National Priorities Project, “although $32 million may sound like a vast sum of money, it is actually .0009% of the proposed Federal FY11 budget. A percentage that small does not represent a true cost-saving initiative–it represents an effort to use the budget and the economic crisis to promote policy change.”

Lee also pointed to the importance of TechStat to open government. TechStat was part of the White House making the IT Dashboard open source yesterday. “TechStat is one of the most concrete arguments for why cutting the e-government fund would be a huge mistake,” he said. “The TechStat process is credited with billions of dollars of savings. Clearly, Vivek [Kundra, the federal CIO] considers the IT Dashboard to be a key part of that process. For that reason alone cutting the e-gov fund seems to me to be incredibly foolish. You might also consider the fact pointed out by NPP: that the entire e-gov budget is a mere 7.7% of the government’s FOIA costs.”

In other words, it costs far more to release the information by the current means. This is the heart of the case for data.gov and data transparency in general: to get useful information into the hands of more people, at a lower cost than the alternatives,” said Lee. Writing on the Sunlight Labs blog, Lee emphasized today that “cutting the e-gov funding would be a disaster.”

The E-Government Act of 2002 that supports modern open government platforms was originally passed with strong bipartisan support, long before before the current president was elected. Across the Atlantic, the British parallel to Data.gov, Data.gov.uk continues under a conservative prime minister. Open government data can be used not just to create greater accountability, but also economic value. That point was made emphatically last week, when former White House deputy chief technology officer Beth Noveck made her position clear on the matter: cutting e-government funding threatens American jobs:

These are the tools that make openness real in practice. Without them, transparency becomes merely a toothless slogan. There is a reason why fourteen other countries whose governments are left- and right-wing are copying data.gov. Beyond the democratic benefits of facilitating public scrutiny and improving lives, open data of the kind enabled by USASpending and Data.gov save money, create jobs and promote effective and efficient government.

Noveck also referred to the Economist‘s support for open government data: “Public access to government figures is certain to release economic value and encourage entrepreneurship. That has already happened with weather data and with America’s GPS satellite-navigation system that was opened for full commercial use a decade ago. And many firms make a good living out of searching for or repackaging patent filings.”

The open data story in healthcare continues to be particularly compelling, from new mobile apps that spur better health decisions to data spurring changes in care at the Veterans Administration. Proposed cuts to weather data collection could, however, subtract from that success.

As Clive Thompson reported at Wired this week, public sector data can help fuel jobs, “shoving more public data into the commons could kick-start billions in economic activity.” Thompson focuses on the story of Brightscope, where government data drives the innovation economy. “That’s because all that information becomes incredibly valuable in the hands of clever entrepreneurs,” wrote Thompson. “Pick any area of public life and you can imagine dozens of startups fueled by public data. I bet millions of parents would shell out a few bucks for an app that cleverly parsed school ratings, teacher news, test results, and the like.”

Lee doesn’t entirely embrace this view but makes a strong case for the real value that does persist in open data. “Profits are driven toward zero in a perfectly competitive market,” he said.

Government data is available to all, which makes it a poor foundation for building competitive advantage. It’s not a natural breeding ground for lucrative businesses (though it can certainly offer a cheap way for businesses to improve the value of their services). Besides, the most valuable datasets were sniffed out by business years before data.gov had ever been imagined. But that doesn’t mean that there isn’t huge value that can be realized in terms of consumer surplus (cheaper maps! free weather forecasts! information about which drug in a class is the most effective for the money!) or through the enactment of better policy as previously difficult-to-access data becomes a natural part of policymakers’ and researchers’ lives.

To be clear, open data and the open government movement will not go away for lack of funding. Government data sets online will persist if Data.gov goes offline. As Samantha Power wrote at the White House last month, transparency has gone global. Open government may improve through FOIA reform. The technology that will make government work better will persist in other budgets, even if the e-government budget is cut to the bone.

There are a growing number of strong advocates who are coming forward to support the release of open government data through funding e-government. My publisher, Tim O’Reilly, offered additional perspective today as well. “Killing open data sites rather than fixing them is like Microsoft killing Windows 1.0 and giving up on GUIs rather than keeping at it,” said O’Reilly. “Open data is the future. The private sector is all about building APIs. Government will be left behind if they don’t understand that this is how computer systems work now.”

As Schuman highlighted at SunlightFoundation.com, the creator of the World Wide Web, Sir Tim Berners-Lee, has been encouraging his followers on Twitter to sign the Sunlight Foundation’s open letter to Congress asking elected officials to save the data.

What happens next is in the hands of Congress. A congressional source who spoke on condition of anonymity said that they are aware of the issues raised with cuts to e-government finding and are working on preserving core elements of these programs. Concerned citizens can contact the office of the House Majority Leader, Representative Eric Cantor (R-VI) (@GOPLeader), at 202.225.4000.

UPDATE: The Sunlight Foundation’s Daniel Schuman, who is continuing to track this closely, wrote yesterday that, under the latest continuing resolution under consideration, funding for the E-Government Fund would be back up in the tens of millions range. Hat tip to Nancy Scola.

UPDATE II: Final funding under FY 2011 budget will be $8M. Next step: figuring out the way forward for open government data.

2011 NASA Open Source Summit convenes innovators and technologists

Today in California, NASA is hosting its first Open Source Summit. You can watch the open source livestream here. The first Open Source Summit is at Ames Research Center in Mountain View, California. Engineers and policy makers across NASA are meeting with members of the open source community to discuss the challenges with open source policy. Here’s the agenda. The liveblog is below.

Virtual attendees connected on morning phone conversations on Maestro Conference and collaboratively took notes online at the Ideation Forum.

In the afternoon, the NASA Open Source Summit turned to breakout groups with discussions driven by the online conversation. Photo by NASA’s Chris @Gerty:

Presentations are also going up over at Slideshare. Here are great examples:

Disaster 2.0: UN OCHA releases report on future of information sharing in crisis

The emergence of crisiscamps and subsequent maturation of CrisisCommons into a platform for civic engagement were important developments in 2010. Hearing digital cries for help has never been more important. A year after the devastating earthquake in Haiti, a new report by a team at the Harvard Humanitarian Initiative analyzes how the humanitarian, emerging volunteer and technical communities collaborated in the aftermath of the quake. The report recommends ways to improve coordination between these groups in future emergencies. There are 5 specific recommendations to address the considerable challenges inherent in coordinating crisis response:

  1. A neutral forum to surface areas of conflict or agreement between the volunteer/technical community and established humanitarian institutions
  2. An space for innovation where new tools and approaches can be experimented with before a crisis hits
  3. A deployable field team with the mandate to use the best practices and tools established by the community
  4. A research and development group to evaluate the effectiveness of tools and practices
  5. An operational interface that identifies procedures for collaboration before and during crises, including data standards for communication

Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies” was commissioned by the United Nations Foundation and Vodafone Foundation Technology Partnership in collaboration with the UN Office for the Coordination of Humanitarian Affairs (OCHA). You can find more discussion of the report in a series of posts on disaster relief 2.0 at UNDispatch.com, like this observation from Jen Ziemke:

…a substantial majority of members on the Crisis Mappers Network have held positions in formal disaster response, some for several decades. Volunteers in groups like the Standby Task Force include seasoned practitioners with the UNDP or UN Global Pulse. But what is really needed is a fundamental rethinking of who constitutes the “we” of disaster response, as well as dispensing with current conceptions of: “volunteers”, “crowds,” and “experts.” While distinctions can be endlessly debated, as humans, we are far more the same than we are different.

Whether it’s leveraging social media in a time of need or geospatial mapping, technology empowers us to help one another more than ever. This report offers needed insight about how to do it better.

A movement to spur innovation and participation in government

This past weekend, Syracuse MPA grad student Pat Fiorenza spoke about Gov 2.0 at the We Live NY Conference in upstate New York. In a wrap up posted after the conference, Fiorenza touched of what people think about when they hear “Gov 2.0,” including:

Fiorenza’s recap of his Gov 2.0 presentation also describes both why the idea is important to him and why it’s important to people who aren’t developers.

“Gov 2.0 extends beyond a great programmer – I’ve noticed that when I talk to some people about Gov 2.0 they immediately associate me as a geeky-computer programming-MPA student (only 2 of the 3!). I’ve developed a passion for Gov 2.0 because it holds so much potential for government. It’s about getting access to data and information immediately, improving constituent services, crowd sourcing information, and empowering citizens. Gov 2.0 requires someone to identify an existing problem and conceptualize a solution – then someone to run with the idea and develop the program, with a lot of collaboration in between.”

Fiorenza also pointed the way to Remy DeCausemaker (@remy_d, a “resident hacktivist and storyteller” at the Rochester Institute for Technology’s Lab for Technological Literacy, who also presented on Gov 2.0 at the conference.

DeCausemaker works on FOSS at RIT and CIVX, an open source public information system for raw data. His presentation (PDF) on open government and open data will be of interest to many people in the Gov 2.0 community.

Todd Park on unleashing the power of open data to improve health

What if open health data were to be harnessed to spur better healthcare decisions and catalyze the extension or creation of new businesses? That potential future exists now, in the present. Todd Park, chief technology officer of the Department of Heath and Human Services, has been working to unlock innovation through open health data for over a year now. On many levels, the effort is the best story in federal open data. Park tells it himself in the video below, recorded yesterday at the Mutter Museum in Philadelphia.

Over at e-patients.net, Pew Internet researcher Susannah Fox asked how community organizations can tap into the health data and development trend that Park has been working hard to ignite. She shared several resources (including a few from this correspondent) and highlighted the teams who competed in a health developer challenge tour that culminated at the recent Health 2.0 conference.

Check out this article about HealthData.gov including footage of Park talking about the “health data eco-system” at the code-a-thon (and actually, the video also features local health hacker Alan Viars sitting there at the right).

Here are 3 blog posts about last year’s event, including mine:

Making Health Data Sing (Even If It’s A Familiar Song)

Community Health Data Initiative: vast amounts of health data, freed for innovators to mash up!

Making community health information as useful as weather data: Open health data from Health and Human Services is driving more than 20 new apps.

The next big event in this space on June 9 at the NIH. If you’re interested in what’s next for open health data, track this event closely.

Samantha Power: Transparency has gone global

Innovations in democratic governance have been and likely always will be a global phenomenon. Samantha Power, senior director and special assistant for multilateral affairs and human rights at the White house, highlighted the ways in which platforms and initiatives for transparency in other countries are growing on the White House blog yesterday.

While “Sunshine Week” may be an American invention, the momentum for greater transparency and accountability in government is a global phenomenon. In countries around the world, governments and civil society groups are taking new and creative steps to ensure that government delivers for citizens and to strengthen democratic accountability.

From Kenya to Brazil to France to Australia, new laws and platforms are giving citizens new means to ask for, demand or simply create greater government transparency. As Power observed, open government is taking root in India, where the passage of India’s Right to Information Act and new digital platforms have the potential to change the dynamic between citizens and the immense bureaucracy.

Power listed a series of global transparency efforts, often empowered by technology, that serve as other useful examples of “innovations in democratic governance” on every continent

  • El Salvador and Liberia recently passed progressive freedom of information laws, joining more than 80 countries with legislation in place, up from only 13 in 1990;
  • A few weeks ago in Paris, six new countries from Europe, Africa, Central Asia, and the Middle East met the high standards of the Extractive Industries Transparency Initiative (EITI), empowering citizens with unprecedented information about payments made for the extraction of natural resources;
  • Brazil and South Africa are pioneering innovative tools to promote budget transparency and foster citizen engagement in budget decision-making, along with tens of other countries that are making budget proposals and processes open to public input and scrutiny;
  • Civil society groups are developing mechanisms to enable citizens to keep track of what happens in legislatures and parliaments, including impressive web portals such as votainteligente.cl in Chile and mzalendo.com in Kenya; and
  • Experiments in citizen engagement in Tanzania, Indonesia, and the Philippines, are demonstrating that citizen efforts to monitor the disbursement of government funds for education, health, and other basic services, actually decrease the likelihood of corruption and drive better performance in service delivery.

There’s a long road ahead for open government here in the United States. While improving collaboration and transparency through open government will continue to be difficult nuts to crack, it looks like “Uncle Sam” could stand to learn a thing or two from the efforts and successes of other countries on transparency. Addressing FOIA reform and better mobile access to information are two places to start.

For more on how open government can have a global impact, click on over to this exclusive interview with Samantha Power on national security, transparency and open government.

National Archives hosts Open Government R&D Summit

Whether the White House can foster innovation through open government is up for debate. Last December, the President’s Council of Advisers on Science and Technology (PCAST) emphasized the importance of establishing an R&D agenda for open government in a report.

This week in Washington, D.C., the National Archives is hosting an Open Government Research and Development Summit. Collaborative innovation in open government is a notion that goes back to Thomas Jefferson. Whether open models for science can lead to better outcomes in research in the 21st Century is the question of the day. You can follow the liveblog of the event below.

Day 2 Liveblog

Day 1 Liveblog

For more details, here are the organizing notes:

The summit will set the foundation for a robust R&D agenda that ensures the benefits of open government are widely realized, with emphasis on how open government can spur economic growth and improve the lives of everyday Americans. This will be the first opportunity for researchers, scholars, and open government professionals to begin a discussion that will continue at academic centers throughout the country over the next few years.”

Government innovators will talk about openness in the context of education, health, and economic policy, and international open government. Speakers include Aneesh Chopra, U.S. Chief Technology Officer, Todd Park, Chief Technology Officer of the U.S. Department of Health and Human Services (HHS), and David Ferriero, Archivist of the United States.

Panelists made up of scholars, activists, and present and former policymakers will then discuss the important research questions that researchers must grapple with in order to ensure lasting success in the open government space. Panels will discuss issues such as how to safely release data without creating mosaic effects. Panelists include Jim Hendler (Rensselaer Polytechnic Institute), Noshir Contractor (Northwestern University), Archon Fung (Harvard University), Chris Vein (U.S. Deputy Chief Technology Officer), Beth Noveck (New York Law School), and Susan Crawford (Yeshiva University).

The National Archives and Records Administration (NARA) and Networking and Information Technology Research and Development (NITRD) are hosting this summit, with support from the MacArthur Foundation.

Workshop agenda: click here
Participant Information Packet: click here

Video of federal chief technology officer Aneesh Chopra is embedded below. [Editor’s Note: Apparently the way that iPhone 4 accelerometer interacted with the video meant that the video didn’t shift to landscape mode after the shift. Apologies to viewers, who may find this one better to listen to, unless you prefer to put the laptop or screen on edge.]

Part II:

Vint Cerf talks to the CFR about Internet freedom and foreign policy

In a new video interview from the Council on Foreign Relations (CFR), Google’s Internet evangelist, Vint Cerf talks with CFR’s Hagit Bachrach about the future of the Internet and what that could mean for international development and foreign policy. He spoke about the importance of an “Internet without borders” last year.

Earlier in the month, Cerf spoke with USAID’s Alex O. Dehgan about technology as a tool for foreign policy, discussing the ability of science and information technology to connect political leaders, diplomats and innovators around the globe.

Last year, Cerf made it clear that he believed that governments shouldn’t control the Web, at least with respect to the governance of ICANN, the organization that has responsibility for the Internet domain system. In the wake of the Internet shutdown in Egypt and ongoing online censorship around the globe, that perspective has gained more prominence.