San Francisco experiments with citizensourcing better ideas

As significant as the revisions to San Francisco’s open data policy may prove to be, city officials and civic startups alike emphasize that it’s people are fundamental to sustained improvements in governance and city life.

“Open data would not exist without our community,” said Jay Nath, the city’s first chief innovation officer, this Monday at the Hatchery.

San Francisco’s approach to open innovation in the public sector — what businesses might describe as crowdsourcing, you might think of as citizensourcing for cities — involves a digital mix of hackathons, public engagement and a renewed focus on the city’s dynamic tech community, including the San Francisco Citizens Initiative for Technology and Innovation, or SF.citi.

Cities have been asking their residents how government could work better for some time, of course — and residents have been telling city governments how they could work better for much longer than that. New technologies, however, have created new horizons for participatory platforms to engage citizens, including mobile apps and social media.

Open data and civic coders also represent a “new class of civic engagement focused on solving issues, not just sharing problems,” argues Nath. “We have dozens and dozens of apps in San Francisco. I think it’s such a rich community. We haven’t awarded prizes. It’s really about sustainability and creating community. We’ve six or seven events and more than 10,000 hours of civic engagement.”

San Francisco’s dedicated citizensourcing platform is called “ImproveSF.” The initiative had its genesis as an internal effort to allow employees to make government better, said Walton. The ideas that come out of both, he said, are typically about budget savings.

The explosion of social media in the past few years has created new challenges for San Francisco to take public comments digitally on Facebook or Twitter that officials haven’t fully surmounted yet.

“We don’t try to answer and have end-to-end dialog,” said Jon Walton, San Francisco’s CIO, in an interview earlier this year. Part of that choice is driven by the city’s staffing constraints.

“What’s important is that we store, archive and make comments available to policy makers so that they can see what the public input is,” he said.

Many priorities are generated by citizen ideas submitted digitally, emphasized Walton, which then can be put on a ballot that residents then vote on and become policy by public mandate.

“How do you get a more robust conversation going on with the public?” asked Walton. “In local government, what we’re trying to do is form better decisions on where we spend time and money. That means learning about other ideas and facilitating conversations.”

He pointed to the deployment of free public Wi-Fi this year as an example of how online public comments can help shape city decisions. “We had limited funds for the project,” he said. “Just $80,000. What can you do with that?”

Walton said that one of the first things they thought about doing was putting up a website to ask the public to suggest where the hotspots should be.

The city is taking that feedback into account as it plans future wifi deployments:

green dot Completed sites

blue dot Sites in progress

Walton said they’re working with the mayor’s office to make the next generation of ImproveSF more public-facing.

“How do we take the same idea and expose it to the public?” he asked. “Any new ‘town hall’ should really involve the public in asking what the business of government should be? Where should sacrifices and investments be made? There’s so much energy around the annual ballot process. People haven’t really talked about expanding that. The thing that we’re focusing on is to make decision-making more interactive.”

At least some of San Francisco’s focus has gone into mobile development.

“If you look at the new social media app, we’re answering the question of ‘how do we make public meetings available to people on handhelds and tablets’?” said Walton.

“The next generation will focus on how do they not just watch a meeting but see it live, text in questions and have a dialog with policy makers about priorities, live, instead of coming in in person.”

Jay Nath on how San Franscisco is working to get its Gov 2.0 groove back

Back in January, Govfresh founder wrote about how San Francisco can “get its Gov 2.0 groove back,” offering six recommendations to the city government to use technology better.

[Image Credit: Fog City Journal]

When asked for comment, San Francisco chief innovation officer Jay Nath (@Jay_Nath) responded to Fretwell’s suggestions via email. While I’ll be sharing more from Nath and SF CIO Jon Walton over at the O’Reilly Radar civic innovation channel, in the meantime I’m publishing his specific responses to those recommendations below.

Build the best mayoral website in the world

Nath: We can always improve how we communicate with our constituents. If we were to undertake an effort to redesign the Mayor’s site, we should take a holistic approach and not just focus on the Mayor’s site. The approach NYC took to invite their design community is one that I think is very smart and something that SF should consider.

Use “Built in SF” technology

Nath: We agree and launched our City Hall iZone concept where we pilot great local technologies and services. We frequently meet with great companies like Square, Twitter, Uber, Yammer and invite each of them to work with the City. Specifically, we’re actively exploring Yammer, Zendesk, Get Satisfaction, Cozybit and 802.11s mesh, Google+ hangouts, and others. Additionally, we’re already using local tech like WordPress (which powers our innovation site), Twitter via Open311API, and Instagram.

Go back to the (data) fundamentals

Nath: We have an open data roadmap to strengthen our leadership in this area. It’s in our 2012 innovation portfolio as well. Our goal is to structurally change how we share data so that our default position is one of sharing. One idea is to require that all software purchased that stores structured data to have a public API. As we secure staffing for this effort, we will invite the community to help us shape the final form and execute.

Leverage the civic surplus

Nath: I would argue that we’ve done a great job in this area. Last summer, we partnered with Gray Area Foundation for the Arts (GAFFTA) to produce the “Summer of Smart.” This series of hackathons produced over 20 prototypes, 500 participants and 10,000 hours of civic engagement. We’ve continued our efforts this year with the City’s first unhackathon around taxi dispatch and real-time mass communication. Our Mayor and transit director both attended the event and thanked our community for their efforts to make SF a better city.

Additionally, we launched our citizen engagement platform, ImproveSF, in a very big way in April.

Open source the infrastructure

Nath: While we can do more to increase open source software adoption, I want to
recognize our efforts to date:

  • open source policy
  • SFPark Android/iPhone app
  • Enterprise Addressing System
  • SmartPDF
  • LAMP as an option for internal customers
  • Pligg (DataSF)
  • Several Drupal applications

Additionally, the idea of moving our City from the existing CMS (Vision) to WordPress is not just about open source technology. We, as a City, made the decision to utilize Vision CMS a couple of years ago and the switching costs to migrate to WordPress currently outweigh the benefits. I will encourage the City to strongly consider WordPress, Drupal, etc for consideration when Vision no longer meets our needs.

Give citizens a dashboard

Nath: This is more than just adopting the IT Dashboard. We have to implement the governance and project management model to ensure that the data is accurate. This is something we need to do but requires time and culture change. I agree that we need to increase access to high value datasets like expenditures. This is part of our open data roadmap and will receive renewed focus in 2012.

Kundra: Closing the IT gap is the key to making government work better for the American people

Today, the first chief information officer of the United States, Vivek Kundra, shared his reflections on public service.

Kundra, whose last day of work at the White House Office of Management and Budget was last Friday, is now at the Harvard Kennedy School and Berkman Center.

I arrived at a White House that was, as the Washington Post put it, “stuck” in the “Dark Ages of technology.” In their words, “If the Obama campaign represented a sleek, new iPhone kind of future, the first day of the Obama administration looked more like the rotary-dial past.”

As my team congratulated me on the new job, they handed me a stack of documents with $27 billion worth of technology projects that were years behind schedule and millions of dollars over budget. At the time, those documents were what passed for real-time updates on the performance of IT projects. My neighbor’s ten year old could look up the latest stats of his favorite baseball player on his phone on the school bus, but I couldn’t get an update on how we were spending billions of taxpayer dollars while at my desk in the White House. And at the same time, the President of the United States had to fight tooth and nail to simply get a blackberry.

These were symptoms of a much larger problem.

The information technology gap between the public and private sectors makes the Federal Government less productive and less effective at providing basic services to its citizens. Closing this gap is the key to making government work better for the American people – the ultimate goal.

His complete thoughts are embedded below. If you’re interested in frank insight into why changing government through information technology isn’t easy, read on.

Vivek Kundra’s Reflections on Public Service 2011(function() { var scribd = document.createElement(“script”); scribd.type = “text/javascript”; scribd.async = true; scribd.src = “http://www.scribd.com/javascripts/embed_code/inject.js”; var s = document.getElementsByTagName(“script”)[0]; s.parentNode.insertBefore(scribd, s); })();

GSA’s McClure: Cloud computing and open data in federal government aren’t going away

To those in media, government or commentariot who think that cloud computing or open data might be going away in federal government after the departure of federal CIO Vivek Kundra next month, Dave McClure offered a simple message today: these trends are “inevitable.”

Cloud computing, for instance, will “survive if we change federal CIOs,” he said. “It’s here, and it’s not going away. McClure describes cloud computing as a worldwide global development in both business and government, where the economics and efficiencies created are “compelling.” The move to the cloud, for instance, is behind US plans to close or consolidate some 800 data centers,, including hundreds by the end of 2011.

Cloud computing was just one of five macro trends that McClure “listed at this year’s FOSE Conference in Washington, D.C. FOSE is one of the biggest annual government IT conferences.
inevitable. Here’s the breakdown:

1) Cloud computing

The GSA is the “engine behind the administration’s ‘cloud-first’ strategy,” said McClure, lining up the procurement details for government to adopt it. He said that he’s seen “maturity” in this area in the past 18-24 months. Two years ago, National Institute of Standards and Technology (NIST) was spending time at conferences and panels defining it. Now we have cloud deployments that are robust and scalable, said McClure, including infrastructure as a service and email-as-a-service.

Government cloud deployments now includes public facing websites, storage, disaster recovery andare beginning to move into financial apps.

2) Collaboration and engagement

The cloud is teaching us that once we free data, make it accessible, and make it usable, it’s
creating opportunities for effective collaboration with citizens, said McClure, noting that this trend is in its “early stages.”

3) Open data and big data

Data.gov has “treasure troves” of data that entrepreneurs and citizens are turning into hundreds of applications and innovations, said McClure. Inside of government, he said that access to data is creating a “thirst” for data mining and business intelligence that help public servants work more efficient.

4) Mobile

Mobile computing will be the next wave of innovation, said McClure, delivering value to ourselves and delivering value to citizens. Government is “entrenched in thinking about creation of data on websites or desktop PCs,” he said. That perspective is, in this context, dated. Most of the audience here has a smartphone, he pointed out, with most interactions occurring on the hip device. “That’s going to be the new platform,” a transition that’s “absolutely inevitable,” he said, “despite arguments about digital divide and broadband access.”

5) Security

As McClure noted, you have to include security at a government IT conference. The need for improved security on the Web, for critical infrastructure, on email and where ever else government has exposed attack surface is clear to all observers.

White House Office of Management and Budget hosts forum on federal IT reform

Last December, the White House proposed sweeping IT reforms. Today in Washington, the nation’s top IT executives will discuss progress on those proposals and assess the challenges that lie ahead. The livestream is embedded below:

http://www.whitehouse.gov/sites/default/modules/wh_multimedia/EOP_OVP_player.swf

This morning, President Obama issued an executive order streamlining service delivery and improving customer service.

Government managers must learn from what is working in the private sector and apply these best practices to deliver services better, faster, and at lower cost. Such best practices include increasingly popular lower-cost, self-service options accessed by the Internet or mobile phone and improved processes that deliver services faster and more responsively, reducing the overall need for customer inquiries and complaints. The Federal Government has a responsibility to streamline and make more efficient its service delivery to better serve the public.”

The White House Office of Management and Budget’s Jeff Zients, the national chief performance officer, will talk about productivity and efficiency enabled by information technology. As the Washington Post reported, Zients will unveil a new customer service initiative this afternoon as well.

According to Politico’s Morning Tech, federal “CIO Vivek Kundra will offer an update on the administration’s IT plans; Deputy Energy Secy Daniel Poneman will focus his remarks on cloud use and adoption; Deputy USDA Secy Kathleen Merrigan will chat about data centers; VA’s Roger Baker will discuss the administration’s plans to eliminate or revise underperforming IT projects and DHS CIO Richard Spires will outline efforts to set up IT best practices.”

More to come as the event goes forward.

Editor’s Note: The White House livestream went down about 18 minutes in and then went on and off. Unfortunately, as this correspondent did not attend in person, other first accounts will have to capture much of what occurred.

Todd Park on unleashing the power of open data to improve health

What if open health data were to be harnessed to spur better healthcare decisions and catalyze the extension or creation of new businesses? That potential future exists now, in the present. Todd Park, chief technology officer of the Department of Heath and Human Services, has been working to unlock innovation through open health data for over a year now. On many levels, the effort is the best story in federal open data. Park tells it himself in the video below, recorded yesterday at the Mutter Museum in Philadelphia.

Over at e-patients.net, Pew Internet researcher Susannah Fox asked how community organizations can tap into the health data and development trend that Park has been working hard to ignite. She shared several resources (including a few from this correspondent) and highlighted the teams who competed in a health developer challenge tour that culminated at the recent Health 2.0 conference.

Check out this article about HealthData.gov including footage of Park talking about the “health data eco-system” at the code-a-thon (and actually, the video also features local health hacker Alan Viars sitting there at the right).

Here are 3 blog posts about last year’s event, including mine:

Making Health Data Sing (Even If It’s A Familiar Song)

Community Health Data Initiative: vast amounts of health data, freed for innovators to mash up!

Making community health information as useful as weather data: Open health data from Health and Human Services is driving more than 20 new apps.

The next big event in this space on June 9 at the NIH. If you’re interested in what’s next for open health data, track this event closely.

The US CIO goes to the white board to describe good government

Earlier this week, United States CIO Vivek Kundra turned to the White House whiteboard to talk about sunshine, savings and service. If you’re unfamiliar with Kundra, he’s the man who has proposed and now is entrusted with implementing sweeping federal IT reform. One of the tools he’s been applying to the task is the so-called IT dashboard, which helps the White House Office of Management and Budget, where he serves to track IT spending. He claims to have reduced federal IT spending by some $3 billion dollars over the past two years with increased tracking and scrutiny.The federal CIO explains more about the results from that work, below.

http://www.whitehouse.gov/sites/all/modules/swftools/shared/flash_media_player/player5x2.swf

UPDATE: As open data consultant Dan Morgan pointed out, however, the Government Accountability Office reported that while OMB has made improvements to its dashboard, “further work is needed by agencies and OMB to ensure data accuracy.”

…inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance. Agencies generally concurred with the recommendations; OMB did not concur with the first recommendation but concurred with the second. GAO maintains that until OMB implements both, performance may continue to be inaccurately represented on the Dashboard.

One question left unanswered: Is /good the new /open? Decide for yourself at the newGood Government” section at WhiteHouse.gov.

Open government scrutinized before the House Oversight Committee

This morning, the Oversight Committee in the United States House of Representatives held a hearing on the Obama administration’s open government efforts. The “Transparency Through Technology: Evaluating Federal Open-Government Initiatives hearing was streamed live online at oversight.house.gov.

House Oversight Chairman Darrell Issa (R-CA) asked his Twitter followers before the hearing a simple question “Have you tried to get facts on how gov’t spends your $ on USASpending.gov?” He received no answers.

The oversight committee did, however, hear extensive testimony from government IT executives and open government watchdogs. As Representative Issa probes how agencies balance their books, such insight will be crucial, particularly with respect to improving accountability mechanism and data. Poor data has been a reoccurring theme in these assessments over the years. Whether the federal government can effectively and pervasively apply open data principles appears itself to be open question.

The first half of the hearing featured testimony from Dr. Danny Harris, chief information officer for the Department of Education, Chris Smith, chief information officer for the Department of Agriculture, Jerry Brito, senior research fellow at the Mercatus Center at George Mason University and Ellen Miller, co-founder and executive director of the Sunlight Foundation.

Alice Lipowicz of Federal Computer Week tweeted out a few data points from the hearing.

  • A Sunlight Foundation audit found that the USDA spent $12.7B on school lunches but only reported $250,000 on USASpending.gov
  • According to Brito, “half of 3000 datasets on Data.gov are on EPA toxic releases, with only 200 to 300 datasets are on fed gov activity.” Lipowicz also tweeted that Brito testified that federal agencies need outside auditors and “ought to report ‘earnings’ similar to private sector.”
  • USDA CIO Chris Smith said that the agency did not report school lunch payments below $25,000 to USASpending.gov; will report in FY2012

In her testimony before the House committee on clearspending, Miller reiterated the position of the Sunlight Foundation that the efforts of the administration to make government spending data open, accurate and available have been insufficient, particularly when the data is wrong.

The Sunlight Foundation has been excited about the new promises of data transparency, but sometimes the results are nowhere near the accuracy and completeness necessary for the data to be useful for the public.

Sunlight’s Clearspending analysis found that nearly $1.3 trillion of federal spending as reported on USASpending.gov was inaccurate. While there have been some improvements, little to no progress has been made to address the fundamental flaws in the data quality. Correcting the very complicated system of federal reporting for government spending is an enormous task. It has to be done because without it there is no hope for accountability.

Miller made several recommendations to the committee to improve the situation, including:

  • unique identifiers for government contracts and grants
  • publicly available hierarchical identifiers for recipients to follow interconnected entities
  • timely bulk access to all data.

Her remarks ultimately reflect the assessment that she made at last year’s Gov 2.0 Summit, where she made it clear that open government remains in beta. Our interview is below:

Tracking the progress of the Open Government Directive requires better data, more auditors and improved performance metrics. That said, this looks like the year when many of the projects at agencies will move forward towards implementation.

Last month, the U.S. moved forward into the pilot phase of an open source model for health data systems as the fruits of the Direct Project came to Minnesota and Rhode Island. The Direct Project allows for the secure transmission of health care data over a network. Some observers have dubbed it the Health Internet, and the technology has the potential to save government hundreds of millions of dollars, along with supporting the growth of new electronic health records systems .Open source and open government have also come together to create OpenStack, an open cloud computing platform that’s a collaboration between NASA, Rackspace, Cisco and a growing group of partners.

It’s too early to judge the overall effort open government as ultimately a success or failure. That said, the administration clearly needs to do more. In 2011, the open question is whether “We the people” will use these new participatory platforms to help government work better.

Video of the hearing will be posted here when available. Testimony from today’s hearing is linked to PDFs below.

Dr. Danny Harris

Chris Smith

Jerry Brito

Ellen Miller

The Honorable Danny Werfel

Note: Video of the hearing was provided through the efforts of citizen archivist Carl Malamud at house.resource.org, the open government video website that he set up in collaboration with Speaker Boehner and Congressman Issa. While the open government efforts of the federal government have a long way to go, in this particular regard, a public-private collaboration is making the proceedings of the House Oversight committee available to the world online.

White House dCTO Chris Vein on innovation and open government

Chris Vein is the newly minted deputy United States chief technology officer for government innovation in the White House Office of Science and Technology Policy. As reported by Fast Company last month, Vein headed to the White House from San Francisco, where he served as the city’s chief information technology officer. Vein takes on the portfolio from now departed deputy CTO for open government Beth Noveck, along with the @OpenGov Twitter account. Vein was a supporter of both open source and open data while at San Francisco; it’s reasonable to expect that will continue in his new role.

As Nick Judd reported at techPresident, Vein talked to open government advocates at the recent Transportation Camp in New York City. You can watch his talk below.

Transportation Camp East: Chris Vein from Streetfilms on Vimeo.

Enhanced by Zemanta

Civic coders for America gather in DC for a Presidents’ Day datacamp

This past weekend, civic developers gathered at a Seattle data camp to code for America. This Presidents’ Day, the day before George Washington’s Birthday, dozens of government technologists, data nerds, civic hackers and citizens from around the District of Columbia, Virginia and Maryland will join Code for America fellows for a datacamp at Big Window Labs.

The attendees of the Washington datacamp can look to the Seattle Data Camp for inspiration. The civic hacktivism on display there led to engaged discussions about Seattle’s South Park neighborhoodmobile damage assessment appstransit apps, mobile / geolocation appsdata mininginformation visualization.

Perhaps even more impressive, one of those discussions lead to the creation of a new smartphone application. Hear Near pushes alerts about Seattle events nearby to iPhone or Android device users using text messages. Hear Near is now available from iTunes and Android.

Joe McCarthy published a terrific post about Data Camp Seattle that offers a great deal of insight into why the event worked well. McCarthy helped the HearNear team by identifying and defining mappings between the GeoLoqi API and the iCal feed.

McCarthy describes how a creative discussion amongst talented, civic-minded people enabled them to donate their skills to putting the open data from Seattle’s data repository to work for its citizens. He also explored what inspires him about Code for America:

I wasn’t sure what to expect going into the event, but was greatly impressed with the interactions, overall experience and outcomes at Data Camp Seattle. I’ve admired the Code for America project since first learning about it, and have been a proponent of open data and platform thinking (and doing) on my blog. It was inspiring and empowering to have an opportunity to do more than simply blog about these topics … though I recognize the potential irony of writing that statement in a new blog post about these topics.

I suspect that one of the most durable outcomes of the Code for America project will be this kind of projection or radiation of civic empowerment through – and beyond – the efforts of the CfA fellows and their collaboration partners. In The Wealth of Networks, Yochai Benkler writes about how “[t]he practice of producing culture makes us all more sophisticated readers, viewers, and listeners, as well as more engaged makers”. In Program or Be Programmed, Doug Rushkoff warns against “relinquishing our nascent collective agency” to computers and the people who program them by engaging in “a renaissance of human capacity” by becoming programmers ourselves.

While many – or even most – of the specific applications we designed and developed during the Data Camp Seattle civic hackathon may not gain widespread traction and use, if the experience helps more of us shift our thinking – and doing – toward becoming co-creators of civic applications – and civic engagement – then the Code for America project will have succeeded in achieving some grand goals indeed.

This example of directed action at an unconference has fast become the next step in the evolution of camps, where a diverse set of volunteers come together to donate more than money or blood: they exchange information and then apply their skills to creating solutions to the needs defined by a given set of societal challenges.

This model of directed civic involvement has became a global phenomenon in wake of the crisiscamps that sprung up after the earthquake in Haiti last year. The cultural DNA of these camps has evolved into CrisisCommons, which has acted as platform for volunteers to donate their skills to help in natural disasters and other crises.

As the role of the Internet as a platform for collective action grows, those volunteers are gaining more ability to make a difference using powerful lightweight collaboration tecnology and open source data tools.

From the towns of the United States to cities in Denmark, Brazil, Kenya, Illinois and India, people interested in local Gov 2.0 have been gathering to to create applications that use open public data. In December, Around the world, the International Open Data Hackathon convened participants in over 56 cities in 26 countries on 5 continents.

As Seattle CIO Bill Schrier put it this past weekend, they’re turning data into information. Federal CTO Aneesh Chopra has praised these kinds of efforts “hacking for humanity.” An event like Random Hacks of Kindness “brings together the sustainable development, disaster risk management, and software developer communities to solve real-world problems with technology.”

On President’s Day, another datacamp will try to put that vision into action.

http://widgets.twimg.com/j/2/widget.js //