Civic coders for America gather in DC for a Presidents’ Day datacamp

This past weekend, civic developers gathered at a Seattle data camp to code for America. This Presidents’ Day, the day before George Washington’s Birthday, dozens of government technologists, data nerds, civic hackers and citizens from around the District of Columbia, Virginia and Maryland will join Code for America fellows for a datacamp at Big Window Labs.

The attendees of the Washington datacamp can look to the Seattle Data Camp for inspiration. The civic hacktivism on display there led to engaged discussions about Seattle’s South Park neighborhoodmobile damage assessment appstransit apps, mobile / geolocation appsdata mininginformation visualization.

Perhaps even more impressive, one of those discussions lead to the creation of a new smartphone application. Hear Near pushes alerts about Seattle events nearby to iPhone or Android device users using text messages. Hear Near is now available from iTunes and Android.

Joe McCarthy published a terrific post about Data Camp Seattle that offers a great deal of insight into why the event worked well. McCarthy helped the HearNear team by identifying and defining mappings between the GeoLoqi API and the iCal feed.

McCarthy describes how a creative discussion amongst talented, civic-minded people enabled them to donate their skills to putting the open data from Seattle’s data repository to work for its citizens. He also explored what inspires him about Code for America:

I wasn’t sure what to expect going into the event, but was greatly impressed with the interactions, overall experience and outcomes at Data Camp Seattle. I’ve admired the Code for America project since first learning about it, and have been a proponent of open data and platform thinking (and doing) on my blog. It was inspiring and empowering to have an opportunity to do more than simply blog about these topics … though I recognize the potential irony of writing that statement in a new blog post about these topics.

I suspect that one of the most durable outcomes of the Code for America project will be this kind of projection or radiation of civic empowerment through – and beyond – the efforts of the CfA fellows and their collaboration partners. In The Wealth of Networks, Yochai Benkler writes about how “[t]he practice of producing culture makes us all more sophisticated readers, viewers, and listeners, as well as more engaged makers”. In Program or Be Programmed, Doug Rushkoff warns against “relinquishing our nascent collective agency” to computers and the people who program them by engaging in “a renaissance of human capacity” by becoming programmers ourselves.

While many – or even most – of the specific applications we designed and developed during the Data Camp Seattle civic hackathon may not gain widespread traction and use, if the experience helps more of us shift our thinking – and doing – toward becoming co-creators of civic applications – and civic engagement – then the Code for America project will have succeeded in achieving some grand goals indeed.

This example of directed action at an unconference has fast become the next step in the evolution of camps, where a diverse set of volunteers come together to donate more than money or blood: they exchange information and then apply their skills to creating solutions to the needs defined by a given set of societal challenges.

This model of directed civic involvement has became a global phenomenon in wake of the crisiscamps that sprung up after the earthquake in Haiti last year. The cultural DNA of these camps has evolved into CrisisCommons, which has acted as platform for volunteers to donate their skills to help in natural disasters and other crises.

As the role of the Internet as a platform for collective action grows, those volunteers are gaining more ability to make a difference using powerful lightweight collaboration tecnology and open source data tools.

From the towns of the United States to cities in Denmark, Brazil, Kenya, Illinois and India, people interested in local Gov 2.0 have been gathering to to create applications that use open public data. In December, Around the world, the International Open Data Hackathon convened participants in over 56 cities in 26 countries on 5 continents.

As Seattle CIO Bill Schrier put it this past weekend, they’re turning data into information. Federal CTO Aneesh Chopra has praised these kinds of efforts “hacking for humanity.” An event like Random Hacks of Kindness “brings together the sustainable development, disaster risk management, and software developer communities to solve real-world problems with technology.”

On President’s Day, another datacamp will try to put that vision into action.

http://widgets.twimg.com/j/2/widget.js //

Civic developers gather to code for America at data camps

Today in Seattle, over 50 civic developers have gathered at Socrata to work on coding applications from the city’s open data repository at data.seattle.gov. Today’s Seattle datacamp, organized by Code for America, is just one of several data camps that the new civic service is convening in host cities around the United States. Chacha Sikes, a 2011 Code for America fellow, explains what’s behind these data camps:

City governments have a lot of information which is useful to all of us. This ranges from maps of local parks to building footprints to real-time 911 calls. We all have an interest in our budget information, legislative documents and other resources that we use in collective decision-making and deliberation. Not all of this information is currently available for all cities, even though much of it is public record. The “Open Data” movement is a way to work on getting information into machine-readable formats, allowing for easy publishing, sharing, and reuse.

We’re hosting DataCamps in CfA’s cities this year to build communities around making city data more open and accessible to allow citizens to help cities work better.

DataCamp is a event focusing on skill-building and collaborative work on city data. It is an opportunity for interested parties in a city to work together, and build a network of people with shared interested in improving civic communications and information management.

Sanjay B. Hyatt, a writer at the Seattle Times, is at the data camp in the Pacific Northeast. He reported back that CTO Bill Schrier said that Seattle has 100 data sets. “Turn that data into information.”

DataCamp Seattle is using a Drupal site and a DataCamp Seattle Google group to coordinate and share notes. In the tradition of unconferences and barcamps, they’re also using a more analog method to sort out ideas and projects: sticky notes. Virtual observers can see the various projects going up, including calendars, an impact survey, an apps workshop and a “pimp my blog” to help stimulate the creation of hyperlocal blogs.

More data camps are coming soon to Seattle and Washington, D.C.

UPDATE: A day after the datacamp, a new app is available to Seattle residents. Hear Near pushes alerts about Seattle events nearby to mobile phone users using text messages.

Hear Near is available from iTunes and Android.

Hear Near was created by a team that included Amber Case (whose geolocation startup, Geoloqi, powers it), Aaron Parecki, Joe McCarthy, Jesse Kocher, Gene Homicki, Naoya Makino, Steve Ripley, Rebecca Gutterman and Jenny Frank.

Frank, a self-identified “non-techie” who attended the camp, came away with the feeling that “nothing is impossible.”

Panoramic image credits to Chris Metcalf.

Deb Bryant on open source at the Tech@State unconference

Deb Bryant, public sector communities manager at the Oregon State University Open Source Lab, kicked off the Tech@State unconference on open source at the National Democratic Institute today.

The short video below, capturing some of her thoughts on the evolution of open source in government, are worth considering, particularly with respect its use internationally. As Bryant pointed out, for instance, Brazil has been doing open source for a decade. “They’re really the Simon Bolivar of software down there.”

The schedule for the Tech@State unconference is evolving at Open4m.org/NDI. It bids to be an interesting day.

3 words from the Tech@State unconference

On Friday, the fifth Tech@State conference at the United States Department at State focused on the role of open source in government, industry and society. Today, there is a Tech@State unconference on open source at the National Democratic Institute. In keeping with the tradition of an unconference – sometimes called a barcamp – the day kicked off with a introduction where every attendee shared his or her name, affiliation and three words that describe who they are, what they care about, what they’ve come to learn or what they do. Or all three. Below is a wordle that shows the frequency of words used.

The schedule for the Tech@State unconference is evolving at Open4m.org/NDI. It bids to be an interesting day.

Why does government social media use matter to citizens?

An important role of technology journalists in the 21st century is to explain how broader trends that are changing technology, government and civic society relate to average citizens. Some have called this broader trend towards smarter, more agile government that leverage technology “Gov 2.0.” (Readers of this blog are no doubt familiar with the term.) When you dig into the topic, you can get stuck in a lot of buzzwords and jargon quickly. Most people don’t care about how a satellite gets into orbit, the release of community health data or the standards of an API for product recalls. They care quite a bit, however, about whether their GPS receiver enables them to get to a job interview, if a search engine can show them ER waiting room times and quality statistics, or if a cradle for their baby is safe. Those wonky policies can lead to better outcomes for citizens.

If you follow Mashable, you might have read about the ways that social media promotes good health or how government works better with social media.

The following stories have little to do with technology buzzwords and everything to do with impact. Following are five stories about government 2.0 that matter to citizens, with issues that literally come home to everyone.

1) The Consumer Product Safety Commission has launched a public complaints database at SaferProducts.gov. You could think of it as a Yelp for government, or simply as a place where consumers could go to see what was safe. Add that to the mobile recalls application that people can already use to see whether a product has been recalled.

2) The new Consumer Financial Protection Bureau will use technology to listen to citizens online to detect fraud. If you haven’t heard, DC has a new startup agency. That hasn’t happened in a long time. Your could think of it as Mint.gov mashed up with HealthCare.gov. The CFPB plans to use technology in a number of unprecedented ways for fraud detection, including crowdsourcing consumer complaints and trends analysis. Given how much financial fraud has affected citizens in recent years,and how much of the anger that the public holds for the bailouts of banks remains, whether this agency leveraging technology well will matter to many citizens.

3) Social data and geospatial mapping join the crisis response toolset. Historic floods in Australia caused serious damage and deaths. Government workers used next-generation technology that pulled in social media in Australia and mapped the instances using geospatial tools so that first responders could help citizens faster, more efficiently and more effectively. It’s an excellent example of how an enterprise software provider (ESRI) partnered with an open source platform (Ushahidi) to help government workers use social media to help people.

4) New geolocation app connects first responders to heart attack victims.The average citizen will never need to know what Web 2.0 or Gov 2.0 means. Tens of thousands, however, will have heart attacks every year. With a new geolocation mobile application that connects citizen first responders to heart attack victims, connected citizens trained in CPR now have a new tool to help them save lives.

Better access to information about food safety, product recalls and financial fraud will help citizens around the country. Improvements to the ability of government workers to direct help in a disastrous flood or for citizens to receive immediate help from a trained first responder in an emergency are important developments. As 2011 takes shape, the need for government to use social media well has become more important than ever. That’s why the perspective of government officials like FEMA administrator Craig Fugate matter.

“We work for the people, so why can’t they be part of the solution? “
said Fugate, speaking to delegates from the distributed chapters of Crisis Commons assembled at FEMA headquarters. “The public is a resource, not a 
liability.”

For example, Fugate said that FEMA used reporters’ tweets during Hurricane Ike for
 situational awareness. “We’ve seen mashups providing better info than
 the government.” Listening and acting upon those digital cries for help on social media during crisis could literally be a matter of life and death.

Whether government can adapt to a disrupted media landscape and the new realities of information consumption is of substantial interest to many observers, both inside and out of government. Whether government can be smarter, agile and more effective is a great interest to all.

How GIS technology and social media helped crisis response in Australia

As a new article at the O’Reilly Radar showed today, social data and geospatial mapping have joined the crisis response toolset. A new online application from geospatial mapping giant ESRI applies trend analysis to help responders to Australia’s recent floods create relevance and context from social media reporting. The Australian flood trends map shows how crowdsourced social intelligence provided by Ushahidi enables emergency social data to be integrated into crisis response in a meaningful way.

The combination of Ushahidi and ESRI in Australia shows that “formal and innovative approaches to information collection and analysis during disasters is possible,” said Patrick Meier, “and that there is an interface that can be crafted between official and non-official responses.” Meier is a research fellow at the Harvard Humanitarian Initiative and director of crisis mapping at Ushahidi and was reached via email.

Russ Johnson, ESRI’s global director for emergency response, recently spoke with the correspondent at the ESRI federal user conference in Washington, D.C. Johnson spent 32 years as a federal employee in southern California, predominantly working in the U.S. Forest Service. He was one of the pioneers who built up the FEMA incident response system, and he commanded one of the 18 teams around the nation that deploy assets in the wake of floods, fires and other disasters. At ESRI, Johnson helps the company understand the workflow and relevance of GIS for first-response operations. Our full interview is embedded below in the following video.

The world of crisis response has changed dramatically in the past several years, said Johnson. The beauty of the present historic moment is that “everybody can be a sensor,” said Johnson. “Everybody is potentially part of the network. The struggle that operators have is taking all of that free form data and trying to put into some sort of framework that makes it accurate.”

Emergency and crisis responders are faced with significant cultural barriers that have nothing to do with logging on to a website or configuring a new account, explained Johnson. “Public safety organizations are really, really resistant to change,” he said. “Technology has frightened a lot of people before social media was a new data source. It’s a new challenge that’s threatening to a lot of people. The question I pose is simple. Let’s use the first responder scenario, where you have 4-6 minutes from the time you get the call. the expectation is you’ll be on scene. Think about the possibility that before you arrive, thousands of people will have video on YouTube. They may have more situation awareness. When you arrive, you’ll be videoed, watched, and critiqued. Shouldn’t you consider that data if it can help you deploy more safely or effectively?”

Johnson said that he really likes FEMA director Fugate’s philosophy and operational mentality in that context. Fugate has emphasized that he believes the public can be a resource in crises, instead of a hindrance. The current FEMA chief is tapping social media’s potential for aiding disaster response. “There are times when agencies can’t get good intelligence,” said Johnson. “I cannot tell you how many times where we had televisions and the best information we were getting was from CNN or helicopters. There are times when it may be wrong but I’d rather have it be part of our mashup of data to help validate and inform responders.”

The technology itself has also evolved recently, said Johnson. “We used to have to have a specific person to support mission, which meant we had to drag a person trained in GIS everywhere. As the technology has evolved, and data has evolved, the tools have reached the operator and first responder level. We can now match persona, mission and task to GIS tech so that it fits them. You can get complex answers that can be generated by an operator, not a GIS geek.”

How did Haiti change the conversation?

“Everyone thought Haiti would be completely dark,” said Johnson, with all information provided by boots on the ground. In fact, social media played an important role, he said, highlighted by the efforts of Crisis Congress and others who heard those digital cries for help. Social media “brought the light on,” said Johnson, providing not just something to act on but perhaps the only thing to act on, at least initially. In subsequent crises, responders have found that crisis data, particularly when added to maps for context, can provide valuable insight long before official reports emerge.

This trend is a key issue for communities as more citizen engagement platforms emerge. “When you have a large emergency, who are the first responders? Who can get to you the most quickly? Your neighbors,” says Johnson. “if you can have a universal way to communicate to the people who can help you, that may have the only help you have. Conventionally, you think of the guys in uniforms and helmets.”

In 2011, citizens have the opportunity to shoulder more of that shared responsibility than ever.

New iPhone app connects trained citizens to others in cardiac distress

In 2011, there needs to be a better way to empower citizens trained in CPR to receive alerts about nearby cardiac arrest victims with geolocated maps and the location of electronic defribrillators to help them.

Now, there’s an app for that too: firedepartment.mobi. The new new geolocation app connects citizen first responders to heart attack victims in San Ramon. FireDepartment can be downloaded directly from iTunes.

Today the San Ramon Valley Fire Protection District (SRVFPD) in California launched a iPhone app that will dispatch trained citizens to help others in cardiac emergencies. This new application is the latest evolution of the role of citizens as sensors, where resources and information are connected to those who need it most in the moment. This FireDepartment app is also an important example of Gov 2.0, where a forward looking organization created a platform for citizens to help each another in crises and planned to make the underlying code available for civic developers to improve on. Given context and information, trained citizens in San Ramone will now be able to do more than alert authorities and share information: they can act to save lives. Here’s a demo of the app:

iPhone Demo from Fire Department on Vimeo.

Adriel Hampton called FireDepartment the perfect blend of technology, government and volunteers. Can an everyday citizen become a hero? As Joe Hackman observed, “Today it just got a lot easier.”

http://widgets.twimg.com/j/2/widget.js

new TWTR.Widget({
version: 2,
type: ‘search’,
search: ‘beahero’,
interval: 6000,
title: ‘Cardiac arrest near you?’,
subject: ‘Be a hero.’,
width: ‘auto’,
height: 300,
theme: {
shell: {
background: ‘#054766’,
color: ‘#ffffff’
},
tweets: {
background: ‘#ffffff’,
color: ‘#444444’,
links: ‘#1985b5’
}
},
features: {
scrollbar: false,
loop: true,
live: true,
hashtags: true,
timestamp: true,
avatars: true,
toptweets: true,
behavior: ‘default’
}
}).render().start();

Malamud: add bulk open government data access to Thomas.gov

An image of (insert name here), taken at about 2:30 this afternoon. (Photo by Abby Brack/Library of Congress)

An image of (insert name here), taken at about 2:30 this afternoon. (Photo by Abby Brack/Library of Congress)

Open government advocate Carl Malamud made a succinct recommendation for improving the United States House of Representatives on January 24th: “Open it up. Bulk access, developer day, an API, long-term open source model. People’s house.” Malamud linked to a letter at House.Resource.org to Representative Eric Cantor (R-VI), House Majority Leader in which he made the case for making bulk data access to bills and corollary data available to the public online through Thomas.gov:

Access to bulk data, both for the core Thomas system and for corollary databases, would have a huge and immediate effect. Hosting a developer day and making sure stakeholders are part of the long-term development will help keep the next- generation system in tune with the needs of the Congress and of the public.

As Malamud pointed out, long term plans to improve public access to the law are evolving, including the announcement that the Cornell Law Library would redesign Thomas.gov legislative/meta data models:

It’s finally official: The Library of Congress has selected us to work on a redesign of their legislative-metadata models. This sounds like really geeky stuff (and it is), but the effects for government and for citizens should be pretty big. What’s really being talked about here is (we hope) a great improvement not only in what can be retrieved from systems like THOMAS and LIS (the less-well-known internal system used by Congress itself), but also in what can be linked to and referenced. We’ll begin with a careful compilation of use cases, build functional requirements for what the data models should do, and go from there to think about prototype systems and datasets. The idea is to bring Semantic Web technology to bills, public laws, the US Code, Presidential documents, and a variety of other collections. Longtime LII friends and collaborators Diane Hillmann, John Joergensen and Rob Richards& will be working with our regular team to create the new models and systems.

Will the new GOP leadership take Malamud up on his proposal for an open developer day and bulk data? Stay tuned. As Nancy Scola wrote in techPresident that “Republicans in the House are making technology-enabled openness, transparency, and participation central to the public presentation of their core political values in a way that their Democratic counterparts never fully did.” Malamud has a track record that lends considerable credibility to his prospects: he helped to get the SEC online in 1993. More recently, “Washington’s IT guy” was able to work with the House leadership to start publishing hundreds of high-resolution videos from the House Oversight Committee hearings at House.Resource.org earlier this month.

If the new GOP leadership is serious about adopting the infrastructure to enable transparency and accountability in the House, perhaps adoption of open government data standards will be one of the enduring accomplishments of this 112th Congress.

gov.house.20110120_to http://d1.scribdassets.com/ScribdViewer.swf?document_id=47510400&access_key=key-28dgxfnpla0o1b17qgmp&page=1&viewMode=list

Exploring Gov 2.0 in Madison, Wisconsin

Erik Paulson published an excellent new series on Gov 2.0 in Madison, Wisconsin today:

The citizens of Madison are a fairly tech-savvy bunch, but when it comes to technology in the civic space, we’re not as far out it the lead as we should be. I’d like us to change that, and join the list of cities developing applications as part of a Gov 2.0 movement.  This is a brief introduction, and what follows below is a three-part set of posts.

Part I focuses on some of what Gov 2.0 is, and uses Madison Metro as an example. Part II looks at how Madison is doing with Gov 2.0, and what we can be doing better. Part III looks at some specific Gov 2.0 systems that we could be building.

All three articles are excellent, and include several kind nods towards this blog and to Code for America and Civic Commons, two of the civic innovations organizations to watch in 2011.You’ll find thoughts on citizens as sensors, urban data, civic development, government as a platform, a “neighborhood API,”improving libraries, adding fibre, legislation tracking and more. Highly recommended.

Paulson also suggests excellent further reading in The Economist’s Special Report on Smart Cities and  Time Magazine’s article “Want to Improve Your City? There’s an App for That” for more background on Gov 2.o in cities.

Tim O’Reilly talks to Code for America about the power of platforms

Today, Tim O’Reilly spoke about the power of platforms to the inaugural class of Code for America fellows.

What’s happening today is an “open data” movement, said O’Reilly. “That’s what’s going to build the next platform.” As he’s said before, he thinks we’re now in an interesting platform stage where “the Internet is the operating system.” As early adopters of the new Google Chrome netbooks, a material metaphor for that notion is now online.

You can listen to the audio of Tim O’Reilly (my publisher) or download the MP3. Video may be available. later. Editor’s Note: O’Reilly Media is a supporter of Code for America and its founder, Tim O’Reilly, sits on its board.

The notion of “government as a platform,” which Tim has been speaking about for years now, is founded in his understanding of how technology companies have historically grown and flourished. Many of the anecdotes and historical underpinning of Gov 2.0 are in the webcast, “What is Gov 2.0?” in the side bar of this blog.

Here are a couple of key points from today:

Lesson 1: Platforms spread when they are ubiquitous and barriers to entry are low
Lesson 2: Create an architecture of participation, like Unix.
Lesson 3: Small pieces, loosely joined, which drove the growth of the World Wide Web.
Lesson 4: Don’t (just) build websites, build Web services.

There’s a lot more in there that builds upon Tim’s platform paradigm for government. Give it a listen and, if you find some other insights that particularly strike you or apply to how you think about how government can leverage the power of platforms, please share it in the comments.

If the notion that data and simplicity can build the government platform sound familiar, it should: Tim talked with the first United States chief technology officer, Aneesh Chopra, about how these ideas apply to government last year:

Building on that, if you have a moment, head on over to the White HouseExpertNet” wiki and share your thoughts on how the federal government should be designing democracy, specifically with respect to creating an open governnemt platform for citizen consultation.