Podcast: IT Security, Internet Freedom and Open Government at Threatpost

This morning, I was privileged to join Dennis Fisher on the Digital Underground podcast to talk about IT security, open government, Internet freedom and open data movements, including how they’re affecting IT security.

ListenIT Security, Internet Freedom and Open Government [MP3]

Fisher is a founding editor of the Threatpost blog and is one of the best information security journalists in the industry and a former colleague from TechTarget.

Over the course of the podcast, we discussed the different ways in which Internet freedom and privacy play into the current climate online. (We also talked a bit about Twitter and journalism.) As 2011 matures, legitimate concerns about national security will continue to be balanced with the spirit of open government expressed by the Obama administration.

The issues created between Wikileaks and open government policies are substantial. Open data may be used for accountability, citizen utility and economic opportunity. But as federal CIO Vivek Kundra said to Harvard Business School students studying Data.gov last year, the transparency facet in the Obama administration’s open government initiative has multiple layers of complexity.

Fisher and I explore these issues, along with a number of the complexities involved with improving information sharing between the public and private sector when it comes to vulnerabilities and threats. Currently, over 80% of the nation’s critical infrastructure is in the private sector.

Related stories:

Civic coders for America gather in DC for a Presidents’ Day datacamp

This past weekend, civic developers gathered at a Seattle data camp to code for America. This Presidents’ Day, the day before George Washington’s Birthday, dozens of government technologists, data nerds, civic hackers and citizens from around the District of Columbia, Virginia and Maryland will join Code for America fellows for a datacamp at Big Window Labs.

The attendees of the Washington datacamp can look to the Seattle Data Camp for inspiration. The civic hacktivism on display there led to engaged discussions about Seattle’s South Park neighborhoodmobile damage assessment appstransit apps, mobile / geolocation appsdata mininginformation visualization.

Perhaps even more impressive, one of those discussions lead to the creation of a new smartphone application. Hear Near pushes alerts about Seattle events nearby to iPhone or Android device users using text messages. Hear Near is now available from iTunes and Android.

Joe McCarthy published a terrific post about Data Camp Seattle that offers a great deal of insight into why the event worked well. McCarthy helped the HearNear team by identifying and defining mappings between the GeoLoqi API and the iCal feed.

McCarthy describes how a creative discussion amongst talented, civic-minded people enabled them to donate their skills to putting the open data from Seattle’s data repository to work for its citizens. He also explored what inspires him about Code for America:

I wasn’t sure what to expect going into the event, but was greatly impressed with the interactions, overall experience and outcomes at Data Camp Seattle. I’ve admired the Code for America project since first learning about it, and have been a proponent of open data and platform thinking (and doing) on my blog. It was inspiring and empowering to have an opportunity to do more than simply blog about these topics … though I recognize the potential irony of writing that statement in a new blog post about these topics.

I suspect that one of the most durable outcomes of the Code for America project will be this kind of projection or radiation of civic empowerment through – and beyond – the efforts of the CfA fellows and their collaboration partners. In The Wealth of Networks, Yochai Benkler writes about how “[t]he practice of producing culture makes us all more sophisticated readers, viewers, and listeners, as well as more engaged makers”. In Program or Be Programmed, Doug Rushkoff warns against “relinquishing our nascent collective agency” to computers and the people who program them by engaging in “a renaissance of human capacity” by becoming programmers ourselves.

While many – or even most – of the specific applications we designed and developed during the Data Camp Seattle civic hackathon may not gain widespread traction and use, if the experience helps more of us shift our thinking – and doing – toward becoming co-creators of civic applications – and civic engagement – then the Code for America project will have succeeded in achieving some grand goals indeed.

This example of directed action at an unconference has fast become the next step in the evolution of camps, where a diverse set of volunteers come together to donate more than money or blood: they exchange information and then apply their skills to creating solutions to the needs defined by a given set of societal challenges.

This model of directed civic involvement has became a global phenomenon in wake of the crisiscamps that sprung up after the earthquake in Haiti last year. The cultural DNA of these camps has evolved into CrisisCommons, which has acted as platform for volunteers to donate their skills to help in natural disasters and other crises.

As the role of the Internet as a platform for collective action grows, those volunteers are gaining more ability to make a difference using powerful lightweight collaboration tecnology and open source data tools.

From the towns of the United States to cities in Denmark, Brazil, Kenya, Illinois and India, people interested in local Gov 2.0 have been gathering to to create applications that use open public data. In December, Around the world, the International Open Data Hackathon convened participants in over 56 cities in 26 countries on 5 continents.

As Seattle CIO Bill Schrier put it this past weekend, they’re turning data into information. Federal CTO Aneesh Chopra has praised these kinds of efforts “hacking for humanity.” An event like Random Hacks of Kindness “brings together the sustainable development, disaster risk management, and software developer communities to solve real-world problems with technology.”

On President’s Day, another datacamp will try to put that vision into action.

http://widgets.twimg.com/j/2/widget.js //

New York City launches 311 online service request map

If you read Steven B. Johnson, you know that 100 million 311 calls reveal a lot about New York. Now citizens can surf over to look at those 311 requests every day. Today, New York City launched a 311 online service request map.

“The launch of the 311 Service Request Map is another milestone in the City’s efforts to improve the way we report 311 data to the public,” said Deputy Mayor Stephen Goldsmith in a prepared statement. “The release of this information will better enable the public and elected officials to hold the City accountable for the services we provide. Putting better information into the hands of community leaders across the five boroughs increases transparency and allows us to collaboratively address the problems that neighborhoods face.”

It appears that Inwood and Washington Heights are making the most 311 service requests, according to the latest version of the map.

This 311 online service request map is a good start, with layers, custom searches and a clean design. Querying for those layers is a bit slow but returns relevant results for bike parking or continuing education, although many of the other layers appear to be grayed out and “coming soon.” Querying for specific request types was even slower, so for the moment I can’t find out where complaints about poison ivy, illegal animals or noisy church services are concentrated.

Early reviews have generally been positive but guarded, with room to grow. It’s a “huge step forward, long way to go,” tweeted Philip Ashlock. They “could get something better for free (eg mobile) by doing #Open311 API instead,” referring to the idea of government as a platform.

I “would much rather have the data raw via an API in an open [format] than the map UI (which isn’t all bad) in the way,” tweeted Mark Headd.

The “NYC 311 map is impressive technically, but lacks context (time period?), a legend (Maps 101), & metadata. I wonder if they talked to users before implementing it,” tweeted Steven Romalewski. “Also, the city obviously has the address-level 311 data. It’d be nice if they published the raw data so others could analyze it (residents & NYC Council reps have been asking for this for years). That would indicate a real commitment to transparency.”

UPDATE: Nick Judd published an excellent post on the New York City 311 map at techPresident, where he reports that “raw complaint data from 311 on the city’s data repository, the NYC DataMine, later this year, according to a city spokesman.” Judd also fills in a couple of key details, like:

  • The application was built using the city’s public city-wide geospatial information system, CityMap
  • Requests for literature are not included in the NYC 311 map
  • Deputy Mayor Stephen P. Goldsmith acknowledged in a press conference today, reported on by the New York Times, “this addition to the city’s open data efforts was a nod to transparency advocates.”

“Some of this will not be entirely exciting for those of us whose job it is to make sure that the holes in the street are filled and the trash is picked up because it’ll provide visibility to what we are or not doing,” Mr. Goldsmith said. “And some of you will enjoy that visibility.”

As Judd links out, in the New York Times Cityroom blog has good coverage of the step towards getting a visual on New Yorkers’ 311 calls.

The importance of being earnest about big data

We are deluged in big data. We have become more adept, however, at collecting it than in making sense of it. The companies, individuals and governments that become the most adept at data analysis are doing more than find the signal in the noise: they are creating a strategic capability. Why?

“After Eisenhower, you couldn’t win an election without radio.

After JFK, you couldn’t win an election without television.

After Obama, you couldn’t win an election without social networking.

I predict that in 2012, you won’t be able to win an election without big data.”

Alistair Croll, founder of bitcurrent.

In November 2012, we’ll know if his prediction came true.

All this week, I’ll be reporting from Santa Clara at the so-called “data Woodstock” that is the Strata Conference. Croll is its co-chair. You can tune in to the O’Reilly Media livestream for the conference keynotes.

For some perspective on big data and analytics in government, watch IBM’s Dave McQueeney at last year’s Gov 2.0 Summit:

Or watch how Hans Rosling makes big data dance in this TED Talk:

http://video.ted.com/assets/player/swf/EmbedPlayer.swf