Esri’s new ArcGIS feature is live. Will terabytes of new open data follow?

esri-open-data

Back in February, I reported that Esri would enable governments to open their data to the public.Today, the geographic information systems (GIS) software giant pushed ArcGIS Open Data live, instantly enabling thousands of its local, state and federal government users to open up the public data in their systems to the public, in just a few minutes.

open-data-esri

“Starting today any ArcGIS Online organization can enable open data, specify open data groups and create and publicize their open data through a simple, hosted and best practices web application,” wrote Andrew Turner, chief technology officer of Esri’s Research and Development Center in D.C., in a blog post about the public beta of Open Data ArcGIS. “Originally previewed at FedGIS ArcGIS Open Data is now public beta where we will be working with the community on feedback, ideas, improvements and integrations to ensure that it exemplifies the opportunity of true open sharing of data.”

Turner highlighted what this would mean for both sides of the open data equation: supply and demand.

Data providers can create open data groups within their organizations, designating data to be open for download and re-use, hosting the data on the ArcGIS site. They can also create public microsites for the public to explore. (Example below.) Turner also highlighted the code for Esri’s open-source GeoPortal Server on Github as a means to add metadata to data sets.

Data users, from media to developers to nonprofits to schools to businesses to other government entities, will be able to download data in common open formats, including KML, Spreadsheet (CSV), Shapefile, GeoJSON and GeoServices.

“As the US Open Data Institute recently noted, [imagine] the impact to opening government data if software had ‘Export as JSON’ by default,” wrote Turner.

“That’s what you now have. Users can also subscribe to the RSS feed of updates and comments about any dataset in order to keep up with new releases or relevant supporting information. As many of you are likely aware, the reality of these two perspectives are not far apart. It is often easiest for organizations to collaborate with one another by sharing data to the public. In government, making data openly available means departments within the organization can also easily find and access this data just as much as public users can.”

EmploymentLaborForces_B230251

Turner highlighted what an open data site would look like in the wild:

Data Driven Detroit a great example of organizations sharing data. They were able to leverage their existing data to quickly publish open data such as censuseducation or housing. As someone who lived near Detroit, I can attest to the particular local love and passion the people have for their city and state – and how open data empowers citizens and businesses to be part of the solution to local issues.

In sum, this feature could, as I noted in February, could mean a lot more data is suddenly available for re-use. When considered in concert with Esri’s involvement in the White House’s Climate Data initiative, 2014 looks set to be a historic year for the mapping giant.

It also could be a banner year for open data in general, if governments follow through on their promises to release more of it in reusable forms. By making it easy to upload data, hosting it for free and publishing it in the open formats developers commonly use in 2014, Esri is removing three major roadblocks governments face after a mandate to “open up” come from a legislature, city council, or executive order from the governor or mayor’s office.

“The processes in use to publish open data are unreasonably complicated,” said Waldo Jacquith, director of the U.S. Open Data Institute, in an email. 

“As technologist Dave Guarino recently wrote, basically inherent to the process of opening data is ETL: “extract-transform-load” operations. This means creating a lot of fragile, custom code, and the prospect of doing that for every dataset housed by every federal agency, 50 states, and 90,000 local governments is wildly impractical.

Esri is blazing the trail to the sustainable way to open data, which is to open it up where it’s already housed as closed data. When opening data is as simple as toggling an “open/closed” selector, there’s going to be a lot more of it. (To be fair, there are many types of data that contain personally identifiable information, sensitive information, etc. The mere flipping of a switch doesn’t address those problems.)

Esri is a gold mine of geodata, and the prospect of even a small percentage of that being released as open data is very exciting.”

Will ESRI allow public GIS data to be fully open government data?

As has been true for years, there’s a robust debate in municipal information technology world around the use of proprietary software or open source. An important element of that conversation centers on open data, specifically whether the formats used by companies are interoperable and “open,” in the sense of being usable by more than one kind of software. When the license required to use a given software application is expensive, that requirement can put budget-strapped cities and towns in a difficult position. Last week, former New York State Senate CIO Andrew Hoppin weighed in on the debate, writing about proprietary software lions and bears in the Civic Commons marketplace, a new online directory of civic software.

http://storify.com/nickgrossman/proprietary-lions-and-bears-in-the-civic-commons-m.js I believe the Civic Commons Marketplace will ultimately save US taxpayers billions of dollars in government IT spending, while accelerating the propagation of technology-driven civic innovation in the bargain.  I’ve believed this for a while.   Thus, it’s a debate worth having; the Marketplace deserves attention, and critique.

In order to realize its potential, from my perspective as a recovering government CIO, I believe that the Civic Commons Marketplace must give equal billing to all software used in government, regardless of the software license associated with it.

Nick Grossman, the executive director of Civic Commons, chronicled the debate that Hoppin described in a Storify:

http://storify.com/nickgrossman/proprietary-lions-and-bears-in-the-civic-commons-m.js[View the story “Proprietary Lions and Bears in the Civic Commons Marketplace” on Storify]

I talked with ESRI founder Jack Dangermond in September 2010 about how he was opening up ESRI and the role he saw for mapping in open government. My sense then, as now, is that this is an issue that’s deeply important to him.

There are clearly strong feelings in the civic development community about the company’s willingness to open up its data, along with what that means for how public data is coded and released. If you’re a GIS developer and have an opinion on this issue, please let us know in the comments.

Jack Dangermond on mapping, government transparency and accountability

Writing over at the ESRI blog today, founder and president Jack Dangermond shared his thoughts on how maps and GIS information can contribute to improving government transparency and accountability:

Born out of the Gov 2.0 movement, the terms transparency and accountability have become part of the daily vernacular of governments and the citizens they serve. One might even suggest these words have become a new expectation of governing. Transparency and accountability began with a simple concept of openly communicating public policy to the taxpayer. Today, these concepts are thriving within a growing emphasis on developing an interactive dialog between governments and the people.

Maps can be a very valuable part of transparency in government. Maps give people a greater understanding of the world around them. They can help tell stories and many times be more valuable than the data itself. They provide a context for taxpayers to better understand how spending or decisions are being made in a circumstance of where they work and live. Maps help us describe conditions and situations, and help tell stories, often related to one’s own understanding of content.

I spoke with Dangermond about precisely this subject last year at the Gov 2.0 Summit in Washington. I believe the interview holds up and remains relevant to the conversation around open government today.

Tim O’Reilly on the power of platforms – from Web 2.0 to Gov 2.0 [VIDEO]

Earlier this spring, Tim O’Reilly gave a talk about how Web 2.0 relates to Gov 2.0 to an ESRI conference. He explores how the idea of the Internet as an operating system and the role of data in future of society.

http://video.esri.com/embed/236/000000/width/600

O’Reilly ended with an encouragement to the conference of mapping professionals and developers there and at large: “We really need to focus on what matters.”

How GIS technology and social media helped crisis response in Australia

As a new article at the O’Reilly Radar showed today, social data and geospatial mapping have joined the crisis response toolset. A new online application from geospatial mapping giant ESRI applies trend analysis to help responders to Australia’s recent floods create relevance and context from social media reporting. The Australian flood trends map shows how crowdsourced social intelligence provided by Ushahidi enables emergency social data to be integrated into crisis response in a meaningful way.

The combination of Ushahidi and ESRI in Australia shows that “formal and innovative approaches to information collection and analysis during disasters is possible,” said Patrick Meier, “and that there is an interface that can be crafted between official and non-official responses.” Meier is a research fellow at the Harvard Humanitarian Initiative and director of crisis mapping at Ushahidi and was reached via email.

Russ Johnson, ESRI’s global director for emergency response, recently spoke with the correspondent at the ESRI federal user conference in Washington, D.C. Johnson spent 32 years as a federal employee in southern California, predominantly working in the U.S. Forest Service. He was one of the pioneers who built up the FEMA incident response system, and he commanded one of the 18 teams around the nation that deploy assets in the wake of floods, fires and other disasters. At ESRI, Johnson helps the company understand the workflow and relevance of GIS for first-response operations. Our full interview is embedded below in the following video.

The world of crisis response has changed dramatically in the past several years, said Johnson. The beauty of the present historic moment is that “everybody can be a sensor,” said Johnson. “Everybody is potentially part of the network. The struggle that operators have is taking all of that free form data and trying to put into some sort of framework that makes it accurate.”

Emergency and crisis responders are faced with significant cultural barriers that have nothing to do with logging on to a website or configuring a new account, explained Johnson. “Public safety organizations are really, really resistant to change,” he said. “Technology has frightened a lot of people before social media was a new data source. It’s a new challenge that’s threatening to a lot of people. The question I pose is simple. Let’s use the first responder scenario, where you have 4-6 minutes from the time you get the call. the expectation is you’ll be on scene. Think about the possibility that before you arrive, thousands of people will have video on YouTube. They may have more situation awareness. When you arrive, you’ll be videoed, watched, and critiqued. Shouldn’t you consider that data if it can help you deploy more safely or effectively?”

Johnson said that he really likes FEMA director Fugate’s philosophy and operational mentality in that context. Fugate has emphasized that he believes the public can be a resource in crises, instead of a hindrance. The current FEMA chief is tapping social media’s potential for aiding disaster response. “There are times when agencies can’t get good intelligence,” said Johnson. “I cannot tell you how many times where we had televisions and the best information we were getting was from CNN or helicopters. There are times when it may be wrong but I’d rather have it be part of our mashup of data to help validate and inform responders.”

The technology itself has also evolved recently, said Johnson. “We used to have to have a specific person to support mission, which meant we had to drag a person trained in GIS everywhere. As the technology has evolved, and data has evolved, the tools have reached the operator and first responder level. We can now match persona, mission and task to GIS tech so that it fits them. You can get complex answers that can be generated by an operator, not a GIS geek.”

How did Haiti change the conversation?

“Everyone thought Haiti would be completely dark,” said Johnson, with all information provided by boots on the ground. In fact, social media played an important role, he said, highlighted by the efforts of Crisis Congress and others who heard those digital cries for help. Social media “brought the light on,” said Johnson, providing not just something to act on but perhaps the only thing to act on, at least initially. In subsequent crises, responders have found that crisis data, particularly when added to maps for context, can provide valuable insight long before official reports emerge.

This trend is a key issue for communities as more citizen engagement platforms emerge. “When you have a large emergency, who are the first responders? Who can get to you the most quickly? Your neighbors,” says Johnson. “if you can have a universal way to communicate to the people who can help you, that may have the only help you have. Conventionally, you think of the guys in uniforms and helmets.”

In 2011, citizens have the opportunity to shoulder more of that shared responsibility than ever.