
Earlier today, however, a mechanical engineer named Claudio Ibarra commented on a Google+ thread that he thought that the animated GIF was a “waste.” Continue reading
Earlier today, however, a mechanical engineer named Claudio Ibarra commented on a Google+ thread that he thought that the animated GIF was a “waste.” Continue reading
You could spend a long day listing all of the organizations or individuals who are putting government data online, from Carl Malamud to open government activists in Brazil, Africa or Canada. As many conversations in the public domain over the past few years have demonstrated, there are many different perspectives on what purposes “open data” should serve, often informed by what advocates intend or related to an organization or institution’s goals. For those interested, I recommend the open data seminar and associated comments highly.
When and if such data includes ratings or malpractice information about hospitals or doctors, or fees for insurance companies, transparency and accountability is an important byproduct, which in turn does have political implications. (Watch the reaction of unions or doctors’ groups to performance or claims data going online for those conflicts.)
There are people who want to see legislatures open their data, to provide more insight into those processes, and others who want to to see transit data or health data become more open, in the service of more civic utility or patient empowerment.
Other people may support publishing more information about the business or performance of government because evidence of fraud, mismanagement or incompetence will support their arguments for shrinking the size of the state. A big tent for open government can mean that libertarians could end up supporting the same bills liberals do.
In the U.S., Govtrack.us has been making government legislative data open, despite the lack of bulk access to Thomas.gov, by “scraping.” There are many people who wish to see campaign finance data open, like the Sunlight Foundation, to show where influence and power lies in the political system. There are many members of civil society, media organizations and startups that are collecting, sharing or using open data, from OpenCorporates to OpenCongress, to Brightscope or ProPublica.
Whether anyone chooses to describe those activities as a movement is up to them — but it is indisputable that 3 years ago, a neutral observer would be hard-pressed to find an open government data platform. Now there are dozens at the national level. What matters more than their existence is what goes onto them, however, and there people have to be extremely careful about giving governments credit for just putting a “portal” online.
While the raw number of open government data platforms around the globe looks set to continue to increase in 2013 at every level of government, advocates should be wary of governments claiming “open government” victories as a result.
First, govts claim to defend “Net freedom” – and Net becomes less free. Then, govts embrace “open data” – and govts become less open. Bingo!
— Evgeny Morozov (@evgenymorozov) May 1, 2012
//platform.twitter.com/widgets.jsSince Morozov sent out that tweet, he’s published a book with a chapter that extends that critique, along with a series of New York Times op-eds, reviews, Slate debates, and a 16,000 word essay in The Baffler that explores the career and thinking of Tim O’Reilly (my publisher). Morozov’s essay catalyzed Annaleen Newitz to paraphrase and link to it at post at iO9, where Tim responded to in a comment.
While his style can distract and detract from his work — and his behavior on Twitter can be fairly characterized as contemptuous at times — the issues Morozov raises around technology and philosophy are important and deserve to be directly engaged by open government advocates, as John Wilbanks suggests.
@ramez If you remove the personal attack, it’s a very valuable essay. He makes terrific points about the abuse of rhetoric.
— Annalee Newitz (@Annaleen) April 3, 2013
@anildash @carlmalamud @digiphile I was summing up his argument.”Openwashing” is the main issue. Also we do need to interrogate our memes.
— Annalee Newitz (@Annaleen) April 3, 2013
That’s happening, slowly. Sunlight Foundation policy director John Wonderlich has also responded, quoting Morozov’s recommendations to reflect out how he might specific uses of technology that support open government. Wilbanks himself has written one of the most effective (short) responses to date:
One of the reasons I do “open” work is that I think, in the sciences, it’s a philosophical approach that is more likely to lead to that epistemic transformation. If we have more data available about a scientific problem like climate change, or cancer, then the odds of the algorithms figuring something out that is “true” but incomprehensible to us humans go up. Sam Arbesman has written about this nicely both in his book the Half Life of Facts and in another recent Slate article.
I work for “open” not because “open” solves a specific scientific problem, but because it increases the overall probability of success in sensorism-driven science. Even if the odds of success themselves don’t change, increasing the sample size of attempts will increase the net number of successes. I have philosophical reasons for liking open as well, and those clearly cause me cognitive bias on the topic, but I deeply believe that the greatest value in open science is precisely the increased sample size of those looking.
I also tend to think there’s a truly, deeply political element to enabling access to knowledge and science. I don’t think it’s openwashing (and you should read this paper recommended by Morozov on the topic) to say that letting individuals read science can have a real political impact.
Morozov’s critique of “openwashing” isn’t specious, though it’s fair to question his depiction of the history of open source and free software and an absence of balance in his consideration of various open government efforts. Civil society and media must be extremely careful about giving governments credit for just putting a “portal” online.
On that count, Wonderlich wrote about the “missing open data policy” that every government that has stood up or will stand up an open data platform could benefit from reading:
Most newly implemented open data policies, much like the Open Government Directive, are announced along alongside a package of newly released datasets, and often new data portals, like Data.gov. In a sense, these pieces have become the standard parts of the government data transparency structure. There’s a policy that says data should generally be open and usefully released, a central site for accessing it, some set of new data, and perhaps a few apps that demonstrate the data’s value.
Unfortunately, this is not the anatomy of an open government. Instead, this is the anatomy of the popular open government data initiatives that are currently in favor. Governments have learned to say that data will be open, provide a place to find it, release some selected datasets, and point to its reuse.
This goes to the concerns of traditional advocates working for good government, as explored in a excellent research paper by Yu and Robinson on the ambiguity of open government and open data, along with the broader discussion you’ll find in civil society in the lead up to the Open Government Partnership, where this dynamic was the subject of much concern — and not just in the Canadian or United Kingdom context. The work exploring this dynamic by Nathaniel Heller at Global Integrity is instructive.
As I’ve written before (unrepentant self-plagiarism alert), standing up open data platforms and publishing data sets regarding services is not a replacement for a Constitution that enforces a rule of law, free and fair elections, an effective judiciary, decent schools, basic regulatory bodies or civil society, particularly if the data does not relate to meaningful aspects of society.
Socrata, a venture-capital backed startup whose technology powers the open data platforms of several city, state and federal governments, including Kenya and the United States, is also part of this ecosystem and indisputably has “skin in the game.”
That said, the insights that Kevin Merritt, the founder of Socrata, shared in post on reinventing government are worth considering:
An open Government strategy needs to include Open Data as a component of enabling transparency and engaging citizens. However, Open Government is also about a commitment to open public meetings; releasing public information in all its forms, if not proactively at least in a timely fashion; engaging the public in decision making; and it is also a general mindset, backed up by clear policy, that citizens need to be empowered with information and a voice so they can hold their government accountable.
At the same time, a good Open Data strategy should support Open Government goals, by making structured data that relates to accountability and ethics like spending data, contracts, staff salaries, elections, political contributions, program effectiveness…etc. available in machine- and human-readable formats.
The open data strategy advanced by the White House and 10 Downing Street has not embraced releasing all of those data types, although the Obama administration did follow through on the President’s promise to launch Ethics.gov.
The Obama administration has come under heavy criticism for the quality of its transparency efforts from watchdogs, political opponents and media. It’s fair to say that this White House has advanced an unprecedented effort to open up government information while it has much more of mixed record on transparency and accountability, particularly with respect to national security and a culture of secrecy around the surveillance state.
Open government advocates assert that the transparency that President Obama promised has not been delivered, as Charles Ornstein, a senior reporter at ProPublica, and Hagit Limor, president of the Society of Professional Journalists, wrote in the Washington Post. In fact, the current administration’s open data initiatives are one of the bright spots its transparency record — and that’s in the context of real data quality and cultural issues that need to be addressed to match the rhetoric of the past four years.
“Government transparency is not the same as data that can be called via an API,” said Virginia Carlson, former president of the Metro Chicago Information Center. “I think the ‘New Tech’ world forgets that — open data is a political process first and foremost, and a technology problem second.”
If we look at what’s happening with open government in Chicago, a similar dynamic seems to have emerged, as the city methodically works to release high quality open data related to services, performance or lobbying but is more resistant to media organizations pushing for more access to data about the Mayor’s negotiations or electronic communications, the traditional targets of open government advocacy. This tension was explored quite well in an article by WBEZ on the people behind Chicago’s government 2.0 efforts.
In the United States, there is a sizable group of people that believe that data created using public funds should in turn be made available to the public — and that the Internet is a highly effective place to make such data available. Such thinking extends to open access to research or public sector code, too.
As those policy decisions are implemented, asking hard questions about data quality, use, licenses, outcomes and cost is both important and useful, particularly given that motivations and context will differ from country to country and from industry to civil society.
Who benefits and how? What existing entities are affected? Should all public data be subject to FOIA? If so, under what timelines and conditions? Should commercial entities that create or derive economic value from data pay for bulk access? What about licensing? If government goes digital, how can the poor, disabled or technically illiterate be given access and voice as well? (Answers to some of these questions are in the Sunlight Foundation’s principles of open government data, which were based on the recommendatations of an earlier working group.)
In the United Kingdom, there are also concerns that the current administrations “open data agenda” obscures a push towards privatization of public services should be more prominent in public debates, a dynamic that Morozov recently explored in the opinion pages of the New York Times. My colleague, Nat Torkington, highlighted the needs for a discussion about which services should be provided by government at Radar back in 2010:
Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.
Whether one agrees with the side of the argument that supports investment or the other that is looking for cost-savings — or both — is something that people of democratic societies will need to debate and decide for themselves, along with the size and role of government. The politics can’t be abstracted away.
I don’t think that many open government advocates are blind to the ideologies involved, including the goals of libertarians, nor that the “open dystopia” that Newitz described at iO9 is a particularly likely outcome.
That said, given the stakes, these policies deserve to be the subject of debate in every nation whose leaders are putting them forward. We’ve never had better tools for debate, discussion and collective action. Let’s use them.
In an age where setting up a livestream to the Web and the rest of the networked world is as easy as holding up a smartphone and making a few taps, the United States Supreme Court appears more uniformly opposed to adding cameras in the courtroom than ever.
Continue reading
The 2012-2013 influenza season has been a bad one, with flu reaching epidemic levels in the United States. Continue reading
The Open Government Partnership (OGP) has released statistics on its first 16 months since its historic launch in New York City, collected together in the infographic embedded below. This week, Open government leaders are meeting in Chile to discuss the formal addition of Argentina to the partnership and the national plans that Latin American countries have pledged to implement. [Livestream] Álvaro Ramirez Alujas, Founder of the Group of Investigation in Government, Administration and Public Policy (GIGAPP), assisted GOP with an analysis of these OPG action plans. Alujas found that:
Sigue el segundo panel de #OGPChile: comparación de los planes de acción twitter.com/ciudadanoi/sta…
— CiudadanoInteligente (@ciudadanoi) January 10, 2013
The infographic is also available en Español:
As I noted in my assessment of 2012 trends for Radar, last year the Economist’s assessment was that open government grew globally in scope and clout.
As we head into 2013, it’s worth reiterating a point I made last summer in a post on oversight of the Open Government Partnership:
There will be inevitable diplomatic challenges for OGP, from South Africa’s proposed secrecy law to Russia’s membership. Given that context, all of the stakeholders in the Open Government Partnership — from the government co-chairs in Brazil and the United Kingdom to the leaders of participating countries to the members of civil society that have been given a seat at the table — will need to keep pressure on other stakeholders if significant progress is going to be made on all of these fronts.
…
If OGP is to be judged more than a PR opportunity for politicians and diplomats to make bold framing statements, government and civil society leaders will need to do more to hold countries accountable to the commitments required for participation: they must submit Action Plans after a bonafide public consultation. Moreover, they’ll need to define the metrics by which progress should be judged and be clear with citizens about the timelines for change.
The post-industrial future of journalism is already here. It’s just not evenly distributed yet. The same trends changing journalism and society have the potential to create significant social change throughout the African continent, as states moves from conditions of information scarcity to abundance.
That reality was clear on my recent trip to Africa, where I had the opportunity to interview Justin Arenstein at length during my visit to Zanzibar. Arenstein is building the capacity of African media to practice data-driven journalism, a task that has taken on new importance as the digital disruption that has permanently altered how we discover, read, share and participate in news.
One of the primary ways he’s been able to build that capacity is through African News Innovation Challenge (ANIC), a variety of the Knight News Challenge in the United States.
The 2011 Knight News Challenge winners illustrated data’s ascendance in media and government, with platforms for data journalism and civic connections dominating the field.
As I wrote last September, the projects that the Knight Foundation has chosen to fund over the last two years are notable examples of working on stuff that matters: they represent collective investments in digital civic infrastructure.
The first winners of the African News Innovation Challenge, which concluded this winter, look set to extend that investment throughout the continent of Africa.
“Africa’s media face some serious challenges, and each of our winners tries to solve a real-world problem that journalists are grappling with. This includes the public’s growing concern about the manipulation and accuracy of online content, plus concerns around the security of communications and of whistleblowers or journalistic sources,” wrote Arenstein on the News Challenge blog.
While the twenty 2012 winners include investigative journalism tools and whistleblower security, there’s also a focus on citizen engagement, digitization and making public data actionable. To put it another way, the “news innovation” that’s being funded on both continents isn’t just gathering and disseminating information: it’s now generating data and putting it to work in the service of the needs of residents or the benefit of society.
“The other major theme evident in many of the 500 entries to ANIC is the realisation that the media needs better ways to engage with audiences,” wrote Arenstein. “Many of our winners try tackle this, with projects ranging from mobile apps to mobilise citizens against corruption, to improved infographics to better explain complex issues, to completely new platforms for beaming content into buses and taxis, or even using drone aircraft to get cameras to isolated communities.”
In the first half of our interview, published last year at Radar, Arenstein talked about Hacks/Hackers, and expanding the capacity of data journalism. In the second half, below, we talk about his work at African Media Initiative (AMI), the role of open source in civic media, and how an unconference model for convening people is relevant to innovation.
Justin Arenstein: The AMI has been going on for just over three years. It’s a fairly young organization, and I’ve been embedded now for about 18 months. The major deliverables and the major successes so far have been:
The idea is that we test ideas that are allowed to fail. We fund them in newsrooms and they’re driven by newsrooms. We match them up with technologists. We try and lower the barrier for companies to start experimenting and try and minimize risk as much as possible for them. We’ve launched a couple of slightly larger funds for helping to scale some of these ideas. We’ve just started work on a social venture or a VC fund as well.
Justin Arenstein: Africa hasn’t had the five-year kind of evolutionary growth that the Knight News Challenge has had in the U.S. What the News Challenge has done in the U.S. is effectively grown an ecosystem where newsrooms started to grapple with and accepted the reality that they have to innovate. They have to experiment. Digital is core to the way that they’re not only pushing news out but to the way that they produce it and the way that they process it.
We haven’t had any of that evolution yet in Africa. When you think about digital news in African media, they think you’re speaking about social media or a website. We’re almost right back at where the News Challenge started originally. At the moment, what we’re trying to do is raise sensitivity to the fact that there are far more efficient ways of gathering, ingesting, processing and then publishing digital content — and building tools that are specifically suited for the African environment.
There are bandwidth issues. There are issues around literacy, language use and also, in some cases, very different traditions of producing news. The output of what would be considered news in Africa might not be considered news product in some Western markets. We’re trying to develop products to deal with those gaps in the ecosystem.
Justin Arenstein: Some of the projects that we thought were particularly strong or apt amongst the African News Challenge finalists included more efficient or more integrated ways to manage workflow. If you look at many of the workflow software suites in the north, they’re, by African standards, completely unaffordable. As a result, there hasn’t been any systemic way that media down here produced news, which means that there’s virtually no way that they are storing and managing content for repackaging and for multi-platform publishing.
We’re looking at ways of not reinventing a CMS [content management system], but actually managing and streamlining workflow from ingesting reporting all the way to publishing.
Justin Arenstein: I think I may have I misspoken by saying “content management systems.” I’m referring to managing, gathering and storing old news, the production and the writing of new content, a three or four phase editing process, and then publishing across multiple platforms. Ingesting creative design, layout, and making packages into podcasting or radio formats, and then publishing into things like Drupal or WordPress.
There have been attempts to take existing CMS systems like Drupal and turn it into a broader, more ambitious workflow management tool. We haven’t seen very many successful ones. A lot of the kinds of media that we work with are effectively offline media, so these have been very lightweight applications.
The one thing that we have focused on is trying to “future-proof” it, to some extent, by building a lot of meta tagging and data management tools into these new products. That’s because we’re also trying to position a lot of the media partners we’re working with to be able to think about their businesses as data or content-driven businesses, as opposed to producing newspapers or manufacturing businesses. This seems to be working well in some early pilots we’ve been doing in Kenya.
Justin Arenstein: A big goal that we think we’ve achieved was to try and build a community of use. We put people together. We deliberately took them to an exotic location, far away from a town or location, where they’re effectively held hostage in a hotel. We built in as much free time as possible, with many opportunities to socialize, so that they start creating bonds. Right from the beginning, we did a “speed dating” kind of thing. There’s been very few presentations — in fact, there was only one PowerPoint in five days. The rest of the time, it’s actually the participants teaching each other.
We brought in some additional technology experts or facilitators, but they were handpicked largely from previous challenges to share the experience of going through a similar process and to point people to existing resources that they might not be aware of. That seems to have worked very well.
On the sidelines of the Tech Camp, we’ve seen additional collaborations happen for which people are not asking for funding. It just makes logical sense. We’ve already seen some of the initial fruits of that: three of the applicants actually partnered and merged their applications. We’ve seen a workflow editorial CMS project partner up with an ad booking and production management system, to create a more holistic suite. They’re still building as two separate teams, but they’re now sharing standards and they’re building them as modular products that could be sold as a broader product suite.
Justin Arenstein: We’ve tried to tap into quite a few of them. Some of the more recent tools are transferable. I think there was grand realization that people weren’t able to deliver on their promises — and where they did deliver on tools, there wasn’t documentation. The code was quite messy. They weren’t really robust. Often, applications were written for specific local markets or data requirements that didn’t transfer. You actually effectively had to rebuild them. We have been able to re-purpose DocumentCloud and some other tools.
I think we’ve learned from that process. What we’re trying to do with our News Challenge is to workshop finalists quite aggressively before they put in their final proposals.
Firstly, make sure that they’re being realistic, that they’re not unnecessarily building components, or wasting money and energy on building components for their project that are not unique, not revolutionary or innovative. They should try and almost “plug and play” with what already exists in the ecosystem, and then concentrate on building the new extensions, the real kind of innovations. We’re trying to improve on the Knight model.
Secondly, once the grantees actually get money, it comes in a tranche format so they agree to an implementation plan. They get cash, in fairly small grants by Knight standards. The maximum is $100,000. In addition, they get engineering or programming support from external developers that are on our payroll, working out of our labs. We’ve got a civic lab running out of Kenya and partners, such as Google.
Thirdly, they get business mentorship support from some leading commercial business consultants. These aren’t nonprofit types. These are people who are already advising some of the largest media companies in the world.
The idea is that, through that process, we’re hopefully going to arrive at a more realistic set of projects that have either sustainable revenue models and scaling plans, from the beginning, or built-in mechanisms for assessments, reporting back and learning, if they’re designed purely as experiments.
We’re not certain if it’s going to work. It’s an experiment. On the basis of the Tech Camp that we’ve gone through, it seems to have worked very well. We’ve seen people abandon what were, we thought, overly ambitious technology plans and rather matched up or partnered with existing technologists. They will still achieve their goals but do so in a more streamlined, agile manner by re-purposing existing tech.
Editors’s Note: This interview is part of an ongoing series at the O’Reilly Radar on the people, tools and techniques driving data journalism.
Pollwatch, a mobile application that enabled crowdsourced poll monitoring, has launched a final version at pollwatch.us, just in time for Election Day 2012. The initial iteration of the app was conceived, developed and demonstrated at the hackathon at the 2012 Personal Democracy Forum in New York City. Continue reading
As is the case in every major event in the U.S., social media was part of the fabric of communications during Hurricane Sandy. Twitter was a window into what was happening in real-time. Facebook gave families and friends a way to stay in touch about safety or power. And government officials and employees, from first responders mayors to governors to the President of the United States, put critical information into the hands of citizens that needed it.
While Hurricane Sandy cemented the utility of these networks, neither they nor their role are new. With all due respect to Gartner analyst Andrea Di Maio, his notion that people aren’t conveying “useful information” every day there — that it’s just ” chatting about sport results, or favorite actors, or how to bake” — is like some weird flashback to a 2007 blog post or ignorant cable news anchor.
Public sector, first responders and emergency management officials have recognized the utility of social media reports as a means for situational awareness before, during and after natural or man-made disasters for years now and have integrated tools into crisis response.
Officials at local, state and federal levels have confirmed to me again and again that it’s critical to build trusted networks *before* disaster strikes so that when crises occur, the quality of intelligence is improved and existing relationships with influence can amplify their messages.
Media and civil society serve as infomediaries and critical filters (aka, B.S. detectors) for vetting information, something that has proved crucial with fake reports and pictures popping up. Official government accounts play a critical role for putting trusted information into the networks to share, something we saw in real-time up and down the East Coast this week.
To be frank, Di Maio’s advice that authorities shouldn’t incorporate social media into their normal course of business is precisely the opposite of the experience on the ground of organizations like the Los Angeles Fire Department, Red Cross or FEMA. Here’s Brian Humphrey, public information officer of the LAFD, on best practices for social media:
If public safety officials come across Di Maio’s advice, I hope they’ll choose instead to listen to citizens every day and look to scale the best practices of their peers for using technology for emergency response, not start during a crisis.
In general, connecting more citizens with their legislators and create more resources for Congress to understand where their constituents and tech community stands on proposed legislation is a good thing. Last year’s Congressional hearings on the Stop Online Piracy Act … Continue reading
Today, I hosted a Twitter chat with the Voting Information Project. They partner with states to provide official election data that developers can use to create free, open source tools for voters.
I’ve embedded a storify of our conversation below, along with a video explaining more about what they do. Of special note: VIP is partnering with Mobile Commons to let registered voters know where to vote. Just txt “where” or “donde” to 877-877.
http://storify.com/digiphile/a-chat-with-votinginfo-on-voting-elections-and-tec.js[View the story “A chat with @VotingInfo on voting, elections and tech” on Storify]