City of Los Angeles launches open data pilot and survey

Upon election, I wondered whether Mayor-Elect Eric Garcetti would reboot Los Angeles city government for the 21st century. After 4 months, there are several promising green shoots to report.

First, Mayor Garcetti vowed to modernize city government in the City of Angels, posting agency performance statistics online and reviewing all departments. Now, the City of Los Angeles has launched its own open data pilot project using ESRI’s ArcGIS platform.

laskyline_logo_final_7_1

For veterans of such efforts, a portal to mapping data may not be particularly exciting or useful, but it’s a start. Notably, the city has put up an online survey where people can request other kind of city data and suggest changes or improvements to the pilot website.

Here’s a few suggestions:

1) Study how the cities of Chicago and New York cleaned, published and used data, including market demand.

2) Talk to the data desk at the Los Angeles Times. If you want your city’s performance data to be used by media for accountability and transparency, address their concerns.

3) Make a list of every single request for data made by journalists in Los Angeles under the California Records ActRelease the data and proactively publish that type of data going forward.

4) If your goal is economic outcomes from open data, review all requests for city data from businesses and prioritize release of those data sets. Engage startups and venture capitalists who are consuming open data and ask about quality, formats and frequency of release.

5) Check out New York City’s gorgeous new open data site and the ongoing release of more data sets. Set those free, too.

6) Check out the new NYC.gov, Utah.gov and gov.uk in the United Kingdom for ideas, principles and models. Of note: the use of open standards, citizen-centricity, adaptive Web design, powerful search, focus on modern design aesthetic.

Good luck, LA!

Does privatizing government services require FOIA reform to sustain open government?

I read an editorial on “open government” in the United Kingdom by Nick Cohen today, in which he argues that Prime Minister David Cameron is taking “Britain from daylight into darkness. Cohen connects privatization to the rise of corporate secrecy … Continue reading

White House asks for feedback on second National Action Plan for Open Government

As the annual Open Government Partnership conference draws near, the White House would like the people to weigh in on building a more open government. The request for feedback parallels the one made two years ago, when the White House engaged civil society organizations regarding its open government efforts, and follows up on a July 3 post on open government on the White House blog.

WhiteHouse-EOB

Here are the questions that they’d like help answering:

  1. How can we better encourage and enable the public to participate in government and increase public integrity? For example, in the first National Action Plan, we required Federal enforcement agencies to make publicly available compliance information easily accessible, downloadable and searchable online – helping the public to hold the government and regulated entities accountable.
  • What other kinds of government information should be made more available to help inform decisions in your communities or in your lives?
  • How would you like to be able to interact with Federal agencies making decisions which impact where you live?
  • How can the Federal government better ensure broad feedback and public participation when considering a new policy?
  1. The American people must be able to trust that their Government is doing everything in its power to stop wasteful practices and earn a high return on every tax dollar that is spent.  How can the government better manage public resources? 
  • What suggestions do you have to help the government achieve savings while also improving the way that government operates?
  • What suggestions do you have to improve transparency in government spending?
  1. The American people deserve a Government that is responsive to their needs, makes information readily accessible, and leverages Federal resources to help foster innovation both in the public and private sector.   How can the government more effectively work in collaboration with the public to improve services?
  • What are your suggestions for ways the government can better serve you when you are seeking information or help in trying to receive benefits?
  • In the past few years, the government has promoted the use of “grand challenges,” ambitious yet achievable goals to solve problems of national priority, and incentive prizes, where the government identifies challenging problems and provides prizes and awards to the best solutions submitted by the public.  Are there areas of public services that you think could be especially benefited by a grand challenge or incentive prize?
  • What information or data could the government make more accessible to help you start or improve your business?

The White House is asking that feedback be sent to opengov@ostp.gov by September 23 and states that it will post a summary of submissions online in the future.

If you’re in the mood to weigh in, there just might be a few other pressing issues that deserve to be addressed in the plan, from compliance with the Freedom of Information Act to press freedom to surveillance and national security.

A note on email, public engagement and transparency

In a post regarding the White House’s call for input, Nextgov reporter Joseph Marks is skeptical about using email to solicit feedback, suggesting instead that the administration return to the approach of 2009, when the transition team asked the public at large to weigh in on open government.

“When seeking advice on open government, it seems natural to make that advice itself open and transparent,” writes Marks. “This could be done using a plain old comments section. Even better, the White House could have engaged the public with a crowdsourcing platform such as IdeaScale, which allows users to vote ideas up and down. That way the public could participate not just in offering ideas but in choosing which ones merit further consideration.”

People who have been following the thread around the drafting of the U.S. “national action plans” for open government know, however, that a similar call for feedback went out two years ago, when the White House asked for comments on the first version of the plan. At the time, I was similarly skeptical of using email as a mechanism for feedback.

Writing on Google+, however, open government researcher Tiago Peixto, however, posited some reasons to look at email in a different light:

My first reaction was similar to that of some other observers: e-mail consultations, in most cases, are not transparent (at least immediately) and do not foster any kind of collaboration/deliberation.

But this comes rather as a surprise. Even though Sunstein might have some reserves towards deliberative models he is a major scholar in the field of decision-making and – to put it in fashionable terms – solutions to tap the crowd’s expertise. In fact, judging from this, one might even expect that Sunstein would take the opportunity offered by the OGP to create some sort of “prediction market”, one of his favorite mechanisms to leverage the knowledge dispersed across the public. In this case, why would they solicit online feedback via e-mail?

Thinking of email as a practical, last-minute choice is a possible explanation. But in the spirit of open interpretation (nowadays everything needs to be preceded by the word “open”), I am thinking of an alternative scenario that may have led to the choice of e-mail as the channel to gather input from the public online:

A possible hypothesis is that Sunstein might have been confronted by something that is no news to federal government employees: they have a very limited number of tools that they are actually allowed to use in order to engage with the public online. Having a limited number of options is not a bad thing per se, provided the options available are good enough. In this sense, the problem is that most of the tools available (e.g. ranking, ideation) do not meet reasonable standards of good “choice architecture”, to use Sunstein’s terms. One might imagine that as Sunstein went through the different options available, he foresaw all the effects that could be generated by the tools and their design: reputational cascades, polarization, herding… In the end, the only remaining alternative, although unexciting, was e-mail. In this case at least, preferences are independently aggregated, and the risks of informational and social influence are mitigated.

Maybe the option of using e-mail to solicit inputs from the public was just a practical solution. But thinking twice, given the options out there, I guess I would have opted for e-mail myself.

From where I sit today, the White House might be better off trying a both/and strategy: solicit feedback via email, but also post the draft action plan to Github, just like the open data policy, and invite the public to comment on proposals and add new ones.

The lack of public engagement around the plan on the primary White House Twitter, Facebook and Google+ accounts, however, along with the rest of the administration’s social media channels, suggests that feedback on this plan may not a top priority at the moment. To date, federal agencies are not using social media to ask for feedback either, including the Justice Department, which plays an important role in Freedom of Information Act policy and requests.

At least they’re using the @OpenGov and @WhiteHouseOSTP accounts:

 

U.S. House of Representatives publishes U.S. Code as open government data

us capitol

Three years on, Republicans in Congress continue to follow through on promises to embrace innovation and transparency in the legislative process. Today, the United States House of Representatives has made the United States Code available in bulk Extensible Markup Language (XML).

“Providing free and open access to the U.S. Code in XML is another win for open government,” said Speaker John Boehner and Majority Leader Eric Cantor, in a statement posted to Speaker.gov. “And we want to thank the Office of Law Revision Counsel for all of their work to make this project a reality. Whether it’s our ‘read the bill’ reforms, streaming debates and committee hearings live online, or providing unprecedented access to legislative data, we’re keeping our pledge to make Congress more transparent and accountable to the people we serve.”

House Democratic leaders praised the House of Representatives Office of the Law Revision Counsel (OLRC) for the release of the U.S. Code in XML, demonstrating strong bipartisan support for such measures.

“OLRC has taken an important step towards making our federal laws more open and transparent,” said Whip Steny H. Hoyer, in a statement.

“Congress has a duty to publish our collective body of enacted federal laws in the most modern and accessible way possible, which today means publishing our laws online in a structured, digital format. I congratulate the OLRC for completing this significant accomplishment. This is another accomplishment of the Legislative Branch Bulk Data Task Force. The Task Force was created in a bipartisan effort during last year’s budget process. I want to thank Reps. Mike Honda and Mike Quigley for their leadership in this area, and Speaker Boehner and Leader Cantor for making this task force bipartisan. I also want to commend the dedicated civil servants who are leading the effort from the non-partisan legislative branch agencies, like OLRC, who work diligently behind the scenes – too often without recognition – to keep Congress working and moving forward.”

The reaction from open government advocates was strongly positive.

“Today’s announcement is another milestone in the House of Representatives efforts to modernize how legislative information is made available to the American people,” said Daniel Schuman, policy director at Citizens for Responsibility and Ethics in Washington (CREW). “The release of the US Code in Bulk XML is the culmination of several years of work, and complements the House’s efforts to publish House floor and committee data online, in real time, and in machine readable formats. Still awaiting resolution – and the focus of the transparency community’s continuing efforts — is the bulk release of legislative status information.” (More from Schuman at the CREW blog.)

“I think they did an outstanding job,” commented Eric Mill, a developer at the Sunlight Foundation. “Historically, the U.S. Code has been extremely difficult to reliably and accurately use as data. These new XML files are sensibly designed, thoroughly documented, and easy to use.”

The data has already been ingested into open government websites.

“Just this morning, Josh Tauberer updated our public domain U.S. Code parser to make use of the new XML version of the US Code,” said Mill. “The XML version’s consistent design meant we could fix bugs and inaccuracies that will contribute directly to improving the quality of GovTrack’s and Sunlight’s work, and enables more new features going forward that weren’t possible before. The public will definitely benefit from the vastly more reliable understanding of our nation’s laws that today’s XML release enables.” (More from Tom Lee at the Sunlight Labs blog.)

Jim Harper, Director of Information Policy Studies at the Cato Institute, similarly applauded the release.

“This is great progress toward better public oversight of government,” he said. “Having the U.S. Code in XML can allow websites, apps, and information services to weave together richer stories about how the law applies and how Congress is thinking about changing it.”

Harper also contrasted the open government efforts of the Obama administration, which has focused more upon the release of open government data relevant to services, with that of the House of Representatives. While the executive and legislative branches are by definition apples and oranges, the comparison has value.

“Last year, we reported that House Republicans had the transparency edge on Senate Democrats and the Obama administration,” he said. “(House Democrats support the Republican leadership’s efforts.) The release of the U.S. Code in XML joins projects like docs.house.gov and beta.congress.gov in producing actual forward motion on transparency in Congress’s deliberations, management, and results.

For over a year, I’ve been pointing out that there is no machine-readable federal government organization chart. Having one is elemental transparency, and there’s some chance that the Obama administration will materialize with the Federal Program Inventory. But we don’t know yet if agency and program identifiers will be published. The Obama administration could catch up or overtake House Republicans with a little effort in this area. Here’s hoping they do.”

This article has been updated with additional statements over time.

White House names “champions of change” for local open government and civic hacking

On June 27, the White House asked for nominations for civic hackers that have built tools that helped to meet the needs of the public. This morning, the Obama administration honored 15 different Americans for their work on open government and “civic hacking.”

The event was part of the ongoing “Champions of Change” program, where the work of Americans in their communities is at the White House. The stories serve as a reminder that in order for local governments work to deliver smarter government, people must step up to lead, organize, collaborate, create and code those better outcomes. Technology itself is not enough. For more examples, look to last year’s champions of change for local government innovation.

One notable detail that emerged from today: Intel announced that it will be funding some open government projects.

Politico’s Morning Tech reported that Intel will fund some of the projects created at the National  Day of Civic Hacking to the tune of some $20,000 – a piece. “It’s unclear how many they’ll support, but the chipmaker will pick projects that envision a more data-oriented society and use datasets from a diverse array of industries,” wrote Alex Byers. “They’ll announce the recipients over the next few weeks.” (If any readers hear of such grants, please let me know.)

Following is a small sample of tweets from or about the event, followed by a list of the men and women who were honored today, biographies provided by the White House press office, and links to their organizations and/or work. I’ll post video when it’s available.

Steve Spiker, Director of Research & Technology at the Urban Strategies Council Moraga, CA

Steve Spiker (Spike) is the Director of Research & Technology at the Urban Strategies Council, a social change nonprofit supporting innovation and collaboration based in Oakland for almost 25 years. He leads the Council’s research, spatial analysis, civic innovation, open data, and technology efforts. Spike has research experience in community development, housing, criminology, spatial epidemiology and reentry issues. He loves data, visualization, GIS and strategic technology implementation, especially open source tech. Spike is the co-founder of OpenOakland, a Code for America Brigade and is helping guide government technology decisions and civic engagement in the East Bay. In 2012 Spike was chosen as one of the Next American City Vanguard class. He is an outspoken supporter of open data and open government and speaks across the USA about data driven decision making. He also campaigns to end human trafficking and runs Stealing Beauty Photography.

Travis Laurendine, Founder and CEO of LA Labs New Orleans, LA

Travis Laurendine doesn’t fit in the typical bio box any more than his hair fits into the typical hat. As a serial entrepreneur he has been on the cutting edge of both the web startup and entertainment industry for nearly 10 years. He launched his first web startup while an Economics major at Vanderbilt University, where he was also selected as the first Vanderbilt student with a film to make it in the Nashville Film Festival. When Hurricane Katrina struck his hometown of New Orleans, he stayed back in the city and found himself wearing the hats of startup CEO, concert promoter, restaurant general manager, standup comic, film/video producer and director, MTV News journalist, band manager/agent, investor, hackathon organizer, Entrepreneur-In- Residence, and cultural ambassador. Recently, he founded Louisiana’s first hackathon organization, CODEMKRS, which is currently being transformed into Louisiana’s only modern code school. This summer he has organized hackathons for giant music festivals JazzFest and Bonnaroo and is currently planning San Francisco’s Outside Lands’ first hackathon. His official job is being the founder and CEO of LA Labs, a startup laboratory focused on the marriage of entertainment and technology that uses New Orleans as the ultimate creative incubator. He is thankful for his loving family and friends and the daily inspiration he gets from the great city of New Orleans.

Scott Phillips, Co-Founder and CEO of Isocentric Networks Tulsa, OK

Scott Phillips is the co-founder and CEO of Isocentric Networks, an advanced data center services company based in Tulsa, OK. He was previously the founder and CEO of a sensor technology company whose work included a project for NASA for use on a manned mission to Mars. Scott is also a founding board member of Fab Lab Tulsa, a 21st Century non-profit community center for innovation, entrepreneurship, and STEM education through a partnership with the Massachusetts Institute of Technology. Scott’s current passion lives at the nexus of entrepreneurship, the maker movement, and civic hacking, three transformative movements that he believes are democratizing how we live, work and play. According to Scott, it is easy to understand the impact of civic hacking on government when you view it in three steps; give citizens transparency, give citizens a voice, then give citizens ownership.

George Luc, Co-Founder and CEO of GivePulse Austin, TX

George Luc is Co-Founder and CEO of GivePulse, a social network that matches people to causes and enables nonprofits, companies and affinities to manage volunteers, list events and track service hours in one central community. GivePulse launched earlier this year in 2013 and has since tracked over 100K service hours and mobilized over 5K volunteers in Austin alone. George has a BS and MS in Computer Science from Virginia Tech with an emphasis in Human Computer Interaction. He spent much of his early career developing technology for people with disabilities and has worked with companies like Daylert, IBM, ESO and HomeAway. He serves as a board member of City of Austin Volunteer & Service, Austin Convention Center and Visitor’s Bureau, KLRU, Open Door Preschool, and was a City Commissioner for Austin Mayor’s Committee for People with Disabilities.

Craig Michael Lie Njie, CEO, Kismet World Wide Consulting Mountain View, CA

Mr. Lie Njie is CEO of Kismet World Wide Consulting, which he founded in 2002. Lie has over 20 years of professional experience and currently consults world-wide on a variety of topics including privacy, security, technology design and development, education, entrepreneurship, management, sales and marketing, and mobile application development. Lie was given his name as an honorarium for his three years of service (2005-2008) as a Peace Corps Volunteer in The Gambia, West Africa, where he designed, deployed, and taught the first two years of The Gambia’s first Bachelor’s in Computer Science program at the University of The Gambia (UTG). Today his program is still successful and sustainable. After returning from the Peace Corps, Lie recruited and managed a volunteer team to build and release the free WasteNot iOS app to help people world-wide share their good ideas for reducing environmental impact. He furthermore helped the United Nations as a technology consultant and researched and documented the privacy risks of health and fitness mobile apps.

Christopher Whitaker, Project Management Consultant at the Smart Chicago Collaborative, Chicago, IL

Christopher Whitaker is a project management consultant at the Smart Chicago Collaborative, utilizing his experience in government and community organizing to advance civic innovation in Chicago. Whitaker also serves as the Chicago Brigade Captain for Code for America, supporting civic hacking events and teaching a weekly Civic Hacking 101 class. He is a graduate of DePaul University (MPA) and Sam Houston State University (BA, Political Science). Previously, Whitaker served with the US Army in Iraq as a mechanized infantryman.

Jessica Klein, Co-Founder of Rockaway Help, Brooklyn, NY

Together with a group of journalists and residents, civic hacker and designer Jessica Klein co-founded “Rockaway Help” in the wake of Hurricane Sandy. Rockaway Help is committed to empowering the community to find solutions for emergency response, preparedness and rebuilding through hyperlocal open news and the development of innovative community-designed technologies. As part of the National Day of Civic Hacking, Jessica lead workshops and hackathons for designers, engineers and Rockaway Beach, New York residents to identify problems and prototype design or technology solutions in the devastated coastal community. Jessica is currently the Creative Lead of the Mozilla Open Badges project where she promotes openness and creativity in formal and informal learning environments and develops ways for learners to design their own unique narrative around their credentials. Jessica created the Hackasaurus project, the Web X-Ray Goggles and Thimble tools to help teens learn how to code through hacking. Over the last decade, she has worked at a variety of institutions dedicated to learning including the Museum of Arts & Design, The Rubin Museum of Art, The Institute of Play, Startl, The Hive and Sesame Workshop. She also founded OceanLab NYC, a project which asked parents, teachers and kids in the NYC community to investigate their urban coastal environment through casual interaction and play.

Caitria O’Neill, Co-Founder of Recovers San Francisco, CA

Caitria O’Neill is a co-founder of Recovers, a disaster preparedness and recovery technology company in San Francisco. After a tornado struck her hometown, Monson, MA in 2011, Caitria and her sister Morgan worked within their community to connect survivors with local skills and donations. This kind of seat-of-the-pants organizing happens in every neighborhood, after every storm. The Recovers team has turned the best practices of many efforts into a user-friendly tech toolkit for risk mitigation and community response. In less than two years they have helped hundreds of thousands of people find information, aid, and ways to pitch in. Caitria holds a BA in Government from Harvard University, FEMA NIMS/ICS certifications, and was named an Up-and- Coming CEO by Forbes Magazine. Her work has been featured by CNN Opinion, TED.com, and Bloomberg Businessweek.

Steven Clift, Founder of E-Democracy Minneapolis, Minnesota

Steven Clift is @democracy on Twitter. He launched E-Democracy.org in 1994 and it is the world’s first election information website. His “government by day, citizen by night” insights were built as leader of the State of Minnesota’s first e-government initiative. He spoke across 30 countries for over a decade from Estonia to Libya to Mongolia on open government and civic participation to support non-partisan, volunteer-powered efforts for inclusive online local democracy. An Ashoka Fellow, today he is E-Democracy’s Executive Director. He leads a dedicated team with the BeNeighbors.org effort to connect all neighbors online (and off) in public life across race and ethnicity, generations, immigrant and native-born, and more. He lives with his lovely wife and two children in Minneapolis, Minnesota.

Gerrie Schipske, Councilwoman on the Long Beach City Council Long Beach City, CA

Councilwoman Gerrie Schipske is currently serving her second term on the Long Beach City Council. She has championed open, transparent and accountable local government since she took office in 2006 by being the first elected official in Long Beach to disclose their calendar and to communicate daily via blog, email, Facebook and Twitter. In January 2012, she took public education and transparency efforts one step further with her “Open Up Long Beach” initiative and website which provide residents increased access to the city’s every day affairs and documents, and includes opportunities for residents to “ go behind the scenes” of city operations. These efforts were lauded in California Forward’s report: The State of Transparency in California: 2013. Gerrie also brought transparency to the Medical Board of California on which she serves by initiating the requirement that members disclose each meeting any contacts they have had with interested parties. Gerrie earned her JD from Pacific Coast University School of Law, her MA from George Washington University, her BA from University of California, Irvine and her RNP from Harbor UCLA Women’s Health Care Nurse Practitioner Program. She is the author of three books on the history of Long Beach, California.

Brad Lander, New York City Council Member Brooklyn, NY

Brad Lander is a New York City Council Member representing Brooklyn’s 39th District, and a leader on issues of affordable housing, livable communities, the environment, and public education. Named one of “Today’s Social Justice Heroes” by The Nation magazine, Lander is co-chair of the Council’s Progressive Caucus and was one of the first councilmembers to bring “participatory budgeting” to his district, giving residents the power to decide which projects to support with their tax dollars. Prior to serving in the City Council, Brad directed the Pratt Center for Community Development and the Fifth Avenue Committee, a nationally-recognized community development organization.

Robert Davis, Co-Founder of RadSocial Cooper City, FL

Robert Davis is a recent marketing graduate from Nova Southeastern University in Davie, FL. His day job consists of managing a social media consultancy for small to medium sized businesses, and at night one can find him at the local maker and hacker spaces around Fort Lauderdale and Miami. Robert is a Code for America intern alumni (’12) and an avid supporter of creating civic tools with open data for the public good. Along with fellow Floridian Cristina Solana, the two created the Florida Bill Tracker, forked from the MinnPost and redeployed to easily track controversial Florida legislation. Robert is also an avid traveler and surfer, and hopes to inspire others to change their world regardless of age or expertise.

Alderman Joe Moore, City of Chicago, 49th Ward Chicago, IL

Known as a pioneer for political reform, governmental transparency and democratic governance, Joe Moore represents Chicago’s 49th Ward, one of the nation’s most economically and racially diverse communities. Moore became the first elected official to bring “participatory budgeting” to the United States. Each year, Moore turns over $1 million of his discretionary capital budget to a process of democratic deliberation and decision-making in which his constituents decide through direct vote how to allocate his budget. Moore’s participatory budgeting model has since been adopted by four of his Chicago City Council colleagues, as well as city council members in New York City, San Francisco, and Vallejo, California.

Anita Brown-Graham, Director of the Institute for Emerging Issues at NC State University, Raleigh, NC

Anita Brown-Graham is Director of the Institute for Emerging Issues (IEI) at NC State University, a think-and-do tank focused on tackling big issues that affect North Carolina’s future growth and prosperity. From energy, to fiscal modernization, to improving our systems of higher education, IEI takes the lead in convening state leaders in business, higher education and government to address these issues in a comprehensive, long-term way to prepare the state for future challenges and opportunities. In her role at IEI, Anita led the development of the Emerging Issues Commons, a first of its kind civic engagement tool – both a physical space and an online hub that stands to transform how citizens across the state connect with each other, access information, and take action in the decades to come. Prior to her leadership at IEI, Anita worked as faculty of the School of Government at UNC Chapel Hill for 13 years, training communities in strategic planning to revitalize their distressed rural communities. Her work inspired both rural and urban communities to work together for a better future for the state. Anita is a William C. Friday Fellow, American Marshall Fellow, and Eisenhower Fellow.

Deborah Parker, Tulalip Tribes Vice Chair Tulalip, WA

Deborah Parker Tsi-Cy-Altsa (Tulalip/Yaqui) was elected to the Tulalip Tribes Board of Directors in 2012. As Vice-Chairwoman, Deborah brings to Tulalip leadership nearly two decades of experience as a policy analyst, program developer, communications specialist, and committed cultural advocate and volunteer in the tribal and surrounding communities. Serving as a Legislative Policy Analyst in the Office of Governmental Affairs for the Tulalip Tribes from 2005-2012, Deborah engaged in the legislative process on behalf of the Tulalip Tribes by providing quality analysis of issues most pertinent to the exercise of sovereignty and tribal governance, with particular emphasis in the areas of education, finance, taxation, and healthcare. Before joining legislative affairs Deborah developed two unique outreach and education programs for the Tulalip Tribes. Young Mothers was a culturally relevant program for teen mothers, and the Tribal Tobacco Program sought to inspire responsible tobacco use among tribal members, while acknowledging tobacco’s sacred place in Indigenous cultures. Prior to her work for the Tulalip Tribes Deborah served as Director of the Residential Healing School of the Tseil-Waututh Nation in Canada, and in the Treaty Taskforce Office of the Lummi Nation, where she was mentored by American Indian leaders such as Joe Delacruz, Billy Frank, Henry Cagey and Jewell James. As a passionate advocate for improved education for tribal members, and a belief in the inherent right of all Native Americans to expect and receive a quality education, one that is free from racial or cultural bias, Deborah is focused on educational reform, which includes developing curriculum that is a true reflection of an Indigenous ethics and knowledge system. Deborah remains committed to education by volunteering her time in the local schools where her children are enrolled. Deborah graduated from the University of Washington with a Bachelor of Arts degree in American Ethnic Studies and Sociology where she distinguished herself as a scholar and a young Indigenous leader. Deborah lives in Tulalip with her husband Myron Dewey (Paiute/Shoshone) and their five children.

21st Century eDemocracy: eGovernment of, for, by and with the People

In the 1990s, the Internet changed communication and commerce forever. A decade later, a new social layer for the World Wide Web democratized the tools for online publishing. Citizens without specialized technical skills can now easily upload pictures, video, and text to a more interactive Web, where they can then use powerful new platforms to share, mix, and comment upon it all. In the years since the first social networks went online, the disruption presented by this dynamic online environment, fed by faster Internet connections and a global explosion of mobile users, has created shifts in the power structure as powerful as those brought about by the introduction of the printing press centuries ago.

With the Internet being hailed as the public arena for our time, governments around the world are waking up to a changed information environment in this new 21st century. Social-media platforms present new risks and rewards for government, but the fact is these platforms are hosting public discourse between hundreds of millions of citizens. In the context of these changes, public servants have begun using social media to share information and engage with citizens. Below, four digital pioneers share their insights, experiences, and hopes about the new opportunities social media offers for people to participate in their government.

These essays were originally published in the Association for Computing Machinery’s “Interactions” Magazine. They are republished here with permission.

Serving Citizens via Social Media

By Steve Ressler (@govloop), founder of GovLoop.com, former IT program manager and auditor at the U.S. Department of Homeland Security.

In 2012, social media is mainstream. Facebook is preparing a $100 billion IPO. President Obama is hosting a series of [social media] Town Halls. Even my grandmother is on Facebook. So what’s the role of social media in government? A few years ago, social media in government was brand new. It was exciting when a new city launched a Facebook page or a councilperson posted meetings on YouTube, or a state department launched a mobile app.

We’ve moved past the honeymoon phase, and now social media is being asked to deliver core mission value. For state and local governments, there are three foundational ways in which social media helps deliver value:

Reach more people. One of the core foundational roles of state and local government is to provide information for citizens. This is why for years government agencies have sent information via postal mail, printed agency newsletters, held in-person town hall meetings, and built telephone call centers. With more than 750 million users on Facebook, 200 million on Twitter, and the whole world tuning in to YouTube, social media is simply the largest channel that most people use these days to get information.

Get feedback. Another core role of government is to get feedback from citizens. Classic town halls simply do not work as well in today’s modern society, where everyone is busy and few people have the time to drive downtown at 5 p.m. on a Wednesday for a meeting. Social media is an interactive, two-way medium that acts as a great vehicle for real-time feedback.

Lower costs and increase revenue. In today’s tough budgetary times, cities and states simply cannot ignore opportunities to lower costs and increase revenue. Mobile applications like SeeClickFix let citizens photograph and report potholes and other city problems, instead of the city having to send out a truck to investigate every call-in complaint. Instead of spending tens of thousands of dollars on printing and mailing property-tax statements or city guides, city governments can save lots of money by sending the same information via email and social media. And that’s just the beginning.

I’ll be the first to admit that social media is not perfect. It is not a magic cure. Just because you add new social media channels does not mean you can remove other channels, like phone lines. Further, implementing social media well is a skill, and it takes time to see its impact. It matters, however, because the world has already changed. If government wants to remain relevant to citizens, it must evolve to meet the demands of the 21st century. The modern citizen is using social media, and is the reason why Facebook has [845] million users, and that iPhones and iPads have made Apple the second most valuable company in the world. Government must meet citizens where they are now or risk losing the opportunity to be more relevant to their lives.

Selective Use of Social Media in Government Projects

By Jeffrey Levy (@levy413), Director of Web Communications at the U.S. Environmental Protection Agency.

The use of social media runs the gamut, from agencies that are still considering it, to those who are using it mostly as a broadcast mechanism, to those like EPA that offer a mix of broadcast and community participation, to those who rely on social media for full-blown collaboration. Social media gives us good tools to enhance transparency, participation, and collaboration. But the trick is figuring out the most effective projects in which to use these tools.

Five years ago, there wasn’t even a single U.S. government blog. Today there is at least one U.S. agency using every type of social media I can think of. EPA itself is engaged in most channels, at least in broadcast mode and often in two-way discussion and the solicitation of community-created content (photos, videos, comments, etc.).

Social media works very well in conjunction with email and websites. At EPA, we use all channels to promote other channels, both by cross-linking and by embedding content from social media into Web pages. Some things we’re starting to think about are how to use two-dimensional barcodes (QR codes) well, and developing mobile applications in general. One nice thing is that many social media sites already have mobile versions, so it is simple and useful to link to them from our mobile site.

We are active where the people are on the most popular social media platforms, so we have the chance to talk to, and respond to, people who may never come to our main website. We also have a much broader ability to share our information. In many cases, we hear ideas from people who otherwise would not contact us. For example, during the recent nuclear crisis in Fukushima, Japan, we were able to answer questions through Facebook to help alleviate concerns and provide solid information to new groups of people.

Our mission is best served when we work collaboratively with the public to protect their health and the environment. Photo and video projects engage people. For example, the “It’s My Environment video project involved hundreds of people making short video clips, in which they took ownership of protecting our environment. By using social media channels to promote “Pick 5 for the Environment,” we challenge people to take other kinds of actions.

Social media can also help us catch environmental criminals simply by helping us advertise our fugitives list. The health warnings we issue can reach hundreds of thousands of people through Facebook, Twitter, and email. The recipients are people who asked to be kept in the loop, so they are a much more interested audience than the general public. Another key aspect of our mission is our use of online discussion forums, where we invite anyone to share their thoughts and opinions on policy issues.

My social media mantra is mission, tools, metrics, teach. It depends on the channel, but generally, we need better stats. For example, we have 42,000 followers on Twitter. But what’s the number of people who actually see a particular tweet? Facebook provides impressions, which is a more useful statistic. YouTube provides some good metrics too.

We also need inexpensive tools to help us monitor multiple channels. Each social media company is doing its own thing, and most are not focused on helping us cross channels. But multichannel management will become increasingly important as we grow more active in more channels.

Changing the Conversation Through Social Media

By Nick Schaper (@nickschaper), Executive Director of Digital Strategic Communications at U.S. Chamber of Commerce, former Director of Digital Media for U.S. Speaker of the House of Representatives, Representative John Boehner.

Much has been made of American politicians’ sometimes transformative, sometimes awkward, and occasionally career-ending entrances into social media. Suffice it to say that many are on board and they’re not likely to exit social media. Your member of Congress wants you to like him or her, both at the ballot box and on Facebook. While the number of elected representatives integrating social media into their communications efforts has soared, this is still very much a new frontier in governance. Americans are getting a very rare opportunity to shape the direction of their government.

In the heady frontier days of the government’s adoption of social media (five to seven years ago), members of Congress moved from the stodgy “traditional media” strategy of drafting and sending out a press release to the cutting-edge “new media” strategy of drafting and sending out a press release and then posting a link to it via Twitter and Facebook. It was hardly splitting the atom, but it was moving in the right direction.

As the government social media ecosystem continues to evolve, we’re seeing more aggressively innovative efforts aimed at increasing participation, transparency, and accountability. Officials and their staff are identifying the unique abilities of popular platforms, such as Facebook and Twitter, and they’re adjusting their communications accordingly. In the past year alone, we’ve seen Republicans in the U.S. House of Representatives enlist Americans’ digital support in voting on which government programs to cut, resulting in their directly shaping the governing agenda of what would become the House’s new majority. Further down Pennsylvania Avenue, the Obama Administration’s digital team has led the nation’s first Twitter and Facebook town halls, among numerous other experiments in participatory and open government.

These efforts have helped to create a vast new virtual town square. Unfortunately, that square is still a noisy, unruly place. Like much of the Web, .gov is plagued by signal-to-noise issues, many of which are exacerbated by the unique rules and traditions of each branch. Members of Congress, for example, would prefer to communicate primarily (if not exclusively) with constituents who live in their districts. Users don’t generally list their home address in their Twitter bio, so should members be @replying to tweets when they can’t trace the origin?

Identity and bandwidth challenges will not be solved anytime soon, and certainly not in this space, but suffice it to say that your representatives are eagerly looking for new ways to communicate and legislate. Congressional staffs scour online communities for mentions of their bosses. Bloggers and other digital influentials have been given unprecedented access to politicians. When the president recently took questions live via Twitter, he found himself on the hot seat in his own White House when he faced questions on the lack of jobs and a flagging economy. All of this is testament to the fact that the tweets and status updates of citizens are echoing in the marble halls of our nation’s government.

The marriage of social media and government has made it through the honeymoon stage. To what degree that results in a more perfect union is still yet to be seen. The potential for transformative change is there, and I’m confident it will be realized by this and many generations of social media patriots to come.

Reaching and Revealing New Heights Through Social Media

By Stephanie L. Schierholz (@schierholz), former Social Media Manager, National Aeronautics and Space Administration.

To understand how NASA uses social media to accomplish its mission, you must first understand the agency’s vision. Simply put, the space agency’s goal is to “reach for new heights and reveal the unknown so what we do and learn will benefit all humankind.” What NASA accomplishes and learns cannot benefit all humankind if people do not know about what we’re discovering. This is why the 1958 act that established the National Aeronautics and Space Administration also called for the agency to “provide for the widest practicable and appropriate dissemination of information concerning its activities and the results thereof.”

Making NASA accessible to the American people—and, really, to citizens around the world—has been ingrained in the agency’s operations since the early days. If you are old enough, you know this is true because you saw astronaut Neil Armstrong set foot on the moon via television signals from a NASA broadcast. Today you can watch NASA TV streaming online via your computer or mobile device.

The mandate to share what the agency is doing as widely as possible (and a restriction against advertising) keeps us on the lookout for new ways we can spread the word and be more accessible. Social media tools have enabled NASA to engage the public efficiently and effectively. Social media sites provide us an easy way to keep the public updated with news delivered straight into their personal newsfeed or homepage, which they probably visit more often than traditional news sites or the NASA website.

The agency has come quite a distance since the pioneers at NASA’s Jet Propulsion Laboratory started a Twitter account for the Phoenix Mars Lander program (@MarsPhoenix) in May 2008. NASA’s primary Twitter account (@NASA) has more than 1 million followers. We have more than 200 social media accounts agency-wide, including more than 20 astronauts on Twitter. You can find them all at www.nasa.gov/connect. Because of its interest in identifying new ways to connect, NASA was the first government agency to form partnerships with Gowalla, Facebook, and SlideShare. Why? Because each allows the agency to share our content with audiences who might never visit the main NASA website.

The real value of NASA’s use of social media tools can be seen in the level of engagement they attract and the communities that form around them. It is called social media because our fans and followers have a reasonable expectation their questions will be answered and their comments heard. By responding and interacting with them, NASA has the opportunity to educate, inform, and inspire. Fans and followers who are passionate about what we do have platforms to express this passion and share it with others.

NASA “tweetups” take it to the next level, bringing the online engagement to in-person gatherings where participants have an opportunity to talk to NASA leaders, scientists, engineers, and astronauts and the chance to see how and where we work. Participants have arrived at NASA tweetups as casual fans or followers and walked away as enthusiastic advocates of the work we are doing. A strong sense of community develops at these events, exemplifying how social media can bring together people who have common interests.

What’s next for NASA and social media? We’ll continue to keep our eyes open for platforms we can use to engage and share the word out about what we’re doing. Meanwhile, the agency is working on improving our internal support for social media, focusing on processes, guidelines, and coordination. You can expect to see improvements to our Facebook page, a mobile check-in spot for our “Search for the Moon Rocks” partnership with Gowalla, a Foursquare mayor of the International Space Station, more of our presentations, videos, and documents on SlideShare, and more out-of-this-world content in the places you go to be social online.

New Horizons for eDemocracy

The insights and experiences shared above represent only a small sample of the variety of ways in which social media is transforming governments. While the examples are U.S.-centric, they do represent trends that are evolving in other countries. What we’ve left for a future discussion is how citizens around the world are using social media to disrupt traditional ways of governing. For instance, social media is credited with helping to accelerate social change in Tunisia, Egypt, and other parts of the Middle East. It has also been used in collaborative partnerships between government and citizens to respond to man-made crises or natural disasters.

The examples above, however, should provide a useful overview of some of the ways in which today’s participatory platforms are playing increasingly important roles in the evolution of government of, by, for, and now with the people.

What is the return on investment (ROI) of open government?

Putting a dollar value on clean water, stable markets, the quality of schooling or access to the judiciary is no easy task. Each of these elements of society, however, are to some extent related to and enabled by open government.

If we think about how the fundamental democratic principles established centuries ago extend today purely in terms of the abstraction of transparency, the “business value” of open government isn’t always immediately clear, at least with respect to investment or outcomes.

Transparency and accountability are core to how we think about government of, by and for the people, where a polity elects representative government. When budgets are constrained, however, city managers, mayors, controllers and chief information officers question the value of every single spending decision. (Or at least they should.)

It’s that context, of course, that’s driving good, hard questions about the business case for open government. Tim Berners-Lee, the inventor of the World Wide Web, said in 2011, at the launch of the Open Government Partnership in New York City, said that increased transparency into a state’s finances and services directly relates to the willingness of the businesses and other nations to invest in a country.

That’s the kind of thinking that has driven the World Bank to open up its data, to give people access to more information about where spending is happening and what those funds are spent upon. While transparency into government budgets varies immensely around the world, from frequently updated portals to paper records filed in local county offices, technology has given states new opportunities to be more accountable — or to be held accountable, by civic media and the emerging practice of data journalism.

The challenges with releasing spending data, however, are manifold, from quality assurance to the (limited) costs of publishing to access to making it comprehensible to taxpayers through visualizations and calculators.

People in and outside of government are working to mitigate these issues, from using better visualization tools to adopting Web-based online platforms for publishing. The process of cleaning and preparing data to be published itself has returns for people inside of government who need access to it. According to the McKinsey Global Institute, on average, government workers spend 19% of their days simply looking for information.

In other words, opening information government to citizens also can mean it’s more available to government itself.

Organizing and establishing governance practices for data, even if some of it will never be published online, also has significant returns. Chicago chief data officer Brett Goldstein established probability curves for violent crime, explained John Tolva, the chief technology officer of the city of Chicago, when we talked in 2011. Since then, “we’re trying to do that elsewhere, uncovering cost savings, intervention points, and efficiencies,” he said.

“We have multiple phases for how we roll out data internally, starting with working with the business owner,” said Goldstein, in an interview. “We figure out how we’ll get it out of the transactional database. After that, we determine if it’s clean, if it’s verified, and if we can sign off on it technically.”

Tolva makes the business case for open data by identifying four areas that support investment, including an economic rationale.

  1. Trust
  2. Accountability of the work force
  3. Business building
  4. Urban analytics

After New York City moved to consolidate and clean its regulatory data, city officials were able to apply predictive data analytics to save lives and money. According to Mike Flowers, the chief analytics officer of NYC, the city achieved:

  • A five-fold return on the time of building inspectors looking for illegal apartments
  • An increase in the rate of detection for dangerous buildings that are highly likely to result in firefighter injury or death
  • The discovery of more than twice as many stores selling bootlegged cigarettes
  • A five-fold increase in the detection of business licenses being flipped

California’s recent budget woes coincided with unprecedented demand for government to be more and efficient online. The state connected citizens to e-services with social media. Both California Unemployment Office and the Department of Motor Vehicles were able to deliver better services online without additional cost.

“You can tweet @CA_EDD and get answers like how long until you get a check, where to go on the website or job fairs,” said Carolyn Lawson, the former deputy director for the technology services governance division in the eServices Office of California, in an interview. “I don’t think the creators of Twitter thought it would be a helpdesk for EDD.”

These kinds of efforts are far from the only places where there are clear returns for investments. A world away from big cities and states in the United States and urban data analytics, the World Bank found the ROI in open government through civic participation and mobile phones. Mobile participatory budgeting helped raise tax revenues in Congo, combining technology, public engagement and systems thinking to give citizens a voice in government.

“Beyond creating a more inclusive environment, the beauty of the project in South Kivu is that citizen participation translates into demonstrated and measurable results on mobilizing more public funds for services for the poor,” said Boris Weber, team leader for ICT4Gov at the World Bank Institute for Open Government, in an interview in Washington. “This makes a strong case when we ask ourselves where the return of investment of open government approaches is.”

This post originally appeared on LaserFiche.

Open data in beta: From Data.gov to alpha.data.gov to next.data.gov

Writing at the White House blog, deputy US CTO Nick Sinai and Presidential Innovation Fellow Ryan Panchadsaram explain what’s new behind the next iteration of the federal open government data platform.

next-data-gov

The first incarnation of Data.gov and subsequent iterations haven’t excited the imagination of the nation. The next version, which employs open source technology like WordPress and CKAN, uses adaptive Web design and features improved search.

It also, critically, highlights how open data is fueling a new economy. If you read Slate, you already knew about how the future is shaping up, but this will provide more people with a reference. Great “next” step.

If you have opinions, questions or suggestions regarding the newest iteration, the Data.gov team is looking for feedback and Project Open Data is encouraging people to collaborate in the design process by creating pull requests on Github or commenting on Quora or Twitter.

Open by design: Why the way the new Healthcare.gov was built matters [UPDATED]

UPDATE: The refresh of Healthcare.gov in June went well. On October 1st, when the marketplace for health insurance went live at the site.gov, millions of users flocked to the website and clicked “apply now.” For days, however, virtually none of them were able to create accounts, much less complete the rest of the process and enroll for insurance. By the end of the week, however, it was clear that the problems at Healthcare.gov were not just a function of high traffic but the result of the failure of software written by private contractors, with deeper issues that may extend beyond account creation into other areas of the site. On October 9th, as prospective enrollees continued to be frustrated by error-plagued websites around the country, I joined Washington Post TV to give a preliminary post-mortem on why the HealthCare.gov relaunch went so poorly.

The article that follows, which was extended and published at The Atlantic, describes the team and process that collaborated on launch of the new site in June, not the officials or contractors that created the botched enterprise software application that went live on October 1st. In the Atlantic, I cautioned that “…the site is just one component of the insurance exchanges. Others may not be ready by the October deadline.”  The part of the site I lauded continues to work well, although the Github repository for it was taken offline. The rest has …not. I’ve taken some heat in the articles’ comments and elsewhere online for being so positive, in light of recent events, but the reporting holds up: using Jekyll is working. Both versions of the story, however, should have included a clearer caveat that the software behind the website had yet to go live — and that reports that the government was behind on testing Healthcare.gov security suggested other issues might be present at launch. If readers were misled by either article, I apologize. –Alex


Healthcare.gov already occupies an unusual place in history, as the first website to be demonstrated by a sitting President of the United States. In October, it will take on an even more important historic role, guiding millions of Americans through the process of choosing health insurance.

How a website is built or designed may seem mundane to many people in 2013, but when the site in question is focused upon such a function, it matters. Yesterday, the United States Department of Health and Human Services (HHS) relaunched Healthcare.gov with a new look, feel and cutting edge underlying architecture that is beyond rare in federal government. The new site has been built in public for months, iteratively created by a team of designers and engineers using cutting edge open source technologies. This site is the rarest of birds: a next-generation website that happens to be a .gov.

healthcare-gov-homepage

“It’s fast, built in static HTML, completely scalable and secure,” said Bryan Sivak, chief technology officer of HHS, in an interview. “It’s basically setting up a Web server. That’s the beauty of it.”

The people building the new Healthcare.gov are unusual: instead of an obscure sub-contractor in a nameless office park in northern Virginia, a by a multidisciplinary team at HHS worked with Development Seed, a scrappy startup in a garage in the District of Columbia that made its mark in the DC tech scene deploying Drupal, an open source content management system that has become popular in the federal government over the past several years.

“This is our ultimate dogfooding experience,” said Eric Gundersen, the co-founder of Development Seed. “We’re going to build it and then buy insurance through it.”

“The work that they’re doing is amazing,” said Sivak, “like how they organize their sprints and code. It’s incredible what can happen when you give a team of talented developers and managers room to work and let them go.”

What makes this ambitious experiment in social coding unusual is that the larger political and health care policy context that they’re working within is more fraught with tension and scrutiny than any other arena in the federal government. The implementation and outcomes of the Patient Protection and Affordable Care Act — AKA “Obamacare” — will affect millions of people, from the premiums they pay to the incentives for the health care they receive.

“The goal is get people enrolled,” said Sivak. “A step to that goal is to build a health insurance marketplace. It is so much better to build it in a way that’s open, transparent and enables updates. This is better than a big block of proprietary code locked up in CMS.”

healthcare-gov-marketplace-graphic

The new Healthcare.gov will fill a yawning gap in the technology infrastructure deployed to support the mammoth law, providing a federal choice engine for the more than thirty different states that did not develop their own health insurance exchanges. The new website, however modern, is just one component of the healthcare insurance exchanges. Others may not be ready by the October deadline. According to a recent report from the Government Accountability Office, the Department of Health and Human Services’ Centers for Medicare & Medicaid Services (CMS) is behind in implementing key aspects of the law, from training workers to help people navigate the process to certifying plans that will sold on the exchanges to determining the eligibility of consumers for federal subsidies. HHS has expressed confidence to the GAO that exchanges will be open and functioning in every state on October 1.

On that day, Healthcare.gov will be the primary interface for Americans to learn about and shop for health insurance, as Dave Cole, a developer at Development Seed, wrote in a blog post this March. Cole, who served as a senior advisor to the United States chief information officer and deputy director of new media at the White House, was a key part of the team that moved WhiteHouse.gov to Drupal. As he explained, the code will be open in two important ways:

First, Bryan pledged, “everything we do will be published on GitHub,” meaning the entire code-base will be available for reuse. This is incredibly valuable because some states will set up their own state-based health insurance marketplaces. They can easily check out and build upon the work being done at the federal level. GitHub is the new standard for sharing and collaborating on all sorts of projects, from city geographic data and laws to home renovation projects and even wedding planning, as well as traditional software projects.

Moreover, all content will be available through a JSON API, for even simpler reusability. Other government or private sector websites will be able to use the API to embed content from healthcare.gov. As official content gets updated on healthcare.gov, the updates will reflect through the API on all other websites. The White House has taken the lead in defining clear best practices for web APIs.

Thinking differently about a .gov

According to Sivak, his team didn’t get directly involved in the new Healthcare.gov until November 2012. After that “we facilitated the right conversations around what to build and how to build it, emphasizing the consumer-facing aspects of it,” he said. “The other part was to figure out what the right infrastructure was going to be to build this thing.”

That decision is where this story gets interesting, if you’re interested in how government uses technology to deliver information to the people it serves. Government websites have not, historically, been sterling examples of design or usability. Unfortunately, in many cases, they’ve also been built at great expense, given the dependence of government agencies on contractors and systems integrators, and use technologies that are years behind the rest of the Web. Healthcare.gov could have gone in the same direction, but for the influence of its young chief technology officer, an “entrepreneur-in-residence” who had successfully navigated the bureaucracies of the District of Columbia and state of Maryland.

“Our first plan was to leverage Percussion, a commercial CMS that we’d been using for a long time,” said Sivak. “The problem I had with that plan was that it wasn’t going to be easy to update the code. The process was complicated. Simple changes to navigation were going to take a month.”

At that point, Sivak did what most people do in this new millennium when making a technology choice: he reached out to his social networks and went online.

“We started talking to people about a better way, including people who had just come off the Obama campaign,” he said. “I learned about the ground they had broken in the political space, from A/B testing to lightweight infrastructure, and started reading about where all that came from. We started thinking about Jekyll as a platform and using Prose.io.”

After Sivak and his team read about Development Seed’s work with Jekyll online, they contacted the startup directly. After a little convincing, Development Seed agreed to do one more big .gov project.

“A Presidential Innovation Fellow used same tech we’re using for several of their projects,” said Cole. “Bryan heard about it and talked to us. He asked where we would go. We wanted to be on Github. We knew there were performance and reliability benefits from building the stack on HTML.”

Jekyll, for those who are unfamiliar with Web development trends, is a way for developers to build a static website from dynamic components. Instead of running a traditional website with a relational database and server-side code, using Jekyll enables programmers to create content like they create code. The end result of this approach is a site that loads faster for users, a crucial performance issue, particularly on mobile devices.

“Instead of farms of application servers to handle a massive load, you’re basically slimming down to two,” said Sivak. “You’re just using HTML5, CSS, and Javascript, all being done in responsive design. The way it’s being built matters. You could, in theory, do the same with application servers and a CMS, but it would be much more complex. What we’re doing here is giving anyone with basic skills the ability to do basic changes on the fly. You don’t need expensive consultants.”

That adds up to cost savings. Sites that are heavily trafficked — as Healthcare.gov can reasonably be expected to be – normally have to use a caching layer to serve static content and add more server capacity as demand increases.

“When we worked with the World Bank, they chose a plan from Rackspace for 16 servers,” said Gundersen. “That added tens of thousands of dollars, with a huge hosting bill every month.”

HHS had similar strategic plans for the new site, at least at first.

“They were planning 32 servers, between staging, production and disaster recovery, with application servers for different environments,” said Cole. “You’re just talking about content. There just needs to be one server. We’re going to have 2, with one for backup. That’s a deduction of 30 servers.”

While Jekyll eliminates the need for a full-blown content management system on the backend of Healthcare.gov (and with it, related costs), the people managing the site still need to be able to update it. That’s where Prose.io comes in. Prose.io is an open source content editor created by Development Seed that gives non-programmers a clean user interface to update pages.

“If you create content and run Jekyll, it requires content editors to know code,” said Cole. “Prose is the next piece. You can run it on your on own servers or use a hosted version. It gives access to content in a CMS-like interface, basically adding a WYSIWYG skin, giving you a text editor in the browser.”

In addition to that standard “what you see is what you get” interface, familiar from WordPress or Microsoft Word, Prose.io offers a couple of bells and whistles, like mobile editing.

“You can basically preview live,” said Cole. “You usually don’t get a full in-browser preview. The difference is that you have that with no backend CMS. It’s just a directory and text files, with a Web interface that exposes it. There are no servers, no infrastructure, and no monthly costs. All you need is a free Web app and Github. If you don’t want to use that, use Git and Github Enterprise.” Update: Cole wrote more about launching Healthcare.gov on the DevelopmentSeed blog on Tuesday.

Putting open source to work

Performance and content management aside, there’s a deeper importance to how Healthcare.gov is being built that will remain relevant for years to come, perhaps even setting a new standard for federal government as a whole: updates to the code repository on Github can be adopted for every health insurance exchange using the infrastructure. (The only difference between different state sites is a skin with the state logo.)

“We have been working in the .gov space for a while,” said Gundersen. “Government people want to make the right decisions. What’s nice about what Bryan is doing is that he’s trying to make sure that everyone can learn from what HHS is doing, in real-time. From a process standpoint, what Bryan is doing is going to change how tech is built. FCC is watching the repository on Github. When agencies can collaborate around code, what will happen? The amount of money we have the opportunity to save agencies is huge.”

Collaboration and cascading updates aren’t an extra, in this context: they’re mission-critical. Sivak said that he expects the new site to be improved iteratively over time, in response to how people are actually using it. He’s a fan of the agile development methodology that has become core to startup development everywhere, including using analytics tools to track usage and design accordingly.

“We’re going to be collecting all kinds of data,” said Sivak. “We will be using tools like Optimizely to do A/B and multivariate testing, seeing what works on the fly and adapting from there. We’re trying to treat this like a consumer website. The goal of this is to get people enrolled in health care coverage and get insurance. It’s not simple. It’s a relatively complex process. We need to provide a lot of information to help people make decisions. The more this site can act in a consumer-friendly fashion, surfacing information, helping people in simple ways, tracking how people are using it and where they’re getting stuck, the more we can improve.”

Using Jekyll and Prose.io to build the new Healthcare.gov is only the latest chapter in government IT’s quiet open source evolution. Across the federal government, judicious adoption of open source is slowly but surely leading to leaner, more interoperable systems.

“The thing that Git is all about is social coding,” said Sivak, “leveraging the community to help build projects in a better way. It’s the embodiment of the open source movement, in many ways: it allows for truly democratic coding, sharing, modifications and updates in a nice interface that a lot of people use.”

Open by design

Sivak has high aspirations, hoping that publishing the code for Healthcare.gov will lead to a different kind of citizen engagement.

“I have this idea that when we release this code, there may be people out there who will help us to make improvements, maybe fork the repository, and suggest changes we can choose to add,” he said. “Instead of just internal consultants who help build this, we will suddenly have legions of developers.”

Not everything is innovative in the new Healthcare.gov, as Nick Judd reported at TechPresident in March: the procurement process behind the new site is complicated and the policy and administrative processes that undergird it aren’t finished yet, by any account.

The end result, however, is a small startup in a garage rebuilding one of the most important federal websites of the 21st century in a decidedly 21st century way: cheaper, faster and scalable, using open source tools and open standards.

“Open by design, open by default,” said Sivak. “That’s what we’re doing. It just makes a lot of sense. If you think about what should happen after this year, all of the states that didn’t implement their systems, would it make sense for them to have code to use as their own? Or add to it? Think about the amount of money and effort that would save.”

That’s a huge win for the American people. While the vast majority of visitors to Healthcare.gov this fall will never know or perhaps care about how the site was built or served, the delivery of better service at lowered cost to taxpayers is an outcome that matters to all.

12 lessons about social media, politics and networked journalism

In 2011, I was a visiting faculty member at the Poynter Institute, where I talked with a workshop full of journalists about working within a networked environment for news. As I put together my talk, I distilled the lessons I’d learned from my experiences covering tech and the open government initiative that would affect the success of any audience relationship and posted them onto Google+. Following is an adapted and updated version of those insights. The Prezi from the presentation is online here.

1) We have to change our idea of “audience.”

People are no longer relegated to being the passive recipients of journalists’ work. They have often creators of content and have become important nodes for information themselves, sometimes becoming even more influential within their topical or regional communities than journalists are. That means we have to treat them differently. Yes, people are reading, watching or listening to the work of journalists but they’re much more than an “audience.”

In the 21st century, the intersection of government, politics and media is increasingly a participatory, reciprocal and hypersocial experience due to the explosion in adoption of connected smartphones that turn citizens into publishers, broadcasters and human sensors – or censors, depending upon the context. More than half of American adults have a smartphone in 2013. The role of editors online now includes identifying and debunking misinformation, sifting truth from fiction, frequently in real-time. The best “social media editors” are creating works of journalism from a raw draft of history contributed by the reports of the many.

2) Good conversations involve talking and listening.

Communicating effectively in networked environments increasingly involves asking good questions that elicit quality responses — the more specific the question, the better the chance for a quality response. The Obama administration’s open government initiative’s initial use of the Internet in 2009, at Change.gov, did not ask highly structured questions, which led to a less effective public consultation.

3) The success of any conversation depends upon how well we listen.

Organizations that invite comments and then don’t respond to audience comments or questions send a clear message — “we’re not listening.” There are now many ways there are to listen and a proliferation of channels, going far beyond calls and email.

Comments have become distributed across the Internet and social Web. People are not just responding to those made on a given article or post: they’re on Twitter, Facebook and potentially other outposts. Find where people are talking about your beat, organization or region: that’s your community. Some organizations are using metrics to determine not only how often sentiments are expressed but the strength of that conviction and the expertise behind it.

4) No matter how good the conversation, its hosts must close the loop.

When the host of a conversation, be it someone from government, school, business or media, asks someone’s opinion, but doesn’t acknowledge it, much less act upon it, the audience loses trust.

If we seek audience expertise but don’t subsequently let it inform our work, the audience loses trust. Increasingly, to gain and hold that trust you must demonstrate the evidence behind your assertions by citation, with research tied via footnotes or hyperlinks, source code or supporting data.

It’s better not to ask than to ask and not act upon the answer. It’s similarly better not to engage in social media at all than to perpetuate the same old one-way communication streams with legacy broadcast behaviors. There are also new risks posted by the combination of ubiquitous connected mobile devices and the global reach of social media networks. To paraphrase Mark Twain, it is better to be thought a fool than to tweet and prove it.

5) You must know who your audience is and where, why, when and how they’re searching for information to engage them effectively.

TechTarget, one of my former employers, successfully segmented its traditional IT audience into niches that cared passionately about specific technology and/or issues. The company then developed integrated media products around highly specific topical area, a successful business model, albeit one that has specialized applicability to the news business. Politico’s approach, which now includes live online video, paid subscriber content for “pros,” policy segmentation, email newsletters and events, is the most apt comparison in the political space, although there are many other trade publications that cater to niche audiences.

Here’s the key for both specific audiences: IT buyers have decision-making ability over thousands, if not millions, of dollars in budgets. Policy makers in DC have similar authority on appropriates, legislation or regulation.

Most general readers do not have budget authority nor policy cout and therefore will not sustain an effective business model. If you can create content that is of interest to people with buying power, then sponsors/advertisers will bite. The model, in other words is not a panacea.

6) Your audience should be able to find and hear from YOU.

It matters whether the person whose name is on a social media account actually engages in it. For instance, President Obama doesn’t directly use social media, with a few notable exceptions. His White House and campaign staff do, at @WhiteHouse and @BarackObama. Some GOP candidates and incumbents actually maintain their accounts. If you take away the president, the GOP is ahead in both houses of Congress. They have attracted huge followings.

Why does a personal account to complement the masthead matter? It stays with the reporter or editor from job to job. While many networks or papers have adopted naming conventions that immediately identify a journalist’s affiliation (@NameCBS or @NYT_Name) that practice does create a gray area in terms of who “owns” the account. @OctaviaNasrCNN was able to drop the CNN and keep her account. @CAmanpour was able to transfer from CNN to ABC. Even within networks, there is a lack of standardization: Compare @DavidGregory or @JakeTapper to @BetsyMTP.

7) People respond differently to personal accounts than mastheads.

Andy Carvin taught me about this dynamic years ago, which I’ve since seen borne out in practice. He compared the results he’d get from asking questions on his personal account (@acarvin) to a primary NPR accounts (@NPRNews) and found that people responded to him more. They followed and viewed the news account (more) as a feed for information. The White House @OpenGov dCTO account explored by creating her account, @BethNoveck, and found similar results. Incidentally, she then was able to keep that account after she left public service.

8) Better engagement with the audience requires the media to change established traditions and behaviors.

How many reporters still do not RT their competition’s stories, whether they beat them to a story or not? The best bloggers tend to be immense linkers and sharers. This is much like the decades-old question of whether a given newsroom’s website links to stories done by competitors or not. This behavior now has increasing consequences for algorithmic authority in both search engines (SEO) and social networks (SMO.) If we aspire to hosting the conversation around an issue, do we now have a responsibility need to point our audience at all the perspectives, data, sources and analysis that would contribute to an understanding of that issue? What happens if competitors or new media enterprises, like the Huffington Post, create an expectation for that behavior?

A good aspirational goal is to be a hub for a given beat, which means linking, RT’ing or sharing relevant information in a source-agnostic manner. If the beat is a given campaign, statehouse, policy area or geography.

9) Data-driven campaigns create more of a need for data-driven journalism.

Social media is important.. In Election 2012, social, location, mobile and campaign data — and how we use it — proved to be an equally important factor. Nate Silver pulled immense audiences to his 538 blog at the New York Times. Online spreadsheets, visualizations, predictive models, sentiment analysis, and mobile and/or Web apps are all part of the new ‘data journalism’ lexicon, as well as an emerging ‘newsroom stack’

Why? President Obama’s reelection campaign invested heavily in data collection, science and analysis for 2012. Others will follow in the years ahed. Republicans are investing in data but are appear to be behind, in terms of their capacity for data science. This may change in future cycles.

Government social media use continues to grow. More than 75% of Congress is using social media now. Freshmen Congressman in the House start the terms in office with a standard palate of platforms: Drupal for their website, Twitter, Facebook, Flickr and YouTube for constituent communications. By mid-2010, 22 of 24 Federal agencies were on Facebook. This trend will only continue at the state and local level.

10) What are governments learning from their attempts?

They’re behind but learning. From applying broadcast models to adopting new platforms, tools for listening, archiving, campaigning vs governing, personal use versus staffers, linking or sharing behaviors, targeted consultations, constituent identity, privacy and security policies, states and cities are moving forward into the 21st century. Slowly.

11) Know your platforms, their utility, demographics and conventions.

Facebook is gigantic. You cannot ignore it if you’re looking for the places people congregate online. That said, if you’re covering politics and breaking news, Twitter remains the new wire for news. It’s still the backchannel for events. It’s not an ideal place to host conversations because of issues with threaded conversations, although third party tools and conventions have evolved that make regular discussions around #hashtags possible. Google+ is much better for hosting hangouts and discussions, as are modern blog comment platforms like Disqus. Facebook fits somewhere in between the two for conversation: you can’t upvote comments and it requires readers to have a Facebook account – but the audience is obviously immense.

12) Keep an eye out for what’s next and who’s there.

Journalists should be thinking about Google+ in terms of both their own ‘findability’ and that of their stories in search results. The same is true for Facebook and Bing integration. Watch stats from LinkedIn as a source or forum for social news. Reddit has evolving into a powerful platform for media and public figures to host conversations. StumbleUpon can send a lot of traffic to you.

The odds are good that there are influential blogs with many readers who are covering your beat. Know the most important ones and their writers, link to them, RT their work and comment upon them. More services will evolve, like communities around open data, regional hubs for communities themselves, games and hybrids of location-based networks. Have fun exploring them!