Open by design: Why the way the new Healthcare.gov was built matters [UPDATED]

UPDATE: The refresh of Healthcare.gov in June went well. On October 1st, when the marketplace for health insurance went live at the site.gov, millions of users flocked to the website and clicked “apply now.” For days, however, virtually none of them were able to create accounts, much less complete the rest of the process and enroll for insurance. By the end of the week, however, it was clear that the problems at Healthcare.gov were not just a function of high traffic but the result of the failure of software written by private contractors, with deeper issues that may extend beyond account creation into other areas of the site. On October 9th, as prospective enrollees continued to be frustrated by error-plagued websites around the country, I joined Washington Post TV to give a preliminary post-mortem on why the HealthCare.gov relaunch went so poorly.

The article that follows, which was extended and published at The Atlantic, describes the team and process that collaborated on launch of the new site in June, not the officials or contractors that created the botched enterprise software application that went live on October 1st. In the Atlantic, I cautioned that “…the site is just one component of the insurance exchanges. Others may not be ready by the October deadline.”  The part of the site I lauded continues to work well, although the Github repository for it was taken offline. The rest has …not. I’ve taken some heat in the articles’ comments and elsewhere online for being so positive, in light of recent events, but the reporting holds up: using Jekyll is working. Both versions of the story, however, should have included a clearer caveat that the software behind the website had yet to go live — and that reports that the government was behind on testing Healthcare.gov security suggested other issues might be present at launch. If readers were misled by either article, I apologize. –Alex


Healthcare.gov already occupies an unusual place in history, as the first website to be demonstrated by a sitting President of the United States. In October, it will take on an even more important historic role, guiding millions of Americans through the process of choosing health insurance.

How a website is built or designed may seem mundane to many people in 2013, but when the site in question is focused upon such a function, it matters. Yesterday, the United States Department of Health and Human Services (HHS) relaunched Healthcare.gov with a new look, feel and cutting edge underlying architecture that is beyond rare in federal government. The new site has been built in public for months, iteratively created by a team of designers and engineers using cutting edge open source technologies. This site is the rarest of birds: a next-generation website that happens to be a .gov.

healthcare-gov-homepage

“It’s fast, built in static HTML, completely scalable and secure,” said Bryan Sivak, chief technology officer of HHS, in an interview. “It’s basically setting up a Web server. That’s the beauty of it.”

The people building the new Healthcare.gov are unusual: instead of an obscure sub-contractor in a nameless office park in northern Virginia, a by a multidisciplinary team at HHS worked with Development Seed, a scrappy startup in a garage in the District of Columbia that made its mark in the DC tech scene deploying Drupal, an open source content management system that has become popular in the federal government over the past several years.

“This is our ultimate dogfooding experience,” said Eric Gundersen, the co-founder of Development Seed. “We’re going to build it and then buy insurance through it.”

“The work that they’re doing is amazing,” said Sivak, “like how they organize their sprints and code. It’s incredible what can happen when you give a team of talented developers and managers room to work and let them go.”

What makes this ambitious experiment in social coding unusual is that the larger political and health care policy context that they’re working within is more fraught with tension and scrutiny than any other arena in the federal government. The implementation and outcomes of the Patient Protection and Affordable Care Act — AKA “Obamacare” — will affect millions of people, from the premiums they pay to the incentives for the health care they receive.

“The goal is get people enrolled,” said Sivak. “A step to that goal is to build a health insurance marketplace. It is so much better to build it in a way that’s open, transparent and enables updates. This is better than a big block of proprietary code locked up in CMS.”

healthcare-gov-marketplace-graphic

The new Healthcare.gov will fill a yawning gap in the technology infrastructure deployed to support the mammoth law, providing a federal choice engine for the more than thirty different states that did not develop their own health insurance exchanges. The new website, however modern, is just one component of the healthcare insurance exchanges. Others may not be ready by the October deadline. According to a recent report from the Government Accountability Office, the Department of Health and Human Services’ Centers for Medicare & Medicaid Services (CMS) is behind in implementing key aspects of the law, from training workers to help people navigate the process to certifying plans that will sold on the exchanges to determining the eligibility of consumers for federal subsidies. HHS has expressed confidence to the GAO that exchanges will be open and functioning in every state on October 1.

On that day, Healthcare.gov will be the primary interface for Americans to learn about and shop for health insurance, as Dave Cole, a developer at Development Seed, wrote in a blog post this March. Cole, who served as a senior advisor to the United States chief information officer and deputy director of new media at the White House, was a key part of the team that moved WhiteHouse.gov to Drupal. As he explained, the code will be open in two important ways:

First, Bryan pledged, “everything we do will be published on GitHub,” meaning the entire code-base will be available for reuse. This is incredibly valuable because some states will set up their own state-based health insurance marketplaces. They can easily check out and build upon the work being done at the federal level. GitHub is the new standard for sharing and collaborating on all sorts of projects, from city geographic data and laws to home renovation projects and even wedding planning, as well as traditional software projects.

Moreover, all content will be available through a JSON API, for even simpler reusability. Other government or private sector websites will be able to use the API to embed content from healthcare.gov. As official content gets updated on healthcare.gov, the updates will reflect through the API on all other websites. The White House has taken the lead in defining clear best practices for web APIs.

Thinking differently about a .gov

According to Sivak, his team didn’t get directly involved in the new Healthcare.gov until November 2012. After that “we facilitated the right conversations around what to build and how to build it, emphasizing the consumer-facing aspects of it,” he said. “The other part was to figure out what the right infrastructure was going to be to build this thing.”

That decision is where this story gets interesting, if you’re interested in how government uses technology to deliver information to the people it serves. Government websites have not, historically, been sterling examples of design or usability. Unfortunately, in many cases, they’ve also been built at great expense, given the dependence of government agencies on contractors and systems integrators, and use technologies that are years behind the rest of the Web. Healthcare.gov could have gone in the same direction, but for the influence of its young chief technology officer, an “entrepreneur-in-residence” who had successfully navigated the bureaucracies of the District of Columbia and state of Maryland.

“Our first plan was to leverage Percussion, a commercial CMS that we’d been using for a long time,” said Sivak. “The problem I had with that plan was that it wasn’t going to be easy to update the code. The process was complicated. Simple changes to navigation were going to take a month.”

At that point, Sivak did what most people do in this new millennium when making a technology choice: he reached out to his social networks and went online.

“We started talking to people about a better way, including people who had just come off the Obama campaign,” he said. “I learned about the ground they had broken in the political space, from A/B testing to lightweight infrastructure, and started reading about where all that came from. We started thinking about Jekyll as a platform and using Prose.io.”

After Sivak and his team read about Development Seed’s work with Jekyll online, they contacted the startup directly. After a little convincing, Development Seed agreed to do one more big .gov project.

“A Presidential Innovation Fellow used same tech we’re using for several of their projects,” said Cole. “Bryan heard about it and talked to us. He asked where we would go. We wanted to be on Github. We knew there were performance and reliability benefits from building the stack on HTML.”

Jekyll, for those who are unfamiliar with Web development trends, is a way for developers to build a static website from dynamic components. Instead of running a traditional website with a relational database and server-side code, using Jekyll enables programmers to create content like they create code. The end result of this approach is a site that loads faster for users, a crucial performance issue, particularly on mobile devices.

“Instead of farms of application servers to handle a massive load, you’re basically slimming down to two,” said Sivak. “You’re just using HTML5, CSS, and Javascript, all being done in responsive design. The way it’s being built matters. You could, in theory, do the same with application servers and a CMS, but it would be much more complex. What we’re doing here is giving anyone with basic skills the ability to do basic changes on the fly. You don’t need expensive consultants.”

That adds up to cost savings. Sites that are heavily trafficked — as Healthcare.gov can reasonably be expected to be – normally have to use a caching layer to serve static content and add more server capacity as demand increases.

“When we worked with the World Bank, they chose a plan from Rackspace for 16 servers,” said Gundersen. “That added tens of thousands of dollars, with a huge hosting bill every month.”

HHS had similar strategic plans for the new site, at least at first.

“They were planning 32 servers, between staging, production and disaster recovery, with application servers for different environments,” said Cole. “You’re just talking about content. There just needs to be one server. We’re going to have 2, with one for backup. That’s a deduction of 30 servers.”

While Jekyll eliminates the need for a full-blown content management system on the backend of Healthcare.gov (and with it, related costs), the people managing the site still need to be able to update it. That’s where Prose.io comes in. Prose.io is an open source content editor created by Development Seed that gives non-programmers a clean user interface to update pages.

“If you create content and run Jekyll, it requires content editors to know code,” said Cole. “Prose is the next piece. You can run it on your on own servers or use a hosted version. It gives access to content in a CMS-like interface, basically adding a WYSIWYG skin, giving you a text editor in the browser.”

In addition to that standard “what you see is what you get” interface, familiar from WordPress or Microsoft Word, Prose.io offers a couple of bells and whistles, like mobile editing.

“You can basically preview live,” said Cole. “You usually don’t get a full in-browser preview. The difference is that you have that with no backend CMS. It’s just a directory and text files, with a Web interface that exposes it. There are no servers, no infrastructure, and no monthly costs. All you need is a free Web app and Github. If you don’t want to use that, use Git and Github Enterprise.” Update: Cole wrote more about launching Healthcare.gov on the DevelopmentSeed blog on Tuesday.

Putting open source to work

Performance and content management aside, there’s a deeper importance to how Healthcare.gov is being built that will remain relevant for years to come, perhaps even setting a new standard for federal government as a whole: updates to the code repository on Github can be adopted for every health insurance exchange using the infrastructure. (The only difference between different state sites is a skin with the state logo.)

“We have been working in the .gov space for a while,” said Gundersen. “Government people want to make the right decisions. What’s nice about what Bryan is doing is that he’s trying to make sure that everyone can learn from what HHS is doing, in real-time. From a process standpoint, what Bryan is doing is going to change how tech is built. FCC is watching the repository on Github. When agencies can collaborate around code, what will happen? The amount of money we have the opportunity to save agencies is huge.”

Collaboration and cascading updates aren’t an extra, in this context: they’re mission-critical. Sivak said that he expects the new site to be improved iteratively over time, in response to how people are actually using it. He’s a fan of the agile development methodology that has become core to startup development everywhere, including using analytics tools to track usage and design accordingly.

“We’re going to be collecting all kinds of data,” said Sivak. “We will be using tools like Optimizely to do A/B and multivariate testing, seeing what works on the fly and adapting from there. We’re trying to treat this like a consumer website. The goal of this is to get people enrolled in health care coverage and get insurance. It’s not simple. It’s a relatively complex process. We need to provide a lot of information to help people make decisions. The more this site can act in a consumer-friendly fashion, surfacing information, helping people in simple ways, tracking how people are using it and where they’re getting stuck, the more we can improve.”

Using Jekyll and Prose.io to build the new Healthcare.gov is only the latest chapter in government IT’s quiet open source evolution. Across the federal government, judicious adoption of open source is slowly but surely leading to leaner, more interoperable systems.

“The thing that Git is all about is social coding,” said Sivak, “leveraging the community to help build projects in a better way. It’s the embodiment of the open source movement, in many ways: it allows for truly democratic coding, sharing, modifications and updates in a nice interface that a lot of people use.”

Open by design

Sivak has high aspirations, hoping that publishing the code for Healthcare.gov will lead to a different kind of citizen engagement.

“I have this idea that when we release this code, there may be people out there who will help us to make improvements, maybe fork the repository, and suggest changes we can choose to add,” he said. “Instead of just internal consultants who help build this, we will suddenly have legions of developers.”

Not everything is innovative in the new Healthcare.gov, as Nick Judd reported at TechPresident in March: the procurement process behind the new site is complicated and the policy and administrative processes that undergird it aren’t finished yet, by any account.

The end result, however, is a small startup in a garage rebuilding one of the most important federal websites of the 21st century in a decidedly 21st century way: cheaper, faster and scalable, using open source tools and open standards.

“Open by design, open by default,” said Sivak. “That’s what we’re doing. It just makes a lot of sense. If you think about what should happen after this year, all of the states that didn’t implement their systems, would it make sense for them to have code to use as their own? Or add to it? Think about the amount of money and effort that would save.”

That’s a huge win for the American people. While the vast majority of visitors to Healthcare.gov this fall will never know or perhaps care about how the site was built or served, the delivery of better service at lowered cost to taxpayers is an outcome that matters to all.

R U up? Haz $ 4 Uncle Sam? [USGAO to host online chat about taxes & Bitcoin]

More signs that it’s 2013 and we’re all into the 21st century: Tomorrow at 2 PM ET, the United States Government Accountability Office will answer online questions about a decentralized electronic currency during a livestreamed event. 

Yes, the USGAO is talking to the Internet about Bitcoin.

And yes, the agency tweeted about it.

Using social media to convene and amplify a discussion about a difficult, timely topic is a terrific use of the medium and the historic moment. Here’s hoping that USGAO officials get good questions and give frank, clear answers.

You can read more about Bitcoin and taxes at National Journal and follow along at #AskGAOLive on Twitter tomorrow.

9 suggested follows for @HillaryClinton on Twitter

Hillary Clinton has joined Twitter. The former First Lady of the United States, U.S. Senator and Secretary of State joined the conversation with aplomb and humor, thanking the authors of a tumblr blog, “Txts from Hillary,” for inspiration and adopting the now iconic image of her aboard a military transport plane as her avatar. Her first — and to this point, only — tweet had been retweeted more than 6,200 times in five hours.

So far, Clinton is only following all-things-Clinton: former President Bill Clinton, their daughter, Chelsea Clinton, and the Clinton Foundation.

The Washington Post suggested 15 accounts for Clinton to follow, ranging from serious (the @VP’s office) to satiric (@AnthonyWeiner & @JustinBeiber.) With the exception of the @StateDept, the list is heavily focused upon U.S. domestic politics and the 2016 presidential election, a prospect that it seems many DC media outlets begin speculating about a few seconds after Mitt Romney walked away from his concession speech in Boston early on the morning of November 7th.

While the list is light-hearted, it’s also unnecessarily constrained in scope and perspective. Clinton spent four years traveling the earth, speaking to world leaders. Why not continue to keep that global reach on a platform that has, well, global reach?

While she could adopt social graphs of Beltway pundits and media, primarily following other DC media and politicians, this new account is an opportunity to do, well, a bit better. While former staffers Jared Cohen, Alec J. Ross, Ronan Farrow, Katie Dowd and Katie Stanton may be of assistance (and useful follows for her) in no particular order, here are 9 other accounts that would vastly improve future #TweetsFromHillary.

1) The White House

People interested in governance and Twitter tend to follow the @WhiteHouse. (Those who wish to be elected to it might benefit as well.) Under Macon Phillips, the White House director of digital, the White House account has taken some risks to become a platform for the President’s policies — and often, amplified back the voices of those Americans who support them. A safe following strategy would be to choose from the accounts the White House follows.

2) Anne Marie Slaughter

A former State Department official turned Princeton official, Slaughter is already well-known to Clinton from her tenure there. Her focus on foreign policy, women’s issues and international affairs is a valuable addition to any feed.

3) Emily Bell

The Director of the Tow Centre for Digital Journalism at Columbia Journalism School is one of the sharpest observers of how technology is changing media and commentators on that shift.

4) Nick Kristof

The New York Times columnist, who calls himself a “print dinosaur, trying to evolve into a new media maven,” has adapted to social media better than any of the other writers on The Grey Lady’s opinion page, from Facebook to Google+ to Twitter. Kristof cuts through the noise, sharing news that matters, and listens to his global networks of connections far better than most.

5) danah boyd

People new to Twitter may find following at least one “social media expert” useful, for tips, nuance and criticism. There’s no one most deserving of that description than digital ethnographer danah boyd, though she’d never claim the title. (Be mindful that she may take a Twitter vacation this summer.)

6) Mark Knoller

If you don’t follow CBS News White House correspondent Mark Knoller, you’re missing a real-time history of the presidency.

7) Bill Gates

One of the world’s smartest men will (help) make you smarter if you follow him.

8) Blake Hounshell

The (former) managing editor at Foreign Policy has one of the best pulses on global events and what they mean on Twitter. He puts world news and events in context, or at least as much as one can in 140 characters. (While you’re at it, Secretary Clinton, set up a list to follow Andy Carvin (@acarvin) too. He tweets a lot but you’ll likely find that many of your former staffers follow him for good reason.)

9) Maria Popova

Everyone has a “desert island follow” or two. For many people, that might be Popova, who has a remarkable talent for finding and sharing interesting literature, art, science and more.

These are, naturally, just a starting point. In 2013, there are literally thousands of government officials, policy wonks, journalists and politicians who Clinton might find following valuable. (Who knows? Maybe she’ll even follow learn from P.J. Crowley.)

There are 66 verified world leaders on Twitter. While most don’t tweet themselves, Estonian president Toomas Hendrik Ilves does personally, sometimes with an edge.

The easiest method may be for her to follow Twitter’s list.

If Clinton wants to make the most of the platform, she’ll do well to personally unfollow some feeds, find new voices, listen to her @replies and act like a human.

12 lessons about social media, politics and networked journalism

In 2011, I was a visiting faculty member at the Poynter Institute, where I talked with a workshop full of journalists about working within a networked environment for news. As I put together my talk, I distilled the lessons I’d learned from my experiences covering tech and the open government initiative that would affect the success of any audience relationship and posted them onto Google+. Following is an adapted and updated version of those insights. The Prezi from the presentation is online here.

1) We have to change our idea of “audience.”

People are no longer relegated to being the passive recipients of journalists’ work. They have often creators of content and have become important nodes for information themselves, sometimes becoming even more influential within their topical or regional communities than journalists are. That means we have to treat them differently. Yes, people are reading, watching or listening to the work of journalists but they’re much more than an “audience.”

In the 21st century, the intersection of government, politics and media is increasingly a participatory, reciprocal and hypersocial experience due to the explosion in adoption of connected smartphones that turn citizens into publishers, broadcasters and human sensors – or censors, depending upon the context. More than half of American adults have a smartphone in 2013. The role of editors online now includes identifying and debunking misinformation, sifting truth from fiction, frequently in real-time. The best “social media editors” are creating works of journalism from a raw draft of history contributed by the reports of the many.

2) Good conversations involve talking and listening.

Communicating effectively in networked environments increasingly involves asking good questions that elicit quality responses — the more specific the question, the better the chance for a quality response. The Obama administration’s open government initiative’s initial use of the Internet in 2009, at Change.gov, did not ask highly structured questions, which led to a less effective public consultation.

3) The success of any conversation depends upon how well we listen.

Organizations that invite comments and then don’t respond to audience comments or questions send a clear message — “we’re not listening.” There are now many ways there are to listen and a proliferation of channels, going far beyond calls and email.

Comments have become distributed across the Internet and social Web. People are not just responding to those made on a given article or post: they’re on Twitter, Facebook and potentially other outposts. Find where people are talking about your beat, organization or region: that’s your community. Some organizations are using metrics to determine not only how often sentiments are expressed but the strength of that conviction and the expertise behind it.

4) No matter how good the conversation, its hosts must close the loop.

When the host of a conversation, be it someone from government, school, business or media, asks someone’s opinion, but doesn’t acknowledge it, much less act upon it, the audience loses trust.

If we seek audience expertise but don’t subsequently let it inform our work, the audience loses trust. Increasingly, to gain and hold that trust you must demonstrate the evidence behind your assertions by citation, with research tied via footnotes or hyperlinks, source code or supporting data.

It’s better not to ask than to ask and not act upon the answer. It’s similarly better not to engage in social media at all than to perpetuate the same old one-way communication streams with legacy broadcast behaviors. There are also new risks posted by the combination of ubiquitous connected mobile devices and the global reach of social media networks. To paraphrase Mark Twain, it is better to be thought a fool than to tweet and prove it.

5) You must know who your audience is and where, why, when and how they’re searching for information to engage them effectively.

TechTarget, one of my former employers, successfully segmented its traditional IT audience into niches that cared passionately about specific technology and/or issues. The company then developed integrated media products around highly specific topical area, a successful business model, albeit one that has specialized applicability to the news business. Politico’s approach, which now includes live online video, paid subscriber content for “pros,” policy segmentation, email newsletters and events, is the most apt comparison in the political space, although there are many other trade publications that cater to niche audiences.

Here’s the key for both specific audiences: IT buyers have decision-making ability over thousands, if not millions, of dollars in budgets. Policy makers in DC have similar authority on appropriates, legislation or regulation.

Most general readers do not have budget authority nor policy cout and therefore will not sustain an effective business model. If you can create content that is of interest to people with buying power, then sponsors/advertisers will bite. The model, in other words is not a panacea.

6) Your audience should be able to find and hear from YOU.

It matters whether the person whose name is on a social media account actually engages in it. For instance, President Obama doesn’t directly use social media, with a few notable exceptions. His White House and campaign staff do, at @WhiteHouse and @BarackObama. Some GOP candidates and incumbents actually maintain their accounts. If you take away the president, the GOP is ahead in both houses of Congress. They have attracted huge followings.

Why does a personal account to complement the masthead matter? It stays with the reporter or editor from job to job. While many networks or papers have adopted naming conventions that immediately identify a journalist’s affiliation (@NameCBS or @NYT_Name) that practice does create a gray area in terms of who “owns” the account. @OctaviaNasrCNN was able to drop the CNN and keep her account. @CAmanpour was able to transfer from CNN to ABC. Even within networks, there is a lack of standardization: Compare @DavidGregory or @JakeTapper to @BetsyMTP.

7) People respond differently to personal accounts than mastheads.

Andy Carvin taught me about this dynamic years ago, which I’ve since seen borne out in practice. He compared the results he’d get from asking questions on his personal account (@acarvin) to a primary NPR accounts (@NPRNews) and found that people responded to him more. They followed and viewed the news account (more) as a feed for information. The White House @OpenGov dCTO account explored by creating her account, @BethNoveck, and found similar results. Incidentally, she then was able to keep that account after she left public service.

8) Better engagement with the audience requires the media to change established traditions and behaviors.

How many reporters still do not RT their competition’s stories, whether they beat them to a story or not? The best bloggers tend to be immense linkers and sharers. This is much like the decades-old question of whether a given newsroom’s website links to stories done by competitors or not. This behavior now has increasing consequences for algorithmic authority in both search engines (SEO) and social networks (SMO.) If we aspire to hosting the conversation around an issue, do we now have a responsibility need to point our audience at all the perspectives, data, sources and analysis that would contribute to an understanding of that issue? What happens if competitors or new media enterprises, like the Huffington Post, create an expectation for that behavior?

A good aspirational goal is to be a hub for a given beat, which means linking, RT’ing or sharing relevant information in a source-agnostic manner. If the beat is a given campaign, statehouse, policy area or geography.

9) Data-driven campaigns create more of a need for data-driven journalism.

Social media is important.. In Election 2012, social, location, mobile and campaign data — and how we use it — proved to be an equally important factor. Nate Silver pulled immense audiences to his 538 blog at the New York Times. Online spreadsheets, visualizations, predictive models, sentiment analysis, and mobile and/or Web apps are all part of the new ‘data journalism’ lexicon, as well as an emerging ‘newsroom stack’

Why? President Obama’s reelection campaign invested heavily in data collection, science and analysis for 2012. Others will follow in the years ahed. Republicans are investing in data but are appear to be behind, in terms of their capacity for data science. This may change in future cycles.

Government social media use continues to grow. More than 75% of Congress is using social media now. Freshmen Congressman in the House start the terms in office with a standard palate of platforms: Drupal for their website, Twitter, Facebook, Flickr and YouTube for constituent communications. By mid-2010, 22 of 24 Federal agencies were on Facebook. This trend will only continue at the state and local level.

10) What are governments learning from their attempts?

They’re behind but learning. From applying broadcast models to adopting new platforms, tools for listening, archiving, campaigning vs governing, personal use versus staffers, linking or sharing behaviors, targeted consultations, constituent identity, privacy and security policies, states and cities are moving forward into the 21st century. Slowly.

11) Know your platforms, their utility, demographics and conventions.

Facebook is gigantic. You cannot ignore it if you’re looking for the places people congregate online. That said, if you’re covering politics and breaking news, Twitter remains the new wire for news. It’s still the backchannel for events. It’s not an ideal place to host conversations because of issues with threaded conversations, although third party tools and conventions have evolved that make regular discussions around #hashtags possible. Google+ is much better for hosting hangouts and discussions, as are modern blog comment platforms like Disqus. Facebook fits somewhere in between the two for conversation: you can’t upvote comments and it requires readers to have a Facebook account – but the audience is obviously immense.

12) Keep an eye out for what’s next and who’s there.

Journalists should be thinking about Google+ in terms of both their own ‘findability’ and that of their stories in search results. The same is true for Facebook and Bing integration. Watch stats from LinkedIn as a source or forum for social news. Reddit has evolving into a powerful platform for media and public figures to host conversations. StumbleUpon can send a lot of traffic to you.

The odds are good that there are influential blogs with many readers who are covering your beat. Know the most important ones and their writers, link to them, RT their work and comment upon them. More services will evolve, like communities around open data, regional hubs for communities themselves, games and hybrids of location-based networks. Have fun exploring them!

United Kingdom looks to put 50 million health records online and increase patient data rights

This Monday, Minister of Parliament Jeremy Hunt, the United Kingdom’s Secretary of State for Health, delivered a keynote address at the fourth annual Health Datapolooza in Washington, DC. In a rhetorical turn that would be anathema for any national conservative politician on this side of the Atlantic, Hunt commended the United States for taking steps towards providing universal health insurance to its people.

Hunt outlined three major elements in a strategy to improve health care in the UK: 1) applying data more effectively 2) improving transactional capabilities and 3) putting patients in the “driver’s seat” of their own health care. He pointed to several initiatives that support that strategy, from extending electronic health records to 50 million people to sequencing the genomes of 100,000 people and developing telemedicine capabilities for 3 million patients. Given the focus of the datapalooza, however, perhaps his most interesting statement came with respect to personal data ownership:

After the keynote, I interviewed Secretary Hunt and Tim Kelsey, the first national director for patients and information in the National Health Service. Our discussion, lightly edited, follows.

What substantive steps has the UK taken to actually putting health data in the hands of patients?

Hunt: Basically, I have given an instruction that everyone should be able to access their own health record online before the next general election, which means that I will be accountable for delivering that promise. There’s no wiggle room for me. That’s a big change, and it’s also a big change for the system because, basically, it means that every hospital and every general practitioner has to get used to the idea that the data they write about patients will be able to be accessed by patients. It’s a small but very significant first step.

There’s sometimes a disconnect between what politicians direct and what systems actually do. What’s happening with the UK’s long-delayed EHR system?

Hunt: I’ve given a pretty accountable timeframe for this: May 2015. I’ll be facing a general election campaign then. If we don’t deliver, then my head’s going to be on the block. I think it is a valid question, because of course once you set these objectives, then you start to look underneath it. One of the questions that we have to ask ourselves is how many have actually used this. We want everyone to be able to use this, but in practice, if the way they use it is they’re going to have to go into their GP, they’ve got to sign a consent form, there’s some complex procedure, then actually it’s not going to change people’s lives. The next question is about take-up, and that’s what we’re exploring at the moment.

Are there any aspects of the U.S. healthcare system that you think might be worth adopting and bringing back to the U.K.? Or vice versa?

Well, it’s quite interesting. We just had a really good meeting with [US CTO] Todd Park. I don’t think the differences are so great. I mean, on one level, yes, hospitals here are private or charitable, so they can’t be mandated by the government to do anything. And yet, they’ve succeeded in getting 80% of them to adopt EHRs through setting a standard and a certain amount of financial incentive. We can tell our hospitals to do things, but actually, as you said earlier, that’s not the same as them actually doing it.

I think in the end, in both countries, what you have to do is make it so that it’s in the hospitals’ own interest. In our case, the way that we’re doing that is trying to demonstrate that sensibly embracing the technology agenda has a massive effect on reducing mortality rates and improving clinical outcomes. By publishing all of the data about those outcomes, we’re creating competition between hospitals. That, I hope, will drive this agenda.

At the same time, we need to change public awareness. This is the big challenge – this sense that you can actually be in charge of your own health is just, surprisingly, absent in large numbers of people. There’s a very strong sense that lots of people have that “health is something that’s done to me” by NHS.

In the U.S has released data on the disparities in pricing for hospital procedures and comparisons of hospital quality — but you still need to go to places that take your health insurance. In the NHS, is that as much of an issue?

Hunt: That’s a really good question to ask because, in the U.K, for virtually any procedure, you have the right to have it done in any hospital in the country — and yet, very few people avail themselves of that right. So, by publishing surgical survival rates, we’re hoping to create pressure, where people actually say “I’m going to have this heart operation, and I’m not going to go to my local hospital, I’m going to go to this one a bit farther away that has higher success rates.” At the moment, people don’t actually do that; they tend to go where they’re recommended to. That’s where this information revolution can take hold.

What is the most unexpected thing that has happened since the U.K. began releasing more open data about health?

Tim Kelsey: I don’t know if this is unexpected or not, but the most startling thing is that we’ve moved from having one of the worst heart surgery survival rates in Europe to being the best. Heart surgery is the only speciality where we’ve published comparative data by heart surgeons across the whole country.

Do you think that’s an accident?

Tim Kelsey: No, I don’t think it’s an accident at all. Within that data, if you look at what has actually happened, the assumption of the geniuses who actually pioneered the program was that the gap between the best surgeons and the worst surgeons would narrow, because the weaker surgeons would raise their game. That didn’t happen. What happened was that the best surgeons got even better, and the underperforming surgeons also raised their game. The truth is that they want to be the winner, and open data has had a massive impact in driving outcomes and standards.

What are the most important principles or substantive steps that you’re applying at the NHS to mitigate risks or harms from privacy breaches?

Hunt: We have to carry the public with us. We have a very strong free press, as you do, and we’re very proud of that. If they believe that people’s data is going to be used to infringe their privacy, then public confidence in the huge revolution that the dataaplooza is all about will be shaken and lack a massive impact. I think that there’s a very simple way that you maintain public confidence, which is by making it absolutely clear that you own that data. You can choose, if you don’t want that data to be used, in even in an anonymized form, you can say I’m not going to share my data. I think once you do that, you create a discipline in the system to make sure that the anonymization of data is credible, because people can withdraw their consent if they don’t believe it.

Also, you put people in the driver’s seat, because I think people’s motives are different. You and I, as young and hopefully healthful individuals, we’re thinking about privacy. If somebody’s got terrible cancer, he’s actually thinking, ‘well, I would really like my data to be used for the benefit of humanity.’ They’re actually very, very happy to have their data shared. They have a different set of concerns.

I don’t think you’ll have any trouble, for example, getting 100,000 people to consent to have their genome sequenced. These will be people who have cancer, and once you have cancer, you think, ‘what can I do to help future generations conquer cancer?’ The mentality changes. We have to maintain people’s confidence.

I think the best analogy, though, is banking. Perhaps the second thing people care about most after their health is their money, and the banks have been able to maintain people’s confidence. They’re actually doing banking online, so that you can access your bank account from any PC, anywhere in the world. It’s something you can do with confidence. They’ve done that because they’ve thought through the procedures.

In the U.S., you’re entitled to access a free copy of your credit report once a year. Consumers, however, still don’t have access to their own data across much of the private sector. Will the British government support “rights to data for its citizens?”

Hunt:: We are hoping to preempt the worry about that by instructing the NHS that everyone has a right of veto over the use of their own data. You own your own medical record. If you don’t want that shared, then that’s your decision, and you’re able to do that. If we didn’t do that, I think the courts might make us do that.

Kelsey: Just to clarify that point: The Data Protection Act, which is effectively a European piece of legislation, says that people have the right to object to data being shared, in any context, private sector or health or otherwise, or to opt out. We’ve said, because of the rights priority we’re giving to patients as the de facto owner of the data, which is different from the American situation so far.

We’re setting a global standard here, which will be interesting experiment for the rest of the world to watch, that people will have the right to say “I don’t want my data shared” — and people will respect that. Now, at the moment that is not a legal right, that is a de facto right that will be expected. It may well be that we’ll need to simply write down a law that this is an individual’s data and rights flow from that. At the moment, there’s no law that gives an individual patient the right to their own data nor to opt out out of its sharing.

White House moves to bash patent trolls, though Congress still must enact trollbane

This morning, President Obama moved to curb suits from “patent trolls,” entities that many observers of the technology industry have been warning have increasingly been harming innovation across the United States. As it turned out, those concerned parties have been right to decry the trend: a report (PDF) contained a startling statistic: the number of lawsuits brought by patent trolls has nearly tripled in the past 2 years, now accounting for 62% of all patent lawsuits in America. As Edward Wyatt pointed out in the New York Times, this surge in patent lawsuits is directly related to the passage of a 2011 law that was designed to address the trouble.

The White House announced several executive actions today to take on patent trolls, including a series of workshops, scholarship opportunities, a consumer-facing website and a review of exclusion orders. The administration will also begin a rulemaking process at the U.S. Patent Office to that would “require patent applicants and owners to regularly update ownership information when they are involved in proceedings before the PTO, specifically designating the ‘ultimate parent entity’ in control of the patent or application.”

One interesting additional outcome of the day’s news is that White House Google+ Hangouts matter. Entrepreneur Limor Fried’s unexpected question to President Obama on patent trolls during a White House Hangout in February 2013 led to a frank answer and contributed to the White House’s action today, a connected directly made by the @WhiteHouse Twitter account. Here’s what the president said, back in February:

A couple of years ago we began the process of patent reform. We actually passed some legislation that made progress on some of these issues, but it hasn’t captured all the problems. And the folks that you’re talking about are a classic example. They don’t actually produce anything themselves, they’re just trying to essentially leverage and hijack somebody else’s idea and see if they can extort some money out of them. And, you know, sometimes these things are challenging, because we also want to make sure that the patents are long enough that, you know, people’s intellectual property is protected. We’ve got to balance that with making sure that they’re not so long that innovation is reduced. And, but I do think that our efforts at patent reform only went about halfway to where we need to go. And what we need to do is pull together, you know, additional stakeholders, and see if we can build some additional consensus on some smarter patent laws. This is true, by the way, across the board when it comes to high tech issues. The technology’s changing so fast. We want to protect privacy, we want to protect people’s civil liberties, we want to make sure the Internet stays open. And I’m an ardent believer that what’s powerful about the Internet is its openness and the capacity for people to get out there and just introduce a new idea with low barriers to entry.

I hope President Obama does more Google+ Hangouts and is asked more tough questions regarding drones, patents and other issues on the minds of the People, far outside of the DC media bubble.

Hangouts aside, as Greg Ferenstein pointed out at TechCrunch, the administration is going to need Congress to effectively curb these abuses: the president can’t simply declare an end to this mess: Congress must be involved.

Five relevant bills have been introduced recently, as Michelle Quinn noted out at Politico and Joe Mullen emphasized at Ars Technica, and while the legislative reforms suggested by the White House could make a real difference in curbing the worst of patent troll abuses, it’s not at all clear what this Congress is capable of passing through both chambers at this point.

Timothy Lee, newly ensconced at Wonkblog at the Washington Post, isn’t convinced that such legislation, even if passed, will effectively smash patent trolls. Lee would like to see the federal government fix a broken patent system. Unfortunately for that aspiration, Washington recently passed an America Invents Act and is now moving forward on implementation. It’s not at all clear how soon substantial reform will end up on a president’s desk again soon.

[Animated GIF credit: White House Tumblr. Oh yes, there will be GIFs. ]

Proposed fracking rule from Interior Department needs more liquid data

A proposed rule on hydraulic fracking from the United States Department of the Interior Bureau of Land Management is now online. As of May 24, the comment period has begun, although the American Petroleum Institue is pressuring Interior to slow down the fracking rule.

You can read the proposed rule and comment at Regulations.gov, which was relaunched last year with an eye on public participation in rulemaking.

While the closely watched regulation has drawn qualified praise from the oil and gas industry, it includes a notable flaw with respect to how technology is used to oversee fracking. The Center for Effective Government is arguing that the BLM fracking rule violates the recent White House executive order on open data:

…instead of establishing a modern example of government information collection and sharing, BLM’s proposed rule would allow drilling companies to report the chemicals used in fracking to a third-party, industry-funded website, called FracFocus.org, which does not provide data in machine-readable formats. FracFocus.org only allows users to download PDF files of reports on fracked wells. Because PDF files are not machine-readable, the site makes it very difficult for the public to use and analyze data on wells and chemicals that the government requires companies to collect and make available.

Although FracFocus.org has recently improved some of its search features, the oil and gas industry opposes making chemical data easier to download or evaluate for fear that the public “might misinterpret it or use it for political purposes.” (subscription required) Citizens need to have adequate, accurate information about the chemicals they may be exposed to in order to evaluate the potential risks and rewards of allowing fracking in their communities.

“It is particularly disappointing that the first new information proposal since the open data executive order completely ignores the new requirements,” said Sean Moulton, Director of Open Government Policy at the Center for Effective Government. “This proposal doesn’t just fail to comply with the new open data policy, it represents a step in the wrong direction since it abdicates control of and access to the data to an industry website.”

Data formats aside, every person in a state where fracking is taking place should care about how it will be regulated, including the way information regarding which chemicals are used in the process.

This is an opportunity to play the role of an informed, engaged citizen that goes beyond a periodic visit to the ballot box every two years.

If you don’t and dislike the outcome, you may be left asking “why wasn’t I consulted?

If you feel strongly, one way or the other, about fracking or federal oversight of industry, should it be approved and come to your state, there is literally no time better than than now to weigh in.

UPDATE: Kyle Smith, writing for the Sunlight Foundation, reports that the fracking debate has been extended until the end of the summer:

Under pressure from the oil industry, Interior Department Secretary Sally Jewell has extended the comment period on a controversial final “fracking” regulation by 60 days, promising two more months of maneuvering over a rule that, in its earlier incarnations, drew more than 177,000 public comments. The bulk of those appeared to be the product of letter-writing campaigns by environmental groups, according to analysis of comments on Sunlight’s Docket Wrench and conversations with agency officials.