On May 10, the Select Committee on the Modernization of Congress in the United States House of Representatives held a hearing on “opening up the process,” at which 4 different experts talked with Congress about making legislative information more transparent,” from ongoing efforts to proposed reforms to the effect of sunshine laws passed decades ago.
If you’re not up to speed on this committee, it was established on January 4, 2019 the House voted in favor of establishing the Select Committee by an overwhelming margin (418-12) by adopting of Title II of H.Res.6, the Rules of the House of Representatives for the One Hundred Sixteenth Congress, with a sole authority of investigating, studying, holding public hearings, making finding, and developing recommendations to modernize Congress – but no legislative jurisdiction nor ability to take legislative action. The committee has has been fairly described by IssueOne as “the best opportunity in decades” for Congress to improve itself, by looking inward.
The thread of tweets above, however is not meant to be comprehensive, nor could it be fully contextualized in the moment. For that, watch the hearing on YouTube, in the video embedded below:
He and Tauberer recommended from appointing a legislative branch Chief Data Officer, releasing structured data that would feed into the clerk’s tool to show how proposed amendments would change bills, and enact a mandate and open standards for a unique identifier for lobbyists across the U.S. government. Whether those ideas make it into the committee’s recommendations remains to be seen, but they’re worth weighing – along with further studying of the value or risk of increasing or decreasing public access to various aspects of the deliberative processes that constitute legislative and oversight activities.
On a meta note, the process on display at this forum was notable for comity between witnesses and members, openness to the public and press, and engagement online.
While paying attention to the digital component has become downright mundane in 2019, the Committee demonstrated admirable competence, streaming the hearing online at YouTube, publicized it on social media prior to the event, engaged with the public during the hearing, and published testimony on its website afterwards. (Unfortunately, there’s no text transcription of the hearing on the hearing page. Given the committee’s acknowledgement of the importance of accessibility, it should make sure to get transcripts online.)
As at the most recent “Congressional hackathons,” Members of Congress were able to show good government, ethics and transparency, far away from partisan rancor over white hot political issues or Congressional attempts to conduct constitutional oversight of a corrupt administration.
If you’re interested in following the activities of the committee or providing feedback, visit Modernizecongress.House.gov and scrub in: the People’s House will only be as open, accessible, accountable, and effective as we make it – or demand it to be. There’s no one else coming to help.
United States chief technology officer Todd Park will be moving to California at the end of August, just in time to take his kids to the first day of school. He’ll be shifting from his current position in the Office of Science and Technology a Policy to a new role in the White House, recruiting technologists to join public service. The move was first reported in Fortune Magazine and then Reuters, among other outlets. Update: On August 28th, the White House confirmed that Park would continue serving in the administration in a new role in blog post on WhiteHouse.gov.
“From launching the Presidential Innovation Fellows program, to opening up troves of government data to the public, to helping spearhead the successful turnaround of HealthCare.gov, Todd has been, and will continue to be, a key member of my Administration,” said President Barack Obama, in a statement. “I thank Todd for his service as my Chief Technology Officer, and look forward to his continuing to help us deploy the best people and ideas from the tech community in service of the American people.”
“I’m deeply grateful for Todd’s tireless efforts as U.S. Chief Technology Officer to improve the way government works and to generate better outcomes for the American people,” added White House Office of Science and Technology Policy Director and Assistant to the President John Holdren. “We will miss him at the Office of Science and Technology Policy, but we’re fortunate Todd will continue to apply his considerable talents to the Obama Administration’s ongoing efforts to bring the country’s best technologists into the Federal Government.”
It will be interesting to see how Park approaches recruiting the nation’s technologists to serve in the new U.S. Digital Service and federal agencies in the coming months.
“It continues to be the greatest honor of my life to serve the President and the country that I love so very much,” stated Park, in the blog post. “I look forward to doing everything I can in my new role to help bring more and more of the best talent and best ideas from Silicon Valley and across the nation into government.”
Park wants to move government IT into the open source, cloud-based, rapid-iteration environment that is second nature to the crowd considering his pitch tonight. The president has given reformers like him leave, he told them, “to blow everything the fuck up and make it radically better.” This means taking on big-pocketed federal contractors, risk-averse bureaucrats, and politicians who may rail at overruns but thrive on contributions from those benefiting from the waste. It also will require streamlined regulations from both the executive and legislative branches. But instead of picking fights, Park wants to win by showing potential foes the undeniable superiority of a modern approach. He needs these coders to make it happen, to form what he calls a Star Wars-style Rebel Alliance, a network of digital special forces teams. He can’t lure them with stock options, but he does offer a compelling opportunity: a chance to serve their country and improve the lives of millions of their fellow citizens.
“We’re looking for the best people on the planet,” he said. “We have a window of opportunity—right the fuck now—within this government, under this president, to make a huge difference.
“Drop everything,” he told them, “and help the United States of America!”
Who will be the new CTO?
The next US CTO will have big shoes to fill: Park has played key roles advising the president on policy, opening up government data and guiding the Presidential Innovation Fellows program and, when the president asked, rescuing Healthcare.gov, the federal online marketplace for health insurance. While it’s not clear who will replace Park yet, sources have confirmed to me that there will be another U.S. CTO in this administration. What isn’t clear is what role he (or she) might play, a question that Nancy Scola explored at The Switch for the Washington Post this week:
There’s a growing shift away from the idea, implicit in Obama’s pledge to create the U.S. CTO post back in 2007, that one person could alone do much of the work of fixing how the United States government thinks about IT. Call it the “great man” or “great woman” theory of civic innovation, perhaps, and it’s on the way out. The new U.S. Digital Service, the pod of technologists called 18F housed at the General Services Administration, the White House’s Presidential Innovation Fellows, even Park’s new outreach role in Silicon Valley — all are premised on the idea that the U.S. needs to recruit, identify, organize, and deploy simply more smart people who get technology.
An additional role for the third US CTO will be an example of the Obama administration’s commitment to more diverse approach to recruiting White House tech staffers in the second term. The men to hold the office were both the sons of immigrants: Aneesh Chopra is of Indian descent, and Park of Korean. As Colby Hochmuth reported for Federal Computer Week, the White House of Office and Science and Technology Policy achieved near-gender parity under Park.
If, as reported by Bloomberg News, Google X VP Megan Smith were to be chosen as the new US CTO, her inclusion as an openly gay woman, the first to hold the post, and the application of her considerable technological acumen to working on the nation’s toughest challenges would be an important part of Park’s legacy.
Update: On September 4th, the White House confirmed that Smith would be the next US CTO and former Twitter general counsel Alex Macgillvray would be a deputy US CTO.
[PHOTO CREDIT: Pete Souza]
This post has been updated with additional links, statements and analysis.
Has the Internet showed up to comment on the Federal Communication Commission’s rulemaking around net neutrality, as I wondered when the Open Internet proceeding began? Well, yes and no. According to FCC press secretary Kim Hart, the FCC 677,000 or so total public comments on Net Neutrality submitted before tomorrow’s deadline.
@digiphile@pd_w@FCC Update: FCC has received approximately 677,000 comments on #NetNeutrality so far. Includes comments to docket + email
As Wall Street Journal reporter Gautham Nagesh tweeted, the FCC’s action on media deregulation a decade ago received the most public comments of any of the agency’s rulemakings to date, with two million or so comments.
What this total number means in practice, however, is that network neutrality advocates have failed to stimulate public interest or engagement with this issue, despite “warnings about the FCC’s fast lane” in the New York Times. While that is in part because net neutrality is to many people a “topic that generally begets narcolepsy,” to use David Carr’s phrase, it may also be because cable, broadcast and radio news haven’t covered the issue, much less shown the email address or offered a short URL for people to officially comment. The big jump in the graphic below after June 1st can reasonably be attributed to John Oliver’s segment on this issue on his HBO show, not other media.
That doesn’t mean that the comments haven’t flowed fast and furious at times, taking down the FCC’s ECFS system after Oliver’s show. (Shenanigans may have been at fault with the outage, too, as Sam Gustin reported at Vice.)
“During the past 60 days, the Commission has received a large number of comments from a wide range of constituents,” wrote FCC chief information officer David Bray on the FCC blog, where he reported the rate and total number of email comments on the Open Internet proceeding as open data and shared two graphics, including the one below.
Chairman Tom Wheeler and I both enthusiastically support open government and open data, so with this post I wanted to share the hourly rate of comments submitted into the FCC’s Electronic Comment Filing System (ECFS) since the start of public comments on the FCC’s Open Internet Proceeding (Proceeding 14-28). Here’s a link to a Comma Separated Values (CSV) text file providing those hourly rates for all comments submitted to ECFS and those specific to the Open Internet Proceeding; below is a graphical presentation of that same data.
I’m hoping we see the content of those public comments, too. I’ve asked.
Bray also wrote that the FCC’s inbox and (aged) public comment system will remain open and that the agency continues to “invite engagement from all interested parties.” He also indicated that the FCC will be considering ways to make it easier to third parties to scrape the comment data from the system.
The FCC IT team will also look into implementing an easier way for electronic “web scraping” of comments available in ECFS for comment downloads greater than 100,000 comments at once as we work to modernize the FCC enterprise.
The number of people submitting comments is impressive, underscoring the importance of this issue and the critical role public engagement plays in the Commission’s policy-making process. When the ECFS system was created in 1996, the Commission presumably didn’t imagine it would receive more than 100,000 electronic comments on a single telecommunications issue. Open government and open data is important to our rapidly changing times both in terms of the pace of technology advances and the tightening of budgets in government. I hope you find this information useful.
In the meantime, you have until tomorrow to participate.
UPDATE: On the afternoon of July 15th, the FCC extended the Open Internet comment period until Friday, July 18 at midnight. It appears that online interest was a large part of the decision. FCC press secretary Kim Hart:
“The deadline for filing submissions as part of the first round of public comments in the FCC’s Open Internet proceeding arrived today. Not surprisingly, we have seen an overwhelming surge in traffic on our website that is making it difficult for many people to file comments through our Electronic Comment Filing System (ECFS). Please be assured that the Commission is aware of these issues and is committed to making sure that everyone trying to submit comments will have their views entered into the record. Accordingly, we are extending the comment deadline until midnight Friday, July 18.”
One additional clarification from Hart, regarding the total number of comments and public access to their contents: emails are being entered into the official docket in ECFS but are not being filed individually in the docket. “A large number of them are put into a big PDF and then that single PDF is filed into ECFS, rather than filing them one by one,” she said, via email. “So they will all be in the docket, but in a couple dozen large files rather than individually. Some are already entered, but there’s a bit of a lag.”
Update: Per Hart, as of Thursday morning, the FCC has received a cumulative total of 968,762 comments: 369,653 to ECFS,
599,109 emails to the Open Internet inbox.
“This is the most comments the FCC has received in a rulemaking proceeding,” said Hart.
Update: As of Friday at 4 pm, 1,062,000 comments had been filed in the FCC’s Open Internet proceeding.
Statement from FCC Chairman Tom Wheeler regarding this outpouring of comments:
“When the Commission launched its effort to restore Open Internet protections that were struck down in January, I said that where we end up depends on what we learn during this process. We asked the public a fundamental question: “What is the right public policy to ensure that the Internet remains open?” We are grateful so many Americans have answered our call. Our work is just beginning as we review the more than one million comments we have received. There are currently no rules on the books to protect an Open Internet and prevent ISPs from blocking or degrading the public’s access to content. There is no question the Internet must remain open as a platform for innovation, economic growth and free expression. Today’s deadline is a checkpoint, not the finish line for public comment. We want to continue to hear from you. “
Statement from FCC spokesman Mark Wigfield regarding the process for reviewing these comments:
“We appreciate the high level of public engagement on the Open Internet proceeding and value the feedback we have received. The FCC has a great deal of experience handling complicated issues that draw extensive public comment. Managing this flood of information requires a combination of good technology, good organization and good people. We are currently examining a number of approaches. The FCC will deploy staff from across many bureaus and offices who have the training, organizational expertise, and track record of success sorting through large volumes of information to ensure that we account for all views in the record.”
Update: At the close of the initial comment period of the Open Internet proceeding, the FCC had received 1,067,779 comments: 446,843 were filed through the Electronic Comment Filing System, and 620,936 through the Open Internet inbox. Now, the “reply” period begins, and will run through September 10. Update: the FCC extended the reply period until September 15th to allow more time for the public to comment.
“The comment and reply deadlines serve to get public input to the FCC in a timely and organized way to provide more time for analysis.
However, comments are permitted in this proceeding any time up until a week before a vote is scheduled at an Open Meeting (the “Sunshine” period under the Sunshine in Government Act). ”
This post has been updated with more numbers, links and commentary, including the headline.
Last November, I speculated about the potential of a” kernel of a United States Digital Services team built around the DNA of the CFPB: digital by default, open by nature,” incorporating the skills of Presidential Innovation Fellows.
As I wrote last week, after a successful big fix to Healthcare.gov by a trauma team got the trouble marketplace for health insurance working, the Obama administration has been moving forward on information technology reforms, including a new development unit within the U.S. General Services Administration.
This week, that new unit became a real entity online, at “18F.
You know the UK’s awesome Government Digital Service? There’s a startup within the US gov’t to do the same work. https://t.co/yD1aZkIlOU
— Waldo Jaquith (@waldojaquith) March 12, 2014
As with the United Kingdom’s Government Digital Services Team, 18F is focused on delivery, an area that the UK’s executive director of digital, Mike Bracken, has been relentless in pushing. Here’s how 18F introduced itself:
18F builds effective, user-centric digital services focused on the interaction between government and the people and businesses it serves. We help agencies deliver on their mission through the development of digital and web services. Our newly formed organization, within the General Services Administration, encompasses the Presidential Innovation Fellows program and an in-house digital delivery team.
18F is a startup within GSA — the agency responsible for government procurement — giving us the power to make small changes with big effect. We’re doers, recruited from industry and the most innovative corners of public service, who are passionate about “hacking” bureaucracy to drive efficiency, transparency, and savings for government agencies and the American people. We make easy things easy, and hard things possible.
“Commencing countdown, engines on!” Meet 18F — a new way of delivering government services. http://t.co/UQD6ViA1FB
The 18F team, amongst other things, has some intriguing, geeky, and even funny titles for government workers, all focused around “agents.” API Agent. Counter Agent. Free Agent. Service Agent. Change Agent. User Agent. Agent Schmagent. Reagent. Agent onGover(). It’s fair to say that their branding, at minimum, sets this “startup in government” apart.
So does their initial foray into social media, now basic building block of digital engagement for government: 18F is on Twitter, Tumblr and Github at launch.
Looks like their office suite is pretty sweet, too.
This effort won’t be a panacea for federal IT ills, nor will a U.S. Government Digital Office nor the role of a U.S. chief technology officer be institutionalized until Congress acts. That said, 18F looks like a bonafide effort to take the approaches to buying, building and maintaining digital and Web services that worked in the Presidential Innovation Fellows program and the Consumer Financial Protection Bureau and trying to scale them around the federal government. The team explained more at their Tumblr blog about how they’ll approach their sizable remit:
Partner with agencies to deliver high quality in-house digital services using agile methodologies pioneered by top technology startups.
Rapidly deploy working prototypes using Lean Startup principles to get a desired product into a customer’s hands faster.
Offer digital tools and services that result in governmentwide reuse and savings, allowing agencies to reinvest in their core missions.
We’re transparent about our work, develop in the open, and commit to continuous improvement.
More than five years ago, Anil Dash wrote that the most interesting startup of 2009 was the United States government. Maybe, just maybe, that’s become true again, given the potential impact that the intelligent application of modern development practices could have on the digital government services that hundreds of millions of Americans increasingly expect and depend upon. What I’ve seen so far is promising, from the website itself to an initial pilot project, FBopen, that provides a simple, clean, mobile-friendly interface for small businesses to “search for opportunities to work with the U.S. government.”
Clay Johnson, a member of the inaugural class of Presidential Innovation Fellows and founder of a startup focused on improving government IT procurement, offered measured praise for the launch of 18F:
Is it a complete solution to government’s IT woes? No. But, like RFP-IT and FITARA, it’s a component to a larger solution. Much of these problems stem from a faulty way of mitigating risk. The assumption is that by erecting barriers to entry – making it so that the only bets to be made are safe ones – then you can never fail. But evidence shows us something different: by increasing the barriers to competition, you not only increase risk, you also get mediocre results.
The best way for government to mitigate risk is to increase competition, and ensure that companies doing work for the citizen are transparently evaluated based on the merits of their work. Hopefully, 18F can position itself not only as a group of talented people who can deliver, but also an organization that connects agencies to great talent outside of its own walls. To change the mindset of the IT implementation, and convince people inside of government that not only can small teams like 18F do the job, but there are dozens of other small teams that are here to help.
Given the current nation-wide malaise about the U.S. government’s ability to execute on technology project, the only approach that will win 18F accolades after the launch of these modern websites will be the unit’s ability to deliver more of them, along with services to support others. Good luck, team.
When 18F starts hiring, you’re going to want to drop everything to work there. https://t.co/yD1aZkIlOU
One of the most important open government data efforts in United States history came into being in 1993, when citizen archivist Carl Malamud used a small planning grant from the National Science Foundation to license data from the Securities and Exchange Commission, published the SEC data on the Internet and then operated it for two years. At the end of the grant, the SEC decided to make the EDGAR data available itself — albeit not without some significant prodding — and has continued to do so ever since. You can read the history behind putting periodic reports of public corporations online at Malamud’s website, public.resource.org.
Two decades later, Malamud is working to make the law public, reform copyright, and free up government data again, buying, processing and publishing millions of public tax filings from nonprofits to the Internal Revenue Service. He has made the bulk data from these efforts available to the public and anyone else who wants to use it.
“This is exactly analogous to the SEC and the EDGAR database,” Malamud told me, in an phone interview last year. The trouble is that data has been deliberately dumbed down, he said. “If you make the data available, you will get innovation.”
November Form 990s now ready. http://t.co/HDoMzPjpY0 We have 7,335,804 Form 990s available. *STILL* no word from the IRS.
Making millions of Form 990 returns free online is not a minor public service. Despite many nonprofits file their Form 990s electronically, the IRS does not publish the data. Rather, the government agency releases images of millions of returns formatted as .TIFF files onto multiple DVDs to people and companies willing and able to pay thousands of dollars for them. Services like Guidestar, for instance, acquire the data, convert it to PDFs and use it to provide information about nonprofits. (Registered users view the returns on their website.)
As Sam Roudman reported at TechPresident, Luke Rosiak, a senior watchdog reporter for the Washington Examiner, took the files Malamud published and made them more useful. Specifically, he used credits for processing that Amazon donated to participants in the 2013 National Day of Civic Hacking to make the .TIFF files text-searchable. Rosiak then set up CItizenAudit.org a new website that makes nonprofit transparency easy.
“This is useful information to track lobbying,” Malamud told me. “A state attorney general could just search for all nonprofits that received funds from a donor.”
Malamud estimates nearly 9% of jobs in the U.S. are in this sector. “This is an issue of capital allocation and market efficiency,” he said. “Who are the most efficient players? This is more than a CEO making too much money — it’s about ensuring that investments in nonprofits get a return.
“I think inertia is behind the delay,” he told me, in our interview. “These are not the expense accounts of government employees. This is something much more fundamental about a $1.6 trillion dollar marketplace. It’s not about who gave money to a politician.”
If I order these IRS DVDs, my cost is $2910. Media and gov get them free, but none of them lifting a finger to help. http://t.co/B6m5VECV1O
When asked for comment, a spokesperson for the White House Office of Management and Budget said that the IRS “has been engaging on this topic with interested stakeholders” and that “the Administration’s Fiscal Year 2014 revenue proposals would let the IRS receive all Form 990 information electronically, allowing us to make all such data available in machine readable format.”
Today, Malamud sent a letter of complaint to Howard Shelanski, administrator of the Office of Information and Regulatory Affairs in the White House Office of Management and Budget, asking for a review of the pricing policies of the IRS after a significant increase year-over-year. Specifically, Malamud wrote that the IRS is violating the requirements of President Obama’s executive order on open data:
The current method of distribution is a clear violation of the President’s instructions to
move towards more open data formats, including the requirements of the May 9, 2013 Executive Order making “open and machine readable the new default for government
information.”
I believe the current pricing policies do not make any sense for a government
information dissemination service in this century, hence my request for your review.
There are also significant additional issues that the IRS refuses to address, including
substantial privacy problems with their database and a flat-our refusal to even
consider release of the Form 990 E-File data, a format that would greatly increase the
transparency and effectiveness of our non-profit marketplace and is required by law.
It’s not clear at all whether the continued pressure from Malamud, the obvious utility of CitizenAudit.org or the bipartisan budget deal that President Obama signed in December will push the IRS to freely release open government data about the nonprofit sector,
The furor last summer over the IRS investigating the status of conservative groups claimed tax-exempt status, however, could carry over into political pressure to reform. If political groups were tax-exempt and nonprofit e-file data were published about them, it would be possible for auditors, journalists and Congressional investigators to detect patterns. The IRS would need to be careful about scrubbing the data of personal information: last year, the IRS mistakenly exposed thousands of Social Security numbers when it posted 527 forms online — an issue that Malamud, as it turns out, discovered in an audit.
“This data is up there with EDGAR, in terms of its potential,” said Malamud. “There are lots of databases. Few are as vital to government at large. This is not just about jobs. It’s like not releasing patent data.”
If the IRS were to modernize its audit system, inspector generals could use automated predictive data analysis to find aberrations to flag for a human to examine, enabling government watchdogs and investigative journalists to potentially detect similar issues much earlier.
That level of data-driven transparency remains in the future. In the meantime, CitizenAudit.org is currently running on a server in Rosiak’s apartment.
Whether the IRS adopts it as the SEC did EDGAR remains to be seen.
Fox News Radio: Technical issues at Healtcare.gov (archived here)
Last week, the Obama administration announced a plan to fix the issues with the software behind HealthCare.gov, including putting QSSI in charge of as the “general contractor” and prioritizing fixing errors in 834 file data first, with the goal of have the system functioning end-to-end by November 30.
The teams of Presidential Innovation Fellows and “A List” contractors in the “tech surge” to fix the software have a tough challenge ahead of them. According to reporting from The New York Times and The Washington Post, Healthcare.gov wasn’t tested as a complete system until the last week of September, when it crashed with only a few hundred users.
Despite the issues revealed by this limited testing, government officials signed off on it launching anyway, and thus was born a historic government IT debacle whose epic proportions may still expand further.
Should the White House have delayed?
“When faced with go live pressures, I tell my staff the following:
‘If you go live months late when you’re ready, no one will ever remember. If you go live on time, when you’re not ready, no one will ever forget.”-Dr. John Halamka, CIO Beth Israel Deaconness Hospital
In retrospect, the administration might have been better served by not launching on October 1st, something that was within HHS Secretary Kathleen Sebelius’ legal purview. After all, would the federal government launch a battleship that had a broken engine, faulty wiring or non-functional weapons systems into an ongoing fight? This software wasn’t simply “buggy” at launch — It was broken. These weren’t “glitches” caused by traffic, although the surge of traffic did expose where the system didn’t work quickly. Now that a reported 90% of users are able to register, other issues on the backend that are just beginning to become clear, from subsidy calculation to enrollment data to insurers reporting issues with what they are receiving to serious concerns about system security.
Based upon what we know about troubles at Healthcare.gov, it appears that people from the industry that were brought in to test the Healthcare.gov system a month ago urged CMS not to go live. It also appears that inside the agency who saw what was going on warned leadership months in advance that the system hadn’t been tested end-to-end. Anyone building enterprise software should have the code locked down in the final month and stopped introducing new features 3-6 months prior. Instead, it appears new requirements kept coming in and development continued to the end. The result is online now. (Or offline, as the case may be.)
On September 30, President Obama could have gone before the American people and said that the software was clearly not ready, explained why and told Americans that his administration wouldn’t push it live until they knew the system would work. HHS could have published a downloadable PDF of an application that could be mailed in and a phone number on the front page, added more capacity to the call centers and paper processing. It’s notable that three weeks later, that’s pretty much what President Obama said they have done.
The failed launch isn’t just about “optics,” politics or the policy of the Affordable Care Act itself, which is a far greater shift in how people in the United States browse, buy, compare and consume health insurance and services. A working system represents the faith and trust of the American people in the ability of government. This is something Jennifer Pahlka has said that resonates: how government builds websites and software matters, given the expectations that people now have for technology. The administration has handed the opponents of the law an enormous club to bash them with now — and they’ll deserve every bit of hard criticism they get, given this failure to execute on a signature governance initiative.
Articles worth reading on Healthcare.gov and potential reforms
Tough reporting on failures in e-government is critical to improving those services for all, but particularly for the poor.
A post by Development Seed founder Eric Gunderson on the open source front-end for Healthcare.gov: “It’s called Jekyll, and it works.”
Rusty Foster on Healthcare.gov: it could have been worse. This failure to (re)launch just happened under vastly more political scrutiny and deadlines set by Congress. The FBI’s Sentinel program, by contrast, had massive issues — but you didn’t see the Speaker of the House tweeting out bug reports or cable news pundits opining about issues. The same is true of many other huge software projects.
A must-read op-edby former Obama campaign CTO Harper Reed and Blue State Digital co-founder and former Presidential Innovation Fellow Clay Johnson on what ails government IT, adding much-needed context to what ailed Healthcare.gov
If not those folks, then how should the administration fix Healthcare.gov? In the larger sense, either the federal government will reform how it buys, builds and maintains software, through a combination of reforming procurement with modular contracting, bringing more technologists into government, and adopting open source and agile development processes …or this will just keep happening. The problems go much deeper that a “website.”
Ezra Klein pulled all of these pieces together in a feature on the “broken promise of better government through technology” at the end of the month. (He may have been heard in the Oval Office, given that the president has said he reads him.) Speaking at an “Organizing for America” event on November 4th, President Obama acknowledged the problem. “…I, personally, have been frustrated with the problems around the website on health care,” he said, “And it’s inexcusable, and there are a whole range of things that we’re going to need to do once we get this fixed – to talk about federal procurement when it comes to IT and how that’s organized…”
The issues behind Healthcare.gov cannot only be ascribed to procurement or human resources, as Amy Goldstein and Juliet Eilperin reported in the Washington Post: insularity and political sensitivity were a central factor behind the launch..
Based on interviews with more than two dozen current and former administration officials and outsiders who worked alongside them, the project was hampered by the White House’s political sensitivity to Republican hatred of the law — sensitivity so intense that the president’s aides ordered that some work be slowed down or remain secret for fear of feeding the opposition. Inside the Department of Health and Human Services’ Centers for Medicare and Medicaid, the main agency responsible for the exchanges, there was no single administrator whose full-time job was to manage the project. Republicans also made clear they would block funding, while some outside IT companies that were hired to build the Web site, HealthCare.gov, performed poorly.
What could be done next? Congress might look across the Atlantic Ocean for an example. After one massive IT failure too many, at the National Health Service the United Kingdom created and empowered a Government Digital Services team. UK Executive Director of Digital Mike Bracken urged U.S. to adopt a digital core.”
In the video below, Clay Johnson goes deep on what went wrong with Healthcare.gov and suggests ways to fix it.
Can the White House and Congress take on the powerful entrenched providers in Washington & do the same? I’m not optimistic, unfortunately, given the campaign contributions and lobbying prowess of those entities, but it’s not an impossible prospect. I’ll write more about it in the future.
Writing at the White House blog, deputy US CTO Nick Sinai and Presidential Innovation Fellow Ryan Panchadsaram explain what’s new behind the next iteration of the federal open government data platform.
The first incarnation of Data.gov and subsequent iterations haven’t excited the imagination of the nation. The next version, which employs open source technology like WordPress and CKAN, uses adaptive Web design and features improved search.
It also, critically, highlights how open data is fueling a new economy. If you read Slate, you already knew about how the future is shaping up, but this will provide more people with a reference. Great “next” step.
What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.
“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.
The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.
As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.
Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.
“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”
The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.
“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”
Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.
We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.
The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.
Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.
Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.
UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below: