Thoughts on the future of the US CIO, from capabilities to goals

vanroekel

This weekend, ZDNet columnist Mike Krigsman asked me what I thought of the tenure of United States chief information officer Steven VanRoekel and, more broadly, what I thought of the role and meaning of the position in general. Here’s VanRoekel’s statement to the press via Federal News Radio:

“When taking the job of U.S. chief information officer, my goal was to help move federal IT forward into the 21st Century and to bring technology and innovation to bear to improve IT effectiveness and efficiency. I am proud of the work and the legacy we will leave behind, from launching PortfolioStat to drive a new approach to IT management, the government’s landmark open data policy to drive economic value, the work we did to shape the mobile ecosystem and cloud computing, and the culmination of our work in the launch of the new Digital Service, we have made incredible strides that will benefit Americans today and into the future,” VanRoekel said in a statement. “So it is with that same spirit of bringing innovation and technology to bear to solve our most difficult problems, that I am excited to join USAID’s leadership to help stop the Ebola outbreak. Technology is not the solution to this extremely difficult task but it will be a part of the solution and I look forward to partnering with our federal agencies, non-profit organizations and private sector tech communities to help accelerate this effort.”

Here’s the part of what I told Krigsman that ended up being published, with added hyperlinks for context:

As US CIO, Steven VanRoekel was a champion of many initiatives that improved how technology supports the mission of the United States government. He launched an ambitious digital government strategy that moved further towards making open data the default in government, the launch of the U.S. Digital Service, 18F, and the successful Presidential Innovation Fellows program, and improved management of some $80 billion dollars in annual federal technology spending through PortfolioStat.

As was true for his predecessor, he was unable to create fundamental changes in the system he inherited. Individual agencies still have accountability for how money is spent and how projects are managed. The nation continues to see too many government IT projects that are over-budget, don’t work well, and use contractors with a core competency in getting contracts rather than building what is needed.

The U.S. has been unable or unwilling to reorganize and fundamentally reform how the federal government supports its missions using technology, including its relationship to incumbent vendors who fall short of efficient delivery using cutting-edge tech. The 113th Congress has had opportunities to craft legislative vehicles to improve procurement and the power of agency CIOs but has yet to pass FITARA or RFP-IT. In addition, too many projects still look like traditional enterprise software rather than consumer-facing tools, so we have a long way to go to achieve the objectives of the digital playbook VanRoekel introduced.

There are great projects, public servants and pockets of innovation through the federal government, but culture, hiring, procurement, and human resources remain serious barriers that continue to result in IT failures. The next U.S. CIO must be a leader in all respects, leading by example, inspiring, and having political skill. It’s a difficult job and one for which it is hard to attract world-class talent.

We need a fundamental shift in the system rather than significant tweaks, in areas such as open source and using the new Digital Service as a tool to drive change. The next US CIO must have experience managing multi-billion dollar budgets and be willing to pull the plug on wasteful or mismanaged projects that serve the needs of three years ago, not the future.

U.S. CIO Steven VanRoekel on the risks and potential of open data and digital government

Last year, I conducted an in-depth interview with United States chief information officer Steven VanRoekel in his office in the Eisenhower Executive Office Building, overlooking the White House. I was there to talk about the historic open data executive order that President Obama had signed in May 2013. vanroekel On this visit, I couldn’t help but notice that VanRoekel has a Star Wars clock in his office.  The Force is strong here. The US CIO also had a lot of other consumer technology around his workspace: a MacBook and Windows laptop and dock, dual monitors, iPad, a teleconferencing system integrated with a desktop PC, and an iPhone, which recently became securely permissible on in the White House IT system in a “bring your own device” pilot. The interview that follows is slightly dated, in certain respects, but still offers significant insight into how the nation’s top IT executive is thinking about digital government, open data and more. It has also been lightly edited, primarily removing the long-winded questions of the interviewer.

We’re at the one year mark of the Digital Government Strategy. Where do we stand with hitting the metrics in the strategy? Why did it take until now to get this out?

VanRoekel: The strategy calls for the launch of the policy itself. Throughout the year, the policy was a framework for a 12 month set of deliverables of different aspects, from the work we’re doing in mobile, from ‘bring your own device,’ to security baselines and mobile device management platforms. Not only streamlining procurement, streamlining app development in government. Managing those devices securely to thinking about the way we do customer service and the way we think about the power of data and how it plays into all of this. It’s been part of that process for about the year we’ve been working on it. Of course, we thought through these principles and have been working on data-related aspects for longer. The digital strategy policy was the framework for us to catalyze and accelerate that, and over the course of the year, the stuff that’s been going on behind the scenes has largely been working with agencies on building some of this capability around open data. You’re going to see some things happening very soon on the release of some of this capability. Second, standing up the Presidential Innovation Fellows program and then putting specific ‘PIFs’ into certain targeted agencies to fast track their opening of data — that’s going to extend into Wave Two. You’re going to see that continuing to happen, where we just take these principles and just kind of ‘rinse and repeat’ in government. Third, we’re working with a small set of the community to build tools to make it easy for agencies to implement these guidelines. So if there’s an agency that doesn’t know how to create a JSON file, that tool is on Github. You can see that on Project Open Data .

How involved has the president been in this executive order? It’s his name, his words are in there — how much have you and U.S. chief technology officer Todd Park talked with the president about this?

VanRoekel: Ever since about last summer, we’ve been talking to the president about open data, specifically. I think there’s lots of examples where we’ve had conversations on the periphery, and he’s met with a lot of tech leaders and others around the country that in many, many cases have either built their business or are relying upon some government service or data stream. We’re seeing that culminating into the mindset of what we do as a factor of economic growth. His thoughts are ‘how do we unlock this national resource?’ We’re sitting on this treasure trove – how do we unleash it into the developer community, so that these app developers can build these different solutions?’ He’s definitely inspired – he wrote that cover memo to the digital strategy last May – and then we’ve had all of these different meetings, across the course of the year, and now it culminates into this executive order, where we’re working to catalyze these agencies and get them to pay attention and follow up.

We’ve been down this road before, in some respects, with the Open Government Directive in 2009, with former US CIO Vivek Kundra putting forward claims of positive outcomes from releasing data. Yet, what have we learned over the past four years? What makes this different? Where’s the “how,” in terms of implementing this?

VanRoekel: The original launch of data.gov was, I think, a way of really shocking the system, and getting people to pay attention to and notice that there was an important resource we’re sitting on called data. Prior to data.gov, and prior to the work of this administration, the daily approach to data was very ad hoc. It wasn’t taken as data, it was just an output or a piece of a broader mix. That’s why you get so much disparity in the approach to the way we manage data. You get the paper-driven processes that are still very prevalent, where someone will send a paper document, and someone will sign it, and scan it, feed it into a system, and then eventually print it and mail it. It’s crazy what you end up seeing and experiencing inside of government in terms of how these things work. Data.gov was an important first step. The difference now is really around taking this approach to everything that we do. The work that we did with the Open Government Directive back in 2009 was really about taking some high value data sets and putting them up on Data.gov. What you ended up seeing was kind of a ‘bulk upload, bulk download,’ kind of access to the data. Machine-readability and programmability wasn’t taken into account, or the searchability and findability.

Did entrepreneurs or advocates validate these data sets as “high value?” Entrepreneurs have kept buying data from government over the past four years or making Freedom of Information Act requests for data from government or scraping data. They’re not getting that from Data.gov.

VanRoekel: I have no official way of measuring the ‘value’ of the data, other than anecdotal conversations. I do think that the motion of getting people to wake up and think about how they are treating data internally within in an organization – well, there was a convenience factor to that, which basically was that ‘I got to pick what data I release,’ which probably dates from ‘what data I have that’s releasable?’ The different tiers to this executive order and this policy are a huge part of why it’s different. It sets the new default. It basically says, if you are modernizing a system or creating a new system, you can do that in a way that adopts these principles. If you [undertake] the collection, use and dissemination of data, you’ll make those machine-readable and interoperable by default. That doesn’t always mean public, because there are applications that privacy and national security mean we should make public, but those principles still hold, in terms of the way I believe we the ways we build things should evolve on this foundation. For the community that’s getting value outside of the government, this really sets a predictable, consistent course for the government to open up data. Any business decisions are risk-based decisions. You have to assume some level of risk with anything you do.

If there’s too much risk, entrepreneurs won’t do it.

VanRoekel: True. To that end, the work we’ve done in this policy that’s different than before is the way we’re collecting information about the data is being standardized. We’re creating a meta data infrastructure. Data itself doesn’t have to be all described in the same way. We’re not coming up with “one schema to rule them all” across government. The complexity of that would be insurmountable. I don’t think that’s a 21st century approach. That’s probably a last century thinking around to say that if we get one schema, we’re going to get it all done. The meta data approach is to say let’s collect a standard template way of describing – but flexible for future extension – the data that is contained in government. In that description, and in that meta data, tags like “who owns this data” and “how often is the data updated,” information about how to get a hold of people to find out more about descriptions within the data. They will be a part of that description in a way that gives you some level of assurance on how the data is managed. Much of the data we have out there, there’s existing laws on the books to collect the data. Most of it, there’s existing laws, not just a business process. One of the great conversations we’re having with the agencies is that they find greater efficiency in the way they collect data and build solutions based upon these open data principles.

I received a question from David Robinson, regarding open licensing in this policy. Isn’t U.S. government data exempt from copyright?

VanRoekel: Not all government data is exempt from copyright, but those are generally edge cases. The Smithsonian takes pictures of things that are still under copyright, for instance. That’s government data. I sent a note about this announcement to the Secretary of the Smithsonian this morning. I’ve been talking to him about opening up data for some time. The nuance there, about open licenses, is really around the types of systems that create the data, and putting a preference for a non-proprietary format. You can imagine a world in which I give you an XML file, and I give you a Microsoft Excel file. Those are both piece of data. To some extent, the Excel format is machine-readable. You can open it up and look at it internally just the way it is, but do you have to go buy a special piece of software to read the file or not? That kind of denotes the open[ness] and accessibility of the data. In the case of policy, we declare a strong preference towards these non-proprietary formats, so that not only do you get machine-readability but you get the broadest access to the data. It’s less about the content in there – is that’s copyrighted or not — I think most data in government, outside of the realm of confidential or private data, is not copyrighted, so to speak from the standpoint of the license. It’s more about the format, and if there’s a proprietary standard wrapped in the stuff. We have an obligation as a government to pick formats, pick solutions, et cetera that not only have the broadest applicability and accessibility for the public but also create the most opportunity in the broadest sense.

Open data doesn’t come without costs. Is this open data policy an unfunded mandate on all of the agencies, instructing them to put all of the data online they can, to digitize content?

VanRoekel: In the broadest sense, the phrase ‘the new default’ is an important one. It basically says, for enhancements to existing systems or new systems, follow this guideline. If people are making changes, this is in the list of requirements. From a costing perspective, it’s pre-baked into the cost of any enhancement or release. That’s the broad statement. The narrow statement is that there are many agencies out there, increasing every day, that are embracing these retroactive open data approaches, saying that there is value to the outside world, there is lower cost, greater interoperability, there are solutions that can be derived from taking these open data approaches inside of my own organization. That’s what we saw in PIF [Presidential Innovation Fellows] round one, where these agencies adopted the innovations fellows to unlock their data. That’s increasing and expanding in round two, and continuing in the agencies which we thought were high administration priorities, along with others. I think we’re going to continue to see this as a catalyzing element of that phenomenon, where people are going to back and spend the resources on doing this. Just invite any of these leaders to the last twenty minutes of a hackathon, where folks are standing up and showing their solutions that they developed in one day, based on the principles of open data and APIs. They just are overwhelmed about the potential within their own organizations, and they run back and want to do this as fast as they can.

Are you using anything that has ever been developed at a hackathon, personally or professionally?

VanRoekel: We are incorporating code from the “We The People” hackathon, the most recent one. I know Macon Phillips and team are looking at incorporating feature sets they got out of that. An important part of the hackathon, like most conferences you go to, is the time between the sessions. They’re the most important – the relationship building aspect, figuring out how we shape the next set of capabilities or APIs or other things you want to build.

How does this relate to the way that the federal government uses open data internally?

VanRoekel: There are so many examples of government agencies, when faced with a technical problem, will go hire a single monolithic vendor to do a single, monolithic solution – and spend most of the budget on the planning cycle – and you end up with these multi-million dollar, 3-ring binders that ultimately fail because technology has moved on or people have left or laws have moved on five or ten years later, after they started these projects. One of the key components of this is laying foundational stones down to say how are we going to build upon that, to create the apps and solutions of the future. You know, I can swoop in and say “here’s how to do modular contracting in the context of government acquisition” – but unless you say, you’ve got to adopt open data and these principles of API-first, of doing things a different way — smaller, reusable, interoperable pieces – you can really build the phenomenon. These are all elements of that – and the cost savings aspect of it are extraordinary. The risk profile is going to be a lot smaller. Inside government I’m as excited about as outside.

Do you think the federal government will ever be able to move from big data centers and complicated enterprise software to a lightweight, distributed model for mobile services built on APIs?

VanRoekel: I think there is massive potential for things like that across the whole of government. I mean, we’re a big organization. We’re the largest buyer of technology in the world. We have unending opportunities to do things in a more efficient way. I’ve been running this process that I launched last year called Portfolio Stat. It’s all about taking a left to right look, sitting down with agencies. What I’ve always been missing from those is some of these groundbreaking policies that start to paint the picture for what the ideal is, and how to get your job done in a way that’s different than the way you’ve don’t it before, like the notion of continuous improvement. We’ve needed things like the EO to give us those conversation starters to say, here’s the way to do it, see what they are doing over at HHS. “How are you going to bring that kind of discipline into your organization?” I’m sitting down with every deputy secretary and all the C-level executives to have those tough conversations. Fruitful, but good conversations about how we are going to change the way we deliver solutions inside of government. The ideal state that they’ll all hear about is the service-oriented model with centralized, commodity computing that’s mostly cloud-based. Then, how do you provide services out to the periphery of your organization.

You told me in our last interview that you had statutory authority to make things happen. What happens if a federal CIO drags his or her feet and, a year from now, you’re still here and they’re not moving on these policies, from cloud to open data?

VanRoekel: The answer I gave to you last time still holds: it’s about inspire and push. Inspire comes in many factors. One is me coming in and showing them the art of the possible, saying there’s a better way of doing this, getting their customers to show up at the door to say that we want better capabilities and get them inspired to do things, getting their leadership to show up and say we want better things. Push is about budget – how do you manage their budget. There’s aspects of both inspire and push in the way we’ve managed the budget this year. I have the authority to do that.

What’s your best case for adopting an open data strategy and enterprise data inventory, if you’re trying to inspire?

VanRoekel: The bottom line is meet your mission faster and at a much lower cost. Our job is not about technology as an end state – it’s about our mission. We’ve got to get the mission of government done. You’re fostering immigration, you’re protecting public safety, you’re providing better energy guidance, you’re shaping an industry for the country. Open data is a fundamental building block of providing flexibility and reusability into the workplace. It’s what you do to get you to the end state of your mission. I hearken back a lot to the examples we used at the FCC, which was moving from like fourteen websites to one and how we managed that. How do we take workload of a place so that the effort pays for itself in six months and start yielding benefits beyond that? The benefits are long-term. When you build that next enhancement, or that new thing on top of it, you can realize the benefits at lower cost. It’s amazing. I do these TechStat processes, where I sit down with the agencies. They have some project that’s going off the rails. They need help, focus, and some executive oversight. I sit down, usually in a big room of people, and it’s almost gotten to the point where you don’t need to look at the briefing documents ahead of time. You sit down and say, I bet you’re doing it this way – and it’s monolithic, proprietary, probably taking a lot of packaged software and writing a lot of glue code to hold it all together – and you then propose to them the principles of open data and open approaches to doing the solution, and tell them I want to see in the next sixty days some customer-facing, benefit value that’s built on this model. They go off and do that, and they get right back on the tracks and they succeed. Time after time when we do TechStat, that’s the formula and it’s yielded these incredible results. That culture is starting to permeate into how we get stuff done, because they see how it might accomplish their mission if they just turn 45 degrees and try a different approach. If that makes them successful, they will go there every time.

Critiques of open data raise concerns about “discretionary disclosure,” where a government entity releases what it wants, claim credit for embracing open government, and obfuscates the rest of the data. Does this policy change any of the decisions that are being made to delay, redact or not release requested data?

VanRoekel: I think today marks an inflection point that will set a course for the future. It’s not that tomorrow or next month or next year that all government data will just be transformed into open, machine-readable form. It will happen over time. The key here is that we’ve created mechanisms to protect privacy and security of data but built in culture where that which is intended to be public should be made public. Part of what is described in the executive order is the formation of this cross-agency executive group that will define a cross-agency priority goal, that we need to get inventories in from agencies regarding that which they hold that could be made public. We want to know stuff that’s not public today, what could be out there. We’re going to take that in and look at how we can set goals for this year, the next year and the year after that to continue to open up data at a faster pace than we’ve been doing in the past. The modernization act and some of the work around setting goals in government is much more compatible and looks a lot like the private sector. We’re embracing these notions that I’ve really grown to love and respect over the course of my private sector career in government around methodologies. Stay tuned on the capital and what that looks like.

Are you all going to work with the House and Senate on the DATA Act or are statutory issues on oversight still a stumbling block?

VanRoekel: The spirit of the DATA Act, of transparency and openness, are the things we’re doing, and I think are embraced. Some of the tactical aspects of the act were a little off the mark, in terms of getting to the end state that we want to get to. If you look at the FY-14 budget and the work we’ve done on transferring USASpending.gov to Treasury to get it closer to the source of the data, plus a view into how those systems get modernized, how we bring these principles into that mix, that will all be a part of the end state, which is how we track the spending.

Do you ever anticipate the data going into FOIA.gov also going into Data.gov?

VanRoekel: I don’t know. I can’t speculate on that. I’m not close enough to it.

Well, FOIA requests show demand. Do you have any sense of what people are paying for now, in terms of government data?

VanRoekel: I don’t.

Has anybody ever asked, to try to figure that out?

VanRoekel: I think that would be a great thing for you to do.

I appreciate that, but this strikes me as an interesting assessment that you could be doing, in terms of measuring outflows for business intelligence. If someone buys data, it shows that there is value in it. What would it mean if releases reflected that signal?

VanRoekel: You mean preference data that is being purchased?

Right.

VanRoekel: Well, part of this will be building and looking at Data.gov. Some of the stuff coming there is really building community around the data. The number one question Todd Park and I had coming out of the PIF program, at the end of May [2013] was, what if I think there’s data, but I don’t know, who do I contact? An important part of the delivery of this wave and the product coming out as part of this policy is going to be this enhanced Data.gov, that’s our intention to build a much richer community around government data. We want to hear from people. If there are data sources that do hold promise and value, let’s hear about those and see if there are things we can do to get a PIF on structuring it, and get agencies to modernize systems to get it released and open. I know some of the costs are like administrative feeds for printing or finding the data, something that’s related to third parties collecting it and then reselling it. We want to make sure that we’re thoughtful in how we approach that.

How has the experience that you’ve seen everyone have with the first iteration of Data.gov informed the nation’s open data strategy today? What specifically had not been done before that you will be doing now?

VanRoekel: The first Data.gov set us on a cultural path.What it didn’t do was connect you to data the source. What is this data? How often is it updated? Findability and searchability of broad government data wasn’t there. Programmability of the data wasn’t necessarily there. Data.gov, in the future, instead of being a repository for data, a place to upload the data, my intention is that it will become a meta data catalog. It will be the place you go, the one-stop-shop, to find government data, across multiple aspects. The way we’re doing this is through the policy itself, which says that agencies have to go and set up this new page, similar to what is now standard in open government, /open, /developer. In that page, the most important part of that page is a JSON file. That’s what data.gov can go out and crawl, or any developer outside can go out and crawl, to find out when data has been updated, what data is available, in what format. All of the standard meta data that I’ve described earlier will be represented through that JSON file. Data.gov will then become a meta data catalog of all the open data out in government at its source. As a developer, you’d come in, and it you wanted to do a map, for instance, to see what broadband capabilities exist near low-income Americans and then overlay locations of educational institutions, if you wanted to look for a correlation between income and broadband deployment and education, you’d hypothetically be looking for 3 different data sources, from 3 different agencies. You’d be able to find the open data streams, the APIs, to go get that data in one place, and then you’d have a connection back to the mothership to be able to grab it, find out who owns it. We want to still have a center of gravity for data, but make the data itself follow these principles, in terms of discoverability and use. The thing that probably got me most pointed in this direction is the President’s Council of Advisors on Science and Technology (PCAST), which did a report on health IT. Buried on page 60 or something, it had this description of meta data as the linchpin of discoverability of diverse data sources. That’s the approach we’ve taken, much like Google.

5 years from now, what will have changed because of this effort?

VanRoekel: The way we build solutions inside of government is going to change, and the amount of apps and solutions outside of government are going to fundamentally change. You and I now, sitting in our cars, take for granted the GPS signal going to the device on the dash. I think about government. Government is right there with me, every single day, as I’m driving my car, or when I do a Foursquare check-in on my phone. We’ll be bringing government data to citizens where they are, versus making people come to government. It’s been a long time since the mid-80s, when we opened up GPS, but look at where we are today. I think we’ll look back in 10 or 15 years and think about all of the potential we unlocked today.

What data could be like GPS, in terms of their impact on our lives?

VanRoekel: I think health and energy are probably two big ones.

POSTSCRIPT

Since we talked, the Obama administration has followed through on some of the commitments the U.S. CIO described, including relaunching Data.gov and releasing more data. Other goals, like every agency releasing an enterprise data inventory or publishing a /data and /developer page online, have seen mixed compliance, as an audit by the Sunlight Foundation showed in December. The federal government shutdown last fall also scuttled open data access, where certain data types were deemed essential to maintain and others were not. The shutdown also suggested that an “API-first” strategy for open data might be problematic. OMB, where VanRoekel works, has also quietly called for major changes in the DATA Act, which passed the House of Representatives with overwhelming support at the end of last year. A marked up version of the DATA Act obtained by Federal News Radio removes funding for the legislation and language that would require standardized data elements for reporting federal government spending. The news was not received well on Capitol Hill. Sen. Mark Warner, D-Va., the lead sponsor of the DATA Act in the Senate, reaffirmed his commitment to the current version of the bill in statement: “The Obama administration talks a lot about transparency, but these comments reflect a clear attempt to gut the DATA Act. DATA reflects years of bipartisan, bicameral work, and to propose substantial, unproductive changes this late in the game is unacceptable. We look forward to passing the DATA Act, which had near universal support in its House passage and passed unanimously out of its Senate committee. I will not back down from a bill that holds the government accountable and provides taxpayers the transparency they deserve.” The leaked markup has led to observers wondering whether the White House wants to scuttle the DATA Act and others to potentially withdraw support. “OMB’s version of the DATA Act is not a bill that the Sunlight Foundation can support,” wrote Matt Rumsey, a policy analyst at the Sunlight Foundation. “If OMB’s suggestions are ultimately added to the legislation, we will join our friends at the Data Transparency Coalition and withdraw our support of the DATA Act.” In response to repeated questions about the leaked draft, the OMB press office has sent the same statement to multiple media outlets: “The Administration believes data transparency is a critical element to good government, and we share the goal of advancing transparency and accountability of Federal spending. We will continue to work with Congress and other stakeholders to identify the most effective & efficient use of taxpayer dollars to accomplish this goal.” I have asked the Office of Management and Budget (OMB) about all of these issues and will publish any reply I receive separately, with a link from this post.

Intelligence executive David Bray to become new FCC CIO

david-bray-flack-jacketDavid Bray, a seasoned national intelligence executive (CV), will be the next chief information officer of the Federal Communications Commission. He’s expected to finish his work in the intelligence community at the Office of the Director for National Intelligence and commence work at the FCC in August.

“As the next FCC CIO, I look forward [to] aiding the FCC’s strong workforce in pioneering new IT solutions for spectrum auctions, next-gen cybersecurity, mobile workforce options, real-time enterprise analytics, enhanced open data, and several other vital public-private initiatives,” wrote Bray, in an email sent to staff and partners Monday night.

Bray holds a a PhD in information systems, a MSPH in public health informatics, and a BSCI in computer science and biology from Emory University, alongside a visiting associateship from the University of Oxford’s Oxford Internet Institute, and two post-doctoral associateships with MIT’s Center for Collective Intelligence and the Harvard Kennedy School. He also has served as a visiting associate with the National Defense University. Bray’s career also includes deployments to Afghanistan, projects at the Department of Energy and work at the Center of Disease Control.

Bray will inherit many IT challenges from former FCC CIO Robert Naylor, who announced that he’d be stepping down in December 2012. His background in the intelligence community will serve him well, with respect to network security issues, but he’ll need to continue to transition an agency that has traditionally outsourced much of its technology to 21st century computing standards and approaches to building infrastructure and meeting increasing demand for services.

Bray’s past work in collective intelligence, informatics, public health and data science suggest that he’ll have no shortage of vision to bring to the role. His challenge, as is true for every federal CIO these days, will be to work within limited budgets and under intense scrutiny to deliver on the promise.

To get a sense of Bray, watch his talk on “21st century social institutions at a brunch for Emory University scholars, in 2013:

San Francisco experiments with citizensourcing better ideas

As significant as the revisions to San Francisco’s open data policy may prove to be, city officials and civic startups alike emphasize that it’s people are fundamental to sustained improvements in governance and city life.

“Open data would not exist without our community,” said Jay Nath, the city’s first chief innovation officer, this Monday at the Hatchery.

San Francisco’s approach to open innovation in the public sector — what businesses might describe as crowdsourcing, you might think of as citizensourcing for cities — involves a digital mix of hackathons, public engagement and a renewed focus on the city’s dynamic tech community, including the San Francisco Citizens Initiative for Technology and Innovation, or SF.citi.

Cities have been asking their residents how government could work better for some time, of course — and residents have been telling city governments how they could work better for much longer than that. New technologies, however, have created new horizons for participatory platforms to engage citizens, including mobile apps and social media.

Open data and civic coders also represent a “new class of civic engagement focused on solving issues, not just sharing problems,” argues Nath. “We have dozens and dozens of apps in San Francisco. I think it’s such a rich community. We haven’t awarded prizes. It’s really about sustainability and creating community. We’ve six or seven events and more than 10,000 hours of civic engagement.”

San Francisco’s dedicated citizensourcing platform is called “ImproveSF.” The initiative had its genesis as an internal effort to allow employees to make government better, said Walton. The ideas that come out of both, he said, are typically about budget savings.

The explosion of social media in the past few years has created new challenges for San Francisco to take public comments digitally on Facebook or Twitter that officials haven’t fully surmounted yet.

“We don’t try to answer and have end-to-end dialog,” said Jon Walton, San Francisco’s CIO, in an interview earlier this year. Part of that choice is driven by the city’s staffing constraints.

“What’s important is that we store, archive and make comments available to policy makers so that they can see what the public input is,” he said.

Many priorities are generated by citizen ideas submitted digitally, emphasized Walton, which then can be put on a ballot that residents then vote on and become policy by public mandate.

“How do you get a more robust conversation going on with the public?” asked Walton. “In local government, what we’re trying to do is form better decisions on where we spend time and money. That means learning about other ideas and facilitating conversations.”

He pointed to the deployment of free public Wi-Fi this year as an example of how online public comments can help shape city decisions. “We had limited funds for the project,” he said. “Just $80,000. What can you do with that?”

Walton said that one of the first things they thought about doing was putting up a website to ask the public to suggest where the hotspots should be.

The city is taking that feedback into account as it plans future wifi deployments:

green dot Completed sites

blue dot Sites in progress

Walton said they’re working with the mayor’s office to make the next generation of ImproveSF more public-facing.

“How do we take the same idea and expose it to the public?” he asked. “Any new ‘town hall’ should really involve the public in asking what the business of government should be? Where should sacrifices and investments be made? There’s so much energy around the annual ballot process. People haven’t really talked about expanding that. The thing that we’re focusing on is to make decision-making more interactive.”

At least some of San Francisco’s focus has gone into mobile development.

“If you look at the new social media app, we’re answering the question of ‘how do we make public meetings available to people on handhelds and tablets’?” said Walton.

“The next generation will focus on how do they not just watch a meeting but see it live, text in questions and have a dialog with policy makers about priorities, live, instead of coming in in person.”

Kundra: Closing the IT gap is the key to making government work better for the American people

Today, the first chief information officer of the United States, Vivek Kundra, shared his reflections on public service.

Kundra, whose last day of work at the White House Office of Management and Budget was last Friday, is now at the Harvard Kennedy School and Berkman Center.

I arrived at a White House that was, as the Washington Post put it, “stuck” in the “Dark Ages of technology.” In their words, “If the Obama campaign represented a sleek, new iPhone kind of future, the first day of the Obama administration looked more like the rotary-dial past.”

As my team congratulated me on the new job, they handed me a stack of documents with $27 billion worth of technology projects that were years behind schedule and millions of dollars over budget. At the time, those documents were what passed for real-time updates on the performance of IT projects. My neighbor’s ten year old could look up the latest stats of his favorite baseball player on his phone on the school bus, but I couldn’t get an update on how we were spending billions of taxpayer dollars while at my desk in the White House. And at the same time, the President of the United States had to fight tooth and nail to simply get a blackberry.

These were symptoms of a much larger problem.

The information technology gap between the public and private sectors makes the Federal Government less productive and less effective at providing basic services to its citizens. Closing this gap is the key to making government work better for the American people – the ultimate goal.

His complete thoughts are embedded below. If you’re interested in frank insight into why changing government through information technology isn’t easy, read on.

Vivek Kundra’s Reflections on Public Service 2011(function() { var scribd = document.createElement(“script”); scribd.type = “text/javascript”; scribd.async = true; scribd.src = “http://www.scribd.com/javascripts/embed_code/inject.js”; var s = document.getElementsByTagName(“script”)[0]; s.parentNode.insertBefore(scribd, s); })();

The US CIO goes to the white board to describe good government

Earlier this week, United States CIO Vivek Kundra turned to the White House whiteboard to talk about sunshine, savings and service. If you’re unfamiliar with Kundra, he’s the man who has proposed and now is entrusted with implementing sweeping federal IT reform. One of the tools he’s been applying to the task is the so-called IT dashboard, which helps the White House Office of Management and Budget, where he serves to track IT spending. He claims to have reduced federal IT spending by some $3 billion dollars over the past two years with increased tracking and scrutiny.The federal CIO explains more about the results from that work, below.

http://www.whitehouse.gov/sites/all/modules/swftools/shared/flash_media_player/player5x2.swf

UPDATE: As open data consultant Dan Morgan pointed out, however, the Government Accountability Office reported that while OMB has made improvements to its dashboard, “further work is needed by agencies and OMB to ensure data accuracy.”

…inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance. Agencies generally concurred with the recommendations; OMB did not concur with the first recommendation but concurred with the second. GAO maintains that until OMB implements both, performance may continue to be inaccurately represented on the Dashboard.

One question left unanswered: Is /good the new /open? Decide for yourself at the newGood Government” section at WhiteHouse.gov.

IBM initiative adds Big Blue to government cloud computing market

What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.

[Image Credit: Envis-Precisely.com]

“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.

The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.

As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.

Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.

“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”

The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.

“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”

Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.

We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.

The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.

Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.

UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below:

What does Gov 2.0 have to do with cloud computing?

Last week, Gartner analyst Andrea DiMaio rendered his opinion of what Gov 2.0 has to do with cloud computing. In his post, he writes that “ironically, the terms “cloud” and “open” do not even fit very well with each other,” with respect to auditability and compliance issues.

I’m not convinced. Specifically, consider open source cloud computing at NASA Nebula and the OpenStack collaboration with Rackspace and other industry players, or Eucalyptus.For more, read my former colleague Carl Brooks at SearchCloudComputing for extensive reporting in those areas. Or watch NASA CTO for IT Chris Kemp below:

Aside from the work that CloudAudit.org is doing to address cloud computing, after reading DiMaio’s post, I was a bit curious about how familiar he is with certain aspects of what the U.S. federal government is doing in this area. After all, Nebula is one of the pillars of NASA’s open government plan.

Beyond that relationship, the assertion that responsibility for cloud computing deployment investment resides in the Office for Citizen Engagement might come as a surprise to the CIO of GSA. McClure certainly is more than conversant with the technology and its implications — but I have a feeling Casey Coleman holds the purse strings and accountability for implementation. Watch the GSA’s RFP for email in the cloud for the outcome there.

To Adriel Hampton’s point on DiMaio’s post about cloud and Gov 2.0 having “nothing to do with one another,” I’d posit that that’s overly reductive. He’s right that cloud in of itself doesn’t equal Gov 2.0. It’s a tool that enables it.

Moving Recovery.gov to Amazon’s cloud, for instance, is estimated to save the federal government some $750,000 over time and gives people the means to be “citizen inspector generals.” (Whether they use them is another matter.) Like other tools borne of the Web 2.0 revolution, cloud has the potential enable more agile, lean government that enables better outcomes for citizens, particularly with respect to cost savings, assuming those compliance concerns can be met.

The latter point is why Google Apps receiving FISMA certification was significant, and why Microsoft has been steadily working towards it for its Azure platform. As many observers know, Salesforce.com has long since signed many federal customers, including the U.S. Census.

DiMaio’s cynicism regarding last week’s Summit is interesting, although it’s not something I can spend a great deal of time in addressing. Would you tell the Gov 2.0 community to stop coming together at camps, forums, hearings, seminars, expos, summits, conferences or local government convocations because an analyst told you to? That’s not a position I’m coming around to any time soon, not least as I look forward to heading to Manor, Texas next week.

In the end, cloud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.

[*Note Gartner’s reply in the comments regarding the resolution of the magic quadrant suit. -Ed.]

State CIOs rank cloud computing, green IT and social media as top emerging tech

According to a March 2010 survey of state chief information officers by NASCIO , Grant Thornton and Tech America, public IT executives in the United States are looking seriously at investing in the cloud and green IT. 50% of the 40 CIOs, IT resource management officials and OMB representatives surveyed planned to invest in cloud computing. Additionally, some two thirds of those surveyed are using social media. The report is embedded below.

2010 Tech America Federal CIO Survey Final Reporthttp://static.slidesharecdn.com/swf/doc_player.swf?doc=2010techamericaciosurveyreportfinal-100330103736-phpapp01&stripped_title=2010-tech-america-federal-cio-survey-final-report

[Hat Tip: Governing People]