As dozens of freshmen Representatives move into their second week of work as legislators here in the District of Columbia, they’re going to come up against a key truth that White House officials have long since discovered since the heady … Continue reading
“The single best thing we could do in open government is to get the American people engaged in the question of what high value data is,” said Aneesh Chopra, the first United States chief technology officer, speaking at this morning’s Politico “What’s Next in Tech” forum in Union Station. Video is below:
In an interview with Politico’s technology editor, Kim Hart, Chopra looked back at the lessons learned from his first two years on the job and ahead, appropriately, to what to expect in tech policy from the Obama administration. They covered a lot of ground, from open government successes to what’s next in Congress (hint: watch the push to open up spectrum for first responders) to supporting entrepreneurial growth.
What were Chopra’s lessons learned? He offered up three examples.
First, with support from the President, Chopra said that they’ve been able to open up discussion and build trusted relationships across the federal government, which has been “critical” to improving the way technology could be used and the long term policy posture.
Second, with that support, he’s been surprised on seeing the pace of response become fast. There’s a “lesson on balance of getting long term balance, versus getting results in 90 days,” he said, referring to the turnaround on projects like HealthCare.gov.
Third, Chopra emphasized the role of “government as a convener,” where the administration can use its influence to bring people together to accomplish goals with technology without new regulations or legislation.
Working tech policy levers
What are the levers that the first US CTO has worked to try to galvanize action on the administration’s priorities?
First, a commitment to openness. From Manor, Texas, to inner cities, “people have found ways to tap into info in ways that helps them do something different,” said Chopra, speaking to the phenomenon of Gov 2.0 going local. “85 to 90% of that activity is happening in places we wouldn’t have imagined,” not gathering in Washington.
Second, Chopra cited the White House’s work towards “voluntary, consensus-driven standards,” noting that he was ” very proud of the work on NHIN Direct.”
Finally, Chopra noted that there’s some $150 billion spent on research and development every year, which offers a number of ways to push forward with innovation in priorities like healthcare IT, energy, smart grid or communications.
Making meaningful use modular
Given the new Congress coming in to Washington, Chopra’s description on the bipartisan agreement on tech policy from his time in Virginia under Republican leadership has to be more than a little strategic. He talked about “getting to the right answer,” referring back to an former manager, David Bradley, and his management strategy of “True North.”
That approach will be rested in the next Congress, on rulemaking. and in moving forward with the tech policy decisions. Outside of the healthcare bill that President Obama signed into law, which continues to meet with significant opposition in Congress, Chopra noted that “healthcare is signature part of President’s agenda,” specifically advanced by more than 20 billion dollars in Recovery Act spending on healthcare IT.
Chopra looked back at two decisions related to approaching technology policy a bit differently. “Rather than walking into Best Buy and buying software, we created more flexible standards for meaningful use,” he said. As a result, “entrepreneurs that never thought of themselves as EMR companies are entering the market.”
The decision to make meaningful use more modular was also significant, asserted Chopra. “We opened up the regulatory regime so you could certify each and every regulatory module.”
In aggregate, Chopra associated that R&D investment, work to convene conversations, open up data and create more flexible regulatory regimes with a better outcomes: venture capital investment in HIT going up by 39%, citing a statistic from the National Venture Capital Association.
Addressing the critics
Kim Hart brought up industry criticism of what the “first tech president” has delivered on, versus President Obama’s campaign promises. Halfway his term, the San Jose Mercury News reported this morning that on tech issues, Obama falls short of high expectations.
How did Chopra respond? He asked for more criticism, responding that you “must listen to people who are frustrated” and consider that much of the tech platform is in the space “where the plane is yet to land.” If you go through campaign promises, and look at executive ability to move the needle on different areas, Chopra asserted that the
biggest part of that – open government – has gone ahead. “It’s not ‘mom and apple pie perfect’,” he said, but they’re proud of delivering on 90 day deliverables like standards, or websites.
Part of the challenge of delivering on campaign promises is that budgetary or legislative action requires different stakeholders, observed Chopra, a reality that will become even more sharply defined in the next Congress. “The Recovery Act is a unique moment in time,” which, as he argued is “overwhelmingly the vehicle for campaign promises” in health IT and clean tech.
What’s next in United States technology policy?
Chopra also met with Representative Darrell Issa (R-CA), who is very supportive of increased government transparency through technology. Issa, a successful technology entrepreneur, is one of the most knowledgeable members of Congress when it comes to technology. Whatever comes out of his his legislative staff, or the new House Oversight committee, which he will chair, could represent a step forward for open government after the 2010 election.
Chopra also emphasized “modest but significant actions” that could improve the conditions for tech entrepreneurs in the United Stats, from open government data to regulatory action to smart grid or support for new learning technologies. On that count, Chopra offered up a “scoop” to Kim Hart, observing that the next area where he will focus on driving innovation will be into learning technologies, with more news coming at a Brookings Institute event in December.
“One policy lever is the role of public-private partnerships,” observed Chopra, highlighting the growth in STEM education, with over half a billion dollars in investment. “It’s not the money, it’s the platforms,” he said.
Chopra fielded a question Congressman Wu (D-OR), the current chairman of the House technology and innovation committee. After a discursion into what went wrong for the Democratic Party in the midterm, Wu asked what the next priority will be for Congress and Chopra to work together upon. His answer was simple: spectrum policy, emphasizing voluntary processes for formulating solution. The priority, he said, was to get a broadband network for public safety that’s interoperable for first responders.
Finally, Chopra talked about the story of the Alfred brothers, who founded Brightscope in California in 2008. The story of Brightscope is important: data driving the innovation economy. They knew about key data on 401(k) plan fees at the Department of Labor, worked hard to liberate it and now have a successful, growing startup as a result.
Look for video of the event on Politico’s multimedia section later today to tomorrow. For more on Chopra, open government and participatory platforms, read Radar or watch the interview below.
This week in Washington, D.C., hundreds of experts have come together at the International Open Government Data Conference (IOGDC) to explore how data can also help citizens to make better decisions and underpin new economic growth. The IOGDC agenda is online, along with the presenters.
“Since the United Kingdom and United States movement started, lots of other countries have followed,” said Tim Berners-Lee, the inventor of the World Wide Web. Canada, New Zealand, Australia, France and Finland are all working on open data initiatives.
As he noted with a smile, the “beautiful race” between the U.S. and U.K. on the Data.gov and Data.gov.uk websites was healthy for both countries, as open data practitioners were able to learn from one another and share ideas. That race was corked off when former British Prime Minister Gordon Brown asked Tim Berners-Lee how the United Kingdom could make the best use of the Internet. When Berners-Lee responded to “put government data on the Web,” Brown assented, and Data.gov.uk was born.
Berners-Lee explored the principles of open linked data that underpin data.gov.uk and open government. Specifically, he emphasized his support for open standards and formats over proprietary versions of either, inviting everyone present to join the W3C open government data working group.
Berners-Lee also reiterated his “five star system” for open government data:
- 1 Stars for putting data on the Web at all, with an open license. PDFs get 1 star.
- 2 Stars if it’s machine-readable. Excel qualifies, though Berners-Lee prefers XML or CSVs.
- 3 Stars for machine-readable, non-proprietary formats
- 4 Stars if the data is converted into open linked data standards like RDF or SPARQL
- 5 Stars when people have gone through the trouble of linking it
“The more transparency there is, there more likely there is to be external investment,” said Berners-Lee, highlighting the potential for open government data to make countries more attractive to the global electronic herd.
Will open data spread to more cities, states and countries, as HTML did in the 1990s? If the open standards and technologies that Berners-Lee advocates for are adopted, perhaps. “The Web spread quickly because it was distributed,” said Berners-Lee. “The fact that people could put up Web servers themselves without asking meant it spread more quickly without a centralized mandate.”
Putting open government data to work
Following Berners-Lee, federal CIO Vivek Kundra highlighted how far the open government data movement has come in the short time since President issued his open government memorandum in January 2009.
Kundra remarked that he’s “seeing more and more companies come online” in the 7 countries have embarked on an open government movement that involves democratizing data. He also reeled off a list of statistics to highlight the growth of the Data.gov platfrom.
- Within the boundaries of the United States, Kundra observed that 16 states and 9 cities have stood up open data platforms
- 256 applications have now been developed on top of the Data.gov platform
- There are now 305,692 data sets available on Data.gov
- Since Data.gov was launched in 2009, it has received 139 million hits.
The rapid growth of open government data initiatives globally suggests that there’s still more to come. “When I look at Data.gov platform and where we are as a global community, we’re still in the very early days of what’s possible,” said Kundra.
He emphasized that releasing open data is not just a means of holding government accountable, focusing three lenses on its release:
- Accountability, both inside of government and to citizens
- Utility to citizens, where, as Kundra said, “data is used in the lives of everyday people to improve the decisions they makes or services they receive on a daily basis
- Economic opportunities created as a results of open data.
Kundra pointed to a product recalls iPhone app created by a developer as an example of the second lens. The emerging ecosystem of healthcare apps is an example of both of the latter two facets, where open health data spurs better decisions and business growth.
“The simple act of opening up data has had a profound impact on the lives of ordinary people,” said Kundra, who pointed to the impact of the Veterans Administration’s Blue Button. Over 100,000 veterans have now downloaded their personal health records, which tundra said has stimulated innovation in blue button readers to connect systems from Google or Microsoft.
“I predict that we’ll have an industry around data curation and lightweight applications,” said Kundra. “The intersection of multiple data sets are where true value lies.” The question he posed to the audience is to consider how the government will move to towards an API-centric architecture that allows services to access data sets on a real-time basis.
When asked about that API strategy and the opportunity costs of pursuing it by open government advocate Harlan Yu, Kundra said that he follows an “80/20” rule when it comes to the government building apps vs third parties. “Do we want to be a grocery store or a restaurant when it comes to the Data.gov platform and movement?” he asked.
As a means of answering that question, Jeanne Holm, the former chief knowledge architect at the NASA Jet Propulsion and current Data.gov evangelist, announced a new open government open data community at Data.gov that will host conversations about the future of the platform.
Kundra also made three announcements on Monday:
- A new Harvard Business School case study on Data.gov, available for free to government employees
- A United States-United Kingdom partnership on open government, which will include an open government data camp later this week
- The release of a concept of operations for Data.gov, embedded below, which includes strategic goals for the site, an operation overview and a site architecture.
What do the two technology leaders see as a vision for success for open government data?
For Berners-Lee, it was to be able to directly access data from a dashboard on laptop, rather than indexes and catalogs on Data.gov and data.gov.uk. He talked about accessing open government data that wasn’t just machine-readable or linked to other sets but directly accessed from his local machine, called through powerful Python scribts.
In contrast, Kundra talked about being able to go to a store like Brookstone and “in the same way you can buy alarm clocks with data in the weather channel,” how data from federal agencies had been employed to provision objects from everyday life.
To be fair, there’s a long way to go yet before that vision becomes reality. As Andrew Odewahn pointed out at Radar, earthquakes are HUGE on Data.gov, consistently bringing in the most downloads, even ahead of those product recall data sets. While provisioning recurring visualization in the Popular Mechanics iPad App might be useful to the publisher, it’s also a reminder that the full vision for delivering utility to citizens through open data that Kundra hopes for hasn’t come to fruition as a result of Data.gov – yet.
The first chief technology officer of the Department of Health and Human Services (HHS), Todd Park, has been working hard to make community health data as useful as weather data. If that vision for open government at HHS matures, the innovation released in the private sector could meet or exceed the billions of dollars unlocked by GPS or NOAA data. To see the first steps in that direction, look no further than the healthcare apps that have already gone online, like the integration of community health data into Bing search results.
Park shared the next step in opening up health data last month out in California, when he announced HealthData.gov at the San Francisco Healthcamp. When interviewed yesterday at the mHealth Summit in Washington, Park shared more details about HealthData.gov, which he says will launch in December. He also shared a new goal for text4baby yesterday, which has now grown to be the biggest mobile health platform in the United States: 1 million moms by 2012.
HealthData.gov will be a new part of Data.gov with a health data catalog, including a roster of new public and private applications using the data, said Park. The site will launch with a new tool, a “Health Indicator Warehouse” with over 2000 metrics for United States, state and county health. HealthData.gov will also host an online community dedicated to health data, which should allow practitioners, technologists and entrepreneurs to learn from one another. The site is the next step in the framework HHS has created for government to act as a platform through the Community Health Data Initiative. The next question will be whether these applications lead to better outcomes for citizens and businesses that expand, bringing on new workers. Measuring that meaningful outcome will require more time.
What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.
[Image Credit: Envis-Precisely.com]
“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.
The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.
As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.
Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.
“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”
The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.
“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”
Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.
We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.
The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.
Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.
Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.
UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below:
If you’re looking for the faces of government 2.0, look no further. The video above, released today by Manor New Tech High‘s “Digital Dojo,” features more than a dozen voices (including this correspondent) talking about what Manor.Govfresh meant to them and what open government means to the country.
“I am very excited to be at Manor Govfresh because it’s the first time I’ve ever been to a conference that doesn’t just talk about change but actually does it,” said White House deputy CTO for open government Beth Noveck. “What’s exciting about Manor Govfresh is that it’s brought together so many people who are interested in municipal innovation and using technology to actually make a difference in local communities here in Manor, Texas, in Deleon, Texas, and across America, to actually make government work better.”
When you watch the video, of course, you’ll hear many more voices than Noveck’s, which is of course the point. The movement towards open government at the local level puts the growth of government 2.0 in context. As Stacy Viselli said this morning in a comment on Radar, “Communities and neighborhoods have been moving their organizations online for a while now and are looking for ways to do more. It creates an optimum environment for collaborative projects that include local governments, business, civic associations, nonprofits, and community foundations. Sometimes it’s not about the data so much as it is about providing a platform that empowers communities do what they are already doing–better.”
For more on how local governments are using technology to deliver smarter government, read about how Gov 2.0 is growing locally. And for more on Manor Govfresh, read about harnessing the civic surplus for open government.
Have you met Todd Park? He’s the first CTO of Health and Human Services Department of the United States. Earlier this week, he announced the upcoming launch of HealthData.gov, a new website that will publish open government health data. If you’re unfamiliar with Park, I interviewed him at this year’s Gov 2.0 Expo:
Park and I talked about his open government work at the Department of Health and Human Services, where he’s been trying to make community health information as useful as weather data. We also spoke about the Health 2.0 Developer Challenge, a series of code-a-thons and team competitions to build apps based upon community health data. “Games are a non-trivial information dissemination approach” that can drive actionable behavior, said Park at HealthCamp, referring to many of the entries that use game mechanics to socialize the data. The developer challenge culminated this week during the fourth annual Health 2.0 Conference in San Francisco.
The nation now can see more about what the tech community has come up since this spring, when the question of whether there’s a healthcare app for that was answered the first time. “Social value and economic value can go hand in hand,” he said to a health IT summit in San Francisco. Below, Park talks about the Veterans Administration’s new “Blue Button,” which provides access to downloadable personal health data.
In the video, Park outlines the agency’s plan to offer military veterans and Medicare recipients the ability to download their own health records using a digital “blue button” on MyMedicare.gov and MyHealthyVet. Fried reported on veterans getting downloadable health info at CNET.com. Park, VA CTO Peter Levin and federal CTO Aneesh Chopra blogged about the Blue Button at WhiteHouse.gov:
Veterans who log onto My HealtheVet at http://www.myhealth.va.gov and click the Blue Button can save or print information from their own health records. Using a similar Blue Button, Medicare beneficiaries who are registered users of http://www.mymedicare.gov can log onto a secure site where they can save or print their Medicare claims and self-entered personal information. Data from of each site can be used to create portable medical histories that will facilitate dialog with Veterans’ and beneficiaries’ health care providers, caregivers, and other trusted individuals or entities.
This new option will help Veterans and Medicare beneficiaries save their information on individual computers and portable storage devices or print that information in hard copy. Having ready access to personal health information from Medicare claims can help beneficiaries understand their medical history and partner more effectively with providers. With the advent of the Blue Button feature, Medicare beneficiaries will be able to view their claims and self-entered information—and be able to export that data onto their own computer. The information is downloaded as an “ASCII text file,” the easiest and simplest electronic text format. This file is also easy to read by the individual; it looks like an organized report.
More than 60,000 people have already downloaded their PHRs. As those technically savvy writers emphasize, however, this will create thousands of opportunities to have that sensitive data leak. They stressed the importance of using encryption and password protection to protect the records. For those watching the development of health IT, the future that the 3 CTOs hint about near the end of the post will be of particular interest:
Soon, Blue Button users may be able to augment the downloaded information that is housed on their computers—or that they transferred to a commercial personal health record or other health application—through automated connections to, and downloads from, major pharmacies including Walgreens and CVS; lab systems such as Quest and LabCorp; and an increasing number of inpatient and outpatient electronic medical records systems.
Keep an eye out for how that develops.
Below, Park kicks off the Healthcamp SF Bay event.
Here are his slides from the event:
Below, he summarizes his Healthcamp session.
This morning, the state of Minnesota announced that it would use Microsoft’s private cloud computing technology as a platform for its collaboration software. Microsoft’s blog post reasonably Minnesota’s move to the cloud as an “historic first.” Given that the state’s press release, embedded below, describes it the same way, that’s not unfair. Details have yet to emerge on the security or privacy requirements that the Redmond-based software giants signed to gain the customer but, as the release notes, “the move makes Minnesota the first U.S. state to move to a large collaboration and communication suite in a private cloud environment.”
While federal, state and local government entities have used Amazon, Google Apps or Salesforce.com, today’s news at least adds Microsoft’s offerings into the conversation. The implementation will likely deploy the Windows Azure platform to deliver Microsoft’s Business Productivity Online Suite (BPOS).
“As states battle growing deficits, they are continually being asked to do more with less,” said Gopal Khanna, Minnesota’s State Chief Information Officer in a prepared statement. “Rethinking the way we manage our digital infrastructure centrally, to save locally across all units of government, is a crucial part of the solution. The private sector has utilized technological advancements like cloud computing to realize operational efficiencies for some time now. Government must follow suit.”
Not all reactions are quite as optimistic, however, particularly with respect to reduced costs. “I forsee short term gain,” tweeted researcher Simon Wardley, “large future exit costs, increased consumption, no long term reduction in IT expenditure.”
Why no long term reductions in state IT expenditures by going to Microsoft’s private cloud?
“See Jevons’ paradox,” Wardley replied. “Causes are co-evolution, long tail of demand, componentisation and increased innovation. In other words, you’ll just end up doing more. Countries & States are in competition with each other … not just firms. It’s not MSFT specific, it’s general to all clouds. The ‘cloud will save you money’argument forgets consumption effects. You might as well argue that Moore’s law should have reduced IT expenditure. [Cloud will] reduce your costs if your workload stays the same but alas it won’t, it’ll increase for the reasons previously listed.”
Last week, Gartner analyst Andrea DiMaio rendered his opinion of what Gov 2.0 has to do with cloud computing. In his post, he writes that “ironically, the terms “cloud” and “open” do not even fit very well with each other,” with respect to auditability and compliance issues.
I’m not convinced. Specifically, consider open source cloud computing at NASA Nebula and the OpenStack collaboration with Rackspace and other industry players, or Eucalyptus.For more, read my former colleague Carl Brooks at SearchCloudComputing for extensive reporting in those areas. Or watch NASA CTO for IT Chris Kemp below:
Aside from the work that CloudAudit.org is doing to address cloud computing, after reading DiMaio’s post, I was a bit curious about how familiar he is with certain aspects of what the U.S. federal government is doing in this area. After all, Nebula is one of the pillars of NASA’s open government plan.
Beyond that relationship, the assertion that responsibility for cloud computing deployment investment resides in the Office for Citizen Engagement might come as a surprise to the CIO of GSA. McClure certainly is more than conversant with the technology and its implications — but I have a feeling Casey Coleman holds the purse strings and accountability for implementation. Watch the GSA’s RFP for email in the cloud for the outcome there.
To Adriel Hampton’s point on DiMaio’s post about cloud and Gov 2.0 having “nothing to do with one another,” I’d posit that that’s overly reductive. He’s right that cloud in of itself doesn’t equal Gov 2.0. It’s a tool that enables it.
Moving Recovery.gov to Amazon’s cloud, for instance, is estimated to save the federal government some $750,000 over time and gives people the means to be “citizen inspector generals.” (Whether they use them is another matter.) Like other tools borne of the Web 2.0 revolution, cloud has the potential enable more agile, lean government that enables better outcomes for citizens, particularly with respect to cost savings, assuming those compliance concerns can be met.
The latter point is why Google Apps receiving FISMA certification was significant, and why Microsoft has been steadily working towards it for its Azure platform. As many observers know, Salesforce.com has long since signed many federal customers, including the U.S. Census.
DiMaio’s cynicism regarding last week’s Summit is interesting, although it’s not something I can spend a great deal of time in addressing. Would you tell the Gov 2.0 community to stop coming together at camps, forums, hearings, seminars, expos, summits, conferences or local government convocations because an analyst told you to? That’s not a position I’m coming around to any time soon, not least as I look forward to heading to Manor, Texas next week.
In the end, cloud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.
[*Note Gartner’s reply in the comments regarding the resolution of the magic quadrant suit. -Ed.]
According to a March 2010 survey of state chief information officers by NASCIO , Grant Thornton and Tech America, public IT executives in the United States are looking seriously at investing in the cloud and green IT. 50% of the 40 CIOs, IT resource management officials and OMB representatives surveyed planned to invest in cloud computing. Additionally, some two thirds of those surveyed are using social media. The report is embedded below.
2010 Tech America Federal CIO Survey Final Reporthttp://static.slidesharecdn.com/swf/doc_player.swf?doc=2010techamericaciosurveyreportfinal-100330103736-phpapp01&stripped_title=2010-tech-america-federal-cio-survey-final-report
[Hat Tip: Governing People]