As people watching the impact of social media in the events in Egypt know, Facebook, Twitter and YouTube played a role. Today, Microsoft’s director of public sector engagement, Mark Drapeau, sent word that the Redmond-based software company’s open source ideation platform, Town Hall, has been deployed at nebnymasr.org to collect ideas.
This Town Hall instance and others show how citizensourcing platforms can be tailored to channel feedback around specific topics, as opposed to less structured platforms. As governments and citizens try to catalyze civic engagement using the Internet, creating better architectures for citizen participation will be critical. Clay Shirky’s talk about the Internet, citizenship and lessons for government agencies at the Personal Democracy Forum offered some insight on that count. Using taxonomies to aggregate ideas instead of a single list was a key takeaway.
Each year, the U.S. Government spends almost $80 billion dollars buying information technology (IT); the software, computer equipment and network devices that help the Government run efficiently. It is important that those purchases be fair, neutral and based on an objective assessment of relevant criteria. To ensure that the agencies and the public are aware of our policy, today U.S. Chief Information Officer Vivek Kundra, Administrator for Federal Procurement Policy Dan Gordon and I issued a statement to Senior Procurement Executives and Chief Information Officers reminding them to select IT based on appropriate criteria while analyzing available alternatives including proprietary, open source and mixed source technologies.
Aliya Sernstein, over at NextGov, extracted an interested headline from the guidance: “Kundra encourages open source.” Getting to that conclusion from the memo in question, embedded below, might be a stretch, though it is notable that a document signed by the United States chief information officer specifically said that agencies should “analyze alternatives” that include open source.
One key phrase in the memo gives a bit more insight here, in terms of the acquisition process: should “selecting suitable IT on a case-by-case basis to meet the particular operational needs of the agency by considering factors such as performance, cost, security, interoperability, ability to share or re-use, and availability of quality support.”
Open source software has both competitive advantages and disadvantages in those areas.
(So why the memo, and why today? It’s not entirely clear yet, but a smart source points out a related news item in the space: yesterday Google won a preliminary injunction in a case where it had argued that the U.S. Department of the Interior had inappropriately geared a nearly $60 million contract for cloud-based email and collaboration software tools to fit only Microsoft’s proprietary products. Again, though, we’re indulging in a bit of speculation here, and it’s worth pointing out that Google’s revelant products aren’t themselves open-source.)
By the way, if you’d like to stay instantly up on such developments, you might try following Kundra’s new Twitter feed. He’s only tweeted three times thus far, but once was an indeed a pointer to this memo. “Open source vs proprietary?,” he posted. Follow @VivekKundrahere.
The Federal Trade Commission released an online privacy report today that will reshape how companies, consumers and businesses interact on the Internet. The agency will take questions from reporters at 1 PM EST and from the public on Twitter in its first Twitter chat at 3 PM EST. The recommendation that “companies should adopt a ‘privacy by design’ approach by building privacy protections into their everyday business practices” is a key direction to every startup or Global 1000 corporation that comes under the FTC’s purview as the nation’s top consumer protection regulator.
The new FTC privacy report proposes a framework that would “balance the privacy interests of consumers with innovation that relies on consumer information to develop beneficial new products and services,” according to the agency’s statement, and recommends the implementation of a “Do Not Track” mechanism, which the agency describes as “a persistent setting on consumers’ browsers – so consumers can choose whether to allow the collection of data regarding their online searching and browsing activities.”
“Technological and business ingenuity have spawned a whole new online culture and vocabulary – email, IMs, apps and blogs – that consumers have come to expect and enjoy. The FTC wants to help ensure that the growing, changing, thriving information marketplace is built on a framework that promotes privacy, transparency, business innovation and consumer choice. We believe that’s what most Americans want as well,” said FTC Chairman Jon Leibowitz.
The report states that industry efforts to address privacy through self-regulation “have been too slow, and up to now have failed to provide adequate and meaningful protection.” The framework outlined in the report is designed to reduce the burdens on consumers and businesses.
“This proposal is intended to inform policymakers, including Congress, as they develop solutions, policies, and potential laws governing privacy, and guide and motivate industry as it develops more robust and effective best practices and self-regulatory guidelines,” according to the report, which is titled, “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.”
“Self-regulation has not kept pace with technology,” said David Vladeck, director of the FTC’s Consumer Protection Bureau, speaking this morning about the proposed online privacy rules. “We have to simplify consumer choice and ‘do not track’ will achieve that goal,” he said. “I don’t think that under the FTC authority we could unilaterally mandate ‘do not track.'”
One of the nation’s top technology policy advocates approved. “The FTC report hits all the right notes. It sets out a modern and forward looking framework for privacy protection that moves beyond a narrow focus on notice and choice toward a full set of fair information practices and accountability measures,” said Center for Democracy and Technology president Leslie Harris. “The FTC has provided the blueprint. Now it is time for Congress and industry to follow suit.”
“We are very pleased to see the FTC exerting strong leadership on privacy,” said CDT Privacy Project Director Justin Brookman. “This report should bolster efforts to enact a privacy bill next Congress. Its recommendations are consistent with what is being discussed on the Hill.”
In a novel move, the FTC tweeted out “key points” from the report, embedded below, using @FTCGov.
Below are the prepared remarks of the FTC chairman, followed by a liveblog of the press call. Audio of the FTC online privacy press call is available as an MP3.
The FTC has issued a privacy advisory for tomorrow, stating that FTC chairman Jon Leibowitz, Jessica Rich, deputy director of the Bureau of Consumer Protection, and Edward W. Felten, the FTC’s new chief technologist, will answer reporters’ questions “about a new FTC report on privacy that outlines a framework for consumers, businesses and policymakers.”
This FTC online privacy report will be one of the most important government assessments this year. Look for widespread reaction to its contents across industry and technology media. Particular attention likely be paid to two events here in Washington:
Will online privacy look different by the end of the day? As Jamie Court, Author, President of Consumer Watchdog, wrote in the Huffington Post:
There are few issues 9 out of 10 Americans agree on. A Consumer Watchdog poll shows that 90% of Americans agree it is important to protect their privacy online. 86% want a “make me anonymous” button and 80% want the creation of a “do not track me” list online that would be administered by the Federal Trade Commission.
The release of the FTC online privacy report also comes with a new media twist: According to @FTCGov, the agency’s Twitter account, the nation’s top regulator will also host its first Twitter chat at 3 PM. It remains to be seen how civil citizens are in the famously snarky medium. The agency has suggested the #FTCpriv hashtag to aggregate tweets. UPDATE: Although the White House OpenGov account and FTC tweeted on Wednesday that the chat would be at #FTCpriv hashtag, not #FTCpriv, the chat ended up being at the original hashtag.
Will a rebooted FCC.gov become a platform for an ecosystem of applications driven by open government data? If that vision is going to come to fruition, the nation’s pre-eminent communication regulator will have to do more than just publish open data sets in machine readable formats online: it will have to develop a community of software developers that benefits to creating such applications.
Monday’s FCC developer day is a first step towards that future. Whether it’s a successful one will be in part predicated upon whether the applications created by the “civic hackers” present help citizens, businesses or other organizations do something new or ease a given process.
UPDATE: One key member of the open government community is the founder of Development Seed, Eric Gunderson. Gunderson has been involved in some of the most innovative mapping projects in open government over the past few years, along with the development of the platform for the new data.worldbank.org. If you’re looking for an unvarnished assessment of the meaning of the FCC’s effort and developer outreach, look no further:
Does it make sense to experiment? “In an online world, the best ideas can and do come from anyone, anywhere,” said FCC chairman Genachowski in a prepared statement. “Tapping into the innovation happening at the edge and in the cloud is a no brainer. The FCC’s first-ever Open Developer Day imports a best practice from the tech industry to help improve accessibility. It is part of our ongoing effort to harness technology to transform the FCC into a model of excellence in government.”
In order to meet that goal, the FCC is taking an open government tack, asking the civic development community to contribute to that effort. The agency’s first data officer, Greg Elin, explained more about the constituents for today’s dev day on the Blogband blog:
Programmers from the Yahoo! Developer Network will be on hand to demo their tools and provide guidance. They will give an overview of YQL, their query language which allows developers to “access and shape data across the Internet through one simple language, eliminating the need to learn how to call different APIs.” We will also see a demonstration of their YUI Library, a set of “utilities and controls … for building richly interactive web applications.”
An undertone, pervading a significant strand of the discussion, will be the 21st Century Communications and Video Accessibility Act. In signing the act last month, President Obama said the act “will make it easier for people who are deaf, blind or live with a visual impairment to do what many of us take for granted… It sets new standards so that Americans with disabilities can take advantage of the technology our economy depends on.”
The full day event will start at 9:00am and take place in Washington, DC at FCC headquarters. All developers are welcome free of charge. Bring a laptop and RSVP soon. If you’re not in the DC area and are unable to make it down here, we will be live streaming portions of the day. You can also join the discussion on Twitter using the hashtag #fccdevday. To email questions write to livequestions [at] fcc [dot] gov. You can participate by visiting Accessible Event, and entering the event code 00520237
For more context on what the FCC is trying to accomplish with FCC.gov/developer, watch a speech from the chairman and managing director Steve Van Roeckel at the 2010 Gov 2.0 Summit, below:
“An exploration of cyberpunk fiction, technology, where we’re headed, the challenges we face, and the solutions we need”-Ignite NYC. I gave a (very) similar talk called Pattern Recognition and Spimewatch at Ignite D.C. later that week. For whatever reason, this version seems to have come off much better. Rack it up to the first time on a big stage; there were close to a thousand people present in NYC.
What will a government cloud computing look like coming from “Big Blue?” Today, IBM announced a community cloud for federal government customers and a municipal cloud for state and local government agencies. With the move, IBM joins a marketplace for providing government cloud computing services that has quickly grown to include Google, Amazon, Salesforce.com and Microsoft.
“We’re building our federal cloud offering out of intellectual bricks and mortar developed over decades,” said Dave McQueeney, IBM’s CTO of US Federal, in an interview. The value proposition for government cloud computing that IBM offers, he said, is founded in its integrated offering, long history of government work and experience with handling some of the largest transactional websites in the world.
The technology giant whose early success was predicated upon a government contract (providing Social Security records keeping systems in the 1920s) will be relying on that history to secure business. As McQueeney pointed out, IBM has been handling hosting for federal agencies for years and, unlike any other of the cloud computing players, has already secured FISMA High certification for that work. IBM will have to secure FISMA certification for its cloud computing, which McQueeney said is underway. “Our understanding is that you have to follow the FedRAMP process,” he said, referring to the the Federal Risk and Authorization Management Program (FedRAMP initiative that’s aimed at making such authorization easier for cloud providers. “We have made requests for an audit,” he said.
As the drive for governments to move to the cloud gathers steam, IBM appears to have made a move to remain relevant as a technology provider. There’s still plenty of room in the marketplace, after all, and a federal CIO in Vivek Kundra that has been emphasizing the potential of government cloud computing since he joined the Office of Management and Budget. Adopting government cloud computing services are not, however, an easy transition for federal or state CIOs, given complex security, privacy and other compliance issues. That’s one reason that IBM is pitching an integrated model that allows government entities to consumer cloud services to the degree to which CIOs are comfortable.
Or, to put it another way, software quality and assurance testing is the gateway drug to the cloud. That’s because putting certain kinds of workloads and public data in the cloud doesn’t pose the same headaches as others. That’s why the White House moved Recovery.gov to Amazon’s cloud, which CIO Kundra estimated will save some $750,000 to the operational budget to run the government spending tracking website. “We don’t have data that’s sensitive in nature or vital to national security here,” said Kundra in May.
“Cloud isn’t so much a thing as a place you are on a journey,” said McQueeney. “To begin, it’s about making basic basic information provisioning as easy and as flexible as possible. Then you start adding virtualization of storage, processing, networks, auto provisioning or self service for users. Those things tend to be the nexus of what’s available by subscription in a SaaS [Software-as-a-Service] model.”
The path most enterprises and government agencies are following is to start with private clouds, said McQueeney. In a phrase that might gain some traction in government cloud computing, he noted that “there’s an appliance for that,” a “cloud in a box” from IBM that they’re calling CloudBurst. From that perspective, enterprises have long since moved to a private cloud where poorly utilized machines are virtualized, realizing huge efficiencies for data center administrators.
“We think most will government agencies will continue to start with private cloud,” said McQueeney, which means CIOs “won’t have to answer hard questions about data flowing out of the enterprise.”
Agencies that need on demand resources for spikes in computing demands also stand to benefit from government cloud computing services: just ask NASA, which has already begun sending certain processing needs to Amazon’s cloud. IBM is making a play for that business, though it’s unclear yet how well it will compete. The federal community cloud that IBM is offering includes multiple levels of the software stacks including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), depending upon agency interest. At the state and local level, IBM is making a play to offer SaaS to those customers based upon its experience in the space.
We know from dealing with municipal governments that processes are very similar between cities and states,” said McQueeney. “There’s probably a great leverage to be gained economically for them to do municipal tasks using SaaS that don’t differ from one another.” For those watching the development of such municipal software, the Civic Commons code-sharing initiative is also bidding to reduce government IT costs by avoiding redundancies between open source applications.
The interesting question, as McQueeney posed it, is what are government cloud computing clients are really going to find when they start using cloud services. “Is the provider ready? Do they have capacity? Is reliability really there?” he asked. Offering a premium services model seems to be where IBM is placing its bet, given its history of government contracts. Whether that value proposition makes dollars (and sense) in the context of the other players remains to be sense, along with the potential growth of Open Stack, the open source cloud computing offering from Rackspace and other players.
Regardless of loud computing will be one more tool that enables government to deliver e-services to citizens in a way that was simply not possible before. If you measure Gov 2.0 by how technology is used to arrive at better outcomes, the cloud is part of the conversation.
Whether state and city governments move to open source applications or cloud computing – like Los Angeles, Minnesota or now New York City – will be one of the most important government IT stories to watch in the next year. Today, IBM has added itself to that conversation.
UPDATE: CNET posted additional coverage of IBM’s government cloud initiative, including the video from IBM Labs below:
The slideshow above is a selection of pictures from today’s Fedtalks in Washington. (Look for more high quality photography soon from the event organizers). If you can’t see the Flash slideshow, you can view my full Fedtalks 2010 set on Flickr.
What is the federal chief technology officer up to out in Silicon Valley? From afar, however, it’s looks like federal CTO Aneesh Chopra is stirring up awareness about open government and entrepreneurship in the venture capital community in California. He’s also traveling with Department of Health and Human Services (HHS) CTO Todd Park to add his compatriot’s considerable enthusiasm for innovation in healthcare information technology (HIT). Chopra’s slides follow:
During the event, I picked up some tweets coming out of a “D.C.-to-Silicon Valley” event and curated them using the Storify tool. It proved to be a bit unstable – apps in beta are fun! – but you’ll find a “living version” of the story embedded in the post below.
This morning, the state of Minnesota announced that it would use Microsoft’s private cloud computing technology as a platform for its collaboration software. Microsoft’s blog post reasonably Minnesota’s move to the cloud as an “historic first.” Given that the state’s press release, embedded below, describes it the same way, that’s not unfair. Details have yet to emerge on the security or privacy requirements that the Redmond-based software giants signed to gain the customer but, as the release notes, “the move makes Minnesota the first U.S. state to move to a large collaboration and communication suite in a private cloud environment.”
While federal, state and local government entities have used Amazon, Google Apps or Salesforce.com, today’s news at least adds Microsoft’s offerings into the conversation. The implementation will likely deploy the Windows Azure platform to deliver Microsoft’s Business Productivity Online Suite (BPOS).
“As states battle growing deficits, they are continually being asked to do more with less,” said Gopal Khanna, Minnesota’s State Chief Information Officer in a prepared statement. “Rethinking the way we manage our digital infrastructure centrally, to save locally across all units of government, is a crucial part of the solution. The private sector has utilized technological advancements like cloud computing to realize operational efficiencies for some time now. Government must follow suit.”
Not all reactions are quite as optimistic, however, particularly with respect to reduced costs. “I forsee short term gain,” tweeted researcher Simon Wardley, “large future exit costs, increased consumption, no long term reduction in IT expenditure.”
Why no long term reductions in state IT expenditures by going to Microsoft’s private cloud?
“See Jevons’ paradox,” Wardley replied. “Causes are co-evolution, long tail of demand, componentisation and increased innovation. In other words, you’ll just end up doing more. Countries & States are in competition with each other … not just firms. It’s not MSFT specific, it’s general to all clouds. The ‘cloud will save you money’argument forgets consumption effects. You might as well argue that Moore’s law should have reduced IT expenditure. [Cloud will] reduce your costs if your workload stays the same but alas it won’t, it’ll increase for the reasons previously listed.”