This Web app takes 311 reports about vacant and abandoned buildings from the Chicago and visualizes them onto a searchable map. “It’s specifically set up to pull data from Chicago’s data portal,” said Eder, linking to the 311 service requests of vacant and abandoned buildings dataset.
Eder shared more about how mapping Chicago’s vacant buildings in a blog post earlier this week. The results are unsurprising: there are many more vacant buildings in areas with high poverty rates.
Eder said that the app could be used by other cities, depending on how they store or format their data. The code for Chicago Buildings is on Github. On that front, he says that Chicago “isn’t using Open 311 yet, so this site isn’t either. That being said, it wouldn’t be too hard to hook up the same interface to a different data source.” Code for America will help Chicago to implement Open311 in 2012. Eder shared that he wrote a script that converts Socrata to Google Fusion Tables that could be modified for this purpose.
In the video below, Elizabeth Park, the creator of IFindit Chicago, talks about how she was inspired to build the team that created an Android app to help homeless and lower income citizens find resources like as shelters, medical clinics,and food pantries.
Voting for the winners ends this Friday, October 14th, so check out the community round entries and weigh in.
As a reminder: If you have open government news to share, you can always find me at @digiphile on Twitter, where I share my email address, alex@oreilly.com.
Washington-based DevelopmentSeed continues to tell dazzling data stories with open source mapping tools. This week, they’ve posted a map of the local impact of unemployment and recovery spending. The map visualizes unemployment rate changes at a county level and folds in total economic recovery spending by the government under the American Recovery and Reinvestment Act of 2009. In the map embedded below, red corresponds to an increased unemployment rate and green corresponds to a lower unemployment rate or job growth. Counties that received less than $10 million dollars in recovery spending have a white pattern.
David Cole explains more in a post at DevelopmentSeed.org:
Over the last year, we see that unemployment dropped in 58% of counties by an average of 0.25 percentage points. On average the Recovery Act funded 31 projects at a total of $24,131,582.47 per county. Nationally this works out to about $282.66 in recovery spending per person.
…
Overall, it’s impossible to tell for sure how much recovery spending improved the economic situation, because we just don’t know how bad things could have been. It may be the case that without spending, this map would have a lot more red. Or maybe not. What’s interesting here is the local impact and information we are able to see from processing a few sets of open data. Check out how your county is doing compared to its surroundings. How about compared to a more or less urban county nearby?
Strata Conference New York 2011, being held Sept. 22-23, covers the latest and best tools and technologies for data science — from gathering, cleaning, analyzing, and storing data to communicating data intelligence effectively. Save 20% on registration with the code STN11RAD
This past weekend, citizens acted as important sensors as Hurricane Irene washed up the East Coast of the United States, sharing crisis data as the storm moved through their communities and damage reports in its wake.
Baltimore has embraced the open 311 standard with a new 311 API and take a major step forward towards a collaborative approach to reporting issues with the launch of new mobile applications for the iPhone and Android devices.
“The new 311 Mobile App allows citizens to have real-time collaboration with their government,” said Mayor Rawlings-Blake in a prepared statement. “If you see a pothole, graffiti, or a broken streetlight, you can see it, shoot it, and send it to us — we have an app for that!”
As Philip Ashlock highlighted at Civic Commons in a post on open 311 in Baltimore, the city has a long history with 311:
The City of Baltimore has a long history of leading the way with 311. In 1996, they were the first city to deploy the 311 short code and unified call center, and in 1999, the city launched CitiStat, pioneering the use of statistics based performance management. Now both of these innovations can be amplified by a much more open and collaborative relationship between Baltimoreans and their government through Open311.
Ashlock highlighted another key detail about the integration of the standard by Motorola, which was crucial in DC and San Francisco, the first cities in the U.S. to embrace the Open311 standard.
The launch of Baltimore’s Open311 apps and API was aided by the fact that they were able to leverage the Open311 compliant solutions provided by Motorola CSR and Connected Bits. Baltimore CIO Rico Singleton went as far as to say that their choice of software solutions was influenced by the interoperability provided by the standard.
There are a limited number of citizens who have the time, expertise, passion and education to go clean up public data. There are quite a few more who will report issues in the neighborhoods they live in or work near and share what they see. This kind of mobile networked accountability is going to be a big deal in Africa, Asia and South America very soon. We’ve been seeing early versions of it emerge already during disasters, man-made and otherwise.
With the launch of more mobile applications that connect citizens to existing systems for accountability, city governments are empowering citizens to act as sensors, connecting the real world to the Internet and creating positive feedback loops. That’s good news for Baltimore and beyond.
“This is a huge accomplishment — a nationwide UK system for individuals to document and report problems with any kind of public transportation system,” wroteCivicCommons executive director Andrew McLaughlin this morning. “MySociety has figured out how to route every kind of report to the responsible agency (or even person) — “the service works everywhere in Great Britain, our database has over 300,000 stops and routes for train, tube, tram, bus, coach and ferry.” Great design and interface. Congratulations, +Tom Steinberg and team!”
“We’ve never before launched a site that took so much work to build, or that contained so much data,” writes Steinberg at the MySociety blog, where he explained more about what it’s for. (The emphasis below is mine.)
FixMyTransport has two goals – one in your face, and the other more subtle.
The first goal, as the site’s name suggests, is to help people get common public transport problems resolved. We’re talking broken ticket machines, gates that should be open and stations without stair-free access. We’ll help by dramatically lowering the barrier to working out who’s responsible, and getting a problem report sent to them – a task that would have been impossible without the help of volunteers who gathered a huge number of operator email addresses for us. Consequently the service works everywhere in Great Britain, our database has over 300,000 stops and routes for train, tube, tram, bus, coach and ferry.
The second goal – the subtle one – is to see if it is possible to use the internet to coax non-activist, non-political people into their first taste of micro-activism. Whilst the site intentionally doesn’t contain any language about campaigning or democracy, we encourage and provide tools to facilitate the gathering of supporters, the emailing of local media, the posting of photos of problems, and the general application of pressure where it is needed. We also make problem reports and correspondence between operators and users public, which we have frequently seen create positive pressure when used on sister sites FixMyStreet and WhatDoTheyKnow.
I’m not saying it is impossible to hack brilliant things without piles of VC gold. But if you are going to hack something really, genuinely valuable in just a couple of weeks, and you want it to thrive and survive in the real Internet, you need to have an idea that is as simple as it is brilliant. Matthew Somerville’s accessible Traintimes fits into this category, as does FlyOnTime.us, E.ggtimer.com and doodle.ch. But ideas like this are super rare — they’re so simple and powerful that really polished sites can be built and sustained on volunteer-level time contributions. I salute the geniuses who gave us the four sites I just mentioned. They make me feel small and stupid.
If your civic hack idea is more complicated than this, then you should really go hunting for funding before you set about coding. Because the Internet is a savagely competitive place, and if your site isn’t pretty spanking, nobody is going to come except the robots and spammers.
To be clear — FixMyTransport is not an example of a super-simple genius idea. I wish it were. Rather it’s our response to the questions “What’s missing in the civic web?” and “What’s still too hard to get done online?”
The pitch for the hackathon includes a “green from the beginning” detail that may catch the eye of sustainable energy advocates:
The hack-a-thon will be located in the spacious new Graduate Research Center adjoining the School of International Service building, which is itself a certified LEED Gold marvel of green technology innovation. With a sustainable design and “cradle-to cradle” philosophy for recycling and reusing building materials, participants will even power their devices with solar and wind offset power so their Apps for the Environment will be green from the first idea until the last line of code.
Come one, come all
The hackathon’s organizers emphasize that this event isn’t just about the District’s local civic coders: “Whether you’re a student at any school in computer science, journalism, a professional in the field, or just have an idea to share (which you can post here http://blog.epa.gov/data/ideasforappscomments/) please join us at the hack-a-thonT”
American University journalism professor David Johnson left a comment on the event page that expands that idea:
…even if you can’t code, you can have ideas. even if you don’t have ideas, you can help spread the word. even if you can’t come to DC or AU, you can join us on twitter, ustream, IRC, GitHub, and other online hangouts… we’ll be all over it. everyone can be a part of this. spread the word to campuses and dev shops. come hack with us.
If you’d like to learn more about Apps for the Environment (and hear a robust conversation about open data and apps contests!) watch the webinar and presentation embedded below.
This Thursday at 4 PM EST, the EPA is hosting a webinar for developers to hear more from the community about what the government can do to make data more usable by developers. (Heads up, government folks: Socrata’s open data study found progress but a long road ahead, with clear need for improvement: only 30 percent of developers surveyed said that government data was available, and of that, 50 percent was unusable.)
To those in media, government or commentariot who think that cloud computing or open data might be going away in federal government after the departure of federal CIO Vivek Kundra next month, Dave McClure offered a simple message today: these trends are “inevitable.”
Cloud computing, for instance, will “survive if we change federal CIOs,” he said. “It’s here, and it’s not going away. McClure describes cloud computing as a worldwide global development in both business and government, where the economics and efficiencies created are “compelling.” The move to the cloud, for instance, is behind US plans to close or consolidate some 800 data centers,, including hundreds by the end of 2011.
Cloud computing was just one of five macro trends that McClure “listed at this year’s FOSE Conference in Washington, D.C. FOSE is one of the biggest annual government IT conferences.
inevitable. Here’s the breakdown:
1) Cloud computing
The GSA is the “engine behind the administration’s ‘cloud-first’ strategy,” said McClure, lining up the procurement details for government to adopt it. He said that he’s seen “maturity” in this area in the past 18-24 months. Two years ago, National Institute of Standards and Technology (NIST) was spending time at conferences and panels defining it. Now we have cloud deployments that are robust and scalable, said McClure, including infrastructure as a service and email-as-a-service.
Government cloud deployments now includes public facing websites, storage, disaster recovery andare beginning to move into financial apps.
2) Collaboration and engagement
The cloud is teaching us that once we free data, make it accessible, and make it usable, it’s
creating opportunities for effective collaboration with citizens, said McClure, noting that this trend is in its “early stages.”
3) Open data and big data
Data.gov has “treasure troves” of data that entrepreneurs and citizens are turning into hundreds of applications and innovations, said McClure. Inside of government, he said that access to data is creating a “thirst” for data mining and business intelligence that help public servants work more efficient.
4) Mobile
Mobile computing will be the next wave of innovation, said McClure, delivering value to ourselves and delivering value to citizens. Government is “entrenched in thinking about creation of data on websites or desktop PCs,” he said. That perspective is, in this context, dated. Most of the audience here has a smartphone, he pointed out, with most interactions occurring on the hip device. “That’s going to be the new platform,” a transition that’s “absolutely inevitable,” he said, “despite arguments about digital divide and broadband access.”
5) Security
As McClure noted, you have to include security at a government IT conference. The need for improved security on the Web, for critical infrastructure, on email and where ever else government has exposed attack surface is clear to all observers.
The federal government is hosting a hackathon focused on unlocking the value from the newly opened click data from its URL shortener. Organizers hope the developer community can create apps that provide meaningful information from the online audience’s activity. Later this month, USA.gov has organized a nationwide hack day, inviting software developers, entrepreneurs, and citizens to engage with the data produced by 1.USA.gov, its URL shortener.
The USA.gov hackathon fits into a larger open government zeitgeist. Simply put, if you enjoy building applications that improve the lives of others, there may never have been a better time to be alive. Whether it’s rethinking transportation or convening for a datacamp, every month, there are new hackathons, challenges, apps contests and code-a-thons to participate in, contributing time and effort to the benefit of others. This July is no exception. Last Saturday, Google Chicago hosted a hackathon to encourage people to work on Apps for Metro Chicago. On the Saturday after OSCON, an API Hackday in Portland, Oregon for “an all-day coding fest focused on building apps and mashups.” If you’re free and interested in participating in a new kind of public service, on July 29th, hack days will be hosted by USA.gov in Washington, D.C., Measured Voice in San Diego, bitly* in New York City, and SimpleGeo in San Francisco. If New Yorkers still have some fire in your belly to collaborate with their local government, the city of New York is hosting its first-ever hackathon to re-imagine NYC.gov on July 30-31.
How URL shorteners and 1.USA.gov work
To understand why this particular set of open data from USA.gov is interesting, however, you have to know a bit more about USA.gov and how social media has changed information sharing online. A URL is the Web address, like, say, oreilly.com, that a citizen types into a Web browser to go to a site. Many URLs are long, which makes sharing them on Twitter or other mobile platforms awkward. As a result, many people share shortened versions. (O’Reilly Media links are shortened to oreil.ly, for instance.) One of the challenges that face users is that, unless a citizen uses one of several tools to view what the actual hyperlink is below the link, he or she might be led astray or exposed to malicious code that was included in the original link. In other words, this is about being able to trust a link.
Last year, the United States General Services Administration (GSA) launched a Go.USA.gov URL shortener at the Gov 2.0 Expo in Washington, D.C. Whenever a government employee used Bit.ly (or any service that uses Bit.ly to shorten URLs, like Tweetdeck) to shorten a .gov or .mil URL, the link will be converted to a short go.USA.gov. That meant that whenever a citizen saw a go.usa.gov short URL on a social network, she knows the content came from an official government source.
For more on how Go.USA.gov URLs work, watch Michele Chronister’s presentation from the last year’s Gov 2.0 Expo, below. Chronister is a presidential management fellow and Web content manager for USA.gov in the Office of Citizen Services and Innovative Technologies at the GSA.
This March, the GSA added a 1.USA.gov URL shortener for civilian use. “The whole idea is to improve people’s experience when dealing with government information online,” explained Jed Sundwall, a contractor for USA.gov and GobiernoUSA.gov, via email. “We keep USA.gov in the domain for usability reasons. It’s crystal clear, worldwide, that 1.USA.gov URLs point to trustworthy government information.”
According to Sundwall, ABC senior White House correspondent Jake Tapper was the first to use it when he tweeted out a link to a PDF containing new unemployment information at the Bureau of Labor and Statistics: “For those asking follow-ups on unemployment, here’s the BLS link http://1.usa.gov/XUtpL“
Months later, Tapper has been followed by thousands of other people that have used the 1.USA.gov URL shortener simply by using the tools there already knew.”The beauty is that Jake used it without knowing he was using it,” writes Sundwall.”We’re trying making it easy for anyone to identify .gov information as it’s being shared online.”
That easy identification is quite helpful given the increasing pace of news and information sharing on the Web. “Trust is a valuable thing online, and being able to know that the information you’re receiving is reliable and accurate is difficult yet essential — especially so for government websites, where people go for critical information, like health services and public safety,” wrote Abhi Nemani, director of strategy and communications for Code for America.
Code for America is “excited to be partnering them to help bring together passionate developers, designers, and really anyone interested to see what we can hack together with the data,” wrote Nemani. The 1.USA.gov hackathon will tap into “a huge and growing resource for new and really interesting apps,” he wrote at the Code for America blog. “See, this data gives a lens into how people are interacting with government, online; an increasingly important lens as citizen/government interaction moves from the front desk or the phone line to the web browser.”
To learn a bit(ly) more about the hackathon and its goals, I conducted an email interview with Michele Chronister and Sundwall.
What does the GSA hope to achieve with this hackathon? How can open data help the agency achieve the missions taxpayers expect their dollars to be applied towards?
Chronister: We hope to encourage software developers, entrepreneurs, and curious citizens to engage with the data produced by 1.USA.gov. 1.USA.gov data provides real-time insights into the government content people are sharing online and we know hack day participants will surprise us with creative new uses for the data. We anticipate that what’s produced will benefit the government and the public. Making this data public expands GSA’s commitment to open, participatory and transparent government.
What hacks can come of this that aren’t simply visualizing the most popular content being shared using 1.USA.gov?
Sundwall: First of all, the issue of popular content is an important one. Before this data set, no one has had such a broad view of how government information is being viewed online. Getting a view of what’s popular across government in real time is a big deal, but a big list of popular URL’s isn’t killer per se.
The data from 1.USA.gov includes a lot of data beyond just clicks, including clickers’ browser version (firefox v ie, mobile v desktop, etc) and IP-derived geo data. It’s also real time. This allows people to look at the data across a number of different dimensions to get actionable meaning out of it. A few ideas:
1. Geo data. The geo data included in the 1.USA.gov feed is derived from IP addresses, which makes it intentionally imprecise for privacy reasons (we don’t show the IP address of each click), but precise enough to spot location-based trends.
One of the reasons we brought SimpleGeo on as a collaborator for the hack day is because they’re really good at making location data easy to work with. Their Context product makes it easy to filter clicks through a number of geographic boundaries including legislative districts. They also make it easy to mash the data up with Census demographic data.
We want to let journalists, analysts, campaign strategists, and other researchers know that 1.USA.gov data is a powerful tool to spot trends in the areas where they work. I gave a demo of 1.USA.gov to Richard Boly at the State Dept soon after we launched 1.USA.gov and thought it could be a tool for country desk officers to spot trends in their countries. Hint: if you’re coming to the hack day, think about building something like this.
We hacked together a quick video showing click data mapped out across the US for most of June: red dots are non-mobile clicks and green are mobile. It’s a blunt visualization, but it’s fascinating to watch the clicks pulse across the country, from the east to west in the morning, and then from red to green when people leave their desks and get on their phones.
We could enhance visualizations like this to see if there are trends in how particular kinds of information are shared throughout cities and across the country. I wouldn’t be surprised if clicks on certain links from certain agencies turn out to be leading indicators—perhaps municipal leaders should pay attention to spikes in clicks on hud.gov links.
2. Browser data. We log, on average, about 56,000 clicks on 1.USA.gov links per day. It’s not a ton of data like Google, but the 1.USA.gov dataset provides a really nice sample of user behavior—particularly social media users because the short URLs are most frequently shared and clicked via Twitter and FB.
I’m hoping 1.USA.gov data can be useful to people tracking trends in browser adoption and trends in mobile usage. The data science team at bitly is already doing this kind of analysis with their much larger set of click data, but we’re really excited to give a slice of that data out to researchers for free.
3. Contextual data. Each link points to a file that is likely to include some amount of machine readable content such as an HTML page title, meta description, body content, etc. Many links, if not most, are shared via Twitter. Both the content of the link’s file and the content of the tweet that included the link when it was shared provide insight into not just what links people are sharing, but what topics people are talking about.
What are some of the early successes — and failures — that inform how the GSA is approaching its open data initiatives? And how will it all relate to citizen engagement?
Chonister: Data.gov has successfully built a community of people interested in government data and we hope to expand on that by making USA.gov’s data more available. One part of this is releasing the 1.USA.gov click data to the public. We also provide XML for all of our frequently asked questions on answers.usa.gov and a product recall API. These resources can be found at www.usa.gov/About/developer_resources/developers.shtml
We know that raw government data is not interesting or useful to everyone which is why we are trying to engage specific communities with the hack day. Hopefully any tools created in the hack day will help engage a larger audience and show what’s possible when government opens their data and makes it available.
What are some useful examples of “infohacks” where someone can easily find useful information already?
Sundwall: USA.gov actually used a method to finding useful government information from 1.USA.gov (and Go.USA.gov) by instructing people to search for USA.gov + tsunami on Twitter after the Japan earthquake in early March — this was the best way for people to find the best government information about the tsunami at the time. It allowed us to crowdsource the best government resources about the tsunami by relying on what everyone on Twitter was already finding and sharing. You won’t see this now, but at the time, the search results featured a few “top tweets” pointing to useful government information. 1.USA.gov let us know it was authoritative even though it was being shared from non-govt Twitter accounts like @BreakingNews.
This Twitter search trick is one of my favorite hacks. I subscribe to RSS feeds of USA.gov + awesome and USA.gov + cool and find great crowdsourced govt information every day. Just last week, this tweet inspired this blog post, which ended up being the most popular post on the USA.gov blog ever.
How else could this bit.ly data be made more useful to citizens – or government?
Sundwall: Researchers could use this Twitter search method to be notified of new information by subscribing to searches like USA.gov + cancer, USA.gov + human rights, USA.gov + Afghanistan, etc. I sometimes get a kick out of searching “USA.gov + wtf.” I’m a nerd.
What’s the incentive for developers to donate their time and skill to hacking on this data?
Sundwall: This is the best question. I hope some of the ideas I’ve presented above give an idea of how powerful this dataset is. This is the kind of information that organizations usually regard as proprietary because it gives them intelligence that they don’t want their competitors to have. I’m really really proud to work with the folks at USA.gov because opening up this dataset reveals a deep understanding of how open data can work.
USA.gov wants to help people by helping them find the government information they need. This data will allow other people to join them in this endeavor. As Tim says, “Create more value than you capture.” I hope that people will recognize the value in this data and create tools, apps, more efficient research methods, and perhaps even businesses based on it. I’m certain this data will prove to be valuable to many people who will discover applications of it that we haven’t imagined yet.
*Editor’s Note: bit.ly is funded by O’Reilly AlphaTech Ventures.
Civic developers at Code at America created a Web application in honor of this year’s Independence Day that features a number of patriotic values around: creativity, technical expertise and interest in the public discourse of their fellow citizens. Flag.CodeForAmerica.org aggregates Twitter avatars from users who tweet using the hashtag #July4th and mashes them up into a mosaic representing the Stars and Stripes. The first flag of the United States of America had a star and a stripe for each state. This flag has a tile for each human’s account.
“I have had the idea to do this for a while,” writes in Abhi Nemani, director of strategy and communication at Code for America, “soon after last year’s binary art campaign, but had to wait until we got the code infrastructure in place to be able to execute it at scale.”
Fortunately, Nemani said, the Mozilla Foundation worked with a development shop, Quodies, to create a similar mosaic when they launched Firefox 4 – check out their Firefox Tweet Machine.
“Given that it’s Mozilla, the code was open source of course,” said Nemani.”We rewired it some to get it working for us and added in some documentation so it should be easier for the next deployment,” said Nemani. Tyler Stadler was the development lead and Karla Macedo the designer.
If you want to check out the code for Twitter Collage, you can find it at Github. There’s no cap on #July4th responses, so tweet away.
The avatar mosaic is a 21st century update that captures some of the diversity and unity of that first flag by featuring some of the many voices that now can be heard on the public square of our time, the Internet. Not all of the tweets captured are positive. Some include strident political messages, divisive rhetoric or commercial promotions. It’s the public, in all of its uncensored, unvarnished, raw glory. The republic that the founding fathers fought and died for included the freedom of speech for its citizens. Over two centuries later, we’re seeing it today, coalesced around a national holiday.
“We hold these tweets to be self evident, that all humans are created equal…”
“This is another example of the new Republican majority using digital tools to better engage with and listen to the American people,” said Speaker Boehner in a prepared statement. “We’re committed to keeping our pledge to lead a House that is more open and that gives Americans a real-time voice in their government.”
Adopting the same low cost Voice over IP tools for videoconferencing that are in use all around the world makes sense on many levels, despite security concerns. Congressmen and their staff will be able to easily communicate with one another at a lower cost now. Daniel Lungren, chairman of House Administration, offered more context for the upgrade to VoIP in a “Dear Colleague” letter this week:
Improving constituent communications and increasing transparency has been a top priority for me as Chairman of House Administration and a member of the House Technology Operations Team. That’s why I am pleased to announce that the House’s Public Wi-Fi network has been enabled to allow Members and staff to conduct Skype and ooVoo video teleconference (VTC) calls.
To maintain the necessary level of IT security within the House network, the House has negotiated modified license agreements with Skype and ooVoo that will require Members, Officers, Committee Chairs, Officials and staff to accept House-specific agreements that comply with House Rules and maximize protection for Members and staff. Detailed requirements on how to comply with these agreements have been posted to HouseNet at http://housenet.house.gov/keywords/VTC. Please note that Skype users will be limited to conducting VTC sessions on the House’s public Wi-Fi to minimize security risks associated with peer-to-peer networking.
During a time when Congress must do more with less, utilizing low-cost, real-time communication tools is an effective way to inform and solicit feedback from your constituents. In addition to Skype and ooVoo, we are searching for additional means to help enhance constituent communications.
“Citizen-to-legislator” communications using VoIP will hold some challenges. Skype and ooVoo both allow conference calls between more than one party but neither will is ideal for one-to-many communications without some tweaking. If a representative’s staff can set up a projector and sound system, however, we may well see new kinds of virtual town halls spring up, whether someone calls back from Washington or from the campaign trail.
Less clear is how constituent queueing might be handled. If hundreds of citizens, activists or lobbyists are all trying to Skype a Congressman, how will priority be assigned? How will identity be handled, in terms of determining constituents from a home district? As I wrote this post, two other questions posed to the Speaker’s office also remained unanswered: will video chats be archived and, if so, how? And will Skype’s file transfer capabilities be allowed?
On the latter count, given the difficult past relationship of the House and P2P filesharing software, learning that file sharing capabilities were disabled would be in line with expectations. UPDATE: Salley Wood from the House Administrative Committee confirms that the current configuration does include file sharing. “Today’s announcement is simply that lawmakers can now take advantage of these platforms using official resources,” she related via email.
What is clear is that one more domino in the adoption of Web 2.0 tools in government has fallen. What happens next is up for debate — except this time, the conversations will span hundreds of new Web connections. This will be, literally, fun to watch.
UPDATE: As Nick Judd blogs over at techPresident, the Hill was the first to report that the House enables use of Skype for members, basing its reporting off of “Dear Colleague” letter above. There’s no shortage of detail in the Hill’s piece, nor good linkage from Judd. So, you know, go read them.