Maybe it's end-of-year nostalgia, or maybe it's all the LinkedIn feed employment updates plus awareness of changes at my own company, anyway, I got to thinking this week about my initial explorations into the community of government transparency and digital reform known as "Gov 2.0" or "open government."
I got involved in Gov 2.0 in 2008 as a municipal government employee after wondering whether some of the digital engagement strategies of the Barack Obama presidential campaign could be applied to civic engagement in a more formal context, and found a small but welcoming community first on LinkedIn, then on GovLoop.com and Twitter. I went on to become group manager for the LinkedIn Government 2.0 community, to start the Gov 2.0 Radio podcast, and to make a (highly unsuccessful) run for U.S. Congress. During that campaign, I met Jim Gilliam, a civic technology entrepreneur, and, in 2011, I joined Jim and his co-founder Jesse Haff at NationBuilder.
NationBuilder, a community organizing system, now has 80 employees and provides a web platform for thousands of politicians and elected officials, and a growing number of cities and agencies. I recently met a new intern on our data team who has quite a local (NationBuilder is based in downtown Los Angeles) reputation in the open government community, and when she added me to a couple of her community Twitter lists, it struck me that I know fewer of the names and faces of today's Gov 2.0.
For a more up-to-date view of the Gov 2.0 community, check out these Twitter lists:
There's been a big shift in how people use the web that caught up with Healthcare.gov and sister sites yesterday. You can build the most beautiful and "scalable" website for web visits, make it open source, put the code up on GitHub, talk about how innovative it is, then watch it crumble under the server strain of people trying to actually do something through your site.
Healthcare.gov's real challenge wasn't to build an alternative to a commercial CMS (content management system), it was to build an application that can handle event-oriented human behavior - for that you need the best systems engineering, not "10,000 authenticated users through GitHub" for your content delivery, as one of the Healthcare.gov contractors highlighted in this Atlantic profile of the project by Alex Howard.
Before the application process bogged down yesterday, Healthcare.gov got lots of nice gov tech insider buzz for its open source nature. But the project still had contractors on board, and based on how the service behaved on opening day of the Affordable Care Act, it could have stood a lot more testing of what people actually wanted to do with it. Kind of like Mitt Romney's Orca system on election day last November.
The Healthcare.gov site loaded fine, but trying to apply through it was kinda like buying first-time Comic-Con badges online.
Open source has changed the technology landscape for the better, underpinning many of our favorite startups. However, simply invoking it like a protection spell is no replacement for the architectural skill and planning required to pull off the systems needed for a successful Healthcare.gov launch. Health and Human Services, which managed the project, needed a little more "Puppet vs. Salt" and a little less "open" in its vernacular.
Adapting to a web where people are participants, not viewers, is the lesson we're all learning. Web infrastructure needs to support people, not publishing.
The reaction to failures of Healthcare.gov under heavy load won't work if the discussion is about how other services fail - it has to be about building infrastructure that's designed for peak interactivity and not for views.
Choice quotes from the Atlantic profile:
Bryan Sivak, CTO at Health and Human Services: "Instead of [running] farms of application servers to handle massive load, you're basically slimming down to two. ... The way it's being built matters."
Dave Cole from HSS contractor Development Seed: "You're just talking about content. There just needs to be one server. We're going to have two, with one for backup. That's a deduction of 30 servers."
Maybe there was a lot more infrastructure work going on behind the scenes, but the project leads' obsessive focus on the content framework is telling.
Healthcare.gov's scaling challenge was never about delivering content like a really popular website, it was the peak activity challenge that Twitter faces on a regular basis. Taking interaction-based scaling challenges seriously is why Twitter is stable now and wasn't in 2009 - those are the issues HHS should have been talking about.
Few updates after a bit of Twitter fun on these issues today:
Not faulting Alex's reporting in any way here - I believe if the HHS team was really focused on the infrastructure for supporting a signup rush at the time of the Atlantic article, that dedication would have shown up in the story. The omission of that kind of discussion (read the article - the project team seems to have an almost flippant approach to back-end server architecture). I also googled around looking for commentary on that front from earlier in the life of the project.
I didn't do a detailed investigation, this is an opinion blog piece not investigative journalism. As I said above, it's quite possible there was more going on - but the fact the site had so much persistent trouble as an actual application (while it functioned fine as what we call in the biz a "brochure site") means whatever was done fell dangerously short.
Finally, if an important initiative like Healthcare.gov is going to get 2.8 million views in a day, I want everyone who wants to apply through that site to do so smoothly. My ding on "open government buzzwords" is that it's really easy to do "innovative" things with government technology and get headlines, without actually delivering for constituents.
Another update from Twitter conversation:
Alex speculates the devs and designers who built the content framework aren't to blame here.
Fair enough. I think it's fairly clear from the above that I blamed HSS and a culture of thinking that web properties are publishing applications and not designing them for interaction. It's really time to stop talking about a "front-end" and a "back-end" for any kind of website. If it doesn't scale for interaction, it doesn't scale. Twitter's infrastructure challenge isn't displaying millions of tweets, it's keeping all of them threaded in real-time.
Open source content frameworks are nice (hey, Twitter released Bootstrap!), but HHS separated that issue from the kind of services needed to effectively scale the application process. It's like building a really shiny muscle car and then giving it a weak 2-liter engine. Fully integrated applications with content delivery and scalable interaction design are really, really hard. And that's where buzzwords fall short.
Sept. 7 update:
On Saturday, I wrote about these issues on GovFresh, "The openwashing of Healthcare.gov" and cited a Reuters article that laid the project on CGI Inc., a giant federal contractor.
Today, the Wall Street Journal quoted an HHS spokeswoman and IT experts regarding flaws in the system. The article mentions CGI and also says Experian had a contract around identity verification. Based on the analyses I've read, it seems like there could be timeouts or critical delays between security question submittal and verification, which would indicate architecture issues again, not an Experian issue per se.
Shannon Spanhake recently joined the City and County of San Francisco to focus on building an OpenGov program within the Dept. of Technology. She aims to strengthen and enable partnerships between public, private, and people sectors to identify and solve civic challenges. Previously, she held a dual appointment as a Sr. Researcher at the Center for Development Finance and also a Post-doc at the California Institute for Telecom and IT. She has a patent-pending for a citizen-powered wireless sensor networking technology and she has co-founded cultural spaces in Mexico and India that explore the community and urban dynamics. @shannonspanhake | Shannoninsf.blogspot.com
Jake Levitas is a designer, consultant, and community activist based in San Francisco. He currently serves as Research Director at Gray Area Foundation for the Arts, a leading Bay Area nonprofit dedicated to fostering creative applications of technology, data visualization, and digital art. He has led and worked on a number of creative technology projects, and managed GAFFTA's four-month Summer of Smart initiative that brought together government and the creative class to build solutions to city issues using open data. His multidisciplinary background includes several years of experience in urban planning, mapping, information design, and sustainability consulting, as well as work in graphic design, audio production, and architecture. @civicinnovation
Sherry Willhoite joined Granicus in November of 2010 and is VP of Product Management . She brings with her over 12 years of experience in industry leading consumer Internet companies including Yahoo!, Friendster and Spark Networks. Sherry has a track record for building highly trafficked online communities, monetizing businesses through advertising and direct subscription memberships and optimizing user experience for key business metrics. She brings a unique combination of in-the-trenches product management, high-level product strategy and analytical business optimization.
Loren L. Hart has more than 35 years of experience as a computer scientist and systems architect. He received his education in computer science at U.C. Berkeley and has since worked for Schlumberger, Sun Microsystems’ JavaSoft, Nanobiz, Data Ace, and Verisign before joining Theranos in 2006. He has garnered extensive experience in Unix systems and kernels, having worked on them since 1980, and has also contributed code to Linux. His work on Java-based commerce and security has earned a patent one-time password tokens, and he has designed and implemented mush of the server-side security systems used in Theranos products.
Javier Muniz draws on his broad knowledge of networking and application development to provide direction on product strategy and design for Granicus’ product development team as Chief Technology Officer. Javier, Granicus' co-founder, enjoys leveraging new technologies to solve pervasive business issues for Granicus and its customers. In his role, he’s able to act as a visionary and technical expert to ensure products meet the desired goal for the company and our valued customers. Javier began his career at Sun Microsystems designing and managing remote access components of the Sun global network infrastructure. Later, he went on to WebTV Networks where he designed and developed applications used by the Network Operations Center to manage over 600 nodes that supported over 1 million active WebTV subscribers. @javicmuniz
This is the briefing document I've provided for prospective lead legislative sponsors (see questions and comments below):
Along with a number of other open government advocates, I've launched a campaign to put a definition of "open data online" into California and San Francisco law. The issue is that often when documents and data are published online, they cannot be accessed or used in a meaningful fashion because they cannot be searched, indexed by Google, or combined in a meaningful way with other documents for analysis. I want to tackle this not by mandating that certain documents and data be published online, but simply by creating a reference standard so that when new mandates pass, or new documents are published online as a matter of course under existing law or regular business, they are in accessible formats.
This has the benefits of making things easier for people who use screen readers, for developer who want to use public data to build applications, for transparency advocates, and is simply good policy. Publishing data in formats that can't be searched, compared to other documents or reused in a meaningful way is as useless as keeping it tucked away in an obscured file cabinets. Publishing in accessible formats online is as simply as education employees in how to properly save and store documents for online publication using the same software they already have on their computers. In an ironic demonstration of the current problem, San Francisco's current open data law was published by the Board of Supervisors as an unsearchable PDF.
- Javier Muniz, CTO and co-founder, Granicus (based in SoMa and one of the greatest open gov tech company success stories in the U.S.)
- Steve Ressler, founder, GovLoop
- Rep. Jason Murphey, Chairman of the House Goverment Modernization Committee, Oklahoma
- Scott Primeau, OpenColorado
- Luke Frewell, founder and publisher, GovFresh
- and many more who can be viewed online - http://www.wiredtoshare.com/structured_open_data_campaign
Comments meant for official consideration should be directed to Alicia Lewis, alicia.lewis [at] sen.ca.gov
Open data in San Francisco, the state of California, and throughout much of the U.S. and the world remains hobbled by a lack of legal definition. San Francisco's own open data law, for example, is posted online by the Board of Supervisors as a non-searchable PDF. On December 10-11, at the winter CityCampSF Hackathon, Gov 2.0 advocates will publicly launch an advocacy campaign to institute an open data standard in San Francisco municipal and California state law. The primary goal of this advocacy will be to achieve a clear and reasonable definition of open data for all materials required by law to be published online.
Please join us in endorsing this advocacy campaign, and encourage your friends and legislators to sign on as well.
For another definition of open data online that we will consider, see the CityCamp model Open Government directive, which describes open data as being published online in an "open format that can be retrieved, downloaded, indexed, sorted, searched, and reused by commonly used Web search applications and commonly used software."
This legislation should also encompass the goals of increased transparency in responses to SF Sunshine Ordinance requests and California Public Records Act requests - documents released in an electronic format after implementation of this ordinance would have to follow its standards of accessibility.
Machine-readability: Data should be published in structured formats easily processed by machines/software.
And much of the hand-wringing over official social media use is about the public - what if they say something we don't like! Many of the agencies using shiny tools like Facebook and or Twitter don't even allow comments on their Web sites, even sites they call "blogs."
Fear and failure to engage are simple reinforcing citizen concerns that government doesn't listen and doesn't care.
According to an April Pew study on trust in government, "By almost every conceivable measure Americans are less positive and more critical of government these days."
I, and, I hope, thousands of other Government 2.0 advocates, have not spent the last two years building a movement to have it end up as "The System 2.0."
Some may argue that government needs to be on social media channels because of the large audiences. However, I cannot state more emphatically - if you're considering a social media channel, but don't want to provide citizen (customer) service and two-way engagement on that platform, you shouldn't bother.
Using new media channels for one-way broadcasts and propaganda will only further alienate the people we serve. There are plenty of agencies using social media to engage and build trust. Join them, or don't bother.
Pew: Distrust, Discontent, Anger and Partisan Rancor - The People and Their Government
Malamud is not a lawyer, but he's met plenty - allies and adversaries - in his time as the nation's "rogue archivist." If you want open government, Malamud's your go-to guy. Intense and lightly sweating, at 9 a.m. he was decorating tables with postcards highlighting one of Law.gov's foundational elements, a state-by-state national inventory of legal materials; after the event, he broke down the space himself. Soon he'll be in Chicago and DC before returning home to the Bay Area and wrapping up a project report. He exudes a revolutionary zeal and the steady confidence of a veteran of many open government and privacy skirmishes.
Wednesday's series of panelists balanced open data dreams with hard truths about privacy in the globalized infoweb. Bob Berring, a UC Berkeley law professor, summed up the core issue: Carl is working on a 10 year old's question: Government has laws. We have to obey those laws. Where are they?
Twitter in-house counsel Alexander Macgillivray talked about the difficulty for legal staff's at small companies to afford basic research because of high Westlaw and Lexis fees - fees that units of government pay as well for access to legal documents.
Malamud believes that the law is one area that the disintermediating promise of the Internet has barely touched, and he brought in friend O'Reilly for a lunchtime discussion with California Secretary of State Debra Bowen. "What are we missing as a society because we are denied access to what is essentially the open source of our democracy?" O'Reilly asked.
A recurring theme was the problem of authentication of legal materials online, and the implied authority of the two major vendors. Erika Wayne, a Stanford law librarian, asked if anyone had seen an "informational only" disclaimer - common on web legal materials - on a physical book.
Chris Hoofnagle, a privacy researcher and UC Berkeley law professor also had a stark warning about the need to protect individual privacy as advocates seek to put more government information online. He argued that believers in "Big Brother" powers for the government - "I'm serious" - will use the language of the transparency movement to accomplish their goal of a surveillance society.
Despite the serious mission and very real challenges, the promising theme of open data, Law 2.0 mashups and lowered barriers to legal knowledge was not lost. Said Macgillivray, imagine a statue with its own Twitter account, tweeting its revisions.