Government Technology
By Bill Schrier: Making technology work for a city government.

Data Data Everywhere

March 2, 2010 By Bill Schrier

  Data.Seattle.Gov - click to see more Seattle just became the latest City to start posting its government data on the Internet in an open format. Open Data publishing may very well transform not just government, but Democracy, as well.

Data.seattle.gov has been live for a couple of months but was just officially announced this past Thursday, February 25th.

An interesting initiative, but what implication does it have for governing and government?

Making government transparent is not new - it has actually been going on since the first government websites went live in the mid-1990s. Most governments have a wide variety of data posted online. But in many cases it is hard to find or get in bulk. Constituents can search for individual building permits or maps or police reports. But only in the past 18 months have they been able to download whole datasets of such information in a usable format from online sites.

By "whole datasets" I mean, for example, perhaps almost every 911 call which occurred in San Francisco during the month of December, 2009, or every restaurant inspection in the entire City of Chicago, or all the building permits issued anywhere in the District of Columbia.

Government openness and transparency really found its legs with President Obama's declaration, on his first day as President, that he would run an open and transparent government. Many large cities now have open data websites. San Francisco's datasf.org is one of the most comprehensive and best, but Chicago, New York and Washington DC have similar sites in operation. Cook County Illinois and the State of Utah among many others put their "checkbooks" online.

The open data trend hasn't really reached a lot of smaller counties, cities and states just yet, but it will. For one thing, commercial services such as Socrata ( www.socrata.com) which powers the City of Seattle's data.seattle.gov and many federal websites, make it relatively cheap and easy for governments to post their data. (Socrata famously hosts the White House visitor log, which has received 400,000 views.)

But is putting data in bulk, online, anything more than a fad?

I believe it is the tip of a very serious explosion of a new version of democracy. Until now, governments use of the Internet has paralleled use in the private sector, although generally lagging two to three years. The private sector is driven by competition and is less risk adverse than those of us who work with taxpayer dollars.

Perhaps the first iteration of government presence on the Internet/web was simply putting information on line. For example, how to apply for a building permit, or explanations of how to report problems with streets.

The second version of online government is transactions, that is, actually doing some business online such as paying a utility bill or parking ticket.

Then the third wave of online work is expanding information to include this bulk download or easy, machine-readable, querying of data, such as data.seattle.gov and similar sites listed above. This makes fascinating applications available such as stumble safely or Cleanscores, listing the health inspection results for restaurants in San Francisco. An explosion of privately developed applications is starting to occur based on this open data. And also, in this wave of innovation, government diverges significantly from the private sector. Few private businesses will want to place large amounts of data collected at their own expense in the public domain for anyone to see and use.

A fourth wave of online interaction is now starting to appear, typified by the site " see click fix" where constituents can not only report issues online (using a map-based interface in the case of see-click-fix) but also see what others have reported and even rank the importance of the issues which have been reporrted.

A fifth wave is bound to occur, as governments expose their internal processes to public scrutiny, in the same fashion Fedex has done for package shipments or banks have done for loan processing. In this iteration, governments will not only accept a report of a problem or a need, but will actually allow citizens to track the problem resolution online. The citizen can report a broken streetlight, see when it is acknowledged or logged, see when it is scheduled for work, know when the crew is dispatched, see when the problem is fixed, and then provide feedback on the timeliness and quality of work. This will really make government accountable, as we'll have to streamline our business processes and expose them to scrutiny, along with the data about how government operates.

But yet another wave of citizen-to-government interaction is occurring as well. In this iteration, data will be posted online, and people will write applications and analyze it, and then use it to create and inform public policy options for elected officials to consider.

For example, a City might acquire a building such as a school which is no longer needed. How should the government use it?  Should it be torn down and sold to commercial developers?  Should it be torn down and used for a park (and what kind of park - swimming pool, grassy knoll, childrens' playground)?  Should it be converted into a community center or housing or offices for non-profit organizations? 

Answering these questions requires a lot of data and analysis. How many kids live nearby and what is the neighborhood crime rate?  Are there already lots of parks and playgrounds and pools nearby?  Are there a lot of seniors or immigrants or people with special needs? In the past, government employees would collect the data and crunch it and present the analyses and drive the solution.  And then the government would have a public meeting to discuss and debate the options.

But eventually, community activists and the neighborhood can do a lot of that, especially if they have access to all the same data and statistics as the government.

Furthermore, they can collect a LOT more and varied inputs. They can poll the neighborhood and canvas door-to-door and collect information from the "man on the street".  They can take photos of neighborhood conditions and gather unique statistics about the health and quality of life in that community.  They can then combine these sorts of input with census data to produce an entirely new look at the options.  And public meetings about potential uses of this school building can be much more informed, with mashups and maps and interactivity using tools like twitter and blogs.  Online polls using tools such as Ideasforseattle or Ideascale can allow the neighborhood to debate and rank choices, and be engaged in deeper and more meaningful ways than ever before.

Ultimately, such interactive government should result in better decisions, informed by the communities affected.

Does this mean the end of representative democracy as we know it?  Could we do away with elected officials entirely and have true governing by the people?

Hardly.  There will continue to be very hard decisions which individual neighborhoods and communities will fight tooth-and-nail, but decisions that have to be made for the good of society as a whole.  No one wants a jail or a garbage transfer station or housing for sex offenders or a nuclear waste dump in their neighborhood.  But we need all those things for society to function, and elected leaders will need to make those hard decisions.

Data.gov, Datasf.org, Data.seattle.gov.  These are only the beginning of a new and exciting era in our democracy.  Still, good leadership will never go out of fashion.


Leave a comment

What's Google Trying to Do?

February 16, 2010 By Bill Schrier

Seattle applies for Googles broadband - click to see more The nation's e-mail and blogging and twitter engines worked overtime on Wednesday February 10th when Google announced its intent to fund ultra-high-speed Internet access for 50,000 to 500,000 people nationwide.

This ain't your grandma's "broadband" connection. And it ain't the 100-squared broadband envisioned by FCC Chair Julius Genachowski in a speech on Tuesday February 15th - 100-squared is 100 megabits per second to 100 million people by 2020 - a pretty bold vision in and of itself. Google wants to provide one gigabit (one billion bits or about 120 million bytes) per second to homes via fiber optic cable.

At a gigabit per second, a very high quality movie would download in 8 seconds flat, compared to an hour or more with a fast cable modem or DSL connection. Google published an RFI and is seeking responses from cities who want Google to come and build. The City of Seattle announced very quickly its intention to apply and jump on the bandwagon. Of course we have a visionary Mayor, Mike McGinn, who is publicly seeking, as a priority for his administration, to build a fiber network to every home and business in Seattle.

So what is Google trying to do here?

Is it being a altruistic corporation, hoping to better the lives of average citizens while fulfilling its pledge to "make money without doing evil"?

Some of Google's motives are clear. They want to offer a competitive service and these networks are clearly "experimental". This is all about Internet, not about offering phone or cable TV service, although, at a gigabit a second, you can watch HDTV video from websites and use video conferencing and telephone service until you are blind and hoarse.

They explicitly want to "see what developers and users can do with ultra high-speeds, whether it's creating new bandwidth-intensive "killer apps" and services, or other uses we can't yet imagine". That implies to me that they want to connect high-tech businesses to other high-tech businesses and to their own employees in their homes as well as connecting other very tech-savvy users, students, and others who will push the envelope. This is probably NOT a network for serving low-income neighborhoods, bridging the digital divide, or connecting mom-and-pop businesses in neighborhoods.

Furthermore, Google would build networks to serve 50,000 to 500,000 "people" (not households or businesses). They want to serve multiple cities, so the chances any individual City would get service are pretty low (1 in 600 or maybe 1 in 6000). And in any given City, not many households would be served. If they do networks to serve 100,000 people, that's probably about 30,000 households, and if they do this in five cities, it is about 6,000 households in any given place.

What other strings will be attached?

Google makes money selling targeted ads. They also like consumers to use their products, e.g. if you want to use Buzz you need a Gmail account and it undoubtedly will gather information about how people use these networks as a part of the "experiment".

Finally, I am certain Google is sending a message to the cable companies and telecommunications carriers here. Those companies thrive on making broadband scarce. As a scarce commodity and a duopoly service (as it is in many communities), the telcomms and cable folks can charge more and keep hiking up rates. They put limits on how much broadband any given consumer can use. They undoubtedly would like to charge "content providers" - companies like Microsoft and Amazon and ... yes ... Google - money to make sure the content of those companies has priority and guaranteed delivery in an allegedly scarce and constrained bandwidth network. This is what the "net neutrality" debate is all about.

But Google (and lots of other people) know better. With fiber-to-the-home, speed is unlimited, the bandwidth is no longer scarce and the fat profits of the incumbents evaporate.

I'm certainly excited about the Google challenge. They are challenging the developers, the carriers, the cable companies and the FCC, to push the limits in its national broadband plan, due out March 17th.

Are there strings attached?  No doubt.  But this is a revolutionary proposal. Its about the economic future of our cities, region and nation. 

And it is cool.


Leave a comment

A Peek at the National Broadband Plan

January 27, 2010 By Bill Schrier

The National Broadband plan On January 26th Admiral Jamie Barnett of the FCC spoke about the National Broadband Plan, which is now due out on March 17th (and I understand New York City, Boston and other cities with large Irish-American populations plan to have parades in honor of the plan that day, too!)

As a CTO, I'm so immersed in technology that I'm not sure "broadband" means anything to the average American (if an "average" American exists).

Certainly most Americans are now at least aware of the Internet and use technology in their lives, even if that tech is nothing more than a cell phone or ATM. But all you have to do is watch the security lines at any airport and see all the laptops and luggables and cell phones and DVD players and other associated smart lumps of plastic dumped on the scanner lines to know that tech is ubiquitous in most people's lives.

A significant fraction of people know about broadband and what it means. In Seattle, some 84% of homes have an Internet connection, 75% have something faster than dial-up and 88% have a computer at home. Of course Seattle's got a reputation as a city of high tech folks (an image Bill Gates, Steve Ballmer and I work hard to polish). But even nationwide 79% of homes have an Internet connection and 63% are faster than dial-up. The source for these stats is here.

These are numbers are hard to fathom when one considers the web didn't exist 20 years ago, and most people probably thought "Internet" had something to do with basketball, volleyball, tennis or another "net-centric" sport.

Admiral Barnett heads the Homeland Security and Public Safety Bureau at the FCC. He's charged with making wireless spectrum available to government in general and specifically to the law enforcement, firefighting and emergency medical agencies who keep the public safe. He spoke at the Winter Summit of Association of Public Safety Communications Officials on January 26th, and gave us a glimpse of what the National Broadband Plan will contain.

Admiral Barnett's remarks centered on wireless spectrum for use by first responders. About 10 Megahertz is available nationwide for public safety, but the license for that is held by a single nationwide organization. Yet most police, fire and emergency medical agencies are operated by cities and counties. Given this paradoxical situation, 17 states and cities have requested waivers from the FCC to use that spectrum in their local areas to immediately create networks for their use.

And why is the spectrum required? These new wireless networks hold promise that cops in police vehicles can see videos of crimes in progress as they race to crime scenes, or rapidly access building plans, images and video. Have a peek at a report prepared by PTI and APCO here for more uses.

According to Admiral Barnett, those waivers may be granted later this year so we can get started building the network.

The FCC is very interested in public-private partnerships to build the networks because many jurisdictions don't have funds to construct such networks for themselves. Luckily, commercial cell phone carriers like Verizon and AT&T, and companies like Motorola and Alcatel-Lucent have signed on in support of this plan, and are developing new networks including LTE (long term evolution) for not only their own networks but also for public safety use. This means public safety agencies could use a network built and funded by taxpayers (more resilient, better priority, less costly) for most of their work, but could roam only the commercial carriers' networks when necessary. This is in stark contrast to today's networks, where police/fire radios are incompatible with the cell phone networks. The best of both worlds!

It looks like the FCC will encourage these partnerships in its plan.

The FCC also knows that funding will be required to construct these networks. Admiral Barnett understands funding is required not just to build the networks, but to operate them. Besides public-private partnerships, the FCC is floating the idea of an Emergency Response Interoperability Center (ERIC) to pushing forward on a national public safety wireless network. We'll hear more about this on February 10th .

Finally, Barnett said "next generation 911" will also be recognized in the national broadband plan. Right now, the only way to get information to a 911 center is to ... well ... telephone 911!

But many citizens' cell phones have the capability to do text messages, take photos and video. Yet 911 centers have little or no capability to accept such media, which can be critical to rapidly apprehending perpetrators and rendering aid to victims. We higher-speed land line fiber optic networking between 911 centers and other public safety and government facilities too, and I hope that will be in the Plan.

Twenty years ago, very few people knew of the Internet or Web. Now it is an indispensible part of most people's lives and a vital component of our HomeCity security and public safety. But we need more network SPEED, both wired and wireless. The National Broadband Plan could be, with a bit of vision by the FCC (and I've given them my vision here), a roadmap to the future of the nation.


Leave a comment

CES: The Time Machine

January 12, 2010 By Bill Schrier

We have a Time Machine.

It is one way, moving 60 seconds an hour, 24 hours a day, into The Future. The Consumer Electronics Show is a window into The Future. Technology demonstrated there this week will be available to early-adopter consumers and businesses in the next year or two, and will be available at Costco soon thereafter. And it has at least one common theme - networks will have to be fast. Not just fast, but FAST. Here are some examples:

But what does all this speed really get you in the real world?

For one thing, much faster two-way or multi-way video telephone or video conferencing, which means fewer commute trips in cars and less demand on other transportation such as plane trips across the country.

That translates into less air pollution, less dependence on foreign oil (and need for foreign military expeditions) and less global warming. Then there is improved entertainment, interactive gaming, energy management, and much much more.

But it all depends on rapid deployment of LTE for wireless and fiber-to-the-premise for wired networks. The Time Machine is taking us inexorably into this glitzy new future. But are our wireless and wired networks ready for this? Not in Seattle, certainly.

We need a network vision to match our CES vision and here it is.

The Flux Capacitor is fluxing.  The Time Machine is ready.  Are we ready to build the network we need?

Seattle Mayor Mike McGinn is ready, and we're going to do it.


Leave a comment

1999 - An Odd Odyssey

December 30, 2009 By Bill Schrier

The Y2K Bug - and confusion in years - click for more It was just ten short years ago that many of us were preparing to celebrate New Year's Eve - by working all night!

Anyone over 30 probably still remembers all the information technology work that went into preparing for Year 2000.

I'm going to dredge (!?) up some of my memories in the next few paragraphs, but if you have memories or stories of that December 31, 1999, evening, I invite you to leave them as a comment to this blog entry.

For many of us in Seattle, 1999 was not a good year.

First of all, we had madly been reviewing and fixing our information technology applications and programs and systems for Y2K bugs.

But no one really knew what would happen.  Would buses and trains stop dead due to bugs in their microchips?  Would the electrical grid fail?  Would 911 stop working?

The City of Seattle, like any organization using IT, had very real problems - we knew the accounting/financial database - called SFMS for Seattle Financial Management System - was not ready for Y2K, so we replaced it with an entirely new system.  We also patched up the water utility's and electrical utility's billling systems, since another project to replace them was in progress. (That system, now called CCSS for the Consolidated Customer Service System, was implemented in 2001, a year late and $14 million over budget, which is a different story).

The City's Chief Technology Officer was Lynn Jacobs, and in 1998 she had spread the alarm about Y2K, galvanizing the Mayor, City Council and most departments into action looking for their Y2K bugs.  But by October, 1999, Jacobs had largely checked out due to personal issues, rarely coming to work and exerting virtually no leadership.  So Mayor Schell replaced her with Marty Chakoian, who was, not coincidently, leading the City's Y2K efforts. There was plenty of consternation among the IT leadership in the City government.

But the outside world was in chaos in 1999 too. 

The Seattle Times ran a whole series of articles about the electrical grid and 911 systems and other critical functions, and how we were preparing them for Y2K. Gee, they even talked about potential water systems' issues with Y2K, even though Seattle's water reservoirs are high up in the mountains and the basic rule of water and wastewater is "s___ flows downhill" (The s___ stands for "stuff", of course).

And we had the WTO riots in Seattle in November; Seattle sure appeared to be the anarchy capital of North America, if not the world.

Then on Dec. 14, 1999, a 32-year-old Algerian named Ahmed Ressam was arrested in Port Angeles, Washington, coming across the border from Canada with 100 pounds of powerful explosives in the trunk of his car.  Was he headed to Seattle to detonate the explosives at the base of the Space Needle on New Year's Eve?  We couldn't take a chance, so Mayor Paul Schell cancelled the grand New Year's celebration planned there.

For most of us tech types, and a lot of other folks, it didn't make any difference, anyway.  We had already planned to be at work instead of celebrating on December 31st.

The City's Emergency Operations Center was open.  At that time, the EOC was in a crowded basement of Fire Station #2 in the Denny Regrade (it has since been replaced with a $30 million modern facility).  Nevertheless, senior officials from every department hunkered down to see in the millennium in that basement.

My own Department of Information Technology was all of 5 months old - we were created as a separate department on August 1, 1999. Our operations center was in an old stock brokerage (Foster and Marshall) building at 2nd and Columbia, which is now home to the United Way of Seattle. That building was home to the telecommunications division, including the service desk - the rest of the department was in the Dexter Horton building next door. [The Dexter Horton building turned out to be much worse off in the earthquake of 2001, when virtually everyone working there was forced to leave it for a couple weeks due to building damage, but again that's another story.]

Y2K at the City of Seattles IT Operations Center - click to see a larger version On December 31, 1999, we had a whole team of folks who celebrated the beginning of the third millennium* together, watching a quiet, uneventful Seattle 20th Century night turn into a quiet, uneventful and sleepy 21st century* morning.

Was it uneventful due to all our diligency and preparations, or was there never really any problem in the first place?  I don't know, but I do know I'll celebrate the end of the decade of the naughts tonight with a bit more enjoyment and a lot less trepidation.

*Note: Yes, yes, I do understand the real beginning of the 3rd millennium and the 21st century is January 1,2001. See article here. But, gee, popular culture doesn't count the years that way, so I took a little tech-journalism-geek liberties with dates in writing this article.


Leave a comment

Featured Papers