Government Technology
By Carl Drescher: Technology trends and their impacts on the provision of government services.

Technology and the Next President

November 3, 2008 By Carl Drescher

obama mccain.jpg With only a few hours until Election Day, I was pondering what the impact would be of a McCain presidency or of an Obama presidency as it relates to technology issues. Yes, both have stated their positions on Net Neutrality, but beyond that what policies would be instituted that might address other technology issues.

On other technology issues neither has really offered anything substantive as best I can discern.  Both say that we need to have a National Broadband Policy in place that provides ubiquitous access, but while this certainly sounds good how will this look and how might it benefit each community?   Certainly every urban and rural locality has different challenges.  Some will require infrastructure development while some might require more competition for affordability etc.  We are seeing government control of ISPs such as in Australia where the Australian Communications and Media Authority has been testing and will be implementing mandatory network filtering.  Is this something that either candidate supports and at what level?  Where does each stand on the 700MHz spectrum auction and the availability of sufficient public safety and first responder bandwidth for interoperable communications?   

I am generally a very optimistic person, but I do not see with either candidate that there will be much change in these overarching policies.  I hope that I am wrong and that significant strides can be made, but with all of the other issues that the next president will need to address I once again see technology issues getting some lip service, but not much more.  

Photo by Chesi - Fotos CC. Creative Commons Attribution-Share Alike 2.0 Generic


Leave a comment

ITIL v3

October 16, 2008 By Carl Drescher

For the past 3 days I have been attending a ITIL v3 Foundations in Service Management class.  What is ITIL v3?  The ITIL website offers the following overview: "The IT Infrastructure Library® (ITIL) is the most widely accepted approach to IT service management in the world. ITIL is a cohesive best practice framework, drawn from the public and private sectors internationally. It describes the organization of IT resources to deliver business value, and documents processes, functions and roles in IT Service Management (ITSM)".

Our IT department is embracing the ITIL framework and best practices.  We believe that ITIL processes and best practices will ensure that we are continuing to provide value to our customers, even as our budgets are cut and our open positions are frozen.  I will periodically provide updates in this blog regarding our ITIL journey; the challenges that we encounter and the impact of changes to our proceses.

I am interested to know how many other agencies and jurisdictions have embraced ITIL.  What obsticles; technical, cultural, etc. have been encountered? 

For those wanting further information on ITIL the following site is a good starting point -  www.ogc.gov.uk/itil - or post your question and I will be happy to find the answer.






Leave a comment

Virtually Virtual...

October 9, 2008 By Carl Drescher

For those of us who have been around long enough to remember the days when the mainframe - or "big iron" was the only real platform that serious business applications ran on, virtualization of system resources was the norm. Each application ran in its own "machine" and more system resources could be added as needed. This provided an efficient use of system resources when hardware costs were quite large as compared to today. Application software vendors supported their applications running in these virtual machine environments without question.


Fast forward to the present. Over the last few years the concept of virtualization has taken hold in PC based servers. Most organizations - such as mine - have implemented or have a strategy to implement this technology as a natural efficient use of the computing resources, as a way to better manage these systems, and to battle data center issues such as power consumption and data center cooling. Unfortunately our experience with the software vendors has not been one of support in terms of their applications running as a production system in these virtual environments. I am assured that this will be changing over time with the introduction of Microsoft's virtual solution and software vendor experience in this virtual world. In the mean time we hear statements like: "Well you can run your test and training in those environments, but we will not support your production in that environment", or " why would you want to incur the cost of training staff and capital costs of that hardware? Individual servers are inexpensive." When I have tried to place the requirement of supporting virtualization in as part of our RFP language I am told that such a requirement is too restrictive and that I am eliminating vendors who otherwise can provide adequate solutions.


I am curious if anyone has been successful in getting software vendors to fully support virtualization in a PC server environment. What specifications or requirements were listed as part of the purchasing or contract process?  Am I the only one seeing this?


Leave a comment

The Lay of the Land

September 29, 2008 By Carl Drescher

tucson3.jpg Before we get too far into the trenches, I would like to put things a bit into perspective and lay out a little about what we have done at the City of Tucson.  A lot of what will be written here is from the experiences that we have encountered implementing technological solutions.  We have planned well and we have also been extremely fortunate to have city managers over the years who understood the value of technology.

For those not familiar the City of Tucson is the second largest City in Arizona.  It spans an area of 250 square miles and has a population of approximately 547,000 making it the 30th largest City in the United States. The city employs approximately 6000 people not including the uniform police and fire personnel.  Tucson has placed in the top ten of the Digital Cities survey each of the last 7 years.

The City embarked on its broadband strategy in 1999, and over the last 9 years has created a robust communication network. This network interconnects all major city facilities such as fire stations, police stations, libraries, community and recreation centers.  The technologies used are a combination of wired and wireless.  A fiber optic backbone encompassing approximately 500 miles of fiber is connected redundantly as OC-48 and OC-12 SONET rings, and gigabit Ethernet pipes.  An OC-3 digital microwave network provides connections to sites that are not accessible via fiber.  The city has a wireless mesh network (for municipal use only) that covers all 250 square miles of the city. This mesh network was built as part of a project call ER-Link that provides video from a paramedic unit (while in transit) to the local trauma center.

The city owns, manages, and maintains the network.  This network is used by other government and quasi governmental agencies at a cost that is less than contracting these services from a provider.  The city currently has service contracts with the local community college, the county, and a local school district to provide wide area networking services across this network

The city continues to take advantage of this infrastructure for the provision of technologies and applications that offer a greater level of service and reduce costs.  Some examples include: VOIP, NOVA (our CRM implementation), streaming video, traffic signal management, and a number of e-services.  In the final testing and implementation phase are projects that provide GIS information to utility workers and first responders in the field, wireless applications for permitting and fire inspections, and consolidation of data centers.

Some other high profile projects that are in progress are:  use of open source software, outsourcing of the city payroll application, ITIL, a countywide public safety radio system, and the expansion of the city network to be a regional system.   

This obviously is a very general overview of technology and projects in Tucson. These are not unique initiatives, they are common to most municipalities and counties.  As we explore each of these over the next few months, I welcome your experiences, both good and bad as they relate to these technologies and initiatives.

Photo by Joe Brent. Creative Commons Attribution-Share Alike 2.0 Generic

Leave a comment

The Data Center of the Future?

September 15, 2008 By Carl Drescher

An interesting article in the Times Online  about a solution proposed by Google for their new data centers. 

According to the article, Google is considering deploying the supercomputers necessary to operate their Internet search engines on barges anchored up to seven miles (11km) offshore.

The floating data centers could use wave energy to power and cool their computers. And if they had an "offshore status," the company would no longer have to pay property taxes on its data center properties around the world, an additional saving.

I have heard about the generation of electrical power from the motion generated from ocean waves, but this certainly solves issues associated with current data centers: power costs and cooling. 

While there are a whole hosts of issues associated with a scenario such as this that must be addressed, (and yes I know that Google is looking to address legal issues as well) but from a technical perspective I am interested to know what your thoughts are regarding the use of data barges as future data centers.

Leave a comment

Featured Papers

In the Trenches
Carl Drescher

One of the constants regarding technology is that it is constantly changing. New technologies have the potential to change the way communities are governed. This blog will discuss some of these technological trends and how they are advancing government at all levels.

Please join in on the conversation so we might all learn from each other.