Government Technology

    Digital Communities
    Industry Members

  • Click sponsor logos for whitepapers, case studies, and best practices.
  • McAfee

Future-Proofing Applications

April 23, 2009 By

In the halcyon days of glass rooms and mainframes, application development was a logical, step-by-step, paced process. It involved careful specification of everything from the database to input screens. Many of us can remember the shelves full of binders containing detailed specs covering everything related to an application -- the planning alone might occur over a period of months or even years.

Application development followed a logical, sensible course. Projects started, they were planned, laid out carefully, built and deployed in a nice, relatively neat linear progression that at least from the viewpoint of the end users took a long time.

All this orderliness encountered the ultimate apple cart upsetters: cheaper computers in the hands of pesky and impatient end users. And, if that wasn't bad enough, the Web arrived which somehow seemed to turn anyone with access to a keyboard and an ability to type into "developers."

Although the Web became -- and still is to a large extent -- a wild west, the fact remains that building good, solid, enterprise-worthy applications depend on many of the solid principles that ruled the glass rooms.

But, even when sound development principles are in use, things still need to move at Internet speed and flexibility is far more important than it used to be. As computing power increased and Web application development matured, some interesting approaches have emerged to reconcile the need for well-planned applications and the flexibility required to survive in the rapidly changing world of the Web.

A popular book written in the infancy of the Web began with a description of what makes a good programmer -- a memorable item from that list was the assertion that developers are "lazy." The author goes on to clarify that this doesn't mean they don't work hard but that by doing things right and writing applications to automate routine or boring jobs, developers buy downtime.

Perhaps another way of saying it is that good developers never want to do the same thing twice. They would rather spend a long week of late nights writing a program to empower a non-technical user to easily import a set of records than to spend a mindless week doing repetitive data entry themselves.

So, the rule is that if you have a one-time only task that entails some kind of repetitive task, don't give it to a developer: chances are good they'll spend the full schedule and then some developing a program to do automatically what could have been done faster by hand. Of course, their argument is that the second time the same task needs to be done, you've recouped the investment.

The old way of building applications -- whether for the glass room or the Web -- was to talk with the end users to identify the "things" the application needed to manage, what "actions" needed to be done with those "things" and what kinds of "outputs" were required.

Based on this data, a database was designed and built to store the "things", code was written to present screens by which the end user could enter, edit or delete the "things," more code was written to perform "actions" on the "things" and, finally, reports and output routines were written to send the "things" somewhere else.

A simple example would be an application to manage employees and their HR benefits. The employees and benefits are the "things;" signing up for or canceling benefits are some "actions" and reports to management or interfaces to a payroll system would be the "outputs."

Pretty straightforward and pretty flat.

When different or new information needed to be added about employees, a developer generally needed to do the following:

1) Update the database to hold the additional data.

2) Change the enter/edit

| More


Add Your Comment

You are solely responsible for the content of your comments. We reserve the right to remove comments that are considered profane, vulgar, obscene, factually inaccurate, off-topic, or considered a personal attack.

In Our Library

White Papers | Exclusives Reports | Webinar Archives | Best Practices and Case Studies
Digital Cities & Counties Survey: Best Practices Quick Reference Guide
This Best Practices Quick Reference Guide is a compilation of examples from the 2013 Digital Cities and Counties Surveys showcasing the innovative ways local governments are using technological tools to respond to the needs of their communities. It is our hope that by calling attention to just a few examples from cities and counties of all sizes, we will encourage further collaboration and spark additional creativity in local government service delivery.
Wireless Reporting Takes Pain (& Wait) out of Voting
In Michigan and Minnesota counties, wireless voting via the AT&T network has brought speed, efficiency and accuracy to elections - another illustration of how mobility and machine-to-machine (M2M) technology help governments to bring superior services and communication to constituents.
Why Would a City Proclaim Their Data “Open by Default?”
The City of Palo Alto, California, a 2013 Center for Digital Government Digital City Survey winner, has officially proclaimed “open” to be the default setting for all city data. Are they courageous or crazy?
View All