July 30, 2009 By Hilton Collins
When you buy software, you probably trust that you're getting a secure product that runs well. This faith may come from the fact that the source code - the digital DNA that tells the program how to work and what to do - is hidden from consumers. In most cases, only the select programmers tasked with maintenance and security can see it and make changes.
Closed or proprietary code is the engine of legions of vendor-made products. Many of them, like Microsoft's nearly ubiquitous Windows software, are closed code to prevent piracy and duplication by competitors or users. And for some license owners, the perceived benefit of closed code is that if no one sees it, those who intend to do harm can't see the software's vulnerabilities easily and how to exploit them.
The prevalence of open source code, however, could make one wonder how much secret code matters. The term "open source" generally refers to programs in which people can view or modify the programming code. Open code is developed in a collaborative environment where programmers can make changes that are visible for the community to see. People can download many of these programs free of charge and can choose to join the development process by making modifications or viewing changes as they see fit.
But does this openness make it less secure than its closed source brethren? Open source advocates certainly don't think so.
"You know exactly what needs to be done to secure it and what vulnerability it has. It's quantifiable; it's knowable," said Christopher Adelman, vice president of sales and marketing for Alien Vault. Alien Vault created OSSIM (Open Source Security Information Management). "The problem with closed source solutions is there's a certain leap of faith associated with closed source software."
Open source code lets users judge how secure a program is, Adelman said. When you can't see the code, you can't see for yourself just how secure it is or isn't. "You know exactly what you're getting into, and for me, that's everything. Game won right there."
A popular argument of the pro-open source crowd is this: If it's open, it's essentially up for peer review, which means there are more sets of eyes to identify security holes and fix them. In a closed environment, how do you know how thorough your software's being reviewed if you can't see what's happening or know who's doing it?
"The things that keep me awake at night are the things I don't know about. It's the things that I have no idea are out there that the hackers know that I don't, that are going to cause us problems on our security operation front," said Jon Dolan, chief information security officer of Oregon State University.
Open source can also make patching software a bit faster. There's no need to contact the vendor about a bug - like you'd have to with proprietary code - or wait for a next release of the software that's fixed the bug.
"If I find a bug in an open source program ... I submit a fix to the people who are responsible for the program," Dolan said. "It gets peer reviewed before it's accepted, but then it is accepted in short order, so we eliminate this whole workflow of reporting a bug to have somebody else fix it. You just fix it yourself and pass along the fix to everyone
This Digital Communities white paper highlights discussions with IT officials in four counties that have adopted shared services models. Our aim was to learn about the obstacles these governments have faced when it comes to shared services and what it takes to overcome those roadblocks. We also spoke with several members of the IT industry who have thought long and hard about these issues. The paper offers some best practices for shared government-to-government services, but also points out challenges that government and industry still must overcome before this model gains widespread adoption.