November 5, 2012 By Larry Karisny
Cyber Security Expert Joe Weiss has spearheaded the ICS Cyber Security Conference for 12 years, and when he calls in the troops, the best come to serve. Last month’s conference held at Old Dominion University's Virginia Modeling Analysis and Simulation Center -- VMASC in Suffolk, Va. -- was no different. I had a chance to attend the conference and talk with Weiss about Industrial Control System (ICS) security, and this is what he had to say.
Karisny: Your conference first and foremost reinforced that industrial control system (ICS) security is different and it is not just IT. Can you briefly explain?
Weiss: ICSs are purpose-built systems for performing specific tasks. They are built with a mix of commercial off-the-shelf systems (such as Windows) and proprietary realtime operating systems, proprietary communication protocols, and have very specific operating requirements. They were built with minimum computing resources and to operate on their own networks to maximize reliability. They are built to operate for long periods of time (up to 10-20 years) with minimal downtime and will be replaced when they are obsolete or functional operating requirements change. Generally, they will not be replaced because of security reasons. Their primary function is to provide safe, reliable operation with computer operators and system integrators trained for reliable operation not security. From a cyber security perspective, the most important considerations are availability of the process and authentication of the devices; confidentiality is generally not important for the data "in motion." The concern is that inappropriate use of IT technologies, policies, and/or testing such as penetration testing could, and has, impacted the performance of ICSs.
Karisny: There were validated disclosures of targeted critical infrastructure cyber incidents in the conference. Without disclosing too much confidentiality can you explain these incidents and their significance?
Weiss: There were two ICS cyber incidents that occurred recently that were discussed. These two unintentional incidents are important as they have not been seen before, they represent two different control system suppliers, and there is no guidance in what to do.
In the first case, the utility was in the final stages of a plant distributed control system (DCS) retrofit. During the installation process, the view of the process (the operator displays, etc) were lost. Neither the utility nor the on-site vendor support was able to get the view of the process restored. It took a vendor link from about 2,000 miles away to get the view of the process back. It raises several questions:
1.What caused the loss of view?
2. Why were the on-site staff not trained about this situation?
3. What did the headquarters staff know that allowed them to get the process view restored?
4. What other facilities have suffered this problem?
5. Could this problem be intentionally caused?
The second case was a complete loss of logic in every plant DCS processor with the plant at power. The event occurred more than once and led to complete loss of control and loss of view. (This is well beyond what I thought was the worst case scenario.) What saved the plant were the old hardwired analog safety systems that shut down the processes. The plant has not been able to determine the cause of the loss of logic. They have documented the situation, contacted their vendor, and provided the vendor their recommendations. The utility is still waiting to hear from their vendor. The concern is this could happen to any industrial facility from any control system supplier. It is not clear if this can be done maliciously.
Karisny: There is a need of sharing cyber breach information but legal issues seem to be deterring this information from even private disclosure. From government intelligence agencies to private sector confidential disclosure, how can we minimally gather this information in some type of a cyber breach clearing house?
Weiss: My view is that end-users will share information if they feel it will help them. That means they need a venue where they feel they can get knowledgeable feedback so that all sides (the discloser as well as the attendees) get something from the disclosure. I also don’t believe private industry trusts the government so a DHS or other government-sponsored vehicle will not work. The ICS Conference works because there are smart people there that can provide intelligent feedback to the presenters and the end-users feel they will not have their information disclosed.
Karisny: Will the difference in ICS require a different way of developing ICS security? Were there some promising new technologies capable of addressing these differences discussed in the conference?
Weiss: As mentioned before, ICSs are different than IT. Generally, IT security suppliers are taking their existing IT solutions and attempting to “customize” them for ICS. What should be done is to understand how the ICS works and what could compromise ICS reliability and/or safety. Then, develop solutions that address those specific concerns. I know of only one technology that seems to have taken this approach. It is still in the R&D stage.
Karisny: A hacker can rapidly respond without recognition or requirement of following cyber security rules and regulations. This is not the case for the good guy in cyber security. With an abundance of standards, regulation, compliance and oversight in cyber security, is there a way to offer short cuts to let the good guys get in?
Weiss: Unlike the good guys, a hacker doesn’t have an organizational chart to follow. As best as I can tell, the only time the IT and ICS communities worked together flawlessly was the development of Stuxnet. The North American Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) cyber security standards are a good example of a compliance rather than security mindset. The NERC CIPs have made the grid less reliable and less secure as well as becoming a roadmap for hackers to compromise the grid. That is, the NERC CIPs publicly identify the size requirements to make a facility critical which allow one to determine which power plants, substations, and control centers will have cyber security requirements and which will not. With the current set of NERC CIP standards, approximately 70 percent of power plants, 30 percent of transmission substations, and all distributions systems have no cyber security requirements. Until certain government organizations stop being more afraid of the bad guys learning something rather than educating the good guys, industry will be in trouble because the bad guys want to learn and the good guys will continue to be unaware. This lack of understanding of critical vulnerabilities was demonstrated by the Aurora discussions at the conference. These first public discussions were new to almost all conference attendees.
Karisny: What is it going to take to get utility senior management buy-in on understanding the possibility and consequences of a cyber attack incident and the talent required to mitigate and prioritize resources for ICS cyber security?
Weiss: Until utility management treats ICS cyber security as a reliability issue rather than a compliance issue, there will be less than robust utility attendance at the ICS Cyber Security Conference. The question is how to reach and educate utility management about the reliability and safety issues of ICS cyber security. The ICS Cyber Security Conference is not a utility conference but a cross-industry ICS cyber security conference. We had a significant number of end-users from water, chemicals, oil/gas, manufacturing, food, pipelines, and DOD. My belief is that the electric industry is not a leader in cyber security of control systems because of the NERC CIPs creating a culture of compliance not security. The leaders in cyber security are the oil/gas and petrochemical industry with DOD starting to take this more seriously. One would hope that after all of the power issues with Hurricane Sandy, utility executives will take ICS cyber security more seriously before it is too late.
For more a full observation summary of the conference by Joe Weiss please click here .
Larry Karisny is the director of Project Safety.org, a smart-grid security consultant, writer and industry speaker focusing on security solutions for the smart grid and critical infrastructure.
This Digital Communities white paper highlights discussions with IT officials in four counties that have adopted shared services models. Our aim was to learn about the obstacles these governments have faced when it comes to shared services and what it takes to overcome those roadblocks. We also spoke with several members of the IT industry who have thought long and hard about these issues. The paper offers some best practices for shared government-to-government services, but also points out challenges that government and industry still must overcome before this model gains widespread adoption.