I decided to write this article after watching an early morning news show concerning homeland security during which a congressman stated that the Federal government must urgently fund new scientific research projects at our leading universities. He referred to the wealth of American talent available at our colleges to tackle the variety of problems faced in providing adequate protection of our homeland against terrorist threats.
My initial reaction was, “Typical political solution—fund new programs!” After a second of further consideration, I accepted the validity of the congressman’s comments, but remained concerned and disappointed that he didn’t even mention the more obvious and immediate solution: engage the energy, talent, and creativity of leading technology companies, large and small.
What Really Needs to be Done
A key component of any Homeland Security system must be the consistent and automatic monitoring and control of critical infrastructure such as power and natural gas distribution. One of the technologies needed to monitor and control critical infrastructure is called supervisory control and data acquisition, or SCADA, a system of monitoring and controlling equipment linked over long distances to a central computer. SCADA systems have been used for 30 years on pipelines, electric utility transmission systems, etc., however to allow homeland security personnel to fulfill their mission these existing systems must be integrated into a single centralized system.
The recent blackout in the northeastern U.S. might have been averted or at least minimized if the data available from each of the dozens of SCADA systems currently used to control power generation and transmission had been available at a critical infrastructure-monitoring center.
How We Do It
Over the past several years Internet technologies have enabled a new breed of SCADA systems to evolve. These open Internet-based SCADA systems provide access to real-time data from existing remote equipment and SCADA systems, therefore allowing the interconnection of these systems and equipment into a single integrated critical infrastructure monitoring system. Internet-based SCADA makes this possible by using standards such as XML for data formatting, SQL databases for storage, and Web browsers for presentation, thus eliminating proprietary data formats and host software. It also eliminates or minimizes the cost and complexity of long distance communications because each piece of remote equipment is connected to a local Internet Service Provider (ISP) or private wide area network.
A critical infrastructure SCADA system must provide three key functions:
- Collect data from and transmit control commands to existing SCADA servers,
- Collect data from and transmit control commands to new and existing remote equipment such as recloser controllers and surveillance cameras, and
- Provide access to aggregated data in a manner that allows rapid decision-making.
Figure 1 shows how the three categories of existing SCADA infrastructure (i.e. existing SCADA systems, standalone field equipment, and surveillance cameras) can be connected and integrated into an Internet SCADA system that meets all three functional requirements.
Dealing with Legacy Technology
Aggregating the necessary operational data into a nationwide critical infrastructure monitoring system offers some challenges. SCADA systems currently used by each operating entity (i.e. power generator or transmission and distribution company) are generally proprietary systems designed and built for the specific purpose of monitoring and controlling only the connected assets without consideration of sharing the data with other systems. These are not open systems of the type we now demand of ecommerce systems and as such, impose significant obstacles to any attempt at sharing data with other systems, including limited connectivity options, proprietary data formats, no facilities for data export or import, etc.
In cases where proprietary SCADA host software is Windows-based, it may be possible by working with the software vendor to provide open, standardized data export, however the longstanding proprietary nature of these systems makes this solution unlikely. The alternative is to install an Internet gateway that interfaces to the host software’s proprietary data export utility (supported by most SCADA software). The Internet gateway converts the proprietary data format to an Internet standard such as XML and pushes the data to the Internet SCADA Servers.
An Internet gateway may also be used to enable field equipment to communicate directly with the Internet SCADA servers. Once installed, it communicates with the equipment in the equipment’s native protocol and converts the data to XML format, and then transmits the data to the Internet SCADA servers.
Another major issue is system security, which in this context means assurance that SCADA data is always available, is not tampered with, and is accessible to only authorized users. The open nature of the Internet requires careful consideration of data security measures when implementing Internet SCADA systems. A determined attacker must not be able to affect the availability of the system or the integrity or confidentiality of the data. Processes, procedures, and tools must be put in place to address availability, integrity, confidentiality, and protection against unauthorized users.
Availability: System up time must be maintained
at the highest levels through use of redundant servers.
Firewall protection must be provided in the Internet
gateway and servers along with automated monitoring
to detect DNS attacks.
Integrity: System must ensure data is not modified or corrupted through use of encrypted data signatures, authentication to restrict access, etc.
Confidentiality: System must ensure restricted access to data through use of encryption, and to the system by employing authentication such as Secure Socket Layer.
Protection against unauthorized users: Multi-layered password protection must be provided at all levels in the system.
The Internet gateway must support Internet protocols and services, the only internationally recognized and supported network standards, specifically an IP address and at least parts of the TCP/IP stack – typically at least HTTPS, TCP/IP, UDP, and PPP. Once connected to the Internet, the Internet gateway pushes data to the Internet SCADA servers. In cases where the equipment incorporates an electronic controller, it may be possible to simply add similar functionality into the existing micro-controller.
Therefore using an Internet gateway (or embedded protocols and services when available) permits the integration of data from disparate SCADA host software and remote equipment into a Internet SCADA server system that is based on open systems standards such SQL database, and XML data format. Once the data is available in the Internet SCADA system, it may be accessed from any standard web browser in any location.
The open architecture of an Internet SCADA system combined with appropriate field equipment makes it possible to develop a highly integrated nationwide centralized system. Integration of thousands of individual pieces of equipment and systems in a way that assures integrity requires standardization of data format and transmission protocol.
The preferred data format is Extensible Markup Language (XML). XML was developed to bring greater flexibility and interoperability to web applications. It is a meta-language for describing markup languages and therefore does not specify semantics or a tag set. In other words, XML provides a facility to define tags and structure. XML provides flexibility not available from HTML because the programmer has the freedom to create tag sets and semantics. The simpler alternative markup language, HTML has undergone continuous development to support new tags and style sheets. However, these changes are limited by the requirement to be backwards compatible, and to what the browser vendors are willing to support.
The preferred data transmission protocol is HTTPS because it is firewall friendly and allows web servers to be used to control data transmission. The alternatives, TCP/IP or UDP, require the cooperation of third party IT departments to open ports on servers and thereby introduce potential for cyber attack.
Scaling an Internet SCADA system from a few to hundreds of thousands of assets while maintaining near real-time performance requires a system architecture that enables data to be pushed from the remote equipment without host system polls. Internet protocols and services are ideal for a system architecture of this type. The scalability of commercially available databases and server hardware has been proven in thousands ecommerce applications.
This approach has already been implemented in systems supporting simultaneous 20-second updates from 3000 devices.
The techniques, software, hardware and networks required to accomplish the integration described above are all widely used on the public Internet and are low in cost and high in reliability. The only SCADA-specific hardware, the Internet gateway, is available from multiple vendors at a price of $300 or less.
Once operational data has been aggregated and integrated into the Internet SCADA servers, it becomes relatively simple to develop Homeland Security applications in a way that is impossible with today’s proprietary systems. Through the use of proven data mining and analysis techniques Internet SCADA can produce highly valuable information such as system-wide trending, failure predications, system condition reports, etc.
Internet SCADA is currently in use in oil and gas, electric utility, and government systems supporting real-time data acquisition, remote control, surveillance, and customized applications, so let’s worry less about new R&D funds for universities and put existing technology to work securing our homeland.
| Thomas Tanton
| Overall, there
are some good thoughts and arguments presented.
I do however have concerns about the approach
presented given the interdependencies of various
critical infrastructures (in this case there are
two main ones--the telecommunications network
that enables the internet and the electric grid)--if
the internet goes down, the grid goes down --
and if the grid goes down the internet-based control
system goes down. And using an essentially open
system like the Internet simply invites hackers
and other troublemakers. Already, some control
rooms are vulnerable to invasion from hackers
with PCS cell phones. The other concern is the
authors suggestion of "a critical infrastructure-monitoring
center"--once something as important as this is
centralized it becomes morfe vulnerable not less.
A beter approach, I think, is to have multiple,
diverse, and distributed intelligent agents (no,
not intelligence agents)--software algorithms
etc. that can learn, anticipate and react to protect
the grid and other critical infrastructures.
| Patrick C Miller
| Without getting
into the Open Source (or any *IX-based OS) vs.
Microsoft, I believe that the use of "open" systems
will invite hackers and malicious code. That doesn't
mean these platforms shouldn't be used for critical
infrastructures such as Energy/Gas/Water/etc.
It does mean that when used, they should be appropriately
secured - which can be a reality. If you are going
to use "open" systems, hire a Security Staff -
period - and not just one overtasked technician
who wears the security hat. Be prepared to hire
a few qualified people to cover the different
environments and a manager to own the responsibility.
The next issue is whether or not to connect these systems to the Internet. To date, the Internet isn't the most reliable form of connectivity. It requires a significant expense to maintain highly available and redundant connections - especially if you cover large geographical areas. In many cases, it just isn't worth the money to do this. Do some extensive research and planning to determine whether or not you can afford the types of connectivity that come with the "high availability" metrics. Also, consider that when you begin down this path, there is the inevitable disconnection scenario. Be sure that you have appropriate mechanisms in place to keep databases and other systems-of-record in synch while disconnected - which can add to the already exorbitant cost of the "high availability" environment.
Lastly, just because the vendors are offering "open" systems that connect to the Internet with many new services and data input/output mechanisms doesn't mean you should rush out to buy and use these devices. This is really the first wave of this type of technology and it should be considered "beta" for the most part. No vendor has established a strong and secure track record with this technology yet. Exert strong pressure on your vendors to provide you with secure devices that have been tested and proven secure. Currently, since there isn't an ICSA or other process to "certify" these devices, your best bet is going with the INEEL or NIST lab results and buying what has been tested - and deemed secure.
As a security consultant, I see too many companies falling for "security snake oil" from vendors and the media in general. Keep it simple. Complexity breeds insecurity. Integrating these systems and providing an interface to the Internet is asking for trouble. If you don't believe me, keep asking around or hire a "Tiger Team" to show you. If you plan on doing this, also plan on hiring the appropriate staff to provide the necessary protection.
| George Kamburoff
| Excellent article
and subsequent arguments are presented here.
Although this is primarily a technical topic, there are other considerations, as well.
Centralization of control is a bad idea, in the present political climate. We in California saw what happened when politics are a major consideration in centralized control. The new, politicized FERC, with two of its Commissioners picked by Ken Lay, set the stage for the Great Electricity Fraud, and refused to stop the looting of California by Enron, Reliant, Calpine, El Paso Gas, and others.
| Ravinder Singh
| Dear Mr. Donald,
Internet is merely an information tool- service
manuals / AR are now avaiable in pdf form, it
has not prevented crashing of companies or making
them more profitable. Obviosly it had made exchange
of information faster but its impact is limited.
Most of decision making process continue to made
through meetings and debates. Internet itself
is vulnerable to attacks. Autos may have computers
to monitor and control many functions but stearing
wheels will continue to be in human hands.
| Len Gould
| I agree with
the thrust of the article. It is high time that
regional grid managers are given better tools
than were apparently available in Ohio in August.
That being said:
I see no magic in the internet as a means of connection. All the internet is is one method of delivering packets of data from one point to another. It has definite waeknesses from a control point of view, providing neither a guaranteed time of delivery or assurance that a particular packet will or has been delivered. If safe dispatching of a generator in the next state depends on first commanding some VARS from a generator in the next state, it is important to know that the one command has been acted on before issuing the other.
I think a better solution would be a dedicated high bandwidth link loosely based on the internet with a protocol which handles such issues would be better. The new open FieldBus standard becomming accepted in the petrochemical industry might make a better base on which to build, though i'm not sure if even it's designed to deal with the distances. If not, it could surely be uprated for it. See http://www.fieldbus.org/. Form there, if/when safe and necessary, very well secured internet connections could be provided to the repository servers, and means of securely commanding a subset of possible actions with ot without the internet.
The main benefit of the internet, which is very cheap connections of points, is not likely worth the risk for large critical assets, though it would/might make sense for e.g. residential meter interaction and possible shedding commands. Most interesting data from large assets is already being collected by some means to points where the cost of a data conversion/transmission engine would not be astronomical. Collection / interpretation / action from there up is what is required.