The other thing that's different about these systems is that they are very specific systems designed to do very specific tasks in a specific time frame. I'll give you an example. In the gasoline industry, if you want to fill a tanker truck with gasoline, you do this automatically. You don't want gasoline from the storage bin starting to be delivered to that tank truck before it arrives underneath the pipe. Conversely, you don't want the truck driving away if it's still being filled or if it hasn't been filled. So these systems have been designed for specific timing.
In a traditional IT system, you don't do that. I do my first task until it's done. I do my next task until it's done. Because you have this difference, and the difference is timing in interrupts, the technology that's been developed for the control systems doesn't have the ability to use the IT technology. The IT technology never accounted for the fact "Wait a minute, I have to stop now." The other thing about control systems that's different is these are systems that use as as small a processor as you can get away with. You don't get rid of it until it's just so obsolete, you can't do anything more with it.
So we have an awful lot of systems that don't have the processing capability to use what happens in the IT world, which has the latest and greatest and large computing capability.
So in other words, what you're saying is it doesn't have the ability to stick in cryptographic sign-on abilities? It doesn't have the ability to stick in security?
You can't, and here's another issue that is really important. Security technology tends to close up the system. A control system is designed to be useful, efficient, interoperable. It has lots of different systems it's working with. What's ironic is securing this system directly impacts the ability to operate efficiently. So, for the first time, we're in a quandary. Can you make a system more secure? Yes, but at a cost. That cost is not just an economic cost. It's very much performance cost. You may not be able to do what that system is designed to do if you slow it down so much.
What's a control system? Where are they, and why are they important?
Control systems is a general term. They are the systems that essentially make the products we use. They are the automated systems that take an input. For example, you measure a temperature or a pressure or a voltage. It takes that input. It has a program inside to say, "I'm going to do something," whether it's to make a car or heat up a boiler. So it controls the signals going out to a pump, to a valve, to a press, to something that's physically doing something. The control system is essentially the brains. The neurons are the sensors that are giving the brains the wherewithal to take and do the actions.
Who uses them, and where are they used?
There are a number of types of control systems. There is one term that's bandied about frequently, and it's the term SCADA. SCADA is a control system, but it's only one type of control system. It stands for Supervisory Control and Data Acquisition. It acts as a data gatherer where you have lots and lots of data. It's used where you have very dispersed facilities, like electric wires that run hundreds of miles, or you have gas pipelines, again, running hundreds of miles, or you have canals, water systems. What you're doing with a SCADA system, predominantly, is gathering lots of data, and you're doing what's called some supervisory control. So SCADAs are normally used in very dispersed applications.
A distributed control system (DCS) is another type of control system. It's used in places like power plants, refineries, chemical plants, large facilities; fairly confined. But the big thing with a distributed control system is it's used where you've got to do lots of complex calculations, and you're doing both what's called regulatory control as well as supervisory control. The regulatory control is the minute-by-minute changing of the process. The supervisory control is the traffic cop on top, checking to see where all of these other things are coming from. The big thing about a DCS is it's there to do lots of very complex calculations.
Now, another type of control system is the programmable logic controller, PLC. The PLC started off in life to do what was called discrete rather than continuous type of things. You could look at it as a PLC was there to make cookies, batches of things. So you could program it to say, "Make chocolate chip cookies for a while." Then change it, "Now go make butterscotch cookies." So the PLC was going to be doing reasonably complex calculations, but it was going to be done in a discrete manner.
Those are probably the three primary [types]. Now other types of control systems have evolved. What has evolved over time is, we've gone from the mainframe type computer -- which is essentially where the DCS evolved from, and the SCADA to some sense also evolved from -- and it's moving down toward essentially what a PLC is now. The PLC is moving down to what a PC is now.
So as the computing technology with the microprocessors gets to be more and more powerful, a lot of the distinctions were in between these systems. What isn't blurring as much is the fact that, when you have the very, very dispersed assets, you're still more concerned with data concentration.
What does a SCADA system look like?
What you just asked is really two terms. SCADA is one thing. When you talk about a SCADA system, you're now talking about all of the different sensors and pieces that make it up. The SCADA itself is potentially, for example, either a series of PCs or superminis or whatever. In fact, one of the issues that becomes -- or confusions, if you will -- is somebody could look at a SCADA screen and say, "Gee, that looks just like my business system." It's what's behind or that feeds into that box that makes a difference.
I'll just use a small utility. A small utility may have maybe, certainly several hundred miles of electric lines. It may have probably 10 to 15 substations. In these substations, you will have transformers, you will have protective relays. Protective relays are essentially the breakers you have in your house.
What the SCADA system is doing is monitoring and controlling the protective relays and the substations. It's monitoring the current and the voltages on the wires. It's monitoring the frequencies. What it's trying to do is provide all of the key electric data that the control center needs to efficiently and economically maintain the grid.
So [how many] sensors? I mean, dozens, hundreds, thousands?
Again, the bigger the utility, the more it moves to thousands and tens of thousands.
[Which] used to, by the way, be controlled manually?
Many of them were controlled manually, and to start with, they were electromechanical. They were mechanical switches. In order to become more efficient, with the microprocessor, we've been able to go from the mechanical switch that needed to be nursed and cared for by people in the field constantly, to an electronic device that could be monitored from afar, much more reliable, and to be able to be programmed. Efficiency-wise, it's been wonderful. There is a flip side, which is the security of it.
How big a problem is that?
These are the systems that are designed to be open, interoperable, and to let the appropriate people use them. The problem we have is in the design and in the implementation. We haven't always put the right box around appropriate people. As it's designed, "appropriate" can be essentially anyone.
These are closed systems, right?
No, they're not. All control systems -- and I do use that term "all" -- when they were designed, they were designed to be closed systems. With the advent of the PC and the Internet and all these wonderful new technologies, we've been able to move away from at least where the operator sits and types on his console -- a proprietary system -- to a commercial off-the-shelf system. However, back when you do your control at the front-end process, that is still viewed as a closed system. That is at least what the design premise is.
The problem that creeps in, though, is that this system that sits out here can be accessed via a number of different means -- not only via the Internet. It can also be accessed from different types of communication media. These communication media themselves are not secured. They were never meant to be secured.
What does that mean?
For example, just using a microwave system -- we don't use encrypted microwave. The government may use encrypted microwave for satellites. It's not what's used in the commercial world. So you have these systems that are out there are systems that have access points.
Additionally, in order to be efficiently maintained by vendors, these systems have a remote dial-in capability. These systems are not closed, isolated systems anymore. Trying to go back and close them off would lead to some substantial efficiency and resource hits, because if you can't do a lot of this remotely, that means you have to man them. That's expensive.
And over the years, that's the opposite direction we were going?
That is exactly the opposite direction.
One of the things people say about electrical grids -- they all deal with them differently. There is different software and everything else, [so] it makes it very difficult to do any havoc with it. Is that the case?
First of all, you were talking operating systems. That has a unique meaning, which I don't think is what you're really asking. At least, I'm going to try to answer what I thought you asked. There are only a limited number of suppliers who supply the SCADA and energy management systems that control the grid, not just in North America, but throughout the world. A very limited number, and those vendors have a limited number of software or software revisions.
Additionally, the grid is an interconnected grid, so these systems need to be able to talk to each other. But a part of the vulnerability is the fact that there are so few suppliers. There is the commonality within the systems; there's also a commonality in the communication handshakes, or protocol between the systems.
As an industry, we have worked harder to be able to have these open communication protocols. We're to the point where we fit the proverbial, "Be careful what you ask for; you just may get it."
How does that leave the system vulnerable?
It leaves it vulnerable because in order to have open systems, you need these systems to be able to talk to each other. But you also need to make sure that they have as much commonality as possible, and that design did not consider security. It wasn't designed to be a secure, functionally efficiently operating system.
With cyber that allows you to enter multiple points at the same time, I believe that the vulnerability is there on a grander scale. If you only were trying to address one substation at a time, or even one control center at a time, you would be talking strictly in very local rather than much more regional or greater [terms]. But one of the unique -- I want to say aspects, or concerns -- about cyber is that you can attack or address multiple systems at multiple locations, either simultaneously, or very close to simultaneously.
If intrusions into systems occur, do we have the technologies to recognize them?
Our control centers have, in almost all cases, firewalls, intrusion detection, demilitarized zones -- everything you could put around them to secure them. That same cannot be said for power plants or substations. What that means is that if there are probes going on, or potential attacks occurring, and they're targeting a control center, we would at least know it. Hopefully we'd be able to defeat it and because you have the logging capability, you'd also be able to track, trend, and hopefully do some forensics in terms of what just occurred. The same can't be said for the other systems where control systems really reside.
And they are accessible?
Yes. That is a general yes.
What's the worst case? For instance, let's talk about a specific. What's the worst case of an intrusion that's happened that you know of at this point that's done damage?
It wasn't in the electric industry. It occurred in Australia. It's very public.
This was a sewage plant. You had a person who worked for a SCADA company, who was fired, so became disgruntled. While at the SCADA company, they provided the SCADA to the local water wastewater company. I am told -- but I can't verify this -- that he tried to get a job with the local water wastewater company and was turned down. So he went to the local electronic store, and got a radio transmitter.
Because he helped design the system -- this is where I mentioned before, where the definition of an insider becomes very different -- because this person knew the system, he effectively was an insider. Even though he didn't work for the SCADA company or the water wastewater company anymore, he was still an insider.
He got a radio transmitter, drove to where they had a sewage discharge valve, which happened to be on the grounds of a local hotel. Using a transmitter, he opened the valve remotely, and caused a sewage discharge.
What was interesting to me, from everything I've been able to gather, they finally caught him the 46th time he did it. What I was also told was, the first 20 times, they didn't even know why the valve kept opening. From 20 to 45, they were looking for him. They caught him on the 46th.
That has some really important ramifications. The first is, it's saying if you use cyber, it's not intuitively obvious that it was cyber that caused something. The valve opened. The first 20 times, they tried to figure out why the valve is opening. They didn't ask, "Who's attacking it? Why is the valve opening?"
To me, the other important ramification was, even after they realized it was cyber, it was taught at courses. You have to just pay, because very easily, it could just be somebody who was going to be a contract engineer to do it. So I'm just saying this knowledge, is it the same knowledge as what you talk about the Internet? No, it's much, much smaller. But this knowledge -- people who work in this industry essentially covers that.
But if you've got knowledge of how the systems work and you've got the access through the Internet and you know how to use the Internet to get into the systems, you got a problem.
Yes.
How big a problem? What's the worst-case scenario?
Don't know. And part of the reason we don't know is there have been multiple vulnerability assessments in many industries to determine the vulnerability of these systems. The ability to get unauthorized access to these systems, say, utility and was turned down, and did the same thing to that same electric utility which he could have done, I think. That would not be a trivial case. Could he? Of course he could have. It's the same issue.
So he could have shut down the power in Australia?
He could have certainly shut down the power in that local area. Depending on what he wanted to do and his knowledge, he possibly could have done more.
But some are on the other side of this issue, saying, "It's not a concern, this example doesn't hold water. It's an insider. No bad guys could get this access, nor the knowledge to do this."
Let me counter them and this is what I think is terribly important. We have a limited number of suppliers of distributed control systems, SCADA systems, programmable logic controllers. These systems are sold not only in North America, they are sold all over the world. A number of these suppliers are not American companies. They can sell these same systems to countries that we probably wish they hadn't sold them to. These same systems, when they sell them, are the same systems we have here. When you sell a system, along with that system goes training -- the same training there you get here.
Additionally, I'll take it again a number of steps further. We have international standards bodies that work on the standards for the equipment design, the equipment use, the communications. An international standards body basically means anybody can participate, because these standards bodies are essentially by individual, not necessarily by company. So anybody can.
There are numerous SCADA listservs that are specific to SCADA, that are specific to other types of control systems. I know firsthand that you have people on these listservs asking all kinds of questions that could be serious at times. But when a question is asked, you get an awful lot of people wanting to very graciously provide answers.
So if somebody out there wanted to do harm to us, the information's available?
Yes. Well, I'll give you a parallel to what occurred in 9/11. We looked at who is taking flight training at the flight training schools. Well, there are an awful lot of, not just control systems suppliers, but system integrators, people that offer courses in how to use control systems. You don't have to be an owner of a company or a utility person or a refinery person or anybody else to take these courses, you have to just pay. You could very easily just be somebody who's going to be a contract engineer to do it.
What's the worst-case scenario?
Don't know. The ability to get unauthorized access to these systems is well proven. I won't say "well documented," because this is not something you're going to pick up a magazine and say, "Here it is," but it's well proven.
With cyber, what makes the answer to your question so difficult, is what is it you want done? If I get in, what is it that I want to do? It's like in the IT world. If I get root access, if I become the administrator and could do whatever I want to that IT system, what could I do? I can't give you an answer, because it depends on what it is you want to do.
Tell me, first of all, your thoughts on Eligible Receiver. What was it, as far as you know?
I am a commercial engineer. I don't have a security clearance. I haven't seen a report. I know of parts. All I can say is, knowing the vulnerabilities of our systems, could this be done? Yes, possibly could be done.
So therefore -- the fact that the NSA said they did it -- within the industry, how is it viewed, what was done, and is it believed by people within the industry? Is it believed by control system experts like yourself, that what they did would have had the results that they said?
Couple of answers. One is, I believe most people in our industry [were] probably not aware of it. Number two, there are very few people, truly, who really understand control systems and cyber, which is a very important piece that's missing, and it's missing in the whole national strategy. Consequently, I would say that most people in our industry would not think it's possible to do, because they don't understand what the true vulnerabilities really are. That goes back to the fact that the awareness of cyber security of control systems is still, in my opinion, abysmally low.
I've also been told that the SCADA systems -- if people don't know what's happening, you can start controlling flow one way or the other.
That is correct.
You can create a situation where you're causing a problem which grows and is a cascading effect. System after system, if they don't understand what is happening, will go down, and you could bring down a grid. Explain that to me.
The first thing is there are numerous, numerous indicators in a SCADA or control room scenario. To make this happen, you would have to be able to effectively change an awful lot of indicators, so that somebody doesn't see things that don't make sense. That said, there have been demonstrations, be it tabletop, to demonstrate how you can change what an operator sees so what an operator sees is not what's really there. That can be done.
You can change the actual setting of devices and have the operator not be aware that those settings have changed. That can be done. It has been demonstrated.
So he thinks he's sending out 25 volts?
Or he thinks a breaker is open or a breaker is closed, or vice versa. Now, before this gets too contentious, again, there are going to be lots and lots of indicators. What we've done at a very, very cursory level is demonstrated what some of the things you can do.
What is so important is DOE has the national SCADA test bed at the Idaho National Engineering Lab. We very much need to use that facility to really understand what can happen, what technology we need to develop, how we need to go about this, so we can really take this through and really, truly understand what can happen and maybe what the limitations are.
Are there things that hopefully may preclude some of these worst-case scenarios? Or are the systems that naive? I don't want to call them dumb, because these are very smart systems. But are they that naive, that if you do try to take them over, they're not smart enough to recognize wrong things are happening?
We need to know and understand this, and this is as much of a plea as anything. We've got to get the government to recognize and prioritize and fund this at the level necessary, because, I will say this: Essentially all of the IT technology that's been developed, all the tens and fifties millions of dollars, or hundreds of millions of dollars, to develop firewalls, intrusion detection, encryption and authentication -- all of that needs to be done for control systems. That won't get done for $3 million. That money that's been allocated from Congress, the $900 million to do cyber security, that money is not going to secure a control system.
Is there a problem that control system experts have not been brought into the debate over the security issues that the government is dealing with now?
I am ostensibly a member of the R& of the Partnership for Critical Infrastructure Security, PCIS. Key word: critical infrastructure. I don't believe, or at least there wasn't as of a year ago, a single supplier of control systems that was a member of the Partnership for Critical Infrastructure. Not one.
I can tell you that, when I'm at conferences like when Gartner has their SECTOR 5, which is the critical infrastructure, I was on a panel last year. For the electric side, there were maybe, I don't know, a hundred people in the room. I was flipping. I pointed at the lights and asked how many people think that's your product? Of the maybe hundred people there, two were not IT. If this is critical infrastructure and IT doesn't have responsibility for any of this, why isn't somebody else there? The discussion on cyber vulnerabilities from day one has been led by IT people and others who don't know how control systems work.
Does Washington get this at all?
I don't think so. I don't think so. I just had a discussion with a couple people in the Defense Department, and it bothers me that it was news to them. There is an abysmal lack of knowledge about cyber security of control systems -- not just in our industry, but in all industries, and even much more in Washington.
In fact, I'll give you another concrete example. I was asked by Senator Bennett's staffer to review an article that was put in the Library of Congress, a congressional research service report on Aug. 12, fairly recent, and it was essentially -- I'm trying to go from memory -- but it was remote control systems that threaten the critical infrastructure.
Here was this article in a report that was prepared for Senate and congressional staffers. The report was about as technically flawed as it could possibly be. They defined a remote control system as any system where the display was remotely operated. That's a client server. Since when is a client server a control system? That's not what it was supposed to be; that was their definition.
What's more, this report that was prepared for Congress, when they started talking about SCADA systems, they didn't even identify that the electrical industry uses SCADA systems. Not only did I comment on that report, I have a vetting group that consists of people at NSA, NIST, the National Institute of Standards and Technology, couple of POD organizations, a number of the national labs, and a number of people in the industry. I had them review my comments, because it was so brutal in terms of what was in that report. And they all concurred.
Have you ever been in a meeting or anywhere else where you lay out your views, and you're amazed about the response?
Well the first time I ever testified, I had no idea what to look for. What I found was that, in this particular congressional meeting, my information was of some relevance. But what was more important was the discussion on confidentiality and freedom of information. So even though for a fact I was even asked questions about nuclear power companies, I figured with my answers, there would probably be some further discussion.
The bulk of the meeting revolved around Freedom of Information [Act], or release from Freedom of Information, because we need to share information. We absolutely need to. Can it be shared easily within the Freedom of Information Act? Not really. So that discussion was ongoing, and mine was just kind of, "Thanks, it was interesting. We're done." ...
Is part of the problem in Washington the fact that they feel this is private sector's problem?
I don't know. I gave two -- I don't know if you want to call them seminars, tutorials, whatever -- to the National Defense University. One of the things that came out is the government uses SCADA systems -- actually uses it for some of the DOD applications. I had one person come up to me afterwards and say, "This is what the SCADA system is and what the vulnerabilities are?"
In a sense, it's a bad analogy, but I harken it back to Y2K. It took us a long, long while to get people to understand that Y2K wasn't only a COBOL problem associated with mainframes. This is still very much an eBay, Amazon, Internet, IT problem.
I was at the Gartner conference. You had all of this discussion on all of this wonderful technology to be able to log and analyze all of this intrusion detection data and how wonderful it was. All I could do is stand up and say, "We don't even have intrusion detection. What are you talking about? Especially considering this is a meeting on critical infrastructure of electric power? Something is missing."
Nuclear power plants. They're safe, right? They're offline, they're secure.
No. They have the same systems. When they started, they were just like any other power plant, in the sense that they were isolated. There are two aspects of the nuclear power plant. One is if you are on the nuclear safety side of the power plant; the other is the balanced plant portion of the power plant. You can put in new networkable technology into the balance of a plant side of a nuclear power plant, and it is happening. Now what's more, you can connect that information back to the corporate network. You can have vendors connecting in to these networks. Not only can you, it is happening.
What is balanced plant side?
That is where you take the output coming out of the nuclear reactor. The systems that are there for the shutdown are separate. The systems that take the hot water or the steam that comes from the nuclear reactor, to turn it into steam and turn the turbines, that's called the balance of plant.
So that's the generating system?
The generating system.
So you can't hack into that?
It is possible.
And if you did, what's the worst-case scenario?
Don't know.
What can you imagine?
I've been thinking about it. I'd just as soon keep it at that.
But the bottom line on it is, are our nuclear power plants safe?
Let me give you an example. Several years ago, a white-hat hacker named Mudge testified to Congress that he was able to hack into the corporate data network of every nuclear utility in the United States. That was approximately two or three years ago. While I was still at EPRI, he talked to our system computer security group about what he did. When he did that at that point in time, it didn't bother me, because that was strictly the corporate side, and I was under the impression, I think as most people are, that there wasn't a connection between the corporate side and even the balance of plant. That's not true anymore.
So does this raise the bar here to making it an issue of national security?
It can, in the sense that, number one, nuclear plants are very large generators of power. They are equivalent many times to two or three or four fossil units. So number one, if you lose a nuclear plant, that's a large chunk of generation. In some parts of the country, it's a very critical part of that generation mix. Could that have an issue? Sure. The optics of having a nuclear plant go down is not a good thing.
But could one also potentially get into a system and cause a meltdown? I don't believe that.
So it's more whether in fact one could mess with a system so it basically could cause enough damage to shut it down?
Between in the plant or in the switchyard.
So just put it in all perspective. What's the worst-case power scenario, power we're talking here -- power lines, power grid?
Absolute worst? I won't even say absolute, but a very worst case could be loss of power for six months or more.
Over how big an area?
Big as you want.
Is that a possibility?
Yes.
How?
I'd just as soon not go into it.
But you believe, as an expert, a man who understands these systems, that indeed that is a possibility?
It's possible.
Why isn't Washington quaking in its shoes?
I can't tell you. I don't know. I don't know.
Have you gone to Washington and raised your hand and said, "Folks, this is a problem?"
I have tried -- again, I have to be careful how I say this. When I was at EPRI, I was not supposed to have communication with those people. I did, because I thought it was important, and in the end, it didn't help my career. I have contacted any number of Senate and congressional staffers that I thought were relevant. I've tried to talk to a few. I've invited them to our conferences, to not only see what was happening, but talk to people and gain an understanding of what they knew and felt.
The other thing that I think is important, too, is to realize as big as our industry is and how few people attend or are even aware of this. I was just at a conference two days ago in Houston. We had a security session. It was for the generators. We had two senior vice presidents of some of the largest insurance reinsurers in the country and another consultant and myself, and I think there were 20 people there.
A lack of concern?
A lack of understanding. Lack of concern happens when you understand it and then decide it's not important. We're at the point where I don't believe most people understand what can happen.
You're talking about in the industry?
Yes, oh, absolutely.
So how is Washington supposed to get it?
I don't know. I jokingly refer to the fact that I'm 2,415 United miles from Washington. So my knowledge of Washington and how it works is limited.
How are you viewed in all this?
Depending on who you talk to, I believe I'm viewed as a technically competent person. I believe there are a number of them that view me as very much of a loose cannon -- part of the reason I'm not at EPRI anymore. I believe I still have a very substantial group of utilities, vendors, etc., that believe in me and agree with me, which is why I'm still under contract to DOE and NIST. We've had the most successful cyber security conferences of anybody, because I know these people who have relevance and are willing to participate. I can tell you that there are a number of people that really, truly, honestly wish I would go away, including on at least one of the industry groups I'm on.
Give me your impressions Dick Clarke and the work he was doing.
I truly believe Dick felt this was very important. Dick tried very hard to get industry to believe it. In a funny sense, I think the problem was because one of the things Dick kept trying to talk about was a cyber Pearl Harbor.
The one difference I had with Dick is the cyber Pearl Harbor implied if we got hit, everybody would know it. My very, very, very strong feeling is, if and when we get hit, we will never know why we were hit. All we will know is breakers are opening, valves are closing, certain things are happening. But we won't have a clue as to why.
I'll give you an example. This is not a cyber attack, but just an example. I believe it was July 1999. There was a pipe break in Bellingham, Washington. A backhoe was digging, hit a gasoline line, broke the line, spilled a couple hundred thousand gallons of gasoline in a creek, caught fire and killed, I think, maybe about three people. I remember even seeing it on the news. As an industry, I'm not trying to belittle the industry; those things happen. OK, happens all the time. We haven't marked things well enough. They should be using better spatial technology.
It was either late November or early December 2002. The National Transportation Safety Board issued a final report on the pipe break in Bellingham, Washington. Turns out the backhoe didn't break the line, the backhoe weakened the line. There was a gas SCADA there. The gas SCADA had about 18 to 20 minutes to take action to keep that line from breaking. It didn't.
This wasn't a hack. Here was a clear case where a control system played a part in a major catastrophe. For whatever reason, I cannot tell you -- for two and a half years, the industry, for whatever reason, was kept oblivious of the fact that a SCADA could have played a part. This wasn't a hack. You want to talk about information sharing? How can we, as an industry, do anything, when information like that is available and we're not even made aware of it? Like I say, it was not a hack. But it's obvious the control system was involved.
Did we learn lessons from the big power outages in the 1960s and 1970s, let's say, the New York big power outage? A lot of people sort of say we had these huge power outages, there were some cascading effects?
Absolutely, we did.
Enough? Did we learn enough?
But let me pose it a different way. This is the problem I have -- it's still a problem in the industry. What we have done is we have been absolutely focused on how to respond to natural disasters. We have been not near as good to focus on what do you do to manmade disasters.
There were many very, very good things in terms of guidelines, procedures, equipment, research, that came out of -- part of what came out of the New York outage was EPRI. I believe it was 1998 where we had the western power grid problem. What never came out of any of these, though, was what could happen with an intentional case or an intentional exacerbation of any of this. And taking it a step further, even to respond to all of these, it's been, how can we improve the reliability? Most people assume if you say reliability, you also take into account security. No. It was purely and only reliability. Security is simply not given a consideration. I'm talking cyber.
One of the things that was told me is that, if you take out a big enough piece of the grid what might happen is through cascading effects or whatever, you go down, we don't know if we can black start the damn thing.
Well, number one, the national labs have been doing an awful lot of work on interdependency studies. There's an awful lot of work going on black start capabilities. Now, again, depending on where you are, we've made certain assumptions for black start. It could be we could self-defeat ourselves, because what we needed for black start also was taken out.
But I still go back to the fact that what we haven't really done is assume what happens if it was intentional or, as we're trying to go back, something keeps exacerbating the problem. That hasn't been addressed, at least on the commercial side. I can't address what's been done on the classified government side.
And through the use of flow and through the use of trying to control these control systems, might one be able to burn out generators and equipment which would take years to replace?
A concern. There is work going on now to understand where our critical transformers are, where the replacements are, how we can get -- because these are long lead items, so if they're not direct replacements, how we can get at least short-term replacements? But the point being, if you take those out, that becomes a major long-term issue. Can we be vulnerable? Yes. ...
Viruses. We've talked about viruses with people, and these are people who say they're not going to do something like hurt an infrastructure, like the electrical infrastructure. Is that the case?
That's what we thought. It turns out, in a funny sense, it's in between. What we did find is that it did hit a number of electrical and water control facilities. The reason I say it's in between -- the Slammer worm itself did not shut the power off. What the Slammer worm did was essentially shut off the control system. That's why I said it's in between. If there would have been a problem with trying to control substations or other facilities, we would have had a problem. It did impact it; it impacted it indirectly.
So it didn't turn the power off?
It didn't turn the power off. What it did is it cut off communications to those systems. The ones I knew about were impacted because the Slammer worm impacted the telecom provider or hit a router, and basically used up the bandwidth on the router and the control system shared that. So in this case, the control system was the unintended consequence of the Slammer worm.
If it really didn't turn the power off, if it was fixable, and if it wasn't an unintended consequence, why is it important to understand?
It's important to understand because it's another point. This, number one, starts to show us that the control systems are not that far, in terms of isolation, from other systems. Again, we're kind of viewing it as these worms and viruses that come and go aren't going to bother control systems. Well, they can.
The other thing is, again -- this is from what I read and hear -- Slammer was simply a precursor of something bigger. Again, that's just what you read in the press and what you hear. But what you had was a number of, not just electric utilities and water utilities, but other industrial facilities also had control systems impacted for on the order of hours.
So what has to be done to secure these systems? Can anything be done?
Yes, there are several things that can be done. ... The first thing that needs to be done is we need to develop security policies and procedures to protect these systems, because the IT security policies and procedures out there were never designed for these systems. ... The euphemism is try and at least close some of the doors and windows.
A second thing is do vulnerability assessments, get a better idea of where you are, really learn what you really have, because invariably, whether it's done just by an analysis or penetration test, whatever, in almost all cases people have been surprised to find out what they really had and didn't recognize. They thought they were a lot more secure, a lot more isolated than they really were.
The longer term -- this is where I go back to the fact with the national SCADA test, that we've got to be able to develop technology. The first thing we need to do is to understand, what technology do we need? The IT mindset of always needing encryption and all the rest for a control system -- that may or may not be what we need. We really need to be able to understand what is it that we need, and then develop it. That needs to be done, number one, with the existing systems out there; then number two, with being designed, that's on the drawing table, as they say, now.
Can the systems be taken offline completely? Just take them off completely from any potential wireless, any potential Internet access?
Are you asking, can you manually run these systems? Well, we looked at can you manually run the systems way back during Y2K, because the question was raised there. These systems are highly, highly dependent on computers. What's more, the people that really have understood these systems both at the utilities, at the vendors, etcetera, the experienced people, aren't there anymore. It would be rather difficult to try to manually run these systems for an extended period of time without some real bumps in the system. I wouldn't say impossible, but it sure wouldn't be the system that you have today.
In a time of financial constraint within the industry, is this a problem that the industry can deal with, or are willing to deal with?
There's a couple of things. One is right after 9/11, FERC gave -- and I don't know what the status of this is -- FERC gave initially dispensation for utilities to put in the rate base security changes. There's a lot of, I guess, efforts going on now as we see it.
If something is mandated by regulation, utilities have to do it. You know, call it what you will, it is a regulated industry. If regulations say you shall or you go to jail or you can't be part of the grid, you will do it. It's that simple.
But are regulations coming down the pike?
No.
Why not?
I don't know.
Are they necessary?
Yes. I just don't believe it will happen without it. With Y2K, it wasn't regulation; it was liability. Without an appropriate driver, I believe a lot won't. And not only that, without that driver, I'm not sure that the vendors themselves will build the systems that we need either. It's a chicken and the egg.
By vendors, you mean?
The people that build the control systems.
And they operate the software that controls them?
Yes.
There is something the government is doing right now - the Giganopr [a government term meaning a notice of prepared rulemaking] to nationalize the grid. Would this magnify our vulnerability?
Yes. I happen to be on the NERC committee. Within that FERC Giganopr, there's also a page dealing with cyber security which is the part I've been working on. The more you centralize control, from an efficiency perspective the better it is. From a security perspective, the more vulnerable it is. That, though, has not been my particular focus. My particular focus in the Giganopr has been the fact that there is going to be a minimum requirement for cyber security for the grid. ... The way that Giganpor is set up, it will only address the control centers, not power plants or substations.
Well, shouldn't FERC get this?
We've had meetings there. I've made my feelings known within the NERC critical infrastructure protection advisory committee where I originally prepared the wording for what we needed to do for the control systems which was removed. I can't tell you anything further.
This is something going to happen soon, right? This is a notice of proposed rule making?
This rule making will take place - the term substantial compliance is there -January 1, 2004. Full compliance January 1, 2005.
And what will it do?
That's a good question. It will help to some degree, it really will. On the other hand because I am focused, which is part of what certain people will tell you is a problem with me, because I am focused on the control systems, I believe this will be more harm than good. Because what they have done is they have removed control systems. They've removed substations and power plants from that rule which means that the substation people and power plant people can basically say, "I don't need to worry." And the control systems suppliers will say, "I don't need to supply a secured control system; there is no requirement."
Let's talk about the probing. There is sophisticated probing going on that a lot of people talk about. What's that mean, and does that scare you when you've heard about it? What are your first impressions?
It's misleading. It's misleading, because most of our substations and power plants have no firewalls for intrusion detection. We wouldn't know if anybody's probing. What you're reading about is what they're finding on the business side of the electric utility. Now, that may mean there's a lot happening. I have no clue. All I'm saying is, what you're hearing about has nothing to do with what's happening on the operational side of the utility.
Because we don't know what's happening?
Because we don't know.
But nothing's ever happened. Nothing bad has happened.
Again, I go back to we haven't had extended outages. The only thing I can say is if there is an outage, what we do for root cause is we go and look and mechanically and electrically analyze what's happening. We have no laws to refer to find out if anybody's got anything from a cyber perspective.
I can also add one other thing. I was on a panel discussion when the IT security manager of a very, very large, electric utility gave three slides. Those three slides were why even if the company were hit by cyber, the industry would never know it. The first was it really didn't have the technology there to detect it. The second is they didn't have the expertise to do the forensics to really determine what it really was. Now, again, I'm not talking on the IT side. This is on the control systems side. The third is, the last thing you'd want to do to either Wall Street or to the hacker community is identify the fact that you've been hit.
So you think in fact we've been hit potentially, but there's just no way to absolutely know it?
I know from creating an informal database that there have been a number of times, both intentionally and unintentionally, our control systems have been impacted. We have had facilities down for several days because of, in some cases, inadvertent denial of service of the control system. There is at least one case in a paper mill. There was a union situation that it got shut down, because somebody went in and changed all the passwords.
Again, maybe that could suggest that certain systems could be affected. But this wild scheme of system after system being affected and warning systems being taken down so that you could really manipulate-- There's no real sort of proof that anything like that has ever been contemplated or could possibly take place?
I can't in any way, shape or form address whether it's been contemplated. The only thing I can tell you is, here is what the systems, are and where they're vulnerable. That's all I can say. ...
One last area is software. How vulnerable are these systems and the infrastructure system, in general, due to the fact of the software that they use?
The basic fact is unless you were to do almost a line-by-line review and a comprehensive test from essentially a cyber security perspective, what would you know? One other concern we have with the UCITA, the Uniform Computer Information Transactions Act, is it says that a vendor can put in a trap door or a trojan and not have to tell you. That's only a law, apparently, in Maryland and Virginia. There is a tendency, if you get software from a vendor and if you think it's a credible vendor, to accept it at face value.
What effects on critical infrastructures, though, that this is a reality? One never quite knows what's in your software?
We do very comprehensive factory testing of our systems. But I'm also going to put in a caveat. The comprehensive factory testing we do is to insure ourselves that the system performs as it was designed to perform and it meets the purchase specifications. If there is some form of a software trojan or something else that is not part of the design checkout, you wouldn't know. Now, in nuclear, you're supposed to do a line-by-line verification of all code. But if you look at fossil or other, there isn't that requirement.
The question that is brought up often, shouldn't you be forced to protect your systems better, because this is a national security issue? And the easy thing for many people to say is, "Well, it's not us. It's the system we buy. It's the software that we buy, it's the Microsofts of the world, it's the SCADA system makers," or whatever. "They're the ones that should be regulated, and they in fact should be regulated."
I don't think so, but I'll add one other thing. With control systems, vulnerabilties can occur beyond just the software. So even if you have that part taken care of, there are other vulnerable aspects. Obviously you've got Microsoft issues, but that's something different and beyond. But if you had all of your software absolutely secured, there are still other things. Remote access is one of the biggest, biggest vulnerabilities we have, and that has nothing to do with software.
In summary, the key points that you think it's essential for the public to understand?
This debate [on vulnerabilities] needs expertise from the people who know these systems. This debate has been going on for people who know policy and people who know IT and security, and they are trying to use terms or certain things out of context. I think it's just critical that people [who] know, understand and operate these systems be part of that dialogue.
One further thing. I think it's absolutely critical that as we train this generation and the next generation of cyber sleuths and that whole thing, that control systems be a part of that study and that discipline, because it isn't right now.
|