hackers
homewho are hackers?the riskswho's responsibleprotecting yourselfinterviews

who do we blame for the system's flaws?

Robert Giovagnoni
Giovagnoni is the Executive Vice-President for Strategic Relations for iDEFENSE, a private agency specializing in information intelligence.

When something comes down, when somebody is badly hurt, who ultimately is going to be held liable for that?

I don't know. And that's why liability is such a real concern today. If I were to break into your system, and use that to go downstream to another system, there's no clear-cut law saying that there's liability on your part. You only have an obligation to protect the records for your client base, and for your customers and for your corporate owners. There's no real responsibility downstream, since you have not actively done anything. But that doesn't mean that, as the bar is raised, as the business practice says everybody should have a certain security and you don't have that implemented on your system, that tomorrow there won't be an issue of liability, because you didn't have that in place. . . .

Knowing what the industry is doing allows you to address those liability issues, because if you're doing what the state of the art is, or what the rest of the industry is doing, you're using a legal standard as a reasonably prudent person, and there shouldn't be liability. . . . While there may not be liability today, if you act openly and with wanton disregard of things that you could implement down the road, you may find a judge that says, "Well, you should have done it."

So knowing what's going on . . . and knowing what everybody else is doing is really key to whether or not there's going to be liability for you today, and maybe liability for everyone else tomorrow. And we're talking big dollars here. Because in the loss of a system, if people are doing an internet business, one attack where they exploited your system could easily cost you $10 million or $30 million, if that loss can be proven and established. . . .

I do know that the courts are not really prepared to look at the damages issue, and how you define damages is unclear. But there are big numbers being thrown around. If I remember correctly, going back a year or two, when Kevin Mitnick was sentenced, Wired magazine ran an article, pointing out that two major corporations said they lost literally millions-- $20-plus million each--because of the actions he took. And if someone has to pay for that, the lawyers will find a way to come up with creative reasons why someone should pay. . . .
read the full interview

photo of Richard Power Richard Power
Editorial Director of the Computer Security Institute (CSI), San Francisco, CA, and author of Tangled Web: Tales of Digital Crime from the Shadows of Cyberspace. (Que, 2000)

Whether I'm speaking as a person with just an internet account or somebody with a business, when the cyber goblin gets me, who should I be mad at? Should I be mad at the goblin? Should I be mad at the guy who sold me the software? Should I be mad at the government for not protecting me?

You might start with yourself in terms of how badly you were gouged. If you're doing your banking online, if you're doing your stock trading online, if you're buying a house or a car online, you might want to think a little bit about how you're doing it, why you're doing it, what the consequences are, how to monitor your online identity.

Leave a paper trail for yourself, leave back-ups of your activity for yourself, check things out, check your credit rating every few months to see if there's something strange on there. There's a whole range of activities that you have to now take part in, just like a homeowner has to have insurance, has to have locks and fire alarms and everything for their house. You, as a citizen of cyberspace, and somebody doing business out there has to take some responsibility for your money, and for what's happening.

Beyond that, you have to look at the merchants and the financial institutions that you're doing business with, and what responsibility they take for what is going on with your online activity, and the vendors of the software that are supposedly making it secure for you. . . .

So where does the big burden lie--on me, the user, or on the company that is selling me the tool?

Well, it's only been in the last few weeks that Visa International has issued a new set of regulations for the merchants using its credit cards online to adhere to. And if you look at this set of new regulations, they are the most fundamental things about internet security: have a firewall in place, have the latest version of software in place, use encryption for any files that are accessible from the internet.

It's hard to believe that this basic level of internet security is what is being required of people now. . . . We're already tens of millions, billions of dollars into e-commerce, aren't we? This is the second or third Christmas where we're going to be talking about how much is being spent online. So there's some culpability there. There's some need for a more serious look. . . .

How should we view this new private information technology security industry?

Law enforcement's role has never been to secure your business. Law enforcement isn't expected to put in your sprinkler system or your burglar alarm, or to make sure your doors are locked at night. Their job is to respond to your call when there's been a crime committed against you or your property. It's the fiduciary responsibility of those corporations to defend themselves and their customers and their clients against cyber attack. . . .
read the full interview

Bruce Schneier
Author Applied Cryptography and Secrets and Lies: Digital Security in a Networked World.

What's the difference between computer security products and real world security products?

What's interesting about computer security products is they're often sold in ways you never see the real world products sold. You never go to a hardware store and buy a lock for your front door, and the lock says, as a slogan, "This lock prevents burglaries." You never see that. But in computer security you see it all the time. "This firewall prevents unauthorized network access. This encryption product prevents eavesdropping." And that difference is real important, because it's just not true. A firewall can't prevent unauthorized access. It can make it harder. It can, like a door lock, provide a measure of protection for your house. But it can't prevent the attack.

Of course, that will lead to a whole new law of liability.

That's right. It's odd, because you never see this in the real world either, right. You can imagine a builder of skyscrapers, after skyscraper 1.0 falls down, saying, "Oh, we're sorry, but the new skyscraper, version 1.1, will stay up, we promise." Right. That'll never happen, because there are liabilities. You can't build a skyscraper and have it fall down because you made a mistake. But in computer security, the vendors have no liability. They could build a computer security product, have it be completely broken, and there's no liability. That has to change.

Why is it this way?

It's that way because that's the history of the computer licenses. Originally, computers and computer software were sold without liabilities. So adding liabilities is hard.
read the full interview

photo of Martha Stansell-Gamm Martha Stansell-Gamm
Chief of the Justice Department's Computer Crime and Intellectual Property Section.

How much of the blame for vulnerability lies in the manufacturing of software, in the tendency to minimize security as a factor?

Well, I'm not in the blame business. I'd rather recast the question a little bit and say, "If we have opportunities for doing it better, where are they and what do they cost?" Writing software is hard, especially the kinds of software programs that we want to buy now. There are thousands and thousands of lines of programming code--probably more--and these software applications are interacting with operating system software, and so there are levels of application. How all of these fit together is tremendously complicated.

So, first of all, it's not an easy problem to solve. Second, to the extent that our software is vetted and perfect and bug-free, somebody is going to be paying for that. It makes the software more expensive. Is the public willing to pay to buy more expensive software if a greater part of the emphasis goes from designing the software to ensuring that there aren't intended unintended security consequences?

But isn't this one of those areas where people in the public sector shake the big stick and say, "Cost is the secondary consideration. You have to make it safer, and you have to pay more for it."

That's certainly one possibility, but it's probably one of last resort. There are some other ways that we have in our culture for straightening out relative liability and risk and a lot of that is in private litigation.

You know, companies are perfectly able to sue manufacturers if they feel that they've been sold a product that's deficient in some way. And I'm not recommending that, of course. But they certainly know how to get recourse. There's also an insurance angle. As we become more understanding of the negative possibilities in these communication systems, I think a lot of companies are beginning to look to insure risks and liabilities. . . .

It seems to me that it's probably way too early in our understanding of the problem for government to come crashing in and say, "Okay, we know how this ought to operate. We're going to write the rules and we're going to tell you what all of this needs to look like." It's a little uncomfortable, but I think we need to live this out a little bit and find our answers. . . .
read the full interview

photo of Robert Steele Robert Steele
Founder, President and Chief Executive Officer of Open Source Solutions, Inc. (OSS).

We have to protect critical infrastructures, but in a distributed computing environment, [that] is not something that can be done by a central agency. It has to be done by the individual proprietors of individual computers. That is essentially a three-part solution.

Part one is that the government has to legislate what comprises "due diligence." Software has to meet certain standards of safety and stability and reliability and transparency. The second part is that government has to test and certify that software, so that as a commonwealth interest, software is validated by the government as meeting those standards.

But the third and most important part is that the proprietors of the computers themselves must live up to a new standard of responsibility. You can't leave your computer connected to the world and not have firewalls. You can't send documents without encryption or other protection and expect them to remain private. So we ourselves have a responsibility. But our responsibility, although the most important, is only the third step. The first two steps have to be taken by government and by the private sector.
read the full interview

home · who are hackers? · risks of the internet · who's responsible · how to be vigilant · interviews
discussion · video excerpts · synopsis · press · tapes · credits
FRONTLINE · wgbh · pbs online

some photos copyright ©2001 photodisc
web site copyright WGBH educational foundation

SUPPORT PROVIDED BY