Stupid Computer Security Myths, 'Dumb Ideas' Keep Enterprises at Risk

 
 
By Fahmida Y. Rashid  |  Posted 2011-11-13
 
 
 

Stupid Computer Security Myths, 'Dumb Ideas' Keep Enterprises at Risk


Despite a growing awareness of security threats and basic security measures necessary to secure networks and data, many misconceptions and myths keep computer users and enterprises at risk, a security consultant said at a Kaspersky Lab event in New York City.

While organizations are deploying firewalls, public key cryptography and complying with various security and privacy regulations, many of them are still hanging on to certain misperceptions, "falsehoods" and approaches that don't work, Charles Pfleeger, a security constant and principal of the Pfleeger Consulting Group, said in a keynote speech on Nov. 10 at American Cup 2012. Kaspersky Lab's educational event was jointly held with NYU-Polytechnic University in New York City. Some of the "dumb ideas" were myths held by nonsecurity professionals, and others were attitudes still circulating within the security community, Pfleeger said.

In 2005, Marcus Ranum, a chief of security for Tenable Security, published six "dumbest ideas in computer security," including the idea that hacking is cool and just patching flaws found in software products will make it more secure. While many of Ranum's points were valid, Pfleeger developed his own list of security mistakes that are made every day in organizations around the world. The security mistakes are generally the result of ignorance and limited time available to address issues, he said.

"There are a lot of dumb ideas," Pfleeger said.

The first one was the idea that organizations can retrofit security, Pfleeger said. Often, when an organization is acquiring another company, or designing and developing a product, if the security question comes up, the most common attitude is, "We will do security later," Pfleeger said.

As systems grow increasingly complex, it becomes harder to retrofit software with effective security features, and even if it is possible, it's never as effective as if it had been part of the product from the beginning, according to Pfleeger. He also said "penetrate and patch" doesn't work, echoing Ranum's previous list, because it just means organizations are just plugging existing holes and not addressing the overall security issues.

He used a construction analogy, noting that if a contractor is asked to build a house, but directed to make sure the house has electricity only after the walls and roof were complete, it would still be possible, but it wouldn't be as effective and would require finished walls and ceilings to be to be broken open and refinished.

Microsoft had to do just that a few years ago. Bill Gates was able to say, "Stop everything. We are going to go back and put security in," Pfleeger said. Since then, Microsoft has implemented a methodology that integrates security into every step of development and builds security in by design. Bill Gates was able to do go back and start afresh, but not everyone can do so, said Pfleeger.

Security, Privacy Should Never Be a Design Afterthought



Stephanie Balaouras, a principal analyst and research director at Forrester Research, put it a little differently at a recent press event, noting that no one designs an airplane without thinking about security at the start of the design process. "It sounds crazy to deploy and then think about security, but that's what is happening in many organizations," Balaouras told eWEEK.

Related to security, Pfleeger said the second erroneous perception was the idea that privacy could also be added back in afterward. Organizations are under pressure to get the service or product off the ground and get people interested to build buzz, he said. While he called out Facebook as one of the culprits of this kind of thinking, he said other social media sites and organizations were guilty of the same. Facebook is the poster child just because it happens to be one of the largest examples, he said.

Many security professionals say that encryption solves all security issues, but in actuality, that expectation is "overrated," Pfleeger said. While protecting the data is important, the reality is that there are problems with implementation, leaving data unprotected. Organizations also have difficulty managing the keys effectively, such as storing them in insecure locations, or not knowing where the keys are after essential employees leave the company.

It's very common to pick one product or technology and claim it is a cure-all, Pfleeger said. Antivirus, intrusion-prevention systems and network tools are all good but none of them can do it all, he said. Security tools that are effective generally tend to be very specialized, which means it can't be a "silver bullet" capable of handling all kinds of security threats. Organizations have different environments, risk levels and requirements, which means different products will address different needs.

Many executives believe that security has to be perfect, "or it's not even worth talking about," Pfleeger said. This puts the contractor in a quandary, because it isn't possible to counter all threats. but that isn't what the client wants to hear, he said. A related myth is the idea that security is easy and "we can do it ourselves."

Pfleeger used another building analogy, noting that he could probably do some aspects of construction, but he doesn't. He "lets people who have done it many times and know what they're doing" take care of the job.

To counter these misconceptions, Pfleeger recommended that IT and security professionals think like an attacker so they can learn about systems and potential threats. They should recognize the limits of technology and work accordingly and counter the erroneous notions and myths about security when they come across them, he said.

 


Rocket Fuel