7.1 Background

 <  Day Day Up  >  

There are various definitions of social engineering. Here are a few:

The art and science of getting people to comply to your wishes. (Bernz, http://packetstorm.decepticons.org/docs/social-engineering/socialen.txt)

An outside hacker's use of psychological tricks on legitimate users of a computer system, in order to obtain information he needs to gain access to the system. (Palumbo, http://www.sans.org/infosecFAQ/social/social.htm)

...getting needed information (for example, a password) from a person rather than breaking into a system. (Berg http://packetstorm.decepticons.org/docs/social-engineering/soc_eng2.html)

Sarah Granger, who compiled these definitions, states: "The one thing that everyone seems to agree upon is that social engineering is generally a hacker's clever manipulation of the natural human tendency to trust" (http://online.securityfocus.com/infocus/1527). The most important term here is natural . It implies that overcoming the efficiency of a social engineering attack is similar to going against nature: it may be possible, but it is difficult.

Although perfect machine-level security is improbable (unless the system is turned off, cemented into a box, and locked in a room with armed guards ), you can nevertheless get close by making a concerted effort. Unfortunately, sometimes security is achieved by sacrificing a substantial amount of functionality. Likewise, security is sometimes passed over in favor of higher functionality. This is especially likely to happen when proper risk assessment is not performed.

Every organization makes a decision on where to stand in the spectrum: either closer to perfect functionality (less security), or closer to perfect security (less functionality). Most companies implicitly choose functionality over security, for various reasons ”such as pressure to deliver or lack of budget, knowledge, or personnel ”and such unconsidered decisions can lead to security breaches. Unfortunately, with social engineering, you often do not have the opportunity to make a choice. Tight system security and user education offer surprisingly little protection against insidious wetware attacks. [1]

[1] The term wetware indicates the "software" running on a human computer ”the brain ”and the corresponding "hardware."

Corporate user education for social engineering usually consists of nothing more than an annual memo stating "Don't give your password to anyone ." Unlike technical countermeasures, protection from human-based attacks is poorly developed and not widely deployed. One novel solution is to fight fire with fire; i.e., to proactively social-engineer people into compliance and a heightened defensive posture . Most security awareness training programs offered by companies can be categorized as social engineering of sorts, or as engineering policy compliance. Only time will tell if this solution proves effective by any measure. It is also possible that it will run counter to perceived civil liberties and freedoms. After all, the noble goal of policy compliance probably does not justify the "zombification" [2] of users. The issue is how far a company is willing to go in order to stop the attacks and whether they care about obtaining the willing support of the users. The opposite argument is valid as well: some believe that only aware and supportive employees , trained to think before making a decision (such as to disclose data), are in fact more effective in stopping the attacks.

[2] The term zombification refers to zombies , those mythical undead creatures who act under the complete control of an evil magician.

Little can be done by traditional security measures to protect your network resources from advanced wetware attacks. No firewall, intrusion detection system, or security patch is going to do it. Nevertheless, there are some newer methods that may help: for example, penetration testing can be very effective if it includes mock wetware attacks.

7.1.1 Less Elite, More Effective

A human controls every computer system, and that human is often the weakest link in the information security chain. Since the golden age of hackers like Kevin Mitnick, stories of social engineering have enthralled the public. The targets of such attacks have ranged from an AOL newbie (in order to harvest a username and password) to an R & D department engineer (in order to harvest microprocessor schematics). For example, one CERT advisory [3] reports that attackers used instant messages to backdoor unsuspecting users with offers of free downloads including music, pornography, and (ironically) antivirus software. The attack qualified as social engineering because users themselves were engineered to download and run malicious software: no computer system flaws were being exploited.

[3] "Social Engineering Attacks via IRC and Instant Messaging." (http://www.cert.org/incident_notes/IN-2002-03.html)

7.1.2 Common Misconceptions

The myth about social engineering is that few people do it well. Unfortunately (or fortunately, depending upon which side you are on), it's not true. Another misconception is that being a social engineer is "evil." While social engineering comes with a stigma, having the skills of a social engineer is like possessing a vulnerability scanner. Unless you use them for a crime, such skills are perfectly legal. In fact, social engineering attacks are highly valued as part of a complete penetration test ”the Open Source Security Testing Methodology Manual (OSSTMM, available from http://www.OSSTMM.org) even contains guidelines for conducting social engineering testing as part of auditing.

 <  Day Day Up  >  

Security Warrior
Security Warrior
ISBN: 0596005458
EAN: 2147483647
Year: 2004
Pages: 211

Similar book on Amazon

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net