The next section in the CompTIA Security+ book is about attacks. It’s a fairly long chapter, and covers multiple arenas, so I’m breaking it into multiple posts, including this one about… social engineering! Yay!
I used to think that social engineering was a bunch of bullshit. I also used to be way more reliant on technical skills over people skills. I’ll just leave it at that. : )
This is a continuation of my blog post series on the CompTIA Security+ exam, where I share my studying and connect it to real-world events.
What is Social Engineering?
Social engineering is an attack against a user, and typically involves some form of social interaction. It takes advantage of the social nature of people’s relationships. Relationships don’t just cover friends, partners, and family. They also cover coworkers, acquaintances, strangers, and so on. How do you respond to someone who clearly needs your help and has their hands full? Do you open a door for them, or let the door close and have them “badge in” on their own?
Social engineering is about “manipulating a person and their actions by manipulating their perception of a situation.” This can be done by appealing to people’s willingness to help others. Alternatively, it can be done by creating a hostile situation and appealing to people’s desire to avoid conflict.
If you’re interested in social engineering, you can catch a SEVillage at a local security conference.
Social Engineering Methods
Pronounced “fishing”… phishing occurs when an attacker tries to obtain sensitive information from users by pretending to be a trusted entity. This can be in the form of an email, text-message, etc. Phishing often directs users to a reputable-looking (fake) copy of a website. The user then enters their credentials, which are stolen by the attacker, who owns the fake website. Phishing is often a large-scale thing that casts a wide net.
Spear phishing, on the other hand, targets a specific group of people. While there are fewer potential victims, this method may be more successful because it looks less suspicious.
Lastly, there’s whaling (they’re really taking this fishing metaphor all the way). Whaling refers to a phishing attack that specifically targets “high-value” persons, like CEOs.
Phishing has everyone on edge, including the DNC, who mistook a phishing test for a real attack. FireEye recently reported that one in every 100 emails is a phishing attempt. And earlier this week, small business service company Zoho was pulled offline by its domain registrar for failing to respond to phishing attacks (through its service).
Since we can’t stop with the stupid made-up words: vishing! Vishing is a type of phishing that uses voice communication. People are more trusting of a stranger over the phone than over email. Unfortunately, this trust can be exploited. It doesn’t help that attackers can spoof calls using Voice over IP technology.
There’s also smishing, which is like phishing and vishing, but over SMS. Make it stop.
This is a classic example of exploiting people’s politeness. Tailgating means following closely behind a person who has used their own access card to get into a room or building. What are they gonna do, slam the door in your face? Awkward. It’s even harder to say no if the person is already talking to you, or if they have their hands full.
It’s incredibly hard to search for “tailgating” online in the fall and get meaningful results (even with my Google-fu).
Impersonation, is, of course, pretending to be somebody else. In this case, we’re talking specifically about pretending to be someone known to the victim, like their boss, or IT. There are different categories of impersonation, several of which are shown below.
A recent study showed an 80% increase in email impersonation attacks (although it’s not clear what countries the study applies to). Another news article says 58%, and discusses impersonation attacks where the attacker pretends to be the victim’s boss.
A social engineer uses information about a workplace, and pretends to be someone plausible. The (contrived) situation is one where the attacker appears helpful, and/or has the blessing of someone in authority, and thus shouldn’t be challenged.
Help Desk and Tech Support
Help desk folks can get you back into your laptop after you’ve lost your password. That means they can also help attackers get back into your laptop without your password, too. Social engineers can call IT and pretend to be an employee and gain access. Likewise, they can also pretend to be IT and get confidential information from employees (which then could be used to “prove” their identity to the actual IT folks).
Contractors and Outside Parties
Companies often contract out cleaning and other services. People wearing the right uniform often aren’t challenged, and pass through a building unnoticed. If your company allows Jimmy Johns delivery people, does anyone give them a second thought? (didn’t think so). Ocean’s 8 has an example of this where Rihanna’s character walks into the boardroom. No one challenges that a black woman is taking out the trash, and she uses that to her advantage.
Of course, all of this can happen online too. However, it’s less likely to be effective than an in-person social engineering attempt. This is because people are aware of online scams, and in-person interactions are a whole ‘nother beast.
You’ve probably heard of dumpster diving before. As it turns out, not everyone disposes of sensitive information properly. In most places, trash is not considered private property after it’s thrown out. This makes it a popular target for law enforcement, private investigators, and social engineering attackers. What could you find when dumpster diving? Personal information, IDs, passwords, company info (that would make the attacker seem more plausible), etc. They might also find hardware or other equipment that could be reverse engineered. This is apparently how some hackers were able to learn phreaking (by finding and testing old phone networking equipment).
While not a recent article (2009), this shows how attackers found laptops, personal information and checks from a bank’s dumpster. And while this article only briefly mentions dumpster diving, it’s a great read.
Hoaxes are, of course, stupid. They’re also unfortunate for security professionals when users are tricked into self-sabotaging their own security or equipment. An example might be fake IT advice, instructing users to delete or modify files or settings. This might result in the user becoming more vulnerable to attacks, unwittingly sharing information, or creating denial of service attacks… for themselves. Oop.
This is where the attacker can watch the victim enter in credentials, a keycode, etc. This can happen while the attacker is physically present and nearby. It can also happen through security cameras, binoculars, etc.
Google Researchers have shared an AI tool that detects shoulder surfers. Another app, IllusionPin, obfuscates the “real” pin numbers on your phone to protect your pin from others. Lastly, hardware authentication methods like YubiKey allow users to side-step shoulder surfing concerns.
Watering Hole Attack
This type of attack is a bit different from the others. While the other attacks involving action directly towards the victim, water hole attacks involve infecting a target website with malware. When they visit the site, their computer will also be affected with malware. The target website might deliver malware to a specific set of users… in general, water hole attacks are difficult to pull off and require sophisticated resources.
Earlier this year, a watering hole attack involving Adobe Flash was reported on. Additionally, the FBI has reported that a North Korean hacking group targeted the Polish Financial Supervision Authority in a watering hole attack.
Social engineering leverages a desire to help and/or a desire to avoid confrontation. Social engineering principles also appear in other fields, like psychology or marketing. Understanding them will help prevent social engineering attacks. It might also help with understanding in other areas of your life, too.
The use, or even appearance, of authority can mean that a person “feels at risk in challenging someone over an issue.” This means that bad behavior might get a pass, which is bad news for your organization. Note: even proximity to authority can be useful. If a social engineer appears to be friends with the boss, who’s going to question their actions?
Intimidation can go hand-in-hand with authority. It can range from very subtle to very direct.
Consensus means a decision reached by a group. If a majority decision starts to form, it can be difficult for people to object, even if they think (or know) it’s wrong. This, and other group dynamics, can be manipulated for the social engineer’s purposes.
This is another human fear no doubt seated in some evolutionary mindset. If something appears to be in short supply, we value it more, and we also value losing it. This allows an attacker to force someone’s hand, and make them make a hasty decision.
Familiarity leads people to “do things for people they like or feel connected to.” If you’ve ever met someone from your childhood town while in a distant city, you know the feeling. People look for connection with others. As with the other principles, it can be manipulated to create (misplaced) trust in the attacker.
Trust is defined as “having an understanding of how something will act under specific conditions.” I find this to be a very engineering-focused definition, but I like it. This section of the book has a great quote about social engineering in general, which I’ll include here:
The whole objective of social engineering is not to force people to do things they would not do, but rather to give them a pathway that leads them to feel they are doing the correct thing in the moment.
Lastly, social engineers can create a false sense of urgency to get a user to take shortcuts or make hasty decisions.
The best defense against social engineering requires training, and consistency. Your employees have a better chance of preventing attacks if they are trained and aware of social engineering techniques. Likewise, company-wide training emphasizes that security is everyone’s responsibility.
Next, if everyone in the company is abiding by the rules, the culture changes. It won’t be considered offensive to enforce the rules. Instead, it will seem suspicious if someone gets upset about it. This helps defend against manipulating people’s politeness. It also makes people more likely to get manager help if they’re being intimidated.
Additionally, if the rules apply to everyone equally, there are fewer loopholes to exploit.
And lastly: “trust but verify”. I’d argue that verifying implies lack of trust, but whatever.