CIO

Social Engineering: The dangers of positive thinking

CSO Online recently spoke to a person working in the security field with a rather unique job. He's paid to break into places, such as banks and research facilities (both private and government), in order to test their resistance to social engineering and physical attacks.

Rarely is he caught, but even when he is it doesn't matter, and the reason for his success is the same in each case -- human nature.

Caught on film:

When the surveillance video starts playing, the images show a typical day at a bank somewhere in the world. Business is steady, but the lobby isn't overly packed with customers, so a single teller is working the window.

Soon, the bank supervisor walks to the left in greeting. At thirty-five seconds in, Jayson Street, the Infosec Ranger at Pwnie Express, a company that specializes in creating unique hacking tools for professionals, makes his first appearance.

Dressed in jeans, a DEF CON jacket and red ThunderCat high-tops, Street is taking a casual stroll behind the counter. Not only is he in the bank, he's in an area that's supposed to be secure and limited only to authorized personnel. Given the location of the bank, somewhere outside of the United States, Street is clearly not a local or a customer.

He's there to perform a penetration test; in this case he's testing both physical security as well as network security, but the staff don't know this. A few seconds later, the supervisor is on screen pointing to a computer that's currently being used by an employee.

Street nods his head in agreement, and moments later he's granted physical access to the system. He's plugging a USB drive into the computer's front port and running software, which requires the employee to stop working with a customer and relinquish his seat for a moment.

This level of access alone proves that the bank's security has been completely breached, and the fact that Street repeats his actions with several other computers compounds the problem.

He stands and starts moving through the teller line, accessing documents and items at each station. After that, he takes a few photographs. Later in the video he is seen sitting next to a large pile of cash as the teller continues their work, spinning in the seat in order to draw attention to himself.

Despite his actions so far, the staff at the bank seem at ease with his presence. Based on the video's timestamp, by the time he leaves Street has spent just over twenty minutes inside.

Why was he allowed such freedom? From the video's perspective, Street simply walked into the bank and went to work. It was as if he owned the place. This begs the question - how much of a role does confidence play in a social engineer's job?

"Ninety-nine-point-nine percent is looking like you know what you're doing. We think that there's all these techniques that social engineers are using, [but] it's not that the social engineers have all these wonderful mind powers or Jedi mind tricks," Street explained in an interview with CSO Online.

What about location? In a matter of moments, he was able to access the entire bank. In the video, the employees stand there and watch as he installs software and collects user IDs, passwords, and a smartcard (used for the teller's computer). Was his task made easier because he was a foreigner, and the staff were uncomfortable with the thought of being impolite?

"It's not a culture thing. It's not a country thing," Street said, pointing out that the human reactions are uniform for the most part no matter where he is in the world.

Positive thinking can be a problem:

He calls the process basic adorable destruction (BAD), because nothing he does during his jobs requires a high degree of technical sophistication. At most he will spend a few moments on Google before entering a job site, and that's all he requires.

In fact, the bank in the video wasn't the only one on this project. Street performed the same essential tasks at several banks, installing software and accessing restricted areas freely. In one case he even walked out with a working computer.

"Humans do not want to think about negative things happening to them," he said, as it goes against human nature to do so.

No one goes to work expecting something bad to happen, such as a random hacker coming in off the streets and violating the company's security. Likewise, humans don't expect -- nor do they want to think about -- something bad to happen to them personally. When such a situation arises, diffusion isn't as hard as one would think.

"If I can give them a reasonable explanation, besides the negative thing that sounds bad, they will believe the positive. They will go out of their way to believe the positive aspect, because otherwise they would have to think something bad was happening to them, and that's not something that humans like to acknowledge."

Describing a job in New York City, near Ground Zero, Street offered another example of where reasonable explanations allowed him to complete his tasks.

Painting a mental image of the area, Street starts his story by talking about the buildings and physical security blocks that were already in place when he arrived. There were SWAT teams and K-9 units working the concourse, in addition to eight security guards and other protective measures. His target was on the upper floors, but first Street needed to clear the lobby, where security had established a checkpoint similar to those used by the TSA.

Compared to the bank, this job should have been harder by several magnitudes. But it wasn't.

"I had to get through eight security guards that were in the elevator lobby, not the office lobby, but the elevator lobby," Street explained.

Equipped with a forged email, Street waited until the late afternoon to make his move, the time of day where there was a lot of foot traffic in the lobby.

Striking up a conversation with a security guard, Street eventually starts talking with a person who was going up to the target's office. This conversation gave the appearance that Street belonged. To an outsider, in this case the guards operating the checkpoint, he was supposed to be accompanying the employee. This brief confusion allowed Street to obtain a security badge that included his name and picture from one of the guards. With that in hand, he was able to access to the target's floor.

He moves through the office and installs malware onto the CFO assistant's computer. His actions draw the attention of a network administrator, who confronts Street. The network administrator told Street that he had noticed a spike in network traffic coming from the assistant's computer, and came to investigate the incident.

"I gave him the forged email, that basically says that I'm supposed to be there doing a surprise inspection, because the owner's not happy -- creating all this confusion. He ended up walking me to every single other machine to install the rest of the malware," Street said, adding that the administrator assumed he was supposed to be there, due to the email and the security badge obtained in the lobby.

"The only thing worse than no security, is a false sense of security."

Avoid candy and smiling faces:

As kids, most people are taught about stranger danger, and Street thinks this idea should continue into adult life, especially where information security is concerned.

"Stranger danger isn't just for kids. We should never lose that. Stranger danger in your secured area is just as relevant if you're a child on a playground, or an employee in your workspace. If you don't know who this person is, find out who they are," Street explained.

The key thing that organizations can do to help protect both the company and its employees, he said, is to arm them with information that they can use.

For example, a phone number that can be called to report something suspicious, such as an unusual email, someone walking around out of place, unusual Internet activity, even something as simple as a business process that's being done differently than it should be. Again, the key is to empower the employee and encourage them to call the number. Sure, false positives are certain, but those can be dealt with.

"It could be a thousand different spam emails that they have to respond to, but that thousand-and-one could be the HVAC email that took down Target. Those humans, those employees, are going to be the biggest intrusion detection system your company's going to have."

However, unless the employee IDS is tuned properly, it's no different IT slapping a blinking box into a rack and walking away. Unfortunately, while having an un-tuned IDS will help check a box during a security audit, it's not helping the company in the long run. Likewise, awareness training in general can help a company pass an audit, but unless the awareness training is tuned and maintained, it isn't going to help.

On its face, awareness seems like an easily obtained security metric. Something that's simple to implement and manage, but it's not. That feeling of "stranger danger" doesn't exist for most adults. So where did it go? Why did society lose the "stranger danger" mentality as adults?

"We're supposed to be safe, because we have security controls. We are so busy thinking that our security controls are flawless, and it's our humans are flawed, that we let it slide that way. The problem is our security controls are flawed just as much as our humans are. You have to take in account that your security systems can be flawed, as well as your humans, and adjust for both of those," Street said.

The security industry, Street explains, has had a mentality for a very long time that "we needed to build walls."

The assumption is that if the walls were built high enough, or thick enough, then that's going to be enough and offer solid security. But level of thinking doesn't hold water, not anymore.

"We need to start understanding that our walls are never going to be high enough or thick enough. We need to start putting lookout towers on those walls. We have to start looking at people looking inside the walls for the breach. We need to start showing, not when the breach happens, but how quickly do we detect when the breach comes -- because it's going to come. So now, instead of trying to build a wall to withstand a breach, build a wall so it can easily detect when a breach occurs. And it's that response that's going to be critical, not trying to prevent a breach altogether."