How To Defend A Bank, Pt. II: Right of Bang

As much as we can imagine what it’s like to be a defender in a cyber-conflict, we don’t really know what it is - unless we’re in the shoes the time of it happening. That's what simulations are for.

Hosted By

Ran Levi

Born in Israel in 1975, Ran studied Electrical Engineering at the Technion Institute of Technology, and worked as an electronics engineer and programmer for several High Tech companies in Israel.
In 2007, created the popular Israeli podcast, Making History, with over 14 million downloads as of Oct. 2019.
Author of 3 books (all in Hebrew): Perpetuum Mobile: About the history of Perpetual Motion Machines; The Little University of Science: A book about all of Science (well, the important bits, anyway) in bite-sized chunks; Battle of Minds: About the history of computer malware.

Special Guest

Sam Curry

Chief Security Office at Cybereason

Experienced Senior Security Executive with a demonstrated history of working in the computer and network security industry: product, engineering, security experience. Extensive publications and patents, big company and entrepreneurial track record. Multiple awards from industry, public sector and academic institutions. Personal mission to fulfill the obligation of security to the world.

How To Defend A Bank, Pt. II: Right of Bang

[Note: This episode, as well as our Pt. I episode on How to Defend a Bank, are based on an article written by our Senior Producer, Nate Nelson, for the website Etekly.com.]

In Part I of our episodes on “How To Defend A Bank,” I introduced some of the reasons why it’s so difficult for us, on the outside, to make sense of just how complicated it is to protect a financial institution in today’s world.

[Sam Curry] My name is Sam Curry. I am Chief Security Officer for Cybereason.

Sam used to work in this field, which is why he’s again helping us in this second episode.

[Sam] One of my first principles in security is that as much as we can imagine what it’s like to be a defender in a cyber-conflict, we don’t know what it is, even if we’ve been through it somewhere else, unless we’re in the shoes the time of it happening.

BANE & OX

Connor Jones was working in the Human Resources department at Bane & Ox, a Fortune 100 financial services company, when he received an ominous call. In an article for ITPro he wrote:

“The first call came through, loudly and abruptly. Out of the six teams tasked with protecting the firm, we had to take the first call – we had to set the example. My teammate Jeremy and I looked at each other with dread – neither of us wanted to take that call. We just stared at each other, both saying nothing but our facial expressions were communicating just fine.
“You take it.”
“No, you take it,” we both said silently. The phone rang longer than a phone usually rings [. . .]”

Jones and his colleague were responsible for gathering and disseminating information to other departments and teams in the company. Even before answering the phone, they knew that danger loomed ahead. His partner picked it up. A journalist was on the other end of the line.

“[She] claimed to have just been told that data belonging to millions of our customers was sprawled across the internet. She asked us for comment, as journalists tend to do. In a visible panic, spluttering and choking, Jeremy blurted out, ‘No!,’ before he’d even verified the information was legitimate.”

After the phone call ended, an eerie calm set over the room. Jones and his colleague were packed in with two dozen other people, sitting at computer monitors while waiting for the inevitable. The room was dark: shiny black ceilings and floors accentuated by cool blue LED lights, giant flat-screen HDTVs cover seemingly every wall, with futuristic-looking displays showing data feeds and visualizations atop a world map. It had the look of a movie theater or a spaceship. All the pretty screens weren’t of any use yet, though. They just had to wait.

And then, everything broke open.

“It wasn’t just one loud phone ring that filled the room with its obnoxious noise, more came in, each more unwelcome than the last. Being part of HR, I was supposed to liaise with these teams and find out what they knew, but the flurry of activity was so distracting that I could barely collect my thoughts, let alone follow a company-wide communication strategy.

The individual teams were tasked with working together to combine what we knew into a coherent picture. We had all received phone calls from different external sources and we were supposed to analyze those at pace while communicating it to other departments. The big picture was that the initial data leak had led to a much more serious cyberattack, of which we were in the midst.”

Bane and Ox had suffered a large-scale data breach. Financial records and information on millions of customers were dumped online. Social media was alight. In the cacophony of incoming calls, tweets, new information, and shouts among team members, it was difficult to piece together a coherent story.
Specific details were scarce, but information sourced from another team revealed that hackers had managed to cripple company systems, even leading to some employees becoming trapped in a lift. Reporters were also at the front doors of our headquarters, demanding answers to the rumors.

“Our IT team crumbled under pressure. The sheer number of status reports coming in, some of which provided conflicting information, meant that many teams started to ignore calls that came in, in an attempt to avoid making the situation any worse. It was simply impossible to know what information was true.”

Jones and his colleagues knew they were going to face a major data breach, but they hadn’t quite prepared for it. Suppressing an event of this scale was hardly something they could plan out in advance. So they tried their best, while frantically sweeping up the mess.

The chaos stopped when news broke that the press had run with the story. Bane and Ox had failed to contain their problem, and now the public had full knowledge of their spectacular failure.

That’s when the exercise directors stepped in.

SIMS & RED TEAMING

If you were to step just outside Bane and Ox’s cybersecurity war room, you’d land on pavement, or maybe grass. Actually, you wouldn’t have been inside a Bane and Ox facility in the first place. They’re a made-up bank. Jones is a journalist. His role in HR was merely delegated to him for the extent of the cybersecurity exercise that he, and other U.K. journalists, participated in during January of 2019.

The 18-wheeler truck which housed that simulation is called the Cyber Tactical Operations Center (C-TOC). The C-TOC was developed by IBM in collaboration with the U.S. Air Force, as part of their X-Force threat intelligence unit. It’s probably one of the sexiest, most high-tech commercial vehicles in the world. It’s also among the most expensive: the facility it was modeled after, based in Cambridge, Massachusetts, cost around 200 million dollars to construct. C-TOC weighs about 23 tons, and it has retractable wings on either side which expand to make room when parked. It travels around the world, teaching and scaring security professionals, students and journalists along its path.

Cyberattack simulations were originally inspired by military wargaming. They come in two forms: those carried out in simulated environments, and those carried out on real-world systems. C-TOC is an example of the former: it’s what we call a “cyber range” (like a firing range, but for digital weapons). The latter is “red teaming”. In American military wargaming, the U.S. is always the blue team, and the red team is always the bad guys–hence the name. Sam Curry.

[Sam]So red teaming, I love the colors in my industry, right? We have white hats, black hats and gray hats and then red teams and blue teams. I’m going to introduce some more too, like purple teams. The whole purpose of red teaming is to make the blue team stronger. I’ve seen some toxic situations where management asked the question, “Is security doing its job? How do I know?” and they think the red team, if it’s successful, proves that the defense, the blue team and security program are incompetent. That is not the case.

I sometimes draw an example to a basketball game. If you see on ESPN news or something, you see a clip of a basketball game, it’s just one basket and you freeze frame it there. You don’t listen to the rest of the report and you say, OK, that guy just scored a basket. Now you say, “Who won the game?” You don’t have enough info. You’ve got 2 points, maybe 3 out of perhaps 100 to 200 points in the game. You don’t have enough data. Then you say, “So based on just this play, who’s the season champion?” You also don’t know what happened to the other teams or how they went on. You just don’t have enough in that – up to 24 seconds to make that play. So red teaming for me is best when it is done frequently and often and it’s not used to band at the blue team.”

To carry out a red team exercise, an organization must delegate a group of capable people–either employees or more often a third-party group–to play hackers. The red team must come up with a way to reach their target, and the blue team must stop them. Red team exercises are executed on real-life IT systems. A red-teamer might breach the most critical center of a company’s actual IT infrastructure. Instead of crippling it, they note their results and what a malicious hacker could’ve done.

Alternatively, red-teamers might hack their target’s systems with a specially-crafted malware: one with all the spreading power of a real malicious program, but none of the damaging payload. Red team exercises are most effective when kept a secret from the security teams responsible for defending against them.
However, most corporations can’t afford to shut down their systems even temporarily. This is especially true in critical industries such as electricity and water. So, some corporations use creative constraints when devising exercises. What distinguishes cyber ranges from red team exercises is the nature of those constraints. In a cyber range exercise, a sandbox environment mimics an organization’s real-life systems. C-TOC is an entirely fictional reality that nonetheless allows participants to experience what it would be like to battle a major cyber incident in the real world.

Unfortunately, cyber ranges tend not to be fully comprehensive: to even attempt to recreate a major corporation’s IT infrastructure would take months, even years, and millions of dollars. Instead, exercise leaders have to strategically cut corners: by focusing on select areas of an IT network, or by papering over certain aspects of the experience. IBM, for example, does not build an elevator and put people inside it to properly simulate a hacked elevator system. Instead, they dictate that part of the story, as they do other components of the exercise. A majority of the exercise described in ITPro was merely stagecraft. The reporters, regulators and Bane executives who called in were actors. The social media posts were prewritten. Every detail of the exercise gave the illusion that it was real.

Wells Fargo is one of the real-life banks that operates its own cyber range, perhaps even more sophisticated than C-TOC. The company has gradually developed more elaborate simulations over recent years with help from military-industrial subcontractors. Speaking to an online magazine called American Banker, their Chief Information Security Officer Rich Baich explained, quote: “It’s almost like building blocks because you wouldn’t go and try to do the entire network at first — it might take you years. But maybe you do payment systems, maybe you do ATMs, maybe you do Swift, maybe you do routers.” End quote. As Wells Fargo builds out their cyber sandbox, they conduct fresh exercises every quarter of the year. As Sam explains, this continuous practicing – both team penetration testing and in cyber ranges – is a key factor in improving the organization’s security posture.

“[Sam] So imagine if – let’s go back to my basketball analogy. Imagine if you got a basketball team to say OK guys, we’re going to practice defense. I want the people with the white pennies to go after the people with the blue pennies over here. See if you can hit the basket and they fail to make the basket.

You go, OK, practice over. We can all go home, right? No. It takes multiple attempts and actually the truth begins to emerge in the system that you care about how are they learning, how are they adapting, how are they playing the game. The game is not about any one attack. The game is about the whole program over time. So I like to look at things like rates in the first and second derivatives rather than whether one pen test failed or succeeded. You don’t want to infer too much from that and you really do it to learn and to get better, not to see if you’re ready right now.”

Training security staff, and simulating as many attack vectors as possible, is critical because the potential damage involved a finance industry hack is so massive–exceeded only by that of a government or critical industry hack. Overall, simulations can be highly effective in exposing holes in an organization’s thinking (especially red team exercises, being less orchestrated and limited than cyber range simulations). Financial organizations like the Securities Industry and Financial Markets Association (SIFMA), the Financial Services Information Sharing and Analysis Center (FSISAC), and the U.S. Treasury regularly host industry-wide simulated cyber exercises.

These cyber range exercise events draw hundreds of participants from dozens of financial institutions, and that number grows each year. All those people don’t just come for learning, though. Simulations can be fun. If C-TOC weren’t so stressful, it would have all the trappings of a pretty interesting amusement park attraction.

AN INSUFFICIENT APPROACH?

On the other hand, simulations are an insufficient approach to addressing the broad threat landscapes financial institutions have to cover. There is simply no good way to reenact every possible hacking scenario a major corporation could face. Wells Fargo might patch up every hole exposed by a very well-executed exercise. But by the end of it, there are many other possible holes. The size of these institutions means hacks can come in any direction, by any means imaginable, with all kinds of unforeseen consequences.

What does that mean for organizations that consider using simulations as part of their cyber security training? If simulations incapable of addressing all the possible threats a financial organization faces – is it even worth it for the organization to pour money into running these simulations?

Well, yes, in most cases. For small banks, it won’t make sense–the kinds of threats they face just don’t scale to that level of solution.

For large financial institutions, drastic measures are necessary. It certainly is expensive to build sophisticated, diverse cyber ranges that take the time and energy of dozens of paid employees. But investing in cybersecurity will, almost always, end up cheaper than dealing with the consequences of not investing in cybersecurity.

“[Sam] So the threats to banks, they’re material and that if they exceed a certain amount, it changes whether a business is worth doing.
What we should be doing collectively is saying how do we equip you to make risk-based decisions better? How do we societally avoid bayoneting the wounded when you are victimized, but also ensure that you are learning from when it goes wrong? [. . .] It’s not OK to walk away and say, “Well, it was inevitable.” There has to be some anti-fragility, some resilience, some segmentation, things that can be done. I call it right of bang. Other people call it that as well.
After the unthinkable happens, how do you minimize the damage? How do you recover and then how do you repair? We spent a lot of time talking about prevention and detection mindsets and left of bang. We also have to pay attention to right of bang and my sincere hope is that Capital One has learned from this and that they can articulate the lessons. I care less about bayoneting the wounded than I do about that there are lessons and they’ve learned now.”

KEEPING A COOL HEAD

This “right of bang” concept is important.

In our Equifax episodes, you heard about how a lack of preparedness caused the company huge amounts of money and bad press. Since the release of that show, Equifax reached a settlement regarding their 2017 breach, in which they’ll have to forfeit up to $700 million dollars in damages. That’s up to $20,000 per customer affected. Now, how many fusion centers could 700 million dollars have bought? How many red team simulations?
When you’re about to give an important speech, or take a game-winning free throw, what do people tell you? Picture everyone cheering, picture that ball going in. Visualization (so the thinking goes) is crucial to success because it allows you to calm yourself before a major event, therefore making that positive outcome more likely to occur.

In cybersecurity, the opposite is true. It is because we imagine things will always go smoothly that we fail to properly secure important, expensive systems. If you picture everything going wrong, you’re much more likely to be prepared when everything inevitably does go wrong.

That, in a sense, is what simulations are for. A simulation can’t account for every possible vulnerability an organization faces, but if C-TOC, and Mastercard’s fusion center teach us anything, it’s that “right of bang” protocol isn’t really about technical issues. It’s about the flood of calls that come in from angry clients and customers, about coordinating between teams and being quicker and more efficient than your attacker. There will always be people in the room smart enough to figure out what exactly happened to the servers and how to fix it, but what’s more valuable and more rare is having a cool head in the face of disaster.

If you’ve lived through a cyber exercise, the next data breach you face won’t seem so new. You’ll have experience to build on. Your blood pressure will be slightly lower than it otherwise would be. When hundreds of millions of dollars are on the line, the most valuable asset is a cool head.