One of Advanced Capabilities Group’s primary offerings is Red Teaming. This is something not everyone is familiar with, even those who are focused on security services. There’s also a variety of perspectives on what constitutes Red Teaming. Our perspective on this is deeply rooted in the adversarial mindset and is shaped by over 30 years of hands-on experience. We thought it was important to give you an idea of where we stand on Red Teaming and its invaluable role in building a holistic security system.
Today’s adversaries don’t play by any rules. They constantly adapt and learn from failures and the complexity of their tactics and thinking is ever increasing. Whether nation sponsored, criminal or simply curious, this new breed of attacker isn't bogged down trying to find an exploit for a web server or cloud container. They’re no longer thinking along the lines of checklists, policies, and procedures that have been developed to thwart them. They’re now just going around, under, or over them to find weak links somewhere else.
The weak link they eventually exploit is most often a human one. The people that provide you with services, the suppliers, the partners...The question, then, becomes, not only do you know your adversary, but do those partners know them as well? How frequently are your partners doing security assessments? It’s a situation that needs frequent testing that searches for the weak links whether hardware, software, brick & mortar, or human.
Red Teaming is a necessary component in an effective security strategy to face today’s adversary. The unique talents that our Red Team members have make this team of individuals a unique and useful asset to organizations looking to test their security as well as the military and law enforcement units searching for a better understanding of the enemy. It’s a holistic approach that digs to uncover those weak points wherever and however they manifest. Not only at the technological and facility weak points, but the human ones as well.
Many security professionals and penetration testers rely on the standards and checklists of certifications, and in doing so they cover many of the basics. But, a good Red Team goes well beyond this, the team has to adopt and play by the same rules of the adversary to fully embody them. In effect that means a good Red Team plays with no rules at all. Done right, Red Teaming starts by actively testing those standards and checklists, but that’s only the first step. The Red Teaming must go well beyond, to identify what works and the vulnerable areas that need to be strengthened. Think of it like a house: checking the standards and checklists is like making sure your windows and doors are locked (the basics), a good Red Team says “you know someone could just cut right through that wall, maybe we should reinforce that too.” Because that’s what today’s adversary thinks like, they aren’t restricted by just trying to defeat locks, deadbolts, and alarm systems they’ll come in however they can.
To be most effective, Red Teaming has to be a constant tool, one factored in during the security planning of any organization. Today’s adversary isn’t taking a break from trying to thwart your security so you can’t either. Think about a Red Team as the special forces of your security team, they’re trained to think non-traditionally, given greater autonomy, and have capabilities that regular forces can’t touch. When thinking about Red Teaming in this context, it’s important to remember one of the five Special Operations Forces “Truths” - “Competent Special Operations Forces cannot be created after emergencies occur.” A Red Team is no different. You can’t create one or learn from one once your security is already compromised. If you become static and stop checking, moving, developing, updating and performing the next round of security assessments, including Red Teaming, your adversaries will exploit your complacency.
The real world behaves following its own chaotic rules, or lack thereof. If you try to plan for it, set your defenses only once and call it a day, the real world can be incredibly unforgiving. We believe in Red Teaming that acts accordingly.