AI Generated Summary
This panel discussion from HITB Cyber Week Dubai explores the relevance of red teaming in enterprises, covering why it’s important, how to get budget approval, common pitfalls, and how to resolve conflicts between red and blue teams.
Panelists
- Emmanuel: 30 years in security industry, specializing in telecommunications, mobile networks
- Dr. Erdell: Regional CISO at Standard Bank, loves community, believes events help share voice
- Kiran: Head of Technical Security Advisory for Digital 14, part of corporate information security team, manages team of security operations, penetration testing, incident response, speaks at multiple global cybersecurity conferences
- Bryson: Former army officer, founded consultancy Grimm 8 years ago, spun out adversary emulation platform Scythe, co-founder of ICS Village (non-profit on critical infrastructure), senior fellow in cybersecurity and emergent threats, advisor to Army Cyber Institute
- Anant Shrivastava: Technical team within NotSoSecure Global Services (part of Claranet Group), deals with threat teaming, pen testing, DevSecOps initiatives
Key Topics Discussed
Why Red Teaming is Important for Enterprise:
Emmanuel’s Perspective:
- Military games analogy: Matters same way military games matter - don’t get to war every day but train all the time
- Readiness: Need to sharpen tactics, plan strategies, know enemy
- No other security tactics: Help discover advanced threat actors like red teaming
- Why: Understand attacker’s mindset better - contrasts with myopic approach of most normal security teams
Dr. Erdell’s Perspective:
- Must: Red teaming is a must - best way to uncover unknown unknowns
- Blind spot assist: Like 10-15 years ago needed blind spot assist while driving - red teaming is blind spot assist helping enhance
- James Comey quote: “There’s two terms of organizations: one they know they’ve been hacked, two they don’t”
- Why wait: Why wait to get hacked? Hire red team, let them do testing
- Second-hand car analogy: When buy second-hand car, want to check, take to mechanic - same applies to red teaming
- Gives view: Will give view that people try to hide under carpet
Kiran’s Perspective:
- German military origin: Red teaming method initially developed by German military in 19th century
- Idea: Better unpredictable common events called friction in military conflicts, how to tackle current situation, identify behavior of enemies, how to react
- Bruce Schneier quote: “Security is a process not a product”
- Cyber attacks: Becoming increasingly sophisticated, ACT techniques become more highly targeted
- Must learn: How digital adversary think to identify gaps in security program
- Red team security assessment: Identify digital, physical, and workforce elements in organization
- Could be: Financial impact, loss to market advantage, corporate reputational damage
- Threats: State-sponsored or criminal organization being threat actor vectors
- Deviating from traditional: Penetration testing kind of method - needs to simulate with real-world situation, real-world scenarios by replicating tactics, techniques, and procedure for real-world scenarios
- Essential: That’s where red team is essential
- Some organizations: People doing it full-time, some outsourcing to third party
Bryson’s Perspective:
- Must be important: Or why else would be up at 6am on panel for it
- Military expression: “The more you sweat in peace, the less you bleed in war”
- Learn before: Need to learn before somebody else teaches us
- Military aphorism: “All no plan survives first contact” - attackers are not following your rules
- Not just technical: Combination of risk surface across technical and people
- Understanding chaos: How they’re working outside of normal operations
- Tabletop exercises: Big fan - how quickly start seeing assumptions laid bare in organization, realizing hadn’t thought scenario through, not documented
- Finger pointing: Several people looking across table, pointing finger “I thought you would have been responsible for that” - start to really peel those things back
- Entire purpose: Find risk of business, assuring business operates way you think it’s going to
- Where do we fit: Hard for IT, defensive, offensive security folks - where do we fit with respect to business?
- Business purpose: Entire purpose of business is not to do red teaming (unless started own red teaming company) - purpose is own operations
- IT is support: IT is support mechanism that enables modern business to scale efficiency
- Inside that: Defense to assure that IT, offense is niche within that
- Important: Ties how budgets and priorities, how business looks at these
- Problem: Strong problem in industry of talking nerd, thinking business just going to get why red teaming is important
- Translate: Have to translate into their terms, understanding relative order never going to change
- Three core elements: Attack chain fundamentally just three areas:
- Reconnaissance: Learning about you
- Access: Getting in
- Post access: Accomplishing goals as attacker once inside
- Iterative: In iterative fashion from “how do I get to here” to “what do I want to do”
- TTPs: That’s order of battle and behaviors going to follow to get crown jewels
- Red team emulating: Using that order of battle, those TTPs to accomplish goals
- What happens: With business while that’s happening - when can see it, when can understand it, when can respond, when can fix it
How to Convince People Who Control the Purse:
Emmanuel’s Approach:
- In nutshell: Go to management, tell them “You can either spend a hundred now or use a million later” - usually something they understand
- Industry evolved: Lot in recent years and decades
- Beginning: Management not even aware there was such thing as information security breaches - no awareness, happened but didn’t know about it
- Basic awareness: Some sort of basic awareness where sort of knew “wow we need to do something about it” but didn’t necessarily mean did it right way
- Spend money: Would spend money but not necessarily in right areas
- Telecom industry: Where work a lot - would go out and buy every product could get hands on
- Telecom world: They like boxes - don’t understand concept of service like pen test or audit or red teaming, just want box can plug into network, serves certain function (preventing every security threat known to man - of course no such product)
- Recently: Last few years, attacks become more sophisticated, awareness due to media exposure, factors that improve way management understand threats
- Know: Not only something needs to be done, but something useful needs to be done, things that actually produce results, bring return on investment
- ROI question: How do you measure return on investment? How do you measure fact that you’re not being hacked? Can you put number on that?
- More people doing: More and more people doing that, red teaming has to be part of equation
- Only way: In view, only way to actually get proper return on advanced threat that can compromise network for years
- Telcos: Seen threat actors exploiting network for 4 years, literally seen actors been there for more than 10 years in certain networks
- Extracting: CDR records, information about subscribers, geolocation data
- Measure cost: How do you measure cost of that? How translate into something management can understand?
- Tell them: Reputation threat, privacy disclosure threat, can have license revoked because not doing enough to protect customers
- All down to: Stock price could be affected, could lose lot of money, could lose millions - that’s language they understand
- Red team: Good way to mitigate those risks that can lead to financial disasters
Dr. Erdell’s Approach:
- CISO perspective: Take from other perspective - CISO
- 10-15 years ago: Cyber security - was hard to explain why needed to invest
- Today: Everybody is aware
- Transform: We transform
- Role of CISO: Not being nerd anymore - have to understand business, understand business to bits as understand cyber security
- If understand business: Tell what weaknesses are
- Red teaming against management: Have to do red teaming against management
- As red team: Go, found mistakes, instead of hacking them, just write report, give to them, hope blue team not going to detect us
- Same thing: Go in front of management
- Personal experience: Used to work for Fortune company in top three (Microsoft)
- Extreme approach: Went to customers three times - customer in New York, with New York Times on it, put customer name at hat, put newspaper in front of customer: “How much are you gonna pay to get rid out of this? Where you can have proper red team”
- Microsoft not doing: Microsoft was not doing red teaming, was cyber security advisor recommending
- Microsoft not earning: Microsoft’s not gonna earn any cent out of it, job is to advise, highly recommend have red team
- Very extreme approach: Not everybody should take because can lose easily customers
- Did it: Honestly did it
- Went to CISO: Explaining if they don’t do XYZ, they’re gonna get fired straight in face
- Could be understood: As threatening or understanding essential adversary
- If go with facts: Like COVID-19 - if don’t want to die, maybe will not kill you, all have to do is wear mask, use hand sanitizers, keep distance, follow basic rules - up to you to stay healthy or not
- Every institution: Should have cyber security strategy
- Not just products: Should be not just based on products, should be based on training talents
- Most important: Subsequent strategy should also tell us what our weaknesses are
- Doctor analogy: Isn’t that why go to doctors? When feel pain, cough lot, go to doctor - “Hey you’re smoking too much, if continue like this you’re gonna die” - up to you to stop smoking - this is also some sort of red teaming
- Only difference: Wish could go straight to red team, things are not there yet
- If big organizations: Continue to get hacked and paid millions and millions as ransom (don’t want to mention name but think we all know)
- Honestly: If ask company to go to Emmanuel or Bryson company, pay (don’t know how much charge, but pretty sure not million, not even half)
- GPS smart watch company: How much did that big GPS smart watch company paid? They were demanded for 10, negotiated 4 million - pretty sure with that 4 million dollars, any professional red teamer can rebuild company (don’t want to mention zero trust, but things can be built in much better way)
- Back to car example: If don’t understand car, take to mechanic - if mechanic tells “It’s great car but gearbox has issues, you’re going to pay 200-300, change gearbox, this car will take you for another 100,000 kilometers”
- If ignore: One day under rain, gonna regret when car stops - “Oh my god, have to push car”
- Putting all this: Into matter, explain in business way (not as did) to management should make life bit easier
Kiran’s Approach:
- Current situation: Knowing all organizations going in difficult times
- Any line item: Even if part of budget, they always ask “What is purpose of this?”
- Justify: Need to justify in current situation - hard for any CISO or head who needs it for penetration testing or red teaming kind of exercises
- Investment: Need to justify it to make it viable, link it back to business, show how business getting benefited out of it
- Some organization: Can have dedicated red team
- Others: May have testing team - do both periodic penetration testing in addition to red teaming
- Not much difference: In approach if either keep dedicated red team or team which has penetration testing and doing red teaming
- Strongly recommend: In case don’t have dedicated red team, always can have pen testing team already have in-house, build capability within team itself with adequate tools and education
- Education: By attending conferences (Black Hat, NullCon, RSA, C0C0N kind of conferences) to know how adversary act in specific situation
- Know: What things need to help organization improve security posture - very important to benefit from here
- Talk to CFO: When talk to chief finance officer, link it back to dollar value and reputation factor with some finding impacting to business
- Question: “I wanted to do red teaming on my network infrastructure - is this anything which is required for business right now?” - people always have this question
- Always bring back: To some critical application, showing and demonstrating that
- Examples: If in company, show something like payment gateway; if bank, show ATM switches
- This is where: Performing red teaming, make executive team understand need of red team, how can be benefited
- Once get buy-in: From management, get budget approved, get into next level, improvise red team engagement model itself
- Gradual process: Not one-time job - can do it, perform, complete it - need to take as long journey
- Show as pieces: Show what can achieve as tangible result at end of engagement, demonstrate it, make executive team know
- Tell them: Can talk to other peers in business, tell how can be benefited by considering red teaming exercises, how can prevent some cyber security attacks
- Show factual evidences: How some of best players (CMMI level companies) if not following red teaming, how have come under cyber attack umbrella, how adversary looking, acting into attacks
- Once give: That kind of big to management, program will improvise to next level
Bryson’s Approach:
- Different take: On “you’ve been hacked or not been hacked”
- Two kinds of companies: In cyber security - leadership cares, leadership doesn’t care
- No amount: Of tool or quality personnel can overcome what is priority from leadership’s perspective - comes down to resourcing in time
- When leadership doesn’t care: They’re compliance focused, not security focused
- Red teaming: Not compliance focused - red teaming is understanding what will happen to security in real-world way with real-world adversary emulating or simulating, not running down checklist
- Want creative: Want red teams thinking outside box
- Pitfall: One of pitfalls to red teams - see red teams that just try to win, do whatever think can do just to win: “I get in, I own Active Directory and domain, I drop bike, I walk out because I own everything, I’ve got it, I’m done”
- What did business learn: From that?
- Red teaming operation: Has to be business focused
- Talking business language: As Kiran said - trying to talk dollars
- Do not talk: CVEs, server configurations - they don’t care
- They do care: “This is your customers financial information, this is PII, this is PHI” - these are things that have tangible value when know what that value is to business
- Challenge: As Emmanuel talking about - okay let’s talk about risk
- Simplistically risk: Impact times probability
- Human psychology: For all of us is terrible at understanding and appreciating future risk - just don’t, really good at “I’m doing this already and therefore it’s already good, I don’t need to worry about that”
- Understanding: How much that is real risk is very difficult problem because fighting that psychology
- Challenge for red team: Inherently different ways to run red teams - can pick different starting points
- Black box: Have no information of organization (external red team)
- Gray or white box: Inside organization or have information to start
- Always recommend: Because going back to three phases of recon, access, post access - business value increases down those phases
- Recon: Learning something about you doesn’t harm business, just gathering information, has some value
- Access: Proverbial burglar coming, picking lock to open door, now touching computers, can get shell - got access but actually haven’t done anything yet, there but nothing happened, haven’t stolen anything
- Post access: Looking at what are all those abilities to do things - that’s where most business value
- Particular: When talk about red teaming and cost, recommend to lot of organizations based on posture, what kinds of resources want to commit
- Cheaper red teams: Ways to do that, still emulate valid parts of attack chain from adversary perspective that doesn’t cost as much
- If want start from nothing: Asking to do entire intelligence operation of all sorts of different kinds of things to learn about what then going to do to operate - has cost and not necessarily as much business value
- Curveball: This is where concept of purple teaming has come about
- Purple teaming: Not making up additional colors (polka dots are not next)
- Purple teaming: Just evolution of red teaming, saying “Okay, red team with adversary emulation looking at it as more of like these folks go away, conduct campaign, come back, give me report, then everybody has adversarial culture of being apart from each other, large report of all failures show up”
- Let’s do it together: Forces want to have collaborative culture of security
- Get results as go: Decide in advance together - this is what want to test, this is adversary going to pick, these are TTPs going to walk through, do them in milestone-oriented fashion together
- First thing: Want to see what happens with ability to see active recon post compromise, conduct active recon together
- Meanwhile: Blue team and defense sitting there going “Well we didn’t see that” or “How do we do that?” - dial in detections, validate that, move on to next step
- Improving as go: Everybody’s part of that
Major Pitfalls in Red Teaming:
Emmanuel’s Perspective:
- Cannot approach same way: In every organization
- Military analogy: Don’t plan land invasion, urban warfare, or submarine hunting same way
- Different approaches: Give different results
- Continuous testing: Obviously comes at cost but optimal risk discovery - keep doing it all time, might catch second time what missed first time
- Pre-approved engagement: Can have wider, deeper view of scoped area - very important
- Telcos: Where work, some key systems not often updated because mission critical, proprietary, nobody really understands security issues
- What could have done better: Sometimes don’t go deep enough in those areas
- Constraints: From vendors or constraints from operator - “Oh no please don’t go there too deep because that could affect service, service quality might be affected, subscribers might notice”
- Very nervous: Because telecommunication is mission-critical environment - cannot disrupt service, always priority number one, security always comes second
- Regret: In those areas sometimes regret couldn’t go deep enough, engage right team way on very interesting obscure proprietary platforms
- Apart from that: Sure there are always things could do better, but kind of true for every kind of job
Dr. Erdell’s Perspective:
- Cannot agree more: This is what makes us better - learning from experiences
- If had today’s mindset: When was 20, probably could be even better today
- Humanity: Not what happened from first human trying to get fire, get wheels to where are today - all because can fall, it’s okay as long as can move forward
- Hunting analogy: When go hunting, aim to hunt deers (totally against something, really bad, never touch gun in life, just example) - in wild, if go for deer hunting, most probably try to aim deers
- But: Are you not gonna hunt if see another animal? Or more importantly if in danger, are you not gonna shoot at that wild pig trying to attack you?
- Going to say: “Oh no I’m here to hunt deer” and not pick? (Just example)
- Believe: Colors really not important - love all colors as Bryson said, purple teaming which is new name, but if want to call it orange team or blue team or red team, honestly as long as get value out of it
- Dedicated team: Looking continuously, approved or pre-operated
- Red team should be independent: In opinion, red team should be always independent (of course within group policy - when say group, not talking about Active Directory group policies, talking about company policies)
- Should be able: To do whatever they like (when say whatever they like, doesn’t mean should go and reach emails, in ethical matter of course need some like if there is pen testing activities, would love to get approval from CISO, but everybody else should not know - they have no business to know it)
- If tell: “I’m gonna come and test your alarm tonight” - what will you do? Will go and sleep or really go and ensure?
- If auditor tells: “They’re gonna come and audit business today at 3pm” - hope get ready beforehand, don’t wait till last minute
- Engagement should happen: As wish, as team manager try to measure strength of blue team, strength of activities
- If don’t do it: Pretty sure depend on which organization, there is plenty of other people waiting to find that one mistake, one vulnerability, one open door (part you name it) to go into organization and maybe harm you which can damage brand, reputation, job (CISOs unfortunately)
- Coaches in sports: Like coaches in sports scene - might be most famous (know probably cricket will be better example, but going to give maybe basketball or football example - Mourinho, pretty sure hear name - Portuguese coach who became very famous, was even Manchester United, won all cups)
- Doesn’t matter: What name is - if don’t bring results, gonna be first out of door, cannot replace whole team
- Up to you: To protect companies, organizations, brand, reputation, confidentiality, integrity, availability
Kiran’s Perspective:
- Good part: Of final discussion - what see is two from internal, one and two people who are at track on how seeing pitfalls at enterprise level - what are challenges, what are lessons as team actually seeing in various engagement
- Good learning exercise: For everybody
- Scope matters: Lot - have to define activities which will be performing, what are target systems which will be looking into red teaming part
- Some cases: Vulnerability can be safely demonstrated on live environment, still not wise to launch real attack against production system which are business critical
- Cases: Which are not having any background to anybody in organization
- If red teamer: Has understanding and competent and confident enough with specific scenario, has taken pre-approval well in advance from CISO, then should be okay to execute to specific scenarios
- If targeting: Active Directory, targeting pretty standard applications - no need of any kind of pre-approval
- If not: If something which is specific business revenue generating application, requires special approvals
- In addition: Sometimes red team or blue team or even could be pen team are proud of what doing - sharing of information could be challenge
- Some cases: What think - red team thinks “I am strongest, I can break any kind of network” - coming from blue team as well
- Not all: Red team or blue team test scenario might not be possible to cover environment - like SCADA systems because there are systems where not able to do sophisticated attacks in real time because not directly connected
- As red teamer: Without knowing contacts, without knowing anything, if give target to SCADA system, may take longer time, probably can break it out but takes longer time
- Some cases: Strongly believe in rinse and repeat testing - what doing testing, don’t think that is full and final, need to identify it, need to take to next level
- Do repetitive testing: Find work with various team internally, ensure those findings are mitigated 100 percent
- If just doing testing: Will not help in any way - just like accumulating laundry list of findings
- Need to fix: Scope application to be fair, well tested in organization, remediated as well
Bryson’s Perspective:
- Reiterating: What said earlier about different kinds of leadership focus - compliance versus security
- Compliance focus: “Give me a Nessus scan”
- Security focus: “Do a real red team”
- Vulnerability scanning: Is not red teaming
- Client ego: Met clients that are just like “Yup just see if you can beat us, we think we’re best in world” - that’s not point
- Spending: Lot more money, making lot more difficult to accomplish what’s business value because let ego get in way
- Problem for red teams: Also for red teams where ego driven versus mission driven
- Final piece: Follow on from what Kiran just said
- Once achieved: New posture that is stronger based on lessons learned
- Repeatable testing: Key to being able to continue to validate that posture is maintained
- People always doing: Different things - joke “Best hackers in world are your own employees, particularly when tell them no” - yes, they will circumvent any control that gets in way, makes jobs harder
- Understanding: Continually changing calculus of people (employees in organization) as well as IT staff and those controls
- Configuration drift: Would be surprised how often things have configuration drift which think got to certain point, changes, then next thing know certain endpoints no longer logging alerts and details to SIEM - lost visibility
- Checking repeatedly: Way that find this after already established what learned
Resolving Conflicts Between Red and Blue Teams:
Emmanuel’s Perspective:
- Interesting question: Will go back
- Didn’t know: Was doing purple testing until Bryson mentioned it many times
- When do red teaming: Exercise and present results, opposite team (in-house security team) says “Oh wow how do you do that? Can we learn about it?”
- Turned them purple: Actually turned them purple - didn’t know there was such term, definitely going to use it from now on
- Purple teaming has value: Not only will expose risk and securities threat, but will also bring up level of enhanced security team
- Telcos: Always have security team with different level of competence of course
- Part of role: Always to make them better - should be able to defend themselves, shouldn’t be only relying to us doing job
- Valid point: Very valid point
- Ego problem: Really happy talk about that because lot of ego in industry both on client side but also on professional side
- Security professional risk: One security risk we don’t always take into consideration is security professional risk - guys that say “Well I have CISSP and I’m best and you can trust me” - some managers listen to that, some even believe it
- Illusion of security: Get illusion of security because some guy who doesn’t really know what doing can talk right way to management
- Definitely a risk: That’s definitely a risk
- From red team perspective: What seen sometimes is some red teams might not have skill set necessary to be at level of advanced threat actors
- Experience can be humbling: There are some threat actors that can do stuff don’t even know exists
- Telecom industry: One story quite famous - called “Athens Affair” (Greek capital because happened in Greece)
- Threat actor discovered: By bit by luck, able to live patch (in-memory patching) of binaries running on mobile switching centers
- High available mission critical: Those are high available mission-critical platforms - platforms actually switching calls
- Written in Flex: Language probably never heard of
- Able to binary patch: Live those systems
- When see that: Know can’t do that, know will never have that kind of knowledge
- Government threat actors: Obviously those were government threat actors because nobody else has resources to go into that
- As red team: Shouldn’t let ego take precedence, should always have approach that “Yeah maybe there are some threats we don’t know about”
- Important: Not always believe we are gods of information security
- Thanks: All of you for mentioning that - think it’s important
Dr. Erdell’s Perspective:
- Ego: Mean we all have some sort of ego right
- Got PhD: Wrote dozen of books, worked in military, honestly used to train as well many many many years ago
- Sometimes: When used to go classes, see students - “Oh my god how did you do that?”
- Good thing: In cyber security it’s like being in China (don’t know if went to China) - if go to China market, go and negotiate, negotiate, at end believe got best price (guess from India, know exactly what mean)
- Think got best deal: After go and meet friend that they got even better deal
- Being cyber security professional: Same thing - think are best, then meet 17 years old boy who hacks Twitter and collects hundred thousand dollars, he’s just 17
- Can give: Many other examples from this
- Ego is good: As long as know how to control it, how can move forward
- At end: We all spend life to come to place in career - not in careers, not gonna let little boy go challenge Emmanuel who spent all years, got gray hairs, build company, bring to this stage
- Of course: Gonna have spirit or ego in other words
- Pretty sure: Ego is not bigger than him - that’s why very is looking at public profile
- Again step back: Knowing is good, sharing is good, but there is gonna be always better
- Think: Kiran or maybe Bryson has reminded that sometimes know hackers are lazy right
- If don’t have: Don’t have to really always wait for vulnerability
- Somehow: If go as Bryson said into organization, do search, do enumeration, sometimes find some executables ready for you there
- Because somehow: Internal security team did some testing, somehow able to download some stuff, done some penetration testing maybe successfully, forgot one little important step to clear track
- Forget: Executable, DLL, whatever there which can make Bryson jobs much easier sometimes to execute
- Unfortunately: Saw this happen as well
- Not in deep level: Not in this deep level industry for while now, but when look back still remember and smile
- When speak: “How can you forget this?” - “Oh yeah it’s just little DLL”
- But again: Sometimes doing as said - we all do mistakes, as long as fall and move forward, as long as learn from it, that’s why in this community
- Bryson wakes: 6am up, Emmanuel left business, Kiran took some off in Sydney (at moment 11pm here, currently in Sydney)
- Believe it or not: Gonna wake one half hours, have engagement with Microsoft but wanted to be here because important to share, important to learn
- More importantly: Hope this was - wanted to add different angle for question
Kiran’s Perspective:
- Egos and conflicts: Very common to any corporate culture - no way different from any companies
- All the time: People need to remember one thing - either could be blue team or could be red team, always that we are enabling business
- Need to: Whatever findings come up as part of red team engagement, what is risk exposure?
- Technical teams: Bound to have this kind of conflicts
- Leadership responsibility: It is leadership responsibility or someone in team need to tell them what is risk which is exposed to business, what is risk exposed to system and people or process which need to talk on that, need to sort it out
- If solve: At that level, then can able to reduce ego or reduce conflicts within both teams
- Always robust interaction: Between red and blue team is necessary
- Not win or lose: Not win or who wins or who loses game
- Objective: Establish minimum environment where blue team learn from process, develop, sharpen skill to defend organization system when directly attacked or with alternative solution
- Not like: Who wins or loses game - always say as people know how always have one kind of in city that so want to put since know everybody should agree to that one - that’s not actually correct way of defining things
- If step back: Understand holistic view that how it is impacting, then probably can avoid conflict
- Also ego: Within red and blue team - that’s view
Bryson’s Perspective:
- Everyone on panel: Is leader
- Leadership: Art and science of getting people to do something they don’t want to do
- Exactly: That’s exactly what this question is
- No different: Than any other problem face in world on daily basis at home with families, with friends, certainly what see amplified in corporate culture where starting to deal with different factions and structure and momentum and inertia of organization
- Really just two aspects: Either somebody is threatened or somebody doesn’t understand something
- Jobs as leaders: Set that vision, that expectation, get them training, education that need to accomplish goals
Key Insights:
- Red teaming is like military training - need readiness, sharpen tactics, know enemy
- Best way to uncover unknown unknowns
- Must translate to business language - talk dollars, not CVEs
- Three core elements: reconnaissance, access, post access
- Purple teaming is evolution - collaborative culture, get results as go
- Scope matters - define activities and target systems
- Repeatable testing key - validate posture maintained
- Ego on both sides - red teams trying to win, blue teams defensive
- Leadership must set vision and expectation
- Business value increases down attack chain phases
- Configuration drift happens - need repeatable testing
Actionable Takeaways:
- Red teaming matters like military training - readiness, tactics, know enemy
- Best way to uncover unknown unknowns
- Translate to business language - dollars, not CVEs
- Three phases: recon, access, post access - business value increases down phases
- Purple teaming - collaborative culture, get results as go
- Scope matters - define activities and targets
- Repeatable testing key - validate posture maintained
- Resolve conflicts by focusing on business risk, not ego
- Leadership must set vision and expectation
- Don’t let ego take precedence - there are always threats don’t know about