So I became aware of the intrusion kill chain in 2009 when Mike Cloppert referenced it in one of his presentations at the SANS Incident Detection Summit (I can't find an agenda to this). In 2010 he released a formal paper on the concept. If you're not familiar with the intrusion kill chain please pause and read it. It's worth your time. Don't TL;DR me.
I recently used the kill chain as an example in a few presentations I gave. That made me think a bit more about the kill chain concept. Specifically I asked the question: Does the defensive side have a kill chain?
Short answer? no.
Long answer? A kill chain relies on the fact that "any one deficiency will interrupt the entire process." Through an entirely inductive reasoning process I've identified five steps of defense that can, if interrupted, will greatly disrupt the defensive process. Unlike a kill chain, disrupting one phase will not necessarily interrupt the entire defensive process or posture.
So what's my back-of-the-napkin defensive kill chain? More precisely, what would a targeted attack focus on in order to disrupt a defensive operation? First, the attacker will leverage penetrating the defensive operations security of the target. This is through a variety of means, including OSInt, HUMInt, etc. Next, the attacker will find weaknesses in the orientation of the defensive operation. This means taking advantage of both the defenders and overall target's mindset, expectations, and beliefs. This includes social engineering. It also includes understanding defensive operations shifts, holidays, and general abilities. Next, the attacker will leverage this combined information and subvert the IT architecture. This is exploitation, this is escalation, this is action. This is done in tandem with subverting the security architecture. This avoids detection and prevention measures; this defeats any defense-in-depth control which is not already inherently built into the IT architecture. Finally, the attacker will defeat any responsive/reactive measures by the defensive operation. This means working faster and better than the defensive team.
The short version of the Defensive Kill Chain: Operations Security -> Orientation -> IT Architecture -> Security Architecture -> Response Activities
I'm wishy-washy on this as an idea; but it's a fun one that I may use and strengthen in the future.
Monday, November 7, 2011
Wednesday, September 14, 2011
Creating a tabletop exercise scenario
There are several types and ways to conduct exercises, drills and team training. A tabletop exercise is one of the ways that I’ve found generates understanding, traction, and visibility. It can be a bit overwhelming to create a good tabletop exercise. Why? It requires an attacker mindset, creative use of evidence trails, technical accuracy and excellent presentation.
Attacker Mindset
You must become the attacker to devise an attack. Your first obstacle is to define an end state and motive of what you wish to desire. Disruption? Theft? It should not be arbitrary. Once you have your motive you then must develop an attack that’s technically accurate and realistic. I recommend outlining each sequence of the attack to create depth of the scenario (see table)- I’ve had scenarios surpassing forty sequences.
Evidence Trails
Your defensive ops team require tidbits of evidence to allow them to think critically and make decisions. Ideally these evidence trails are slowly revealed through the course of the exercise and projects real-world activities. These evidence trails must be customized to the defensive operations tools and procedures- if the defensive ops team utilizes netflow data and HIPS events then fictional flows and events may be presented to them. I recommend having a potential evidence trail with each sequence of attack in a table. This will help the scenario stay organized and will allow you to decide how the scenario is ultimately presented to the participants.
Technical Accuracy
The tactics and tools used both by the fictional attacker and the participants must be grounded in accuracy. A zero day exploit in Adobe Reader is fair; a “zero day exploit” which “takes down the network” is not.
Excellent Presentation
The presentation must be done plainly and convince and inform all levels of audience. I recommend separating out the attack sequence from the observations and responses of the participants. Once the table top is complete, you may then walk the participants through each sequence of the attack. They then tie in their observations and reactions based on exactly what happened. That’s where the lessons can be learned.
A mocked up attack timeline. This is used to help build the basis of the exercise. It helps generate the depth and scope of the attack, the evidence trails, and allows you to then craft how the tabletop exercise itself may be carried out.
| ||
Date Time
|
Event
|
Evidence / Artifacts
|
4/15/11 13:41
|
Attacker A uses google searches to locate a series of employee email addresses
|
Screenshots of google hits
|
4/16/11 08:41
|
Attacker A sends a crafted phishing message to the identified email addresses
|
SMTP email gateway logs
|
4/16/11 8:45
|
Victim B erroneously clicks malicious link / successfully compromises PC “DougH”
|
HTTP gateway log
Windows prefetch entry
File: C:\windows\tasks\svchost.exe
|
4/16/11 8:46
|
PC “DougH” establishes C2 with example1.dyndns.org:443
|
HTTP gateway log
|
4/16/11 8:46
|
PC “DougH” downloads p.zip from rapidshare.com/
|
HTTP gateway log
File: c:\windows\tasks\p.zip
c:\windows\tasks\p.exe |
4/16/11 8:46
|
PC “DougH” executes p.exe (pwdump) and transfers results via FTP to example2.dyndns.org
|
Windows prefetch entry
|
Thursday, September 1, 2011
Establishing Defensive C2
Sustained defensive operations should expect an incident at any time. This has tought me that well crafted, exercised, and useful C2 is required. This is particularly important for operations which have small teams, geographically separated personnel or lack a 24x7 operations center.
The below techiques may seem banal but it's striking at the seemingly lack of recognition by the community on how vitally useful they are. Offensive operations are deliberate in both their actions and their C2. Defensive operations require those same characteristics. I loosely define communications plans as a formalized C2 structure at both an organizational and technical level.
A successful communications plan allows for adaptability, high tempo operations, preparedness, and leadership understanding. These aren't buzzwords, they are achievable and necessary. It is a foundation for medium to large scale incident response operations.
Individual preparedness
All personnel mobile phones should include contacts which are needed during incidents. This includes team members, leadership, external parties, or preplanned meeting locations (see below). Such a list should reside on each team member’s mobile phone. As a contingency, keeping a subset of the most critical contacts on wallet-sized cards and carrying them on your person can provide coverage when cell phones are unavailable.
Keeping these lists up to date can be a nuisance; testing them through scheduled call trees can make sure they remain up to date and useful.
Secure comms
Think in terms of both standard as well as worst case options. If you can no longer trust your hosts or networks how can you properly share and collaborate with team members? How can the defensive team access workstations, servers, or security controls? How can you inform leadership? When and how can you engage law enforcement? Have preparations on how to securely share documents or files with in-team, in-company, or external parties. As maturity develops, different categories of communications may be used. For instance, open email may be satisfactory in some incidents while encrypted attachments are needed in other cases. Out of band communications may also be desired in certain circumstances. Quickly deploying an airgapped network for defensive operations may be needed. Having these defined ahead of time will prevent “just-in-time solution building”. Finally, staff will gravitate towards communications medium they typically use; treat this familiarity as a strength when building a communications plan.
Control
Defensive operations rely on access to equipment. This includes workstations, phones, servers, and other security systems. These control channels - including hands on a keyboard in a data center, coordinating collection of evidence, remote access, application access, and log retrieval - and their contingencies should be considered in the communications plan.
Preplanned Meeting Locations
You know those meeting spots where everyone is trained to walk to in case of a fire drill? Having such preplanned locations (either physical or virtual) can be essential in unknown or complex situations. Many organizations telephone systems can create predefined conference bridges. These conference bridges can act as a central preplanned meeting place for geographically dispersed teams. These bridges can act as a staging and coordination center and can be invaluable. If you do use such systems, be mindful that they may be VoIP and not necessarily out of band.
Various communication media can be used in different ways. Technical staff can achieve a higher performance tempo through utilization of secure and instant communications through services such as jabber, skype or SILC. Such virtual conference rooms can act as a copy/paste medium to spread critical log file entries or other plaintext to team members in real time. If timestamps are enabled its backlog can also serve as a useful historical timeline of response activities. This medium is excellent as it is both realtime and can be used in conjunction of other media such as physical or telephone meetings.
Recognize that heavy incident response operations will utilize a variety of communications at once. These channels can often clash and cause delays or confusion. The response staff should reserve their own communications channel for coordinating activities - determining the vector, scope, and seriousness of the situation. Other channels should be pursued to communicate findings, reports, and recommendations to higher leadership, business partners, law enforcement, etc.
Some conclusions:
- Comm channels are key and often underappreciated during response activities.
- Comms plans are essentially C2 activities. These C2 channels need clear paths from top leadership to defenders to defender equipment- if a defender can't initiate analysis or containment then it's going to be a bad day.
- Comms plans should address realistic contingincies surrounding confidentiality, integrity, and availability.
- This isn't an exercise of paper but of preparedness.
Monday, August 29, 2011
Dragon Bytes Followup
Last year Richard posted a review of "Dragon Bytes" by Timothy L. Thomas. This book was no longer being published when Richard reviewed the book; to the extend that Richard had to do a followup post to answer questions on how to obtain a copy.
Fast forward nearly a year and I was able to obtain my own copy through Amazon's new/used program. I liked the book. I did a few searches and found several of his papers on the Defense Technical Information Center (http://dtic.mil/). Some of them are directly related to "Dragon Bytes" while a few cover Russian theory and one covers Al Qaeda. They're worth checking out if you have not had an opportunity to get a copy of the book due to it's unavailability.
The Chinese Military’s Strategic Mindset
Google Confronts China’s “Three Warfares
Russian and Chinese Information Warfare Theory and Practice
Russian Views on Information Based Warfare
Dialectical Versus Empirical Thinking: Ten Key Elements of the Russian Understanding of Information Operations
Al Qaeda and the Internet: The Danger of “Cyberplanning”
Also, I recommend "On China" by Henry Kissinger if this subject interests you.
Fast forward nearly a year and I was able to obtain my own copy through Amazon's new/used program. I liked the book. I did a few searches and found several of his papers on the Defense Technical Information Center (http://dtic.mil/). Some of them are directly related to "Dragon Bytes" while a few cover Russian theory and one covers Al Qaeda. They're worth checking out if you have not had an opportunity to get a copy of the book due to it's unavailability.
The Chinese Military’s Strategic Mindset
Google Confronts China’s “Three Warfares
Russian and Chinese Information Warfare Theory and Practice
Russian Views on Information Based Warfare
Dialectical Versus Empirical Thinking: Ten Key Elements of the Russian Understanding of Information Operations
Al Qaeda and the Internet: The Danger of “Cyberplanning”
Also, I recommend "On China" by Henry Kissinger if this subject interests you.
Friday, August 26, 2011
Sustained Operations
Lately I've been thinking of security operations in the context of the duality of defensive and offensive operations. An offensive operation may achieve little if it doesn’t account for security controls deployed by a defensive team. Alternatively, a defensive operation must take into account the tools and tactics used by various categories of offensive operations. In this view, security operations is the combination of both offensive and defensive operations. It is the competition between these two operations.
An offensive operation can exist without a defensive counterpart however a defensive operation cannot exist in a successful or sustained fashion without at least one effective offensive operation. This perception that offensive operations do not exist, are ineffective, or otherwise in a nebulous or unknown state is an undertone in the continual incorrect risk calculations performed by business leaders. This recently has been reflected by both Sony and RSA breaches in 2011 and their apparent disregard for defensive personnel.
If security operations is the duality of defensive and offensive operations, what is defensive and offensive operations? Offensive operations is the willful and sustained intent of an actor or a set of actors to control your technology or information against your will. The operation includes actors as well as the actor’s specific strategy, tools, tactics or procedures. For instance, the Zeus Trojan is not an offensive operation but is a tool of an offensive operation. Exfiltrating data through the use of encrypted RAR files to a drop host is not an offensive operation but may be a procedure of one.
Defensive operations is the willful and sustained intent of actors to prevent such control. This operation may include tasks such as incident detection and response, architecture design, vulnerability discovery and correction. More on what makes up a defensive operation will be outlined in later posts.
The defensive posture built over the last several years has strengthened to a degree which generally deters automated threats such as worms or brute force scanners. The steady and slow advancement of security over the last twenty years has yielded an unexpected result: the offensive side has moved to sustained operations.
Some conclusions:
An offensive operation can exist without a defensive counterpart however a defensive operation cannot exist in a successful or sustained fashion without at least one effective offensive operation. This perception that offensive operations do not exist, are ineffective, or otherwise in a nebulous or unknown state is an undertone in the continual incorrect risk calculations performed by business leaders. This recently has been reflected by both Sony and RSA breaches in 2011 and their apparent disregard for defensive personnel.
If security operations is the duality of defensive and offensive operations, what is defensive and offensive operations? Offensive operations is the willful and sustained intent of an actor or a set of actors to control your technology or information against your will. The operation includes actors as well as the actor’s specific strategy, tools, tactics or procedures. For instance, the Zeus Trojan is not an offensive operation but is a tool of an offensive operation. Exfiltrating data through the use of encrypted RAR files to a drop host is not an offensive operation but may be a procedure of one.
Defensive operations is the willful and sustained intent of actors to prevent such control. This operation may include tasks such as incident detection and response, architecture design, vulnerability discovery and correction. More on what makes up a defensive operation will be outlined in later posts.
The defensive posture built over the last several years has strengthened to a degree which generally deters automated threats such as worms or brute force scanners. The steady and slow advancement of security over the last twenty years has yielded an unexpected result: the offensive side has moved to sustained operations.
Some conclusions:
- Defensive operations must move to a sustained model of operation in order to counter this growth of depth by nearly all offensive operation categories.
- It’s in the best interest of offensive operations to have a continuing bag of tools, tactics and procedures and use each as needed over a large period of time.
- Offensive operations are no longer reliant on a particular exploit. Unlike twenty years ago, such exploits are only a subset of tools at the disposal of the offensive operation.
- Nearly all defensive operations are exceptionally bad at acknowledging and sharing the offensive operations tools tactics and procedures with each other. I suspect this lack of acknowledging or sharing of information is a contributing factor to successes by the offensive operation.
- Correcting vulnerabilities as they are uncovered does negligible good for the defenders while deterring known tools, tactics and procedures has a greater impact.
If you haven't read it, my cause and effect post from last year attempts to compare defensive and offensive operations.
Wednesday, August 24, 2011
Beginning Somewhere: Incident Response/Leadership Cycle
First off: there must be a pre-existing reason to create a defensive capability. How does one prove or gain acceptance of that reason? There's no formula for 'selling' a defensive posture; this post will not outline how to create a sustained defensive posture. The below summarizes how I instead think about growing an operational incident response team and it's capabilities within a company.

This construct applies the theory of leadership. Leadership is, in my words, the act of doing the right thing with the long view in mind. It is not the easy thing to do with the short term in mind. This dichotomy is not exact but it focused on the characteristics we need.
I call this construct the Incident Response/Leadership Cycle. This is straight forward. It begins with the willful intent to detect a security incident. That willful intent will stir up a multitude of actions such as response and recovery. It is necessary that leadership acknowledges and appreciates how the detection is made. This is the second phase - generating increased leadership visibility. If such appreciation is merely a pat on the back then this step hasn't fully been realized. Such leadership must desire not solely prevention of future incidents but a desire to migitate and prepare for the next instance. This leads to the final step of the cycle: increased detection capabilities. Raising this detection capability will provide the next incident to respond to; hopefully earlier in the attack than the previous incident. This detection capability implicitly improves the response capabilities of the team through new experiences and resources.
This cycle is how I've come to articulate the pragmatic growth of a team of IT or risk professionals into an operational and defensive focused team. It relies on both the professional and leadership's desire to have an ear to the ground and prepared to respond.
Wednesday, July 6, 2011
2011 Reading List
Orientation is a multi-faceted array of experiences, heritage, and traditions. One of the best ways to improve one’s orientation is through constant immersion in new experiences and knowledge. One way I intentionally try to improve/change my orientation is through a constant reading regiment. My 2011 goal was to read 30 books. It’s been over 6 months and while I’ve had some schedule setbacks I have finished 9 books. I do still expect to come near my initial goal. The below is a recap of the books I’ve read to date. I wish more folks would share their reading lists.
Additionally, you can check out my amazon wishlist or find me on goodreads.com under the alias electricfork.
The Exploit: A Theory of Networks by Alexander Galloway
Cyberdeterrence and Cyberwar By Martin Libicki
The Starfish and the Spider: The Unstoppable Power of Leaderless Organizations by Ori Brafram
The Firm, The Market, and the Law by Ronald Coase
Outliers by Malcom Gladwell
Strategy by B.H. Liddell Hart
The Analects by Confucius
Gang Leader for a Day: A Rogue Sociologist Takes to the Streets by Sudhir Venkatesh
Book of Five Rings by Miyamoto Musashi
The Medium is the Massage by Marshall McLuhan
Additionally, you can check out my amazon wishlist or find me on goodreads.com under the alias electricfork.
Friday, June 10, 2011
Digital Pearl Harbor
[Digital|Cyber] Pearl Harbors
The three words combined has the power to make the population of security defenders growl in contempt. There’s a good reason for this. It’s symbolism and obvious American political connotations instantly put into play fear, uncertainty, and doubt tactics.
A military perspective has validity in the expression. The Pearl Harbor attack was a surprise strategic air strike with the intent of delaying American response in the Pacific theatre. Air strikes are used to gain dominance in another domain, such as land or sea. This outcome is similar to how the US military defines cyberattacks (or CNA). A digital Pearl Harbor would have similar intentions, outcomes and the same surprise characteristic as it’s namesake.
Scenario: State A invades state B, the US has interests in state B and is expected to intervene. A “digital Pearl Harbor” attack by State A could conceivably disrupt logistics and supplies, communications, or other military operations in a fashion to adequately delay troop and asset deployments during the critical opening hours of hostilities. Such a successful attack would be strategically invaluable to State A. It also has the benefit of minimal costs with no risks which would be encountered if entering US territory. I have zero understanding of the militaries vulnerability exposure to such an attack, though the threat side of the equation is presumably rather high. This assumption is based on both the desire and ability for various nation-states to carry out such an attack.
When I see Digital Pearl Harbor catchphrases thrown around in the media by generals and federal employees this is my default interpretation and gives me the allowance to not growl in contempt at the usage of fear, uncertainty, and doubt but instead examine and evaluate the phrase in context with their speech or talking points.
The three words combined has the power to make the population of security defenders growl in contempt. There’s a good reason for this. It’s symbolism and obvious American political connotations instantly put into play fear, uncertainty, and doubt tactics.
A military perspective has validity in the expression. The Pearl Harbor attack was a surprise strategic air strike with the intent of delaying American response in the Pacific theatre. Air strikes are used to gain dominance in another domain, such as land or sea. This outcome is similar to how the US military defines cyberattacks (or CNA). A digital Pearl Harbor would have similar intentions, outcomes and the same surprise characteristic as it’s namesake.
Scenario: State A invades state B, the US has interests in state B and is expected to intervene. A “digital Pearl Harbor” attack by State A could conceivably disrupt logistics and supplies, communications, or other military operations in a fashion to adequately delay troop and asset deployments during the critical opening hours of hostilities. Such a successful attack would be strategically invaluable to State A. It also has the benefit of minimal costs with no risks which would be encountered if entering US territory. I have zero understanding of the militaries vulnerability exposure to such an attack, though the threat side of the equation is presumably rather high. This assumption is based on both the desire and ability for various nation-states to carry out such an attack.
When I see Digital Pearl Harbor catchphrases thrown around in the media by generals and federal employees this is my default interpretation and gives me the allowance to not growl in contempt at the usage of fear, uncertainty, and doubt but instead examine and evaluate the phrase in context with their speech or talking points.
Tuesday, May 10, 2011
Applying Liddell Hart to Infosec
I believe infosec- particularly incident response- is a realization of competition through digital venues. Similarly, military thought is focused one of the most serious levels of competition: War. There’s a series of similarities here that goes beyond simple analogies but of mutual usefulness. A lot of lessons we, as security practitioners, can learn from. Earlier this year I read “Strategy” by B.H. Liddell Hart. The below are a smattering of quotes and quick observations I’d like to apply to infosec.
The field still lacks proper training and experience. One of the more valuable assets we have at our disposal is sharing and collaboration of information. Both vertically and laterally. Vertically learning practical experience in other fields and horizontally learning from peers’ experience in the information security space. Mudge made a good observation in his 2010 shmoocon keynote (summarized): the most common method of learning in infosec has historically been through mentor relationships. We need more and better mentors at an individual level and increased and honest information sharing and collaboration at an organizational and industry level.
Various bad guys use and lack of permenence/infrastructure against our immobile and lethargic infrastructure. They take advantage of our lack of information sharing of attack data. We’re not just slow but we’re also stupid by comparison. Their fewer resources make them more mobile; their ambiguity more powerful. (Also, this quote is a simplistic precursor to Boyd’s OODA loop construct which I find pretty hip.)
To clarify the context of the quote: Lines of communication are external to the army. emails, newspapers, radio, TV, Internet, etc (a nation’s C2, if you will). Lines of intercommunication are reports from the field and orders to the field (the military’s own C2).
Indeed, the theme of controlling an opponents communication and command structure is repeated throughout not just this book but others. Information Security is as much about securing information as it is about securing external and internal command and control. At both strategic and tactical levels, this requires careful consideration.
The abstraction of technology through wireless, clouds, and mobile devices just may have a positive affect against targeted attacks. To get there, we also need mobile defenses (visibility and C2).
Laws are historically what civic society uses against criminals. International laws simply don’t exist or extend far enough to deprive attackers their freedom of action. Defenders also have a habit of constraining our own freedom of actions through not understanding organizational hurdles, modifying one’s mission to that of compliance, or otherwise.
Too much infosec thought is on the assumption that there isn’t an intelligent attacker on the other end but some sort of inorganic and stagnant thing that needs discovered and mitigated. Duality of attacker and defender is much closer to the truth.
And finally, Bourcots axium:
Defense in Depth was designed to achieve this. The classic DiD tactic is to create backup defenses through several layers to filter out an attack before it reaches it’s target. We’re now seeing attacks where, if the target isn’t acheived, the offense will leverage what it did achieve and then use that as a base to continue the campaign or mission. Perhaps, more presisely, currently DiD tactics do not take in account campaign level attacks, Liddell Hart’s thoughts on rapidity nor especially indirectness, nor Clausewitz’s concept of friction. It doesn’t help the large (aka lethargic) infrastructures we secure simply can’t meet a holistic DiD solution due to over-complexity and constant changes within said infrastructure.
“... there are two forms of practical experience, direct and indirect - and that, of the two, indirect practical experience may be the more valuable because infinitely wider”
The field still lacks proper training and experience. One of the more valuable assets we have at our disposal is sharing and collaboration of information. Both vertically and laterally. Vertically learning practical experience in other fields and horizontally learning from peers’ experience in the information security space. Mudge made a good observation in his 2010 shmoocon keynote (summarized): the most common method of learning in infosec has historically been through mentor relationships. We need more and better mentors at an individual level and increased and honest information sharing and collaboration at an organizational and industry level.
“[Belisarius] was a master of the art of converting his weakness into strength; and the opponent’s strength into a weakness. His tactics [...] had the characteristic [...] of getting the opponent off balance”
Various bad guys use and lack of permenence/infrastructure against our immobile and lethargic infrastructure. They take advantage of our lack of information sharing of attack data. We’re not just slow but we’re also stupid by comparison. Their fewer resources make them more mobile; their ambiguity more powerful. (Also, this quote is a simplistic precursor to Boyd’s OODA loop construct which I find pretty hip.)
“[...] to cut an army's lines of communication is to paralyze it’s physical organization […] to cut an army's lines of intercommunication is to paralyze it’s sensory organization”.
To clarify the context of the quote: Lines of communication are external to the army. emails, newspapers, radio, TV, Internet, etc (a nation’s C2, if you will). Lines of intercommunication are reports from the field and orders to the field (the military’s own C2).
Indeed, the theme of controlling an opponents communication and command structure is repeated throughout not just this book but others. Information Security is as much about securing information as it is about securing external and internal command and control. At both strategic and tactical levels, this requires careful consideration.
“[...] it is wise in war not to underrate your opponents. it is equally important to understand his methods, and how his mind works.”Compare that with platitudes which constantly deflate the attacker: “script kiddie”, “miscreant”, “kid living in his mom’s basement”, etc. We can hurt our sense of mission by underrating the opponents. The honeynet project got this right over 10 years ago with their “Know Your Enemy” papers .
“natural obstacles are inherently less formidable than human resistance in strong defenses” (emphasis mine).Skilled humans are more adaptable and key to security operations thanautomated defenses or ‘obstacles’.
“the weaker the defending side, the more essential it becomes to adopt mobile defense” (emphasis mine)
The abstraction of technology through wireless, clouds, and mobile devices just may have a positive affect against targeted attacks. To get there, we also need mobile defenses (visibility and C2).
“deprive the enemy of his freedom of action”.
Laws are historically what civic society uses against criminals. International laws simply don’t exist or extend far enough to deprive attackers their freedom of action. Defenders also have a habit of constraining our own freedom of actions through not understanding organizational hurdles, modifying one’s mission to that of compliance, or otherwise.
“... in war every problem and every principle is a duality”.
Too much infosec thought is on the assumption that there isn’t an intelligent attacker on the other end but some sort of inorganic and stagnant thing that needs discovered and mitigated. Duality of attacker and defender is much closer to the truth.
And finally, Bourcots axium:
“every plan of campaign out to have several branches and to have been so well thought out that one or other of the said branches cannot fail of success”.
Defense in Depth was designed to achieve this. The classic DiD tactic is to create backup defenses through several layers to filter out an attack before it reaches it’s target. We’re now seeing attacks where, if the target isn’t acheived, the offense will leverage what it did achieve and then use that as a base to continue the campaign or mission. Perhaps, more presisely, currently DiD tactics do not take in account campaign level attacks, Liddell Hart’s thoughts on rapidity nor especially indirectness, nor Clausewitz’s concept of friction. It doesn’t help the large (aka lethargic) infrastructures we secure simply can’t meet a holistic DiD solution due to over-complexity and constant changes within said infrastructure.
Thursday, February 24, 2011
Patterns for Successful Incident Response
What are themes and strategies that make up successful detection and response operations? This has been a question in my head for the last several days. This post is an attempt to generate discussion and dialog on that. Here is my stab:
This begs my next question. What patterns make up an unsuccessful security operation? Is it simply the opposite of the above?
- A genuine desire to champion a sustainable security operation. This statement is meaningless unless juxtaposed next to the stereotypical organizations who deploy security in order to meet an outcome needed for compliance, regulations, business requirements, et al. You may do the right thing, but for the wrong reason. Executing to bare minimums instead of raising the bar.
- The expectation that prevention eventually fails. This is a core tenet of Bejtlich's NSM mindset. Bejtlich reasons that it's inevitable that someone exists who is smarter than he, which suggests he should prepare for that individual instead of ignoring him. I agree with that, and also extend that thought. This isn't simply preparing for the worst. This is preparation for an intelligent and unpredictable yet rational person. In this sense, security is a form of competition. If you pardon a comparison: Football players do not train because they assume they will win; they train because they must prepare themselves in order to have a chance to win.
- Technology is not the key, it's a tool. The team is the key. The team must be in harmony, adaptable, rapid, capable, and make the right decisions. Or, at least quicker and better decisions than the attacker.
- Organization. You need more than the team. You need C2 that can mobilize leadership and other departments as needed. At first glance this suggests top leadership will command and control the situation, this is not the path you want to go down. Instead, the structure created needs to "lead while monitoring". To fully appreciate this "leading while monitoring" expression please read Boyd's Organic Design for C2.
- Honest detection. If you can outline the story for a particular past or ongoing security incident you begin going down a path of observational security. It's akin to the snowball going downhill that steadily grows. It can be treated as a feedback loop. Your first incident can generate visibility and importance of logs and events. That will identify more issues. Those will in turn slowly generate a mature response capability.
- Externally focused. This is an understanding of external threats you face. It's also collaboration with external allies and a constant re-assessment of your operating environment.
- Feedback loops. Detection is the first feedback loop. There are more. Incidents will uncover which security controls work and which don't. These lessons need fed back into the environment in a measured way. Each process should be examined to discover where feedback into other processes can streamline, generate momentum, reduce defensive friction, and improve operations.
This begs my next question. What patterns make up an unsuccessful security operation? Is it simply the opposite of the above?
Wednesday, February 2, 2011
Social Disclosure
Limited disclosure. Responsible disclosure. Full disclosure. These are varying level of loose expectations or cultural norms that certain circles of communities in the security industry respected.
We're so beyond that point. Security "researchers", vulnerability buyers, software vendors, universities, the press, government and everyone in between are fragmented. Fragmented on twitter, facebook, blogs, forums, and (old school!) mailing lists. The discussions and announcements have moved from a small slice of the Internet (mailing lists and individual emails) to social media at large. There is no one culture or expectation anymore. There may be disclosure in the usual places, or it may be on the companies facebook or twitter feed. In plain view of everyone, not just security geeks. I expect this small tweak can have larger ramifications of the discourse.
There are no norms; however social content reaches everyone at the same time and communication is expected to be bi-directional, transparent and generally honest. If that's true, then the vulnerability "owner" must interject itself into the disclosure and establish dialog and understanding. This is increasingly likely to be public.
Because the "disclosurer" can now dictate the public discussion. One last time: Social Disclosure has no particular custom or cultural norm (yet?).
Just throwing it out there. Want more? Read McLuhan or Shirky.
We're so beyond that point. Security "researchers", vulnerability buyers, software vendors, universities, the press, government and everyone in between are fragmented. Fragmented on twitter, facebook, blogs, forums, and (old school!) mailing lists. The discussions and announcements have moved from a small slice of the Internet (mailing lists and individual emails) to social media at large. There is no one culture or expectation anymore. There may be disclosure in the usual places, or it may be on the companies facebook or twitter feed. In plain view of everyone, not just security geeks. I expect this small tweak can have larger ramifications of the discourse.
There are no norms; however social content reaches everyone at the same time and communication is expected to be bi-directional, transparent and generally honest. If that's true, then the vulnerability "owner" must interject itself into the disclosure and establish dialog and understanding. This is increasingly likely to be public.
Because the "disclosurer" can now dictate the public discussion. One last time: Social Disclosure has no particular custom or cultural norm (yet?).
Just throwing it out there. Want more? Read McLuhan or Shirky.
shmoocon 2011
I've been attending shmoocon since 2005. I enjoy it for a few reasons. It whips me out of complacency and reminds me why I enjoy what I do. It's a chance to remove the organizational weight that you carry during the day and allows you to refocus on the true complexities. It's also a fun time and I get to hang out with everyone.
I jotted down 2 one liner notes to myself during the con. This is a brief expansion of those.
First, my perspective on Mudge's hackerspace talk (or what I've named "The l0pht mindset infiltrates DARPA"). Have you read esr's Cathedral and Bazaar? You should. Both black markets as well as certain nation-states have fully embraced the bazaar concept. Hackerspaces offer a potential avenue for Mudge to leverage the same strength of the bazaar from within DARPAs Cathedral. This is the asymetric advantage that I believe Mudge eluded to near the end of his talk.
Secondly, I also noted an upward trend of talks and mentions of "defense" or "offense". Not black or white hat, and not researcher. We need more love for defenders, and this is a great trend. There's also a trend of increasingly discussing the active usage of intel as part of the security program. Certainly the Mandiant guys mentioned it, but also Mudge, Richard Rushing, and the INTERSECT guys also keyed in on it.
I jotted down 2 one liner notes to myself during the con. This is a brief expansion of those.
First, my perspective on Mudge's hackerspace talk (or what I've named "The l0pht mindset infiltrates DARPA"). Have you read esr's Cathedral and Bazaar? You should. Both black markets as well as certain nation-states have fully embraced the bazaar concept. Hackerspaces offer a potential avenue for Mudge to leverage the same strength of the bazaar from within DARPAs Cathedral. This is the asymetric advantage that I believe Mudge eluded to near the end of his talk.
Secondly, I also noted an upward trend of talks and mentions of "defense" or "offense". Not black or white hat, and not researcher. We need more love for defenders, and this is a great trend. There's also a trend of increasingly discussing the active usage of intel as part of the security program. Certainly the Mandiant guys mentioned it, but also Mudge, Richard Rushing, and the INTERSECT guys also keyed in on it.
Tuesday, January 18, 2011
Blackboard Security
Ronald Coase is a Nobel winning economist. I've been reading a few of his papers. As an introduction to a 1974 essay entitled "The Lighthouse in Economics" he states:
As a security professional, this should be given careful consideration.
"... They paint a picture of an ideal economic system, and then, comparing it with what they observe (or think they observe), they prescribe what is necessary to reach this ideal state without much consideration for how this could be done. The analysis is carried out with great ingenuity but it floats in the air. It is, as I have phrased it, "Blackboard economics"..."1
As a security professional, this should be given careful consideration.
Subscribe to:
Posts (Atom)