Friday, December 15, 2006

Practicing NSM via DIY attitude

I read Tao of NSM when it first came out. A lot of the ideas behind the book just make sense, especially when you truely need actionable data (Don't even get me started on IDS consoles or MMC snap-ins).
It also suits (or helped develop?) my thoughts that data is what's important. Event data, session data, virus reports, firewall logs, vulnerability data. Good stuff! This is where my hate for vendors come in, they never work together and they will constantly have a shoddy interface. This is also where my DIY attitude also kicks in.

Tools needed:
  • perl
  • cron
  • your existing toolset

Now, ignore your toolset and concentrate at the data that it provides. Call the vendor and get access to their SQL database. Pull together your session data and correlate that to your IDS events. Then combine that with open source intelligence such as dshield blocklist or where a netblock is registered. And add your own whitelist / blacklist checks. Then pull in your antivirus infections. Setup data to pinpoint where a local machine pysically resides.

I finally got around to writing a portion of this. I am receiving hourly emails on non-us sites that have sessions to my network (flow data). I check these non-us (and US) sites against dshield's blocklist. I count how many connections I've seen, if they are on my own personal white/blacklist, and how many IDS events the connections have made. This is all very actionable data, and it's all in just one table. More will be on the way.

I still need to look into session data from a services/ports perspective. And add in my IDS stats, Antivirus stats, and maybe some web proxy stats just for added fun. In short, if I can aggregate all these feeds into one source a few really neat things happen. The most important is situational awareness. But I can also summarize this data into daily/weekly/monthly reports. That sounds awfully close to being able to trend your posture over time. You may even be able to draw inferences of performance/security/roi with such things.

Mental picture: Your data is a small shiny sphere. You are pac-man.

Thursday, November 16, 2006

Net::FTP::Recursive

I had some issues trying to use Net::FTP::Recursive yesterday. Turns out that it only understands UNIX style ls listings so when trying to rget from an FTP site with DOS formatted then it chokes silently. Lots of various questions on this but no answers that I found while googling. The pod does mention something about ParseSub but you have to pay attention to what they're talking about. ParseSub argument lets you add your own formatting input to allow the module to know how to parse the output. It's all very straightforward once you read the source....

If you're looking for why rget isn't working, check to see if your FTP site isn't using DOS style listings. If it is, feel free to use this:




$ftp->rget(ParseSub => &somesub);



sub somesub {

my @returnedobjs = ();

foreach my $line (@_) {

#print "file: $line";

my (@fields, $file);

$line =~ /^

d{1,2}-d{1,2}-d{1,2}s+ #date

d{1,2}:d{1,2}wws+ #time of day nn:nn{AM|PM}

(|d*)s+ #either DIR or filesize

(S*) #name of entry

$

/x;

$fields[6] = $2;


if (isDat($fields[6])) { $datfile = $fields[6]; }

next if $fields[6] =~ /^.{1,2}$/;

if ($1 eq '') {


$file = Net::FTP::Recursive::File->new(IsDirectory=>
1,
IsPlainFile => 0,

IsSymlink => 0,

OriginalLine => $line,

Fields => [@fields]);

}

else {

$file = Net::FTP::Recursive::File->new(IsDirectory => 0,

IsPlainFile => 1,

IsSymlink => 0,

OriginalLine => $line,

Fields => [@fields]);

}

push(@returnedobjs, $file);

}

return(@returnedobjs);

}

Wednesday, August 30, 2006

Firefox plugins and RSS

Now that I have a mac mini at home I feel naked without my rss feeds that I routinely read from work. I use rssfeeder.net as an agregator on my work machine.

I just need something basic on my mac, so I plan on using firefox's builtin RSS as opposed to something like newsfire. I found a plugin that can import OPML files to import my rss list. (some other good plugins)

Wednesday, August 23, 2006

hexview.pl

The last week or two I've been making a conscious attempt at better knowing my hex values. The ASCII 'a' is x61 or 97 in decimal, things along those lines. Why? It seems like one of those important basic things that I never got around to. So I had some downtime tonight and realized that it'd be pretty easy to write a basic hexviewer script in perl. It also works as a rudimentary 'strings'. Admittedly not very useful. The code

Friday, June 23, 2006

Security Metrics (part 1)

I attended two of Dan Geer's presentations in the last year. The first was at shmoocon 2006 while the other was at usenix. Both were about finding and applying metrics to security.

I'm currently doing some neat scripts to measure changes in firewall configurations. That's a whole other ball of wax that just happens to fit into this metrics post (and will eventually be part 2).

Something simpler: passwords. I thought it'd be a simple case of pulling out JtR and then make some silly charts. That ended up having lots of variables. To keep things sane, I'm using stock (documented) hardware along with a typical install of Fedora. I then compiled john w/ system x86-linux-mmx flags. These are all things I'm tracking to verify that, over time, the test constants stay the same.
Then I dug into JtR as I never used it. Props to Solar Designer, as it's really pretty sweet. I didn't find anything that let me see how long a certain password took to crack though. That made me sad, which made me resourceful.

Here is a small perl script I wrote (simply requires the Proc::Background module). It's slightly hackish but golly it's cool.
./jtrtimer.pl -h ntlmhashes.pwdump -t incremental -o output.csv -i 30:120:300:3600:28800

This revs up JtR to just brute forcing the passwords in ntlmhashes.pwdump. It then outputs the number of passwords cracked per each increments as seperated by the colon
$ cat output.csv
100:30
110:120
142:300
231:3600
258:28800
$

Which is pretty lame until you import it into excel:

cracked passwords over time

Now, you do this on a routine basis (say, quarterly) and you can then track as users have stronger or weaker passwords. This can provide data on if your current policy is adequate, or if password awareness is adequate.

And it'll get looked at because everyone likes graphs. This is a rough script, and I've only ran it against test hashes. I should have a more streamlined (converts seconds to minutes, etc) version in the next few weeks.

Monday, June 12, 2006

Back to the basics

I was working on documentation today.  It was complete slaughter.  Not only did I have some coherent sentences but I had simple and elegent solutions to problems.  What problems,  you, the intelligent reader may ask?

I have no idea.

Mission creep and scope growth roar their head when you least expect them.  Let's go back to basics:

  1. Define the question
  2. Gather information and resources
  3. Form hypothesis
  4. Perform experiment and collect data
  5. Analyze data
  6. Interpret data and draw conclusions that serve as a starting point for new hypotheses
  7. Publish results

The wiki has spoken

Saturday, April 15, 2006

A Pointless Rant

The infamous question: "So, what do you do for a living?". The dreaded reply after my answer: "What's a security analyst do exactly?". After attempting to answer this basic question on numerous occasions it became apparent that the answer is more suble than one would think. What do security analysts do?The easy answer to Joe Average is akin to "I keep the hackers out". This gives the fun answer that creates mental images of outwitting spys and subverting corporate espianage.

Another boiled down answer is "I keep the operational tools running to keep security in place". That's closer to the truth. The hard part with operations is inadvertently forgetting the means is for the ends. Or, more accurately the end goal is to create more security, not maintain the tools that keep the security.

How does one keep a high maintenance level of tools as well as keeping the end goal of security intact? Proxy, IDS, IPS, Firewall, AntiVirus, the buzzword list is infinite. The chances of these tools integrating together is slim as well. Just keeping patches current in a big environment is a task in itself, it is extremely easy to go from day to day on maintaining these tools and not utilizing them.

Each security shop is different, but to those that both maintain and use their toolset must examine their priorities. It is too easy to administer the tools instead of using the tools to enhance security. At the end of the day the mindset of the group is the key.

Back to the original question: "What do you do for a living?". I put forth the new answer is not "catching bad guys" or "implementing tools to catch bad guys". At the end of the day the job is to lower the risk of the company to threats that may exist. That can't even be comprehended until you know what the initial risks are. Tools are not the key, much to the chagrin of the security vendors. The key is data. It's the data that is key, shows you your current defensive stance, the threat's stance, and the longterm trends of how the risk has been mitigated for the company.

What are the odds that company X has Antivirus deployed? What are the odds that company X generates infection reports only after a malware outbreak has occured? Amazing trends can be found through daily infection reports if they are looked at. The same goes for all tools based in the typical analyst's toolkit. A lot of emphasis has always been placed on IDS monitoring and yet even that can be subpar in most cases.

The vendors don't make it easy on analysts. They may let you export data into excel or create PDFs. They may accept feeds from other vendors tools as well. But at the end of the day a truly customized data solution is needed. The lowest common denominator will let this customized data take form. In some cases this is text, in most cases it is SQL queries. The vendors do understand the data they collect is what gives their tools power, and they typically will use industry standard tools such as SQL to store this data. Querying that data and combining it with other feeds, with generating daily reports, baselining it, is where a true picture of the network security posture can be formed. Acting on that data is when true security is accomplished. After all, a security analyst should be analyzing data at the end of the day.