SANS Digital Forensics and Incident Response Blog: Category - Evidence Analysis

Artifact Timeline Creation and Analysis - part 2

In the last post I talked about the tool log2timeline, and mentioned a hypothetical case that we are working on. Let's explore in further detail how we can use the tool to assist us in our analysis.

How do we go about collecting all the data that we need for the case? In this case we know that the we were called to investigate the case only hours after the alleged policy violation, so timeline can be a very valuable source. Therefore we decide to construct a timeline, using artifacts found in the system to start our investigation, so that we can examine the evidence with respect to time. By doing that we both get a better picture of the events that occured as well as to possibly lead us to other artifacts that we need to examine closer using other tools and techniques.

To begin with you start by imaging the drive. You take an image of the C drive (first partition) and start working

...


Artifact Timeline Creation and Analysis - Tool Release: log2timeline

Using timeline analysis during investigations can be extremely useful yet it sometimes misses important events that are stored inside files on the suspect system (log files, OS artifacts). By solely depending on traditional filesystem timeline you may miss some context that is necessary to get a complete picture of what really happened. So to get "the big picture", or a complete and accurate description we need to dig deeper and incorporate information found inside artifacts or log files into our timeline analysis. These artifacts or log files could reside on the suspect system itself or in another device, such as a firewall or a proxy (or any other device that logs down information that might be relevant to the investigation).

Unfortunately there are few tools out there that can parse and produce body files from the various artifacts found on different operating systems to include with the traditional filesystem analysis. A version of mactime first appeared in The

... Continue reading Artifact Timeline Creation and Analysis - Tool Release: log2timeline


Sizing up the FAT

Another Tuesday, another FAT post. If you're just joining us, you can find the whole series of posts here.

Over the last month or two, we've been working with a FAT image that has been modified by the suspect in a case we're working. We have slowly been undoing the changes made by the suspect, one file at a time. We have one file left to make right so let's see what's going on with our third and final file. Recall the output from fls from previous posts:
fls ouput from usbkey.img
We've already recovered the first two files and adjusted the cluster chains in


De-mystifying Defrag: Identifying When Defrag Has Been Used for Anti-Forensics (Part 1 - Windows XP)

I have seen the following Windows Prefetch entries in nearly every Windows XP / Vista machine that I have reviewed over the past several years. Their existence always reminds me of the imperfect nature of information gained via individual artifacts. Does this mean that a user ran the Microsoft Defragmenter application on July 16, 2009 at 1:19PM? Or was the defragmenter started automatically by Windows? The defragmenter tool has been used very effectively as an anti-forensic tool since it was first introduced. In cases where data spoliation could be important, it is critical for the examiner to be able to identify any overt actions by a user. Complicating this is that starting with Windows XP, the operating system conducts limited defragmentation approximately every three days. [1] This post seeks to identify forensic artifacts which can help us determine if a user initiated the defrag

... Continue reading De-mystifying Defrag: Identifying When Defrag Has Been Used for Anti-Forensics (Part 1 - Windows XP)


A big FAT lie part 2

Last week we looked at the next file in our disk image (next file according to the output from fls). We saw that though the file was 15585 bytes, istat reported only a single sector for the file. Based on our cluster/sector size of 512 bytes, the file should have occupied 31 sectors (15585/512=30.439..., round up).

We theorized that we may have a broken or missing FAT cluster chain. Knowing the file should occupy 31 clusters, we used the blkstat command to carve out 31 clusters of data beginning at sector 834. We found only nulls.

At this point, most sane investigators probably would have reached for their favorite

...