Thursday, July 22, 2010

Detective Cindy Murphy’s Cell Phone Evidence Extraction Process

This is a special edition of the blog in which I am honored  to host a full paper that has been authored by Detective Cindy Murphy of the Madison, Wisconsin Police Department.  Cindy Murphy is a lethal forensicator and if you make the decision to engage in technology related crime, I highly recommend avoiding doing so in Madison, Wisconsin. Cindy will find you and you will go to jail.  Don’t say you weren’t warned. 

In addition to being a very sharp computer forensic examiner, she does cutting edge research in the area of mobile device forensics. For example, she has developed the fraternal clone method for CDMA phones which has been published in the Journal of Small Scale Digital Forensics

You can find a PDF version of this paper over at Jesse Kornblum’s website.

Cellular Phone Evidence Data Extraction and Documentation

By Detective Cindy Murphy

Developing Process For The Examination
of Cellular Phone Evidence

Recently, digital forensic examiners have seen a remarkable increase in requests to examine data from cellular phones. The examination of cellular phones and the extraction of data from the same present challenges for forensic examiners:

· The numbers of phones examined over time using a variety of tools and techniques may make it difficult for an examiner to recall the examination of a particular cell phone.

· There is an immense variety of cellular phones on the market, encompassing a array of proprietary operating systems and embedded file systems, applications, services, and peripherals.

· Cellular phones are designed to communicate with the phone network and other networks via Bluetooth, infrared and wireless (WiFi) networking. To best preserve the data on the phone it is necessary to isolate the phone from surrounding networks, which may not always be possible.

· Cellular phones employ many internal, removable and online data storage capabilities. In most cases, it is necessary to apply more than one tool in order to extract and document the desired data from the cellular phone and its storage media. In certain cases, the tools used to process cellular phones may report conflicting or erroneous information, thus, it is critical to verify the accuracy of data from cellular phones.

· While the amount of data stored by phones is still small when compared to the storage capacity of computers, the storage capacity of these devices continues to grow.

· The types of data cellular phones contain and the way they are being used are constantly evolving. With the popularity of smart phones, it is no longer sufficient to document only the phonebook, call history, text messages, photos, calendar entries, notes and media storage areas. The data from an ever-growing number of installed applications should be documented as these applications contain a wealth of information such as passwords, GPS locations and browsing history.

· The reasons for the extraction of data from cellular phones may be as varied as the techniques used to process them. Cellular phone data is often desired for intelligence purposes and the ability to process phones in the field is attractive. Sometimes though, only certain data is needed. In other cases full extraction of the embedded file system and the physical memory is necessary for a full forensic examination and potential recovery of deleted data.

Because of the above factors, the development of guidelines and processes for the extraction and documentation of data from cellular phones is extremely important. What follows is an overview of process considerations for the extraction and documentation of cell phone data.

Cellular Phone Evidence Extraction Process

Untitled

Figure 1: Evidence Extraction Process

Evidence Intake PHASE

The evidence intake phase involves the procedure by which requests for examinations are handled. The evidence intake phase generally entails request forms and intake paperwork to document chain of custody, ownership information, and the type of incident the phone was involved in and outlines general information regarding the type of data the requester is seeking to have extracted or documented from the phone.

A critical aspect at this phase of the examination is the development of specific objectives for each examination. This not only serves to document the examiner’s goals, but also assists in the triage of examinations and begins the documentation of the examination process for each individual phone examined. This information allows the examiner to triage cellular phones.

Identification PHASE

For every examination, the examiner should identify the following:

· The legal authority to examine the phone

· The goals of the examination

· The make, model and identifying information for the cellular phone itself

· Removable & external data storage

· Other sources of potential evidence

Legal Authority:

Case law surrounding the search of data contained within cellular phones is in a nearly constant state of flux. It is imperative that the examiner determines and documents what legal authority exists for the search of the cellular phone, as well as what limitations are placed on the search of the phone prior to the examination of the device:

  • If the cellular phone is being searched pursuant to a warrant, the examiner should be mindful of confining the search to the limitations of the warrant.
  • If the cellular phone is being searched pursuant to consent, any possible limitations of the consent (such as consent to examine the call history only) and should determine whether consent is still valid prior to examining the phone.
  • In cases where the phone is being searched incident to arrest, the examiner needs to be particularly cautious, as current case law in this area is particularly problematic and in a state of constant change.

Particular questions as to the legal authority to search a cellular phone should be directed to a knowledgeable prosecutor or legal advisor in the examiner’s local area. (Mislan, Casey & Kessler 2010)

The Goal of the Examination:

While the general process used to examine any given cellular phone should be as consistent as possible, the goal of the examination for each phone may be significantly different. It is unlikely that any given forensics lab has the capability nor the capacity to examine every cellular phone that contains data of evidentiary value in every kind of case. For this reason, it can be useful to identify what level of examination is appropriate for any given cellular phone.

The first of two main considerations is who will be responsible for the process of documenting the data. The second main consideration is how in depth the examination needs to be. Of those phones that are submitted to the lab for examination, there will be differences in the goals of each examination based upon the facts and circumstances of the particular case.

In some cases evidence from cellular phones may be documented in the field either by hand or photographically. For example, in the interest of returning a victim’s main communication lifeline while still documenting information of evidentiary value, or in the case of documenting evidence in a misdemeanor or minor offense, field documentation would be a reasonable alternative to seizing the device. In other cases, it may be sufficient to have an officer or analyst with basic training in the examination of cellular phones perform a quick dump of cellular phone data in the field, specifically for intelligence purposes using commercially available tools designed for this purpose.

A smaller subset of cellular phones may be submitted for examination with the goal of targeted data extraction of data that has evidentiary value. Specifically targeted data, such as pictures, videos, call history, text messages, or other specific data may be significant to the investigation while other stored data is irrelevant. . It also might be the case that only a certain subset of the data can be examined due to legal restrictions. In any event, limiting the scope of the exam will likely make extraction and documentation of the data less time consuming.

The goal of the exam may alternatively include an attempt to recover deleted data from the memory of the phone. This is only possible if a tool is available for a particular phone that can extract data at a physical level (See levels detailed at the end of this section). If such a tool is available, then the examination will involve traditional computer forensic methods such as data carving and hex decoding, and examination process may be a time consuming and technically involved endeavor, as deeper levels of examination necessitate a more technically complex and time consuming process.

Figure 2

Figure 2: Goal of the Exam

The goal of the exam can make a significant difference in what tools and techniques are used to examine the phone. Time and effort spent initially on identification of the goal of the exam with the lead investigator in the case can lead to increased efficiency in the examination process.


These types of realities should be addressed in training, and the triage process related to the initial submission of cellular phones for examination should based upon the individual circumstances and severity of the case.

The Make, Model and Identifying Information for the Cellular Phone:

As part of the examination of any cellular phone, the identifying information for the phone itself should be documented. This enables the examiner not only to identify a particular phone at a later time, but also assists in the determination about what tools might work with the phone as most cellular phone forensic tools provide lists of supported phones based on the make and model of the phone. For all phones, the manufacturer, model number, carrier and the current phone number associated with the cellular phone should be identified and documented.

Depending upon the cellular phone technology involved, additional identifying information should be documented, if available, as follows:

CDMA cellular phones:

The Electronic Serial Number (ESN) is located under the battery of the cellular phone. This is a unique 32 bit number assigned to each mobile phone on the network. The ESN may be listed in decimal (11 digits) and/or hexadecimal (8 hex digits). The examiner should be aware that the hex version of the ESN is not a direct numeric conversion of the decimal value. An ESN converter can be found at http://www.elfqrin.com/esndhconv.html.

The Mobile Equipment ID (MEID), also found under the battery cover, is a 56 bit number which replaced the ESN due to the limited number of 32 bit ESN numbers. The MEID is listed in hex, where the first byte is a regional code, next three bytes are a manufacturer code, and remaining three bytes are a manufacturer-assigned serial number. CDMA phones do not generally contain a Subscriber Identity Module (SIM) card, but some newer hybrid phones contain dual CDMA and GSM technology and can be used on either CDMA or GSM networks. Inside these dual technology phones is located a slot for a SIM card. The identifying information under the battery of these phones may list an IMEI number in addition to the ESN/MEID number.

CDMA phones also have two other identifying numbers, namely, the Mobile Identification Number (MIN) and Mobile Directory Number (MDN). The MIN is a carrier-assigned, carrier-unique 24-bit (10-digit) telephone number. When a call is placed, the phone sends the ESN and MIN to the local tower. The MDN is the globally-unique telephone number of the phone. Prior to Wireless Number Portability, the MIN and MDN were the same but in today's environment, the customer can keep their phone number (MDN) even if they change carriers.

GSM cellular phones:

The International Mobile Equipment Identifier (IMEI) is a unique 15-digit number that identifies a GSM cellular phone handset on its network. This number is generally found under the battery of the cellular phone. The first 8 digits are a Type Allocation Code (TAC) and the next 6 digits are the Device Serial Number (DSN). The final digit is a check digit, usually set to 0.

On a GSM phone, there will be at least a Subscriber Identity Module (SIM) card slot which is also generally located under the battery. The SIM card may be branded with the name of the network to which the SIM is registered. Also located on the SIM card is the Integrated Circuit Card Identification (ICCID), which is an 18 to 20 digit number (10 bytes) that uniquely identifies each SIM card. The ICCID number is tied to the International Mobile Subscriber Identity (IMSI) which is typically a 15-digit number (56 bits) consisting of three parts including the Mobile Country Code (MCC; 3 digits), Mobile Network Code (MNC; 3 digits in the U.S. and Canada, and 2 digits elsewhere), and Mobile Station Identification Number (MSIN; 9 digits in the U.S. and Canada, and 10 digits elsewhere) which are stored electronically within the SIM. The IMSI can be obtained either through analysis of the SIM or from the carrier.

The Mobile Station International Subscriber Directory Number (MSISDN) is the phone's 15-digit, globally unique number. The MSISDN follows the International Telecommunication Union (ITU) Recommendation E.164 telephone numbering plan, composed of a 1-3 digit country code, followed by a country-specific number. In North America, the first digit is a 1, followed by a 3-digit area code.

iDen cellular phones:

iDen cellular phones contain an International Mobile Equipment Identity (IMEI) that identifies an iDen cellular phone on its network. This number is generally found under the battery of the cellular phone. iDen cellular phones also contain SIM cards, with the identifying information described above, though they are based on different technology than GSM SIM cards and are not compatible with GSM cellular phones.

Unique to iDen cellular phones is the Direct Connect Number (DCN) which is also known as the MOTOTalk ID, Radio-Private ID or iDen Number. The DCN is the identifying number used to communicate directly device-to-device with other iDen cellular phones. This number consists of a series of numbers formatted as ###*###*##### where the first three digits are the Area ID (region of the home carrier’s network), the next three digits are the Network ID (specific iDen carrier) and the last five digits are a subscriber identification number which sometimes corresponds to the last five of the cellular phone number. (Punja, & Mislan, 2008)

Removable /External Data Storage:

Many cellular phones currently on the market include the ability to store data to a removable storage device such as a Trans Flash Micro SD memory expansion card. In the event that such a card is installed in a cellular phone that is submitted for examination, the card should be removed by the examiner and processed using traditional digital forensics techniques. The processing of data storage cards using cellular phone forensic tools while the card remains installed within the phone may result in the alteration of date and time stamp data for files located on the data card.

Additionally, cellular phones may allow for the external storage of data within network based storage areas accessible to the phone’s user by computer. Accessing this data, which is stored on the cellular phone service provider’s network, may require further legal authority and is generally beyond the scope of the examination of a cellular phone handset. However, the potential existence of network based data storage should be taken into account by the examiner.

Many feature phones and smart phones are also designed to sync with a user’s computer to facilitate access to and transfer of data to and from the cellular phone. Full backups of the data from a phone may be found on the phone owner’s computer. The potential for the existence of these alternative storage areas should be considered by the examiner as additional sources of data originating from cellular phones.

Other Potential Evidence Collection:

Prior to beginning examination of a cellular phone, consideration should be given to whether or not other evidence collection issues exist. Cellular phones may be good sources of DNA, fingerprint or other biological evidence. Collection of DNA, biological and fingerprint evidence should be accomplished prior to the examination of the cellular phone to avoid contamination issues.

Preparation PHASE

Within the Identification phase of the process, the examiner has already engaged in significant preparation for the examination of the phone. However, the preparation phase involves specific research the regarding the particular phone to be examined, the appropriate tools to be used during the examination and preparation of the examination machine to ensure that all of the necessary equipment, cables, software and drivers are in place for the examination.

Once the make and model of the phone have been identified, the examiner can then research the specific phone to determine what available tools are capable of extracting the desired data from the phone. Resources such as phonescoop.com and mobileforensicscentral.com can be invaluable in identifying information about cellular phones and what tools will work to extract and document the data from specific phones. The SEARCH toolbar (available as a free download from www.search.org) contains additional and regularly updated resources for cellular phone examinations.

Appropriate Tools:

The tools that are appropriate for the examination of a cellular phone will be determined by factors such as the goal of the examination, the type of cellular phone to be examined and the presence of any external storage capabilities.

A matrix such as the one shown below of the tools available to the examiner, and what general technology of phones (GSM, CDMA, iDEN, SIM Card) they are compatible to be used with in an examination may also be helpful.

 

CDMA

GSM

iDen

SIM

Logical Dump

Physical Dump

BitPim

X

     

X

 

Data Pilot Secure View 2

X

X

       

Paraben Device Seizure

X

X

 

X

X

 

SIMCon

     

X

   

iDen Media Manager

   

X

     

Manufacturer / Other

X

X

X

X

   

Cellebrite

X

X

X

X

X

X

CellDEK

X

X

X

X

X

X

Oxygen Forensic Suite

X

X

 

X

   

XRY / XACT

X

X

X

X

X

X

Figure 3: Cellular Phone tool matrix (Kessler, 2008)

Cellular Phone Tool Leveling System:

When identifying the appropriate tools for the analysis of cellular phones, a useful paradigm is the Cell Phone Tool Leveling System. (Brothers, 2009) The tool leveling system is designed to categorize mobile phone and GPS forensic analysis tools by the depth to which they are capable of accessing data on a given device. As you move up the pyramid (generally):

· Methods get more “forensically sound”

· Tools get more expensive

· Methods get more technical

· Longer Analysis times

· More training required

· More invasive

 

Pyramid

Figure 4: Cellular Phone Tool Leveling Pyramid – (Brothers 2009)

1. Manual Analysis – physical analysis of the phone involving manual manipulation of the keyboard and photographic documentation of data displayed on the screen.

2. Logical Analysis - Connect data cable to the handset and extract data using AT, BREW, etc. commands in client/server architecture.

3. Physical Analysis (Hex Dump) - Push a boot loader into phone, dump the memory from phone and analyze the resulting memory dump.

4. Physical Analysis (Chip-Off) - Remove memory from the device and read in either second phone or EEprom reader.

5. Physical Analysis (Micro Read) - Use an electron microscope to view state of memory.

In general, examinations gather more detailed information and take more time as one advances through the levels.

Other resources that should not be overlooked are cellular phone manufacturer’s and cellular phone carrier’s websites. They can contain links to electronic versions of manuals for almost every make and model of phone as well as cable and phone driver downloads. Manufacturers will also sometimes provide free software designed for users to access and synchronize the data on their phones with their computers. While these tools are not designed specifically for extraction and documentation of evidence, they may be useful when other tools do not work. Caution should be used when utilizing these tools, as they are generally designed to access and write data back to a cellular phone.

Isolation PHASE

Cellular phones are by design intended to communicate via cellular phone networks. They are also sometimes capable of connecting to each other and to other networks via Bluetooth, infrared and wireless (WiFi) network capabilities. For this reason, isolation of the phone from these communication sources is important prior to examination. Isolation of the phone prevents the addition of new data to the phone through incoming calls and text messages as well as the potential destruction of data through a kill signal or accidental overwriting of existing data as new calls and text messages come in.

Isolation also prevents overreaching of the legal authority of warrant or consent that covers data on the phone. If the phone is isolated from the network, the examiner cannot accidently access voicemail, email, Internet browsing history or other data that may be stored on the service provider’s network rather than on the phone itself.

Isolation of a cellular phone in the field can be accomplished through the use of Faraday bags or radio frequency shielding cloths which are specifically designed for this purpose. Other available items such as arson cans or several layers of tinfoil may also be used to isolate some cellular phones. One problem with these isolation methods however is that once they’re employed, it is difficult or impossible to work with the phone as you can’t see through them or work with the phone’s keypad through them. Faraday tents, rooms, and enclosures exist, but are cost prohibitive.

Additionally, some laboratories within the US federal government may use signal jamming devices to curtail radio signals from being sent or received for a given area. The use of such devices is illegal for many organizations and their use should only be implemented after verifying the legality of the use of such devices for a given organization. Additional information about the FCC exceptions can be found at www.fcc.org.

Another viable option is to wrap the cellular phone in radio frequency shielding cloth and to then place the phone into Airplane mode (Also known as Standalone, Radio Off, Standby or Phone Off Mode). Instructions for how to place a phone into Airplane mode can be found in the manufacturer’s user manual for the particular make and model of cellular phone.

Unfortunately, not all phones have an Airplane mode, and sometimes even the most seemingly foolproof isolation methods can fail. Even if a cellular phone is successfully isolated from all networks, user data can still be affected if automatic functions are set to occur, such as alarms or appointment announcements. If these situations arise the examiner should document their attempts to isolate the phone and whether any incoming calls, text messages or other data transmissions occur during the course of the examination.

Processing PHASE

Once the phone has been isolated from the cell phone and other communication networks, the actual processing of the phone may begin. The appropriate tools to achieve the goal of the examination have been identified in the steps described previously, and they can now ideally be used to extract the desired data from the phone.

Accessing data stored on these cards through use or processing of the cellular phone may alter data on the data storage card thus any installed data storage/memory cards should be removed from the cellular phone and processed separately using traditional computer forensics methods to ensure that date and time information for files stored on the data storage/memory card are not altered during the examination. Similarly, SIM cards should be processed separately from the cellular phone they are installed in to preserve the integrity of the data contained on the SIM card.

Consideration should be given to the order of the software and hardware tools used during the examination of the cellular phone. There are advantages to consistency in order of tools used during examination of a cellular phone. This consistency may help the examiner to remember the order of tools used in the examination at a later time. Also, depending on the circumstances, it may make sense to use more intrusive tools first or last during the course of the examination depending upon the goals of the exam. For example, if the goal is to extract deleted information from the physical memory of the phone, starting the examination with a physical dump of the memory (if tools for this function are available) would make more sense than extracting individual files or the file system of the phone.

Verification PHASE

After processing the phone, it is imperative that the examiner engages in some sort of verification of the accuracy of the data extracted from the phone. Unfortunately, it is not unusual for cellular phone tools to erroneously or incompletely report data or to have different tools report conflicting information. Verification of extracted data can be accomplished in several ways.

Comparison of Extracted Data to the Handset

Comparison of the extracted data to the handset simply means checking to be sure the data which was extracted from the cellular phone matches the data displayed by the phone itself. This is the only authoritative way to ensure that the tools are reporting the phone information correctly.

Use of More than One Tool, Compare Results

Another way to ensure the accuracy of extracted data is to use more than one tool to extract data from the cellular phone and to cross verify the results by comparing the data reported from tool to tool. If there are inconsistencies, the examiner should use other means to verify the accuracy of the data extracted from the phone. Even if two tools report information consistently, verification via manual inspection of the handset is authoritative.

Use of Hash Values

If file system extraction is supported, traditional forensic tools can be used for verification of extracted data in several ways. The forensic examiner can extract the file system of the cell phone initially, and then hash the extracted files. Any individually extracted files can then be hashed and checked against the originals to verify the integrity of individual files.

Alternatively, the examiner could extract the file system of the cell phone initially, perform the examination and then extract the file system of the phone a second time. The two file system extractions can then be hashed and the hash values compared to see what data on the phone, if any, has been altered during the examination process. Any changed files should then be examined to determine if they are system files or user files to potentially determine the reason for the changes to the files. (Murphy, 2009) It should be noted that hashing only validates the data as it exists after it has left the phone. Cases where the data extraction process actually modifies data have been documented in other papers. (Dankar, Ayers & Mislan 2009)

In some cases, a combination of verification techniques may be employed to validate the integrity of the data extracted from the phone.

Documentation & reporting PHASE

Documentation of the examination should occur throughout the process in the form of contemporaneous notes regarding what was done during the examination. Examination worksheets can be helpful in the examination process to ensure that basic information is recorded.

The examiner’s notes and documentation should include information such as:

· The date and time the examination was started

· The physical condition of the phone

· Pictures of the phone and individual components (e.g., SIM card and memory expansion card) and the label with identifying information

· The status of the phone when received (off or on)

· Make, model, and identifying information

· Tools were used during the examination

· What data was documented during the examination

Most cellular phone tools include reporting functions, but these may not be sufficient for documentation needs. At times, the cellular phone tools may report inaccurate information such as the wrong ESN, MIN / MDN numbers, model, or erroneous date and time data, and so care must be taken to document the correct information after data verification. For law enforcement purposes, the process used extract data from the phone, the kinds of data extracted and documented and any pertinent findings should be accurately documented in reports. Even if the examiner is successful in extracting the desired data using available tools, additional documentation of the information through photographs may be useful, especially for court presentation purposes.

Presentation PHASE

Consideration should be given throughout the examination as to how the information extracted and documented can clearly be presented to another investigator, prosecutor and to a court. In many cases, the receiver may prefer to have the extracted data in both paper and electronic format so that call history or other data can be sorted or imported into other software for further analysis.

The investigator may also want to provide reference information regarding the source of date and time information, EXIF data extracted from images or other data formats, in order that recipients of the data are better able to understand the information.

For court purposes, pictures or video of the data as it existed on the cellular phone may be useful or compelling as exhibits. Extracted text messages may be great evidence, but pictures of the same text messages may be more familiar and visually compelling to a jury.

It is often very useful to present a series of pictures of text messages and call history logs in chronological order via a simple PowerPoint® presentation so that the progression of communications is shown clearly to the audience, whether the audience is an investigator, prosecutor, or jury. This is especially effective if there are a number of cellular phones involved in a case.

Archiving PHASE

Preservation of the data extracted and documented from the cellular phone is an important part of the overall process. It is necessary to retain the data in a useable format for the ongoing court process, future reference, and for record keeping requirements. Some cases may endure for many years before a final resolution, and most jurisdictions require that data be retained for varying lengths of time for the purposes of appeals.

Due to the proprietary nature of the various tools on the market for the extraction and documentation of cell phone data, consideration should be given to the ability to access saved data at a later date. If possible, store data in both proprietary and non-proprietary formats on standard media so that the data can be accessed later even in the event that the original software tool is no longer available. It may also be a good practice to retain a copy of the tool itself to facilitate the viewing of the data at a later date.

Conclusion

With the growing demand for examination of cellular phones and mobile devices, a need has also developed for the development of process guidelines for the examination of these devices. While the specific details of the examination of each device may differ, the adoption of consistent and well documented examination processes will assist the examiner in ensuring that the evidence extracted from each phone is well documented and that the results are repeatable and defensible in court. The information in this document is intended to be used as a guide for forensic examiners and digital investigators in the development of processes that fit the needs of their workplace.

The author wishes to thank the following individuals for their thoughtful and insightful additions as well as their assistance in reviewing and editing the content of this document:

· Richard Ayers –National Institute of Standards in Technology

· Sam Brothers – US Customs and Border Protection

· Richard Gilleland – Sacramento Police Department

· Michael Harrington –

· Eric Huber – A Fistful of Dongles

· Gary Kessler – Gary Kessler Associates

· Andrew Muir – Intern, Madison Police Department

· Tim O’Shea – US Attorney’s Office, Western District of Wisconsin

Bibliography

Brothers, S. (2009). How Cell Phone "Forensic" Tools Actually Work - Proposed Leveling System. Mobile Forensics World 2009. Chicago, Illinois

Ayers, R., Dankar, A. & Mislan, R. (2009). Hashing Techniques for Mobile Device Forensics. Small Scale Digital Device Forensics Journal , 1-6.

Kessler, G. (2010). Cell Phone Analysis: Technology, Tools, and Processes. Mobile Forensics World. Chicago: Purdue University.

Mislan, R.P., Casey, E., & Kessler, G.C. (2010). The Growing Need for On-Scene Triage of Mobile Devices. Digital Investigation, 6(3-4), 112-124

Punja, S & Mislan, R. (2008). Mobile Device Analysis. Small Scale Digital Device Forensics Journal, Vol. 2, No. 1 , 2-4.

Murphy, C. (2009). The Fraternal Clone Method for CDMA Cell Phones. Small Scale Digital Device Forensics Journal , 4-5.

Friday, July 16, 2010

Stop, Children, What’s That Sound?

In a previous post, I outed myself as an unrepentant SANS cheerleader.  To expand a bit on that full disclosure, it would be appropriate to point out that I will be acting as Rob Lee’s teacher’s assistant for SEC408 at SANS Network Security 2010 which will be held in Las Vegas from September 20th through the 25th.  I have a passion for teaching and presenting so I’m looking forward to this opportunity.

With that out of the way, I recently completed both SEC408 and SEC508.  I won’t bother with a review of either course because you can guess what I thought of them.  I think the SEC408/508 material is some of the best digital forensics training that I’ve ever run across.   I consider SEC408 and SEC508 to essentially be two parts of the same class.  I would strongly encourage even those who are experienced forensicators to consider taking SEC408 before taking SEC508.  SANS has put together a very nice assessment test for people to determine what courses they would best benefit from.  While it’s entirely possible that someone could already have SEC408 knowledge and not need to take the course before 508, I learned quite a bit from the SEC408 course.

SEC408 provided me with additional knowledge in areas that I already had a pretty decent grasp of such as browser forensics.  It was an excellent class that helped me sharpen my edge in forensic fundamentals.  I consider SEC508 to be a transformational experience where I was given entirely new tools that I have been using with great enthusiasm now that I have them in my arsenal. The tool that I want to blog about today is what Rob Lee accurately calls the Super Timeline.

Making Use of a Super Timeline

I won’t go over how to create a Super Timeline since Rob has already covered that as a high level in on the SANS Forensic Blog. What I’ve been working on recently is how to best make use of the resulting timeline. I have also discovered some interesting artifacts that never occurred to me to consider as part of a timeline.

What I’ve learned is that creating a Super Timeline is only the beginning of timeline analysis.  Because the Super Timeline method captures so many time stamps, it is likely that a Super  Timeline will contain too many entries to manually review line by line especially if an examiner creates a timeline for an entire drive image.  The challenge is to be able to pin down what portions of that timeline are relevant to the examination at hand.

What I recommend is to use more tactical forensic tools to pull out specific dates and times that can then be viewed in greater detail by using the Super Timeline.  A classic forensic examination is one where an examiner is asked to determine whether someone removed information such intellectual property from a computer using methods such as email or a USB device.  The Super Timeline is an invaluable tool for this sort of examination, but you have to know where to look on the timeline to get the data of interest.  Tools that can help an examiner do this are tools such Digital Detective’s Net Analysis and HSTEX, Harlan’s Reg Ripper and keyword searching via spreadsheet programs such as Excel.

I like the Net Analysis and HSTEX combo and I’ve been using both tools for many years.  Craig Wilson was recently awarded a well deserved Forensic 4cast Lifetime Achievement Award.  An examiner can take the latest version of HSTEX and use it to extract web browser history from an image.  If it’s a Windows operating system that is being examined, the Internet Explorer history will be of great interest because the examiner can load the HSTEX results into Net Analysis and then filter on terms like “file” to show just file access entries or terms like “attach” to find evidence where files might be uploaded or downloaded from something such as web based email.  The examiner can then take the date and time information for specific events of interest and refer to the Super Timeline to get a clearer picture of the events that surrounded that point in time.

Harlan has been doing some great work in the area of registry forensic research and tool development. Harlan’s Reg Ripper tool is a one that every examiner should have in their tool box and it’s Harlan’s regtime.pl tool that provides registry date and time data in the creation of a Super Timeline.  For example, using the Reg Ripper tool to determine what types of USB devices have been connected to a system allows the examiner to then search for device specific keywords on the Super Timeline.

Super Timelines are designed to be loaded up into a spreadsheet such as Microsoft Excel.  These spreadsheets can also be used to help an examiner zero in on specific events through keyword searching. Keywords such as the word “USB” can be used to help determine when a USB specific event occurred in the timeline.

One of the added bonuses that I’ve discovered from using Super Timelines is that it’s shown me new artifacts to be aware of during an examination.  For example, while examining a recent Super Timeline I saw the last accessed times being updated .wav files for the sounds that are made when a USB device is inserted or removed.  It occurs to me that this is a valuable thing to keep in mind when trying to determine what a user did on a particular computer.  When a user interacts with an operating system GUI like Windows, certain actions can result in sound files playing and that can result in the last accessed time stamps of those files being updated.

Twitter Update

I have decided to create a separate unprotected Twitter account called @AFoDBlog for the blog which will be dedicated exclusively to alerting readers to new blog posts and to also pass along digital forensic content that I think will be of interest.  It’s intended to be a low traffic volume feed that emphasizes quality over quantity. Since it’s unprotected you can see what you are getting into before following it.

I use my protected @ericjhuber account to Tweet about digital forensics. I also use it to socialize with my fellow digital forensic examiners which might not be something that readers care to read about.  Most people continue to follow that account once they start reading it, but I have noticed that some unfollow it.  I assume it’s because they aren’t necessarily interested in reading Ken Pryor and me swapping patrol stories about being bitten by cop hating dogs.  I, of course, think this is riveting stuff, but I understand others might not see it that way.

Tuesday, July 6, 2010

My Precious

SANS Digital Forensics has come up with another great idea which is the introduction of the SANS Institute Digital Forensics Lethal Forensicator Coin.  Rob explains the conditions under which the coin will be awarded over at the SANS Forensic Blog. For reasons I can’t completely explain or fully understand, I have an insatiable desire for one of these coins.  Gollum. Gollum.


The community has plenty of certifications and there is a continuing debate about the issues of licensing and certification. That’s another post or two for another day, but what we haven’t had has been these sort of awards where members of our community can be recognized for their achievements.  That’s one of the reasons why I’ve been supporting the efforts of Lee Whitfield in regards to the Forensic 4cast awards.  It’s nice to have an avenue where our best and brightest can be recognized by their peers and Lee has done the community an invaluable service with these awards.

$FILE_NAME Discussion


Harlan has a great contribution to the discussion we’ve been having here on the blog about the purpose of GUI tools and what examiners use them for in their examinations.  I think I’ve pretty much beaten the $FILE_NAME aspect into the ground, but Harlan’s post and A Thulin’s comment on my previous post made me take a step back and ask myself what do I use my primary GUI forensic tools for and what do I expect out of them?

Some rough definitions are probably in order at this point.  When I talk about primary GUI forensic tools, I’m talking about tools that are designed and marketed as a primary tool for an examination. These tools are designed to parse common file systems and in many cases are loaded up with other features such as capabilities dealing with browser forensics, email forensics, file carving and the like.  Tools in this category would be tools like EnCase, FTK, X-Ways Forensics, etc.

The tools that I think of as secondary GUI tools would be tools like Net Analysis or Cache Back which are designed to present information in a GUI format and allow an examiner to navigate around a particular bit of data. These sort of tools are designed for a limited purpose such as parsing a particular artifact like web browser logs.  Unlike the primary GUI tools, they aren’t designed to be a general forensic tool.

The tools that I think of as secondary tools would be non-GUI programs like Harlan’s Reg Ripper which aren’t designed to allow an examiner to navigate around a particular artifact (like what, for example, Registry Viewer does), but are intended to do things like parse a particular bit of data with the goal of providing the examiner a report of it’s findings.

What do I use my primary GUI forensic tools for? Not nearly as much as I used to use them for when I first started.   The tool that I tend to use the most is still EnCase and as I thought about Harlan’s post I asked myself why I still use it the most even though FTK 3 is arguably a better product.

I think I prefer EnCase still because it feels like a sort of glorified read-only hex “editor” (I put editor in quotes since it’s obviously not designed to edit anything) that is easy for me to use when I need to work at the hex level on a disk. It’s what I’ve “grown up” using so it’s something that I’m very comfortable with for what I do with primary GUI tools.    The rub is that I find myself using FTK 3 more and more these days since it scales much better when it comes to larger data sets and handles a lot non-file system artifacts like compressed files much better than EnCase.  If someone was just starting out and could only buy one, I’d tell them to get FTK 3.  I’d hate to have just one, but I suppose I’d make the same decision if I were forced to make the choice.
I just don’t tend to use all of the additional features that are loaded into programs like EnCase and FTK. Browser forensics are a good example. I don’t use FTK or EnCase to parse web history artifacts like index.dat files because there are much better programs out there such as Net Analysis.  I think that’s one of the reasons why I continue to use EnCase.  For the most part, it does what I need it to do when it comes to file system parsing and I have loads of secondary forensic (GUI and otherwise) tools that I can use by just right clicking on a file in EnCase and sending to a tool like Net Analysis.  I’m also using the SIFT Workstation a lot these days because it’s packed with all sorts of tools that greatly enhance my ability to do an examination by doing things like memory forensics and Super Timeline analysis.

I do tend to use FTK 3 for email forensic work and I think that is one of the reasons why FTK 3 is a good value.  You get the general file system forensics along with some decent secondary tools like Registry Viewer and PRTK.  You also get a nice email forensic tool built into FTK 3.  I’ve just never been that enamored with EnCase as an email forensic program largely because I don’t think the GUI works as well for email forensics compared to FTK and because I hate having to reload the email each time I reopen a case in EnCase.  EnCase just scales poorly when it comes to email, but I know Guidance is working hard to catch up based on what I learned from them at CEIC.

So the bottom line is that as I have gained more experience, I’ve relied less on my primary GUI tools and more on a whole host of other specialized tools along with the manual parsing of artifacts.  Since EnCase does a good job parsing file system data and can easily be used as a springboard for the use of other tools, I keep using it especially since it just feels very natural for me when it comes to hex level forensic work.

Wednesday, June 30, 2010

$FILE_NAME Follow Up

After I posted on the lack of NTFS $FILE_NAME data provided by the major GUI forensic tools, there were several great comments left for that post that described a variety of tools from people like Harlan, David Kovar and Mark Menz.   While these are great tools from three  forensic gurus, I’m still a bit perplexed why the major GUI software tool makers don’t just deal with parsing this data head on.

I’ve had at least one person tell me that EnCase could do this with an EnScript.  Of course, EnCase can do a lot of things with EnScripting. The rub is that I don’t want to use an EnScript for something that should be part of the standard GUI column view along with the $STANDARD_INFORMATION time stamp values.  For example, I want to be able to quickly view the $FILE_NAME information for the files stored in particular folder or volume for timeline purposes.  One of the primary reasons we use GUI forensic tools like EnCase and FTK is that they serve as  overall file system examination tools.  We can use them to examine our evidence from a high level and then decide which of the more specialized tools we wish to employ to drill down on specific artifacts. 

I don’t expect EnCase or FTK to do everything for me. That’s why we have people like Craig Wilson, Rob Lee, Harlan Carvey, Mark McKinnon, Lee Whitfield, Paul Sanderson, Kristin Gudjonsson and all of the rest of the fantastic forensic tool developers out there who make great tools for specific purposes that compliment the major GUI tools. However, I do expect them to parse basic $MFT record information which includes $FILE_NAME time stamps.

Since I made my original post, I discovered that the fine people over at Technology Pathways are doing this at least with a free version of their Pro Discover tool.  Pro Discover Basic is  a very basic GUI forensic tool, but it does what every major GUI tool should do which is to parse both the $STANDARD_INFORMATION and $FILE_NAME  time stamps in glorious column form.

EnCase doesn’t do this at all.  FTK is sort of…kind of…starting to move in this direction.  If you look in the comments section of my previous post on this issue, you’ll see that a couple Access Data engineers were nice enough to drop by and explain that FTK 3.1 parses this data….sometimes. I say “sometimes” because it doesn’t do it as part of the normal column view and it reportedly only shows the data to the examiner if the $FILE_NAME values are different form the $STANDARD_INFORMATION values.  I have no idea why Access Data is making it this complex.  I absolutely do not want this level of hand holding from my forensic tools.  I want to be able to see for myself what the time stamp values are for a given file.  Concealing basic time stamp information from me because they think it’s…I guess…not important isn’t helpful.

If the Guidance Software and Access Data think that having the extra $FILE_NAME columns in their standard GUI file system view would somehow confuse the examiner or clutter the interface, then they can make them turned off by default and require the examiner to “opt in” to see them.

What am I missing here?

Forensic 4cast Awards

The Forensic 4cast awards are upon us! If you haven’t voted yet, you still have time before the awards presentation at the SANS Forensics and Incident Response Summit on July 8th.  You can also attend the award ceremony for free even if you aren’t attending the summit. Lastly, the fine people over at Disk Labs have sponsored the actual awards which are pretty amazing looking.

Saturday, June 19, 2010

Give Me $FILE_NAME or Give Me Death

I think we’re long past the point as a community where we should be pushing the vendors of our GUI forensic tools to provide us with the $FILE_NAME time values inside of an NTFS $MFT record.  Every tool parses the $STANDARD_INFORMATION time values, but that should no longer be considered the bare minimum for a GUI forensic tool.  Most tools do not provide the $FILE_NAME time values as part of their standard file system navigation experience.  The concern that has been expressed in the past was that adding this information would be confusing to the user.  While I can certainly understand that it might be confusing to an inexperienced or poorly trained examiner, that’s not a good reason for not presenting the information.  If an examiner doesn’t understand how an $MFT record works, then this confusion is a teachable moment that will hopefully prompt the examiner to learn more about the inner workings of an $MFT record.  The information is out there and it’s easily accessible on the Web, through training courses and books.

Yes, I can parse the data manually or by scripting with the various vendor tools.  However, it’s much more useful to me if I can have these data stamps parsed automatically and presented to me as part of the main user interface experience.

I’m not familiar with all of the forensic tools that are available so I’ll have to rely on other people to let me know what tools might be doing this already. I’ve been using Sleuth Kit more and more these days and it parses everything (istat) because it’s Brian Carrier’s awesome tool.  I heard a long time ago that Pro Discover might present some of this information to the user also, but I’d be curious if someone could verify that for me. Any other tools that are doing this?

What do you think? Am I missing something? Why wouldn’t we want this information presented to us up front in our GUI tools?

Forensic 4cast Awards Voting Has Opened

The nominations have closed for the upcoming Forensic 4cast awards and the voting has started.  SANS announce this week that the awards will be open to everyone so if you are in the DC area and aren’t attending the SANS Forensic and Incident Response Summit, you can still attend the awards.

New Tools

I’ve been made aware of a couple new forensic tools that I’d like to share with everyone. 

The first one is Defraser which is a tool by the Netherlands Forensic Institute.  I learned about this tool when I was taking SEC563 at SANSFIRE recently.  This is a carving tool that will recover full and partial video data.  I have just started it so I can’t yet speak to how well it works yet, but I’m excited about the possibilities.

The second tool is called raw2vdmk.  It looks like it’s an alternative to LiveView.  I use LiveView quite a bit and I’m quite fond of it.  I haven’t tried raw2vdmk, but I would potentially give it a spin if it could do something that LiveView couldn’t do for me.

Tuesday, June 15, 2010

Bacon

This post is about SANS and last week’s SANSFIRE 2010.  It also contains a review of the SEC563 Mobile Device Forensics course that I attended at the conference.
Full Disclosure: I’m a member of the GIAC Advisory Board and an advisor to the GIAC Ethics Council.
Fuller Disclosure: I’m a SANS independent contractor who is writing test questions for the GCFE (GIAC Certified Forensic Examiner) certification. This is the certification that will be linked to the SEC408 Computer Forensics Fundamentals course. SANS is nice enough to pay people who do this work a little bit of money for their work.  Don’t tell SANS, but I’d do it for free.
Fullest Disclosure: I’m an unrepentant SANS cheerleader.

I Heart SANS

My first SANS experience was in 2004 when I took SEC504 Hacker Techniques, Exploits and Incident Handling from Ed “Skodo Baggins” Skoudis at a smaller SANS event that was held in Phoenix.   I had no idea who Ed was before taking the class, but I certainly knew who he was after the class.  SEC504 with Ed was one of the finest training experiences that I’ve ever attended and I cherish the experience to this day.  I consider it a transformational experience because it opened my eyes to all of the possibilities in information security world.  Ed essentially acted as Virgil to my Dante.  Not only was the course content fascinating, but I was amazed at what an incredible instructor Ed was.  Since that course, Ed has been an example to me of just how good an instructor can and should be. 

When I took this course with Ed, it was in the days of the old certification model where certification GCIH candidates were required to complete a white paper before they were allowed to attempt to take the two tests that were necessary to pass the  certification process.  Incident handling was new to me, but I somehow managed to successfully complete the paper and then was faced with the two tests.  The first test covered the incident handling process and I scored somewhere in the 80s on that test and passed.  The second test dealt with the technical aspects of the course and I think my score was 78.  I remember it was in the 70s and I was very glad to have achieved that score.  It was a long and difficult process, but completing it was more than worth it.

I recertified over a year ago and scored well on that test.  Because I scored over 90, I was invited to join the GIAC Advisory Board.  After I did so, I had more of the SANS world opened up to me because I could see the SANS staff interacting with the rest of the Advisory Board through the Board’s email list.  This was a very educational experience because it allowed me to observe Stephen Northcutt and some of the other SANS leadership in action.  Additionally, I recently had the opportunity to serve with Stephen with project that we are both involved in.  Stephen is the face of SANS since he is the CEO of SANS, holds a position on the GIAC Board (I can’t remember if he’s the chair of the board or not. I’m sure he told me last week, but my brain can only hold so much new information while being soaked with the SANS knowledge fire hose) and is the President of the SANS Technology Institute.

The best I can tell is that Stephen is the person who the Dos Equis people modeled their most recent advertising campaign on. Stephen has got to be a contender for the information security version of the Most Interesting Man in the World(tm). When he’s not traveling around the globe leading his merry band of SANS people, he does things like write, pontificate, snorkel, sail and live in Hawaii.


SANSFIRE 2010 Review

Last week’s SANSFIRE was my first major SANS conference.  As you can tell from the tone of this post so far, I was not disappointed.  SANS does a great job putting on these conferences and there is a lot of attention to detail.  There were legions of helpful work study facilitators who made everything run smoothly.  The major SANS conferences are a great experience because not only do you get to attend training with the top SANS instructors, but there are a whole host of networking opportunities available to you. These conferences are attended by a large number people with very diverse information security backgrounds.  There were plenty of after hour events to attend such as the very popular SANS @Night presentations where industry experts gave talks that could be attended by anyone at the conference.  SANS also provided snacks and drinks during the morning and afternoon breaks that kept everyone going.  During one of the evenings early in the conference, they provided free food (very nice hot dogs and pretzels this time) along with a live band and several cash bars. Day 5 was ice cream day where the afternoon snack was all sorts of frozen goodies.  One of the nice touches is that they had a cash bar available during the initial registration on Sunday evening and even provided a free drink ticket with the registration packet.  That’s right.  We got a free beer on SANS after we picked up our registration information.  It was a nice touch after more than three hours of slogging through East Coast traffic to get to Baltimore. 

One of the things I found was that the SANS instructors are very approachable even if you aren’t taking their class.  I was able to meet a lot of the instructors who I have met through various electronic methods, but never in person such as James Tarala, Chad Tilbury and Paul Henry.  I was also able talk to Ed Skoudis in person after corresponding with him for many years.  I’ve recently started presenting on digital forensics in conference settings and Ed is always good for a great teaching tip or two. The SANS staff (both the instructors and the support staff) earn their pay during these conferences because they always have to be “on” in case they run across someone like me after class.


SANS SEC 563 Mobile Device Forensics Review

The class that I was at SANSFIRE to attend was SEC563 Mobile Device Forensics.  Eoghan Casey and Terry Maguire from cmdLabs taught the class. Eoghan has been the primary person behind the course since it’s inception.  Thus, those of us who took the class had the benefit of being taught by two very accomplished digital forensic examiners and instructors.  If I had only one word to describe what I thought of this course, I would pick the following word: Bacon.  Not turkey bacon.  That’s undead zombie pseudo-bacon. We’re talking thick cut smoked bacon.  I like bacon and I liked SEC563.

Putting together a five day mobile device class is a pretty tall order given the current fluid state of the tools and methods. There isn’t a lot of standardization in the mobile device world given all of the different phones, carriers, operating systems and third party applications.  The computer forensics world is relatively static and mature at least to the extent that we deal only with relatively small number of operating and file systems.
The course struck a very even balance between lecture content and hands on exercises for the students.  Students are introduced to a wealth of different forensic tools and many of them are used during the practical exercises.  Because there is so much hands on work, the class is limited to no more than 25 students.  

The class was an overview of the mobile device forensics world and provided students the fundamental knowledge to get started by exposing them to the wide variety of tools and methods that are available.  I took this class because I am relatively new to mobile device forensics and I found that I learned an immense amount.  I wish I would have taken this class earlier in my studies because it would have made tool selection and process development much easier.  I came out of the course with a fundamental understanding of how to examine SIM cards, CDMA and GSM phones.  I can’t call myself an expert in mobile device forensics and it would have been unreasonable to think that even with instructors like Eoghan and Terry that I could be brought up to their level in just week.  However, taking this course is one of the most efficient ways to gain the fundamentals that an examiner would need to pursue mastery of the subject.

This course reinforced my initial impression that mobile device forensics is basically the wild, wild west right now.  There are some useful tools out there, but the state of the tools and methods aren’t nearly as mature as they are in computer forensics.  Eoghan and Terry stressed the need to validate results and to not put all of your faith into one tool.   Manual review of mobile devices is still very necessary in some cases and validation has to be a key concern of an examiner. 

So the bad news is that the state of mobile device forensics is very fluid and complicated.  A lot of hex level examination still needs to be done in cases where tools won’t do the parsing for you. To me, this is also the good news.  I know some examiners hate it, but I enjoy working at the hex level.  It’s not practical to do it as a primary method of examination, but there’s just something I find really fulfilling when I pull a bit of useful evidence out with a hex editor.  If you like this sort of thing, you’re going to love both mobile device forensics and this class.

Thursday, June 3, 2010

Forensic 4cast Awards

The 2nd annual Forensic 4cast Awards will be held at the SANS Forensic Summit on July 8th of this year.  You do not have to attend the Summit to make nomination or to vote for the awards.  The nomination period is open, but it will close on the 13th of this month so get your nominations in soon!

Lee Whitfield hosted another episode of Forensic 4cast this weekend. Mark McKinnon and I were the panelists and we discussed the recently completed CEIC conference, the Guidance Software acquisition of Tableau and many other topics such as the upcoming Summit. 

I’ll be at SANSFIRE next week absorbing powerful Eoghan-Fu when I take his SEC563 Mobile Device Forensics class.  I’m looking forward to being able to finally meet Eoghan and many of the other SANS Instructors in person as well as learning from my fellow students.  I’ll post a review of the course and SANSFIRE afterwards.  I’ll also provide a reviews for the OnDemand versions of SEC408 Computer Forensics Essentials and SEC508 Computer Forensics and Incident Response in the near future. 

Eoghan released his most recent book late last year which Harlan reviewed on his blog.  You can read my review as well as one from Richard Betjlich over at Amazon. It is an excellent book that is good for those of us who are already in the field and those who are considering a career in digital forensics.

Lastly, I want to thank all of you for your support of the blog and your comments here and in other venues.  I’m very grateful for the feedback and the response has been more than I ever expected.  I use Google Analytics to provide some idea if anyone reads this blog and the numbers have been very strong.

Saturday, May 29, 2010

My Big Fat CEIC 2010 Post

I attended CEIC 2010 last week and I think I’m still processing all that I took in during the week.  This was the first time that I attended the Guidance Software run conference and they did a magnificent job with it.  It was a very well planned and executed conference which was reported around 1300 people in attendance.  It was held at the Red Rock Resort which is a very modern and well run facility.  I still sort of miss the “disco elevators”.  Those of you who were there know what I’m talking about…

I don’t get out to as many conferences as I would like because of time and expense, but one of the reasons I like to travel to these conferences is that I get to meet people in person who I generally only get to communicate with via electronic methods like email, twitter and phone.  These conferences are also a great way to get a lot of information very quickly about the state of the industry which allows you to keep up on industry trends.  For example, I’ve used HTCIA conferences that I have attended in the past to ramp up on the state of mobile phone forensics.  This time I spent a lot of time learning and talking about timeline analysis and memory forensics.

I also was able to spend time speaking to various people inside of Access Data and Guidance Software.  It turns out that my previous “Don’t Panic” post circulated around Guidance Software and, fortunately, they took it in the constructive spirit it was intended rather than just someone else running them down.  One of the things that they were concerned about was that when I spoke about employees from Guidance who went over to Access Data , they didn’t want people to think that they had lost their developmental staff in that process.  They made a point to let me know that they didn’t suffer a developmental exodus and that with the addition of the Tableau developmental team, they are very excited about their prospects for future innovation.  Access Data also made a point to praise and promote their developmental team.  Given that team is responsible for FTK 3, they certainly deserve to take a victory lap.

Both Guidance Software and Access data are working on some exciting innovations that they were generous enough to talk to me about.  My purpose as a blogger is to positively contribute to the discussions of issues important to our community and to distribute my own research work.  My purpose isn’t to “scoop” other bloggers or to make announcements that disrupt a vendor’s communication and marketing strategy by revealing information before a vendor is ready.  Doing so wouldn’t serve any useful purpose and my contacts likely wouldn’t talk to me again which means I wouldn’t have access to their industry insights. Thus, this paragraph will just have to serve as a teaser of sorts.   Talking to both camps felt like talking to two championship class NFL football teams who were gearing up for the Super Bowl.  Both companies are hard at work innovating and creating good things for the community.

Of course, What I can talk about is the information that was made public such as what was discussed in the “EnCase Forensic Roadmap” session that was held at CEIC.  This is where Ken Basore and Ashley Stockdale discussed what we can expect in the next year or so with EnCase Forensic.  Some of the high points were:

1. Guidance is working on a new indexing engine.  They are not considering using a third party licensing engine and are sticking with in house development team’s effort. I originally thought this wasn’t a great idea because I could never understand why they just didn’t do something like licensing dtsearch, but it was explained to me that when you do that, you lose a certain amount of control over your product.  What happens if your third party tool (whether it’s indexing, file viewing, email parsing, etc, etc) causes software instability?  People will blame you for it when it’s an issue that needs to be addressed by the third party technology maker.

It also is obviously going to cut into profits compared to the financial benefits you reap when you develop your own tools. However, third party technology is appealing because you simply can’t expect your development teams with finite resources to be experts in everything. Thus, companies like Access Data and Guidance Software have difficult decisions to make when considering how use their resources. Do you develop in house? Do you license technology? Do you just purchase it outright?

So I find myself ambivalent on this decision to continue to develop an internal indexing engine.  Maybe it’s a good idea, maybe it’s not.  We’ll know soon enough and I hope that the next version of the index engine is successful.  I don’t use the current EnCase indexing engine (I use Access Data’s FTK for all of my indexing needs) because I gave up on it after they released it before it was ready.  I intend to give it a try the next time I do an examination so that I have a basis to compare it with whatever they come up with next.

2. Multi-threaded acquisition.  This innovation has already been introduced in version 6.16.  While I haven’t had a chance to test it, I did talk to at least one person who stated that the acquisition speeds rivaled the excellent Tableau TIM product.

3. Evidence Preprocessing innovations.  Guidance is working on an evidence preprocessor that will run in the background of EnCase. It will provide examiners with intuitive options and will present the examiner evidence in stages.  Thus, you will be able to access data as it’s processed rather than having to wait until all of the processing is over.  Since it’s going to run in the background, you’ll also be able to work on your case while the processor is running.

This is a great idea, but one of the biggest complaints that I hear from people and that I have myself is when you ask EnCase to do some sort of processing, you increase your risk of encountering the dreaded “White Screen of Wait”. This is where EnCase chugs away on something, but uses so much resources that you can’t actually do anything with the program until the resources are freed up.  Just this week I followed a twitter thread with some experienced forensic examiners who were lamenting this issue.  Thus, if this is going to be successful, it’s going to have to truly be able to run in the background and not prevent the examiner from working with their case.   The hopeful news on this front is item 4 which is…

4. Work product storage innovation.  This is my terminology rather than Guidance’s.  I forgot the language that they used but I have the phrase “transportable cache files” in my notes. To their credit, Guidance understands that we hate having to pay for the same real estate twice, so to speak.   One of the frustrations we all have with EnCase is that when you do something like parse a container file like a Zip file, you essentially have to do the same thing all over again when you open up a case.  What Guidance is going to do is get away from the model where all of your work product is stored in just the traditional EnCase evidence file.  There will be additional container files that will contain your work product so that you just have to do processing once and not have to repeat it again.

This is huge and this is clearly an attempt to keep up with Access Data’s FTK (1 and 3) where you just have to process things once and you’re done.  In fact, FTK 3 processes a lot of data very quickly and you’re done. 

So the innovation battle lines are drawn when it comes to indexing and work product storage.

5. Evidence File V2.  The new version of the EnCase evidence file will be faster, smarter, better looking and will have a lovely singing voice.  Okay, maybe that’s not what they said, but that’s essentially what I heard.  They are also going incorporate the option to encrypt evidence files.   The new format will still have the same metadata that we’re used to and will do MD5\CRC checks, but we’ll have the option to encrypt the data portion of it with a password.

Having the option to encrypt evidence files is nice because sometimes we don’t always have an encrypted container drive (You do encrypt your evidence when you ship it, right?)  available to ship images or the person on the other end might not have the decryption technology easily available.

6. More options for report creation. I didn’t take as many notes on this because unless Guidance tells me their reporting option will make bacon directly appear on my desk, I don’t much care. I long since gave up on using EnCase to make forensic reports.   That said, they are going to give us the option to put hyperlinks in reports and to resize/rotate pictures.   Don’t feel too bad, Guidance. I don’t use Access Data’s report function either.  I certainly like it better, but my customers don’t and they are the ones who matter.

7. Decryption.  They said that they will have the ability to decrypt Windows 7 Bitlocker soon.  This is good news and one of the things I’ve really appreciated about Guidance and Access Data is their aggressiveness in working with encryption vendors to incorporate decryption technology into their products.  It makes our lives as examiners so much easier because manual decryption processes can be long and painful.

8. Email Threading.  EnCase will have the ability to follow email threads across multiple email repositories.  This is a very nice option to have, but I suspect I won’t be using it since EnCase is pretty painful to use for email investigations compared to tools like FTK.  However, this signals to me that Guidance isn’t giving up on enticing it’s customers to use EnCase for email investigations and that’s a good thing.

9. Neutrino\mobile phone forensics.  Digital forensics is a very broad field with all sorts of devices, operating systems and file systems.  It’s hard enough being good at traditional hard disk file system forensics.  The innovation in the mobile device market is staggering which is why I think we haven’t seen one mobile device forensic vendor establish a dominant position in the market.  Guidance seems to understand that they just don’t have the developmental cycles to keep up on everything going on in the mobile device world so they have apparently decided to concentrate on digital forensics of smart phones like Android, iPhone, etc.

This makes good sense to me.  The market for smart phones is growing quickly and, as Guidance points out, they have a lot of experience with parsing file system artifacts. Trying to be a comprehensive mobile device forensic company and keeping up with their competitors like Access Data on the traditional disk forensics front doesn’t seem like a winning proposition.

One of the executives I was able to meet at CEIC was Robert Botchek.  Based on my discussions with him and others, I’m convinced that the Tableau purchase is a good move for Guidance and the community as a whole.  I found Rob to be very unique in that he has deep technical skills, an excellent business mind and is a very personable fellow who can communicate complexity in an understandable manner.   The Tableau name will continue to exist in some form, but Tableau will be a part of Guidance software.  The chain of command issues have already been decided and Robert is a direct report to Guidance’s CEO Victor Limongelli.  Thus, Victor and the rest of the Guidance senior executive management will get the benefit of Rob’s business background and keen insights into the digital forensics market.  The biggest issue will be the traditional one that you have in acquisitions like these which is integrating two different organizational cultures.  If Guidance can pull this off, this should be a good move for everyone involved.

Being able to finally meet Victor in person was also a treat.  He’s also a very smart and personable fellow and, along with all of the other Guidance executives and employees I spoke with, seems to genuinely want people to understand that Guidance doesn’t want to be the organization that we’ve all, unfortunately, grown to distrust if not actively dislike.  Essentially, they want people to understand that they are the new Guidance Software.  The Tableau purchase and the fruits that will hopefully come from it should help on that front.

I’ve been thinking about what companies like Guidance and Access Data can to do engage the community better. An obvious method would be to interact more with the community via social media (Access Data makes great use of Twitter, for example) and the various email lists that are popular with the community. As I thought on it more, it occurs to me that vendors who can afford it should take a page out of Guidance Software’s old play book and hire Directors of Customer Relations.   One of the darkest days of my digital forensics career is the day I learned that the great Bill Siebert had passed away.  For many years during the bad old days of Guidance Software, Bill was the face of the organization.  His title might have been Director of Customer Relations, but it was really Director of making-you-not-hate-Guidance-nearly-as-much-if-Bill-wasn’t-working-for-them.  If you had a problem with Guidance, you could go to Bill and you know he’d tell it to you straight and do whatever it took to get the issue resolved.  He wasn’t a company line type who just told you want you wanted to hear. He’d tell you if he thought Guidance was doing something silly and then do his best to fix it for you. Once Bill left Guidance, things really pretty rocky with my relationship with them and I think one of the biggest public relations mistakes they ever made was not filling that role.  You could never replace Bill, but they should have at least filled that role.   In my case, the relationship with Guidance was repaired through the herculean efforts of my Guidance Software sales representative.    He’s the few sales representatives that I’ll knowingly pick up the phone for when I think it’s him calling. I never thought I’d type that about a sale representative, but there it is.  However, I understand his role is to sell me more stuff rather than to engage the community at large.

What would that role look like today at a place like Access Data or Guidance? The person in that position would be someone that has instant credibility with the community because they were an experienced practitioner rather than someone in a sales or marketing role.  In fact, you wouldn’t have that person as part of the sales organization.  The best position on the organization chart for that person would be to report to a senior executive manager in an operations or developmental role.   This person’s skip level manager would be the CEO and would have access to senior executive management so that they could establish a two way communication between the company leadership and their current and potential customers.  They would be someone who would directly engage the community in the places they inhabit such as forums,  email lists, blogs, podcasts, conventions and social media. Because they were part of the extended senior leadership team they would act as a conduit between the community and senior executive leadership.

That’s enough organizational development pontificating, I think.  I also wanted to comment on some of the people I met and some of the presentations since part of what I love about conventions is meeting people in person and learning new things.

The first session that I attended was Dave Shaver’s “Defeating Advanced Hiding Techniques”.   I don’t know how he did it in 90 minutes, but the course was a comprehensive review of how an experienced digital forensics examiner such as Dave approaches doing an incident response investigation and discovering what sort of evil has been buried in the shadows of a computer.   The conference was also a treat for me because I was finally able to meet Dave and his co-conspirator from Army CID Ryan Pittman.  They’re both some of the nicest guys you’d want to meet and very sharp forensic gurus.   They co-authored the excellent chapter on Windows Forensics in Eoghan Casey’s most recent book.

I finally got to meet Rob Lee in person after countless emails, tweets and phone conversations.  It’s amazing how you don’t really know someone as well even after all of that until you just sit down and talk to them for awhile.  Rob is a big friendly well of digital forensic knowledge and energy. He looks like he played football at the Air Force Academy and he put that command presence to use in his “Super Timeline ” class at CEIC.  If you haven’t taken that class, you can get that content and a lot more by taking the SANS SEC508 class.  “Super Timeline Analysis” is where Rob instructs his students how to use tools like fls, Harlan Carvey’s Regtime.pl and Kristinn Gudjonsson’s log2timeline to make a timeline of activity on a system.  The resulting timeline is amazing at providing an examiner with a detailed view of what happened on a system. This is something that every digital forensic examiner needs to learn how to do.

I had the pleasure of having dinner with Larry and Lars Daniel of Guardian Digital Forensics.  They are both quality guys and excellent digital forensic examiners.  I really enjoyed talking to them about their experiences doing criminal defense work and their perspective on digital forensics in the legal system.

Adrian O’Leary of the Metropolitan Police gave a fantastic presentation on their ability to extract data from physical flash memory on mobile devices.  I don’t know how much information they want public, but I do highly recommend attending any presentations he does in the future.

I also discovered one of my new favorite conference presenters when I attended Joshua Gilliland’s “Textual Relations” presentation. Joshua is a very accomplished presenter and also sports a pretty sharp bow tie.  His presentation was an overview of legal issues involving text messages as well as illustrating how some people have scored massive legal own goals through texting things they really should not have.

Michael Webber’s memory forensic presentation was very educational and it’s been an area that I am very interested in from a research perspective.  He’s a very accomplished presenter who does a great job explaining complicated information in a short amount of time to a large amount of people.  Unfortunately, he only had 90 minutes, but he made good use of the time and I’d love to attend more training with him.

I know I’ll forget to mention all of the great people I finally got to meet in person, but it was a treat being able to connect with people like Eric Smith from Lockheed Martin and Greg Dominguez from Forensic Computers.

As you can tell, I had a wonderful whirlwind of a week at CEIC and I enjoyed the experience very much. I hope to make it CEIC 2011 in Orlando next year.  Great job, Guidance!

Saturday, May 22, 2010

TIM GCFA EWR LAS CEIC LBS



TIM
As a follow up about my previous post about the Guidance Software purchase of Tableau, I saw that Tableau’s Robert Botchek announced on one of the digital forensic email lists that Guidance is going to remove the Tableau requirement for TIM. This means that TIM will work with write blockers other than Tableau.   This is an amazing bit of news for all of us and especially those of us who have been in the industry for awhile.  Great job, Guidance!

GCFA
So I passed the GCFA exam this week with a 92.67% score.  I’m positive that I lost some key data from my past such as my old high school locker combination, past phone numbers and the like since more than a few brain cells died in the attempt.  I completed the SEC508 course that is the basis for the GCFA test via SANS OnDemand method and I’ll post a detailed review of that course and the SEC408 OnDemand course in the near future.  In the mean time, Joe Garcia of Cyber Crime 101 has posted his audio review of his SEC408 experience with the mighty Mike Murr.  As a teaser, Joe announces some exciting new news about the future of the SEC408 course.  I won’t steal his thunder here so you’ll have to give it a listen.

It’s not that the GCFA test is unreasonably difficult, but I had set a goal for myself to get a score over 90%. That means I could only miss 15 questions in a 150 question test that covers some pretty complicated material and has to be completed in 240 minutes.  Proper pre-test preparation is a must because if you don’t have a strong foundation in the material all of the books, notes and whatnot that you bring into the test facility aren’t going to save you.  You just don’t have time to teach yourself new concepts on the fly.  Thus, if you take the SEC508 material seriously when it’s presented to you through whatever format you choose from SANS, do a decent job with the practice tests and practice good test taking skills (including creating a proper index), you’ll have an excellent chance of passing the test.

SANS provides you with two practice tests as part of your GIAC attempt.  You can purchase additional tests for $99.  The impression that I get is the the practice test question bank provides enough unique questions for roughly one and a half practice tests.  Therefore, the score you get on your first practice test is going to be the best indicator of how well you can expect to do on the real test.   Subsequent practice tests will result in higher scores because of repeat questions.  My first practice test score was 88%, my second score was 96.67% and my third score was 98%.  Based on my first and second scores, I was reasonably certain my final test score would fall somewhere in between and it did.   Why did I purchase a third practice test?  Because SANS allows you to take their tests through an open book method, you can bring your SANS course material into the test with you.  The rub is that you need to be able to locate specific subject matter areas quickly if you are going to research the answer to a particular question.  The best way to do this is to create a proper index.  The methods vary, but one of the ways to ensure your best performance on a GIAC test is make sure that you are comfortable with your index.  The reason I took the third test is because I wanted one last test where I would concentrate on training myself to use my index. 

Another thing that I strongly recommend is to look up answers in the cases when you are uncertain of the answer.  The mistake I made during my first practice test was to answer some questions based on the thought that “it’s probably this answer”.  Probably isn’t a good standard to use for GIAC tests because it means you’re going to guess wrong in some circumstances.  If you know the material well and you have a good index, you should have time to look up those “probably” answers and turn them into “certainly” answers. Also, don’t be afraid to skip questions.  The test engine allows you to skip five and I used all of my skips for questions that I knew would require some extra reading and pondering.

EWR LAS CEIC LBS
I’ll be heading out to CEIC tomorrow and I’m looking forward to giving my presentation on Adobe Flash Cookies and meeting new people as well as people I normally just get to communicate with in the virtual world.  One of the reasons  I like going to these conferences is that I can finally meet in person those who I have only communicated with through email, twitter, etc.   They’re a very useful way to keep up on industry trends, tools and techniques.  It’s very powerful having so much knowledge from the community under one roof for a short amount of time.

It’s also a moral imperative that I get a Bacon N’ Egg burger at LBS.  I admit it.  I think with my stomach.  If you actually read this blog and you see me at CEIC, I’d like to know your thoughts on what you think of the blog so far.

Saturday, May 15, 2010

Don’t Panic

This was a big week for digital forensic news. We learned that Guidance Software purchased Tableau and that Access Data would be releasing FTK Imager for the Mac and Linux. All of this great digital forensic news will make for great fodder if I were going to be on the next Forensic 4cast, but I won’t because I have a prior commitment. The forensic gods can be cruel. However, Lee and his band of merry forensic practitioners will have an excellent show for you soon where they discuss these issues. Fortunately, I have a blog that is read by many twos of examiners where I can comment on these sort of things.

The initial reactions about the Tableau purchase from my fellow digital forensic examiners ranged from concern to opposition. Not exactly a vote of confidence for the folks over at Guidance, but having been in this business for many years now, I understand their concern. We’ve all been burned by the major forensic software vendors like Guidance. How many disastrous EnCase version releases have you lived through? I’ve been through three so far where the digital forensic community essentially paid to be beta testers until the Guidance fixed their product to do what they said it would do when they sold it to us. Remember how well the indexing feature worked when V6 came out?

Access Data has evolved into Guidance’s mortal enemy and they haven’t been immune to playing Lucy to the community’s Charlie Brown trying to kick the forensic football. FTK ME FTK 2 was a situation where, once again, a major forensic vendor released a product that they should have known wasn’t ready for prime time and essentially expected their customers to pay to beta test their product.

Back when I first started in forensics, EnCase was in version 3 (Good Ol’ 3.22g was the classic V3 version) and most people used it as their primary forensic tool and used FTK 1 for things like email and to test their keywords. Sure, some people used FTK as their primary GUI toolset, but they weren’t the majority. The world was Guidance’s oyster and they acted (and charged) like it. This attitude created a lot of hard feelings in their customer base which linger to this day.

Not too long ago, Access Data made it’s great leap forward when it obtained a cash and talent injection (lots of that talent came from Guidance) which resulted in a flurry of product innovations including the wretched FTK Vista FTK 2. You could see what they wanted to do with FTK 2 and how cool it could be, but it just didn’t work. For whatever reason, they released it before it was done baking which might have been a tribute to Guidance because that’s what they had been doing to their customers for years. Eventually, they got it right and released FTK 3 (AKA FTK The Apology) which is a great tool. Access Data even made an offer to buy Guidance. I’m not sure if it was a serious offer or just a good PR stunt, but it illustrated how far Access Data had come from behind to get to where they are today.

Guidance is a publically traded company and as such we can review a lot of their financial data because they have to send so much of it to the SEC. Access Data isn’t a publically traded company so they don’t have to release much of anything. Thus, we can’t really compare financial information, but my opinion is that Access Data took the lead in the innovation competition with FTK 3. Guidance has been doing incremental innovation with their EnCase tool, but EnCase V6 doesn’t feel all that different to me than EnCase V3. Sure, the UI has evolved a bit and they’ve added incremental innovations over the years such email support, Internet history support and great encryption support. The rub is that a lot of their innovations have been done better by other people with other tools (both paid and free). There isn’t much reason to use, for example, their email or Internet history support options. If I’m going to parse an index.dat file, it’s not going to be with either EnCase or FTK. However, for email FTK still wins hands down and EnCase has never been a great email forensic tool. FTK 3 is a big change from FTK 1. While the UI borrows quite a bit from FTK 1, the move to Oracle allowed Access Data to do a lot more with the tool such as handle larger data sets in a more efficient manner. They have a long laundry list of innovations that they have put into FTK 3 such as fuzzy hashing, distributed processing and remote evidence mounting. You can have all of this cool technology for a pretty reasonable price. Gone are the days when FTK was a glorified email tool. You can now comfortably use FTK as your primary forensic GUI tool and not use EnCase if you like. This is a problem if you are Guidance Software especially since Access Data is working very hard at closing the gap at the enterprise level.

The last thing any one of us in the digital forensic community should want is for one of these companies to “win”. We don’t want to go back to the days where one was dominant and treated its customer base accordingly. I don’t know anyone who didn’t dread the idea of Access Data purchasing Guidance Software to return us back to the pre-competitive era in digital forensic GUI tools. Robert Botchek and Tableau have been doing a lot of innovation in the area of data acquisition and have rightly earned the good will of the community because of that. The TIM tool when coupled with a Tableau product is an amazing innovation in data acquisition, for example. I suspect that this purchase was a low cost way for Guidance to help close the innovation gap that has been opened by Access Data. If Guidance essentially allows Tableau to be Tableau and continue to innovate, it should be good for Guidance and the community. I wonder if the deal that Guidance made (and this is pure speculation on my part) was essentially to tell Botchek\Tableau that GSI would provide the funding and the day to day operational support (HR, payroll, marketing, etc) while the Tableau team would be free to just concentrate on innovation.

We all know what the worst case scenarios could be based on past behavior. For example, TIM becomes an EnCase only tool and you have to pay $500 more per dongle to use. That would be a Bad Thing(tm), but I suspect that Guidance knows it now lives in a world where it can’t act like it used to act and continue to be successful.

My bottom line is that I like and use products from Access Data and Guidance Software. EnCase V6 is my primary GUI forensic tool, but I’m increasingly using FTK for tasks that I used to do in EnCase. I have no desire at all to return to the bad old days where one of them was dominant over the other. We should want both organizations to win rather than having one of them lose. If this Tableau purchase helps maintain a rough balance of power between the two, I think it’s going to be good for the community.

Saturday, May 8, 2010

Flip Video Forensics

Staying true to my compulsion to forensically examine anything I can connect to a computer, I decided to see what sort of information I could pull off of a Flip Video UltraHD device.

It turns out that these devices aren't terribly difficult to examine which isn't surprising since they're a narrowly purposed. They're a very user friendly device that allows easy creation and sharing of relatively high quality videos. They are designed to be plugged into a computer's USB port so that video can be pulled off and shared via the software included on the device itself.

Like the Kindle, write blocking can be accomplished by standard USB write blocking procedures. For this examination, I used the Windows USB write blocking software (essentially just an automated registry modification program) that came with the SANS 508 class disk. You should also be able to use traditional hardware write blocking methods such as the Tableau T8 USB write blocker.

The device has one FAT32 partition that comes in at around 7.6GB with most of it being unallocated space that is used for video storage. The actual system files don't take up much more than 120MB of data and include the software needed to run the actual device as well as the software that a user would place on a computer to manage their videos. There is software on the device for both Windows and Mac.

The videos themselves are in the DCIM\100VIDEO folder and are in MPEG-4 format. The video files are numbered in the order they are created starting with "VID00001.MP4". There aren't any surprises when it comes to deleted videos and you can recover those videos like you would any other file from a FAT32 volume. Thus, deleted videos will show up as "_ID0007.MP4" as you would expect based on normal FAT file system behavior. I did a keyword search for the header information for MP4 videos and I was able to get plenty of hits in unallocated space. A system files of interest sits in the root folder and that's the "INFO.BIN" file which contains useful information such as the firmware and serial number information for the device.

Saturday, May 1, 2010

GIAC Certified Haiku Master

One of the many reasons why I like the SANS Institute is that the collective SANS community is made up of some pretty sharp and creative people. One of the creative things that SANS did in promoting their upcoming SANS Boston conference was to have a Twitter Haiku contest that was judged by Craig Duerr with the words provided by Stephen Northcutt.

I've always had an interest in Haiku so even though I won't be able to make the conference this year, I decided to take a shot at the competition. It turns out that Dan Crowley also has a thing for Haiku and he decided to participate also. The battle was joined and Dan and I entered what we affectionately began to call the SANS Haiku Thunderdome (Two poets enter, one leaves).

Dan won the first round and I somehow managed to win the next two rounds and emerge the winner. I'll be the first to admit that Dan is a better Haiku poet than I am, but sometimes a blind squirrel finds a nut and that just happened to be me this time around.

The spoils of victory were a certificate proclaiming me a GIAC Certified Haiku Master and I also was awarded a Iron Kung Fu fan as my trophy. Very cool. We'll call this competition reason #214 why I love SANS.









Sunday, April 25, 2010

The Ballad of Grayson Lenik

Grayson Lenik is a relatively new member of our community who has made the decision to move from a systems administration focus to a digital forensics focus. You can follow his journey at his blog "An Eye on Forensics".

Grayson is clearly a sharp fellow. From what I can tell, he passed the SANS GCFA exam via the challenge process rather than taking any of the SANS course content. That's impressive considering the scope and difficultly of that exam. Grayson has been encouraged to contribute to the recently started Into The Boxes digital forensic online magazine and, to his credit, he's accepting the challenge and looking for a topic to research.

His comments on the research issue made me think of my decision making process regarding engaging digital forensics research. I've been doing digital forensics for a relatively long time now, but it was only last year when I decided that I'd start to contribute to the community in a meaningful manner in this area.

The reasons why I hadn't done so were largely due to intimidation. I look up to people such as Harlan, Rob, Jesse and Eoghan (otherwise known as the people who don't need last names to know who they are) and the work that they have done advancing the field with their research, training and tool development efforts. Who was I to even think that I could play on the same field as them? I also fell into a common trap that I see with IT security people which is that because I didn't know everything, I thought that I didn't know anything.

Last year I stumbled across Adobe Flash Cookies while doing an examination and started to dig into them. I began to learn that some of these cookies can provide a treasure trove of information for a digital forensic examination and started to parse them out as well as I could. I made a couple phone calls to some very experienced examiners and asked them if they had heard of them before and was told that they had not. One of those examiners was actually able to take what I told them over the phone and put it to use in a criminal investigation they were using so I knew I had something that would be beneficial to the community.

So I decided to just plow ahead and start writing something up with the goal having something to present at a conference like CEIC. I started to create an early overview paper.
I was lucky enough to have people like Cindy Murphy, Gary Kessler, Jimmy Weg and Mark Johnson review that paper and make suggestions on how to improve it. Cindy even managed to carve out some time from her busy schedule to do some additional research in regards to a particular kind of cookies that really helped fill out my knowledge. I briefly distributed the paper through some of the email lists like IACIS and HTCC hoping that it might get the word out and generate some additional research leads.

I sent it out to the community and heard....nothing much. I later learned this is a pretty common occurrence in our community even for Those-Who-Only-Need-A-First-Name. A digital forensics researcher will put a lot of work and effort into a project, release it out for free and ask for feedback...and will rarely get any back. I would get people thanking me for providing them the paper after I sent it to them, but then no response back to my requests for feedback on whether it was useful, whether they found any errors, how I could improve the final product, etc.

One of the notable exceptions to this which was Jesse Kornblum. Some time after I had released the paper, I checked my email to see a request from Jesse for the paper. It was a classic good news\bad news situation. The good news was that Jesse Kornblum wanted to see the paper. The bad news was that Jesse Kornblum wanted to see the paper. I'll admit a certain amount of dread when I hit the send button. The short version of the story is that Jesse liked what I had done. He offered encouragement and suggestions on how to proceed. Very cool!!!

So bolstered with my new found confidence, I pressed forwards with the research project and hit a major sticking point when I encountered some very odd metadata behavior that I absolutely could not figure out. I was saved by Eoghan Casey who helped me determine that the odd behavior I was seeing was due to File System Tunneling (which I will explain at my CEIC presentation next month). Yet another of my forensic idols riding to the rescue!

Around January or so, however, I was starting to realize that I was over my head. I able to parse out the header information for these artifacts, but I didn't have the knowledge to completely parse everything out. My hex-fu was okay, but it wasn't good enough to completely finish the project the way I wanted to complete it. The way I saw it was that I could either crawl back into my hole and admit defeat or just publish what I learned so far and hope that someone else could run with the research at a later date. I decided to do the second option with an eye on getting what I had completed published in some form.

Then on Feb 17th, 2010, I got lucky. Kristinn Gudjonsson posted some of his Adobe Flash Cookie research on the SANS Forensic blog. My initial reaction was that I had been too slow, too unknowledgeable and had been just wasted months of my research life because what he had done was so fantastic that it was better than I could have ever done. I even found that I had made at least one major error in my original header research. Woe is me, right? However, when I started to look closer, I realized that we had approached the research from different standpoints. Kristinn is an amazingly sharp incident responder and forensic examiner with an engineering background. That means he spent a lot of time looking at the hex level view of these cookies and did an exceptional job parsing them out. I approached the research from a more traditional investigative digital forensics perspective which means I concentrated on the metadata (which is why I discovered and overcame the file tunneling issue) and a lot of the higher level aspects of the research such as how and when Flash cookies tended to appear on a machine. I became excited about the prospect of merging the research, but would someone like Kristinn be willing to talk to little old me? (There's that self doubt again...)

As you know from my previous blog entries, yes, he was more than willing to talk and after a flurry of emails comparing our various notes on the project, we decided it made good sense to team up and create a final research project.

The moral of the story?

1. Be like Grayson Lenik, not Eric Huber. Grayson has been a member of our community literally only for a matter of months and he's already sharing what he's learning through his educational process and he's going to do a research project for ITB. It took me years before I decided to do what Grayson is doing now.

2. Research what you know and if you get stuck, get help and continue on. There is a vast amount of research opportunities in digital forensics for all skill levels. Harlan wrote a particularly pithy bit of advice for Grayson when he said "...start writing about what you know...we'll work with you." That's essentially what I have been doing. I plow through the best I can within the range of my abilities and if I get stuck, I go ask for help. Grayson will do great because he's a sharp fellow who has the desire to do the work and he'll have people like Harlan and Don Weber to help him when he needs it. What I've found is that the gurus like Harlan and Don are very helpful if you approach them in the right way.

3. If you don't have time to complete a project, even partial research is helpful and someone else might take what you have done and run with it. I did that with my Kindle forensic research. I knew I wasn't going to have the time and probably the knowledge to completely parse every aspect of what one can find on a Kindle so I posted what I learned on this blog.

4. Provide feedback. If you don't have the time or desire to do digital forensic research, no worries. However, one thing that you can do to help those who are doing it is to provide feedback when you have found something useful that helped you in your job. Did you like a particular digital forensics book? A nice thing to do would be to post review at a site like Amazon. Even negative feedback is welcome as long as it's constructive. If I made a mistake, I want to know about it. If what I wrote didn't make any sense, it doesn't help me develop as a writer or a researcher if I don't know what I'm doing wrong.

Tuesday, April 20, 2010

Forensic 4cast and Me

The most recent Forensic 4cast podcast is up with a brand new format. Lee has decided to test out a panel format where he brings together people from the digital forensic community to discuss the topics of the day.

This episode included a panel that consisted of Lee, Tom Yarrish, Joe Garcia and myself. Give it a listen and let Lee know what you think about the new format. I'm grateful to Lee for the opportunity and I hope I did a good job for him. I have to admit that I was a bit vexed when I heard the podcast after the fact because the sound quality from my phone wasn't remotely as good as the other panelists. I already have a proper Skype certified phone on order from Newegg so that I can use it with Skype next time and not sound like the panelist who is calling from the outer reaches of Absurdistan.

Lee has also released the much anticipated presentation on Volume Shadow Copies that he was due to give at the SANS EU Forensic Summit. That summit was delayed because of, as Chad Tilbury puts it, the Krakatoa eruption in Iceland. Chad made the Krakatoa reference on Twitter this week and I've been laughing about it ever since. It's yet another reason why I like socializing with my fellow digital forensic examiners on Twitter. Chad is a very sharp fellow and one of the primary SANS digital forensics instructors.

As Lee was nice enough to mention at the end of the podcast, I will be presenting on the topic of Adobe Flash Cookies at this year's CEIC conference. Kristinn Gudjonsson and I have been working on an article to submit to an academic journal and I have crafted an overview of the research for the presentation. The presentation won't cover much of the content in the article because there just won't be enough time to do that, but it will provide examiners with enough of an understanding of these artifacts to start using them in their digital examinations. I'm looking forward to CEIC this year as there are a lot of amazing presentations such as Rob Lee's Super Timeline Analysis Lab. I also think it's a moral imperative that I have an Bacon N' Eggs burger at LBS Burger.

I started this research project independently late last year and it turns out Kristinn had also been working on parsing these artifacts as part of his larger log2timeline research. He posted about them on the SANS Forensic blog earlier this year and that's when we discovered that we had been working on the same subject. We essentially had a "you got chocolate in my peanut butter" moment and decided to work together on putting together a paper that we hope will be useful to the community. Kristinn is certainly the brains behind the operation given his very robust technical background. I never would have been able to fully parse these artifacts on my own because I don't have the deep technical knowledge that Kristinn has so I'm lucky he posted on the SANS Forensic blog when he did and that he's generous with his time and knowledge.

One of the reasons I mentioned Chad earlier in this post is that he also did some research on Adobe Flash Cookies and posted about it on the SANS Forensic blog.