Thursday, September 9, 2010

NYC4SEC Meetup

The next NYC4SEC Meetup is next week on Thursday at Pace University in NYC.  The special guest for this event will be Ovie Carroll from the SANS Institute.  Ovie will be teaching FOR408 next week in New York.

The response to the “Take Vienna” blog post was very positive and I’m grateful to all of the comments that were posted on the blog and sent to me privately.  I’ll be doing a follow up blog post where I expand on the subject a bit more. I will also provide some examples of excellent digital forensic scientists who have devoted their lives to becoming digital forensics professors and teaching others to become lethal forensicators.

Saturday, September 4, 2010

Take Vienna

Napoleon once said that if you start to take Vienna – take Vienna.  This is the same advice that I give people who are interested in obtaining an academic degree in digital forensics. If you start to study digital forensics – then study digital forensics.  If you are passionate about digital forensics and you want to break into the field by obtaining a digital forensics degree then do it properly.

With the increasing popularity of digital forensics, we are seeing an explosion of academic programs that claim to prepare students for a career in the field. Some of these programs are well suited for this task and others appear to be a great waste of time and money. 

For example, I have observed quite a few programs that label themselves as computer forensics programs, but offer very little in the way of a proper computer forensic education.  Many of these programs are nothing more than classic computer science programs that offer a handful of computer forensic classes by instructors whose CVs don’t indicate a mastery of the field.

It’s not that students won’t benefit from academic programs that teach foundational  information technology skills such as networking, programming and databases as they prepare for a career in digital forensics.  Some of our greatest digital forensic gurus studied disciplines like electrical engineering (Harlan Carvey), computer science (Jesse Kornblum) and mechanical engineering (Eoghan Casey).  However, we live in a time where those who are passionate about the field have many opportunities at the academic level to build a strong foundation in digital forensics early in their careers. 

If you are going to get a degree in digital forensics then get  a proper degree in digital forensics.  The digital forensics program at Champlain is a good example of what appears to be solid program. I have heard very good things about this program from at least one of my trusted peers who has hired their graduates.  Champlain offers a bachelor’s level degree in Computer and Digital Forensics.  Instead of a handful of token computer forensic classes layered on top of a traditional computer science curriculum, this degree program appears to be specifically designed to prepare students for a career in digital forensics. It is also offered online and at the Champlain campus.

If you look over the curriculum, you will see that they offer nine specifically branded forensics courses including an internship.  These courses include content specifically geared towards digital forensics such a pair of foundational computer forensics courses, but also courses in areas such as anti-forensics and network forensics.  A nice bonus is that students can get some training in areas such as white collar crime, forensic accounting, criminal law and criminal procedure.  This program also provides students with the opportunity to obtain grounding in general information technology skills such as networking.

A critical consideration when making the decision on what degree program enroll in is not only the strength of the material, but who is teaching you that material.  I like to review the CVs of professors who teach computer forensic courses to get a feel whether these are people who actually have experience in the field or if it’s just a side thing for them.  A lot of the people I see teaching digital forensic classes are people who appear to have very strong backgrounds in computer science, but look very weak when it comes to digital forensics. It’s a bad idea to get a computer guy, even a highly skilled one, to act as an expert witness in a legal case instead of an actual digital forensics expert. It strikes me as an equally bad idea to have that same computer guy teach people digital forensics.

With a program like Champlain, you get an instructor like Jonathan Rajewski teaching some of your classes.  Jonathan might not have a PhD, but he has real live experience in the field and has worked as a full time digital forensics practitioner before he became a professor at Champlain.  In fact, according to his biography, he continues to work in the field as part of the Vermont ICAC Task Force.

The rub with a program like this is that it comes at high price.  The online program costs $540 dollars a credit hour.  The campus based program is going to cost you over $27,000 a year.  Student loans can be a horrible burden if you borrow more money than your degree is ultimately worth.

Another interesting looking program that I don’t have much familiarity with is the Bachelors of Science Program in Technology Forensics at the University of Advancing Technology.  If you look at their online course content, it has a similarly strong focus in actual digital forensics just like the Champlain program does.

Network Security 2010

SANS Network Security 2010 in Las Vegas is mere weeks away.  Get your seats if you haven’t done so already.  The seats at these events can sell out before the event.  For example, Jonathan Ham’s FOR558 Network Forensics class already has a waiting list.

I’ll be acting as Rob Lee’s Teacher’s Assistant for FOR408 Computer Forensics Fundamentals. This class has been expanded to a sixth day because of all of the new forensic goodness that has been added.  I can’t wait to meet all of the students and help Rob turn them into lethal forensicators.

Saturday, August 28, 2010

Kristinn Gudjonsson’s GIAC Gold Paper Released

Kristinn has announced that his GFCA gold paper entitled “Mastering the Super Timeline With log2timeline” has been released. You can get it here.

Kristinn and I are also back at work on the Adobe Flash Cookie research and tool development project and we hope to have it wrapped up relatively soon.  The release of Flash Player 10.1 set my portion of the research back a little bit since there was some changes to how things work, but the fundamentals remain the same.

I have completed the file system tunneling research portion of the project and that will be part of the final paper since it’s critical to understanding time and date issues with these artifacts. The universal response when I have approached various forensic gurus on the issue has been unfamiliarity.  It’s appears that file system tunneling is something that was esoteric enough where it hasn’t appeared on anyone’s file system research radar until Kristinn and I ran into it during the course of our research.

Sometimes you just get lucky.

Links

There have been a lot of interesting items that I have run across recently that I’d like to share with the group.

The first is an EFF article on Apple’s efforts to patent spyware and what EFF terms “traitorware”. Your spider senses should start tingling when you read the article.

The second is a fantastic Brad Garnett SANS Blog post on report writing.  Report writing is an area that is critically important for digital forensic examiners to learn and master, but it’s a very neglected topic when it comes to digital forensic training.

Lastly, Brandon Gregg has an excellent article over at CSO Online on free and cheap tools to help manage investigations.  I found the last segment on “hypothesizing your investigation” to be particularly intriguing. 

Sunday, August 22, 2010

Horse Then Cart

I usually have a couple blog posts in some sort of draft form at any given time.  This thread over at Forensic Focus allowed me to flesh out one of those drafts enough where I essentially wrote the core of what I’d like to write about today. 

I’m frequently approached by people who are either putting together a forensic team or just preparing to get into forensic work on their own.  One of the first question that I am almost invariably asked is what tools I recommend that they obtain.  That’s not a question that I can intelligently answer unless I have some idea of what purpose the team is going to serve.  In the business world, customer requirements should drive tool selection, team recruitment and process development.  One of the classic mistakes that one can fall into is allowing your processes to be driven by your tools.  Vendors are very keen to sell you their tools regardless of whether they are a good fit for your organization or customer requirements.  If you allow the vendors to drive your team development by making you dependant on their tools and processes, you can end up with a team that isn’t much more than a group of glorified tool drivers.

It’s important that to understand the customer requirements first and then craft a team to meet those requirements.  A team that is intended to do a lot of eDiscovery work will not have the same tools, people and processes as a team that is primarily tasked with incident response forensics.  The eDiscovery focused team, for example, won’t have as keen of a need for memory forensics and malware reverse engineering compared to the incident response forensic team.

Team selection and training will also be driven by customer requirements for the same reason.  There are quite a few subdisciplines in digital forensics and team selection will be driven in part by how mastery of selected subdisciplines will accomplish the customer requirements.  For example, if a team is going to be engaging in eDiscovery, that team will require people with backgrounds in areas such as software development and database management.  There are many fine eDiscovery tools out there, but they aren’t necessarily going to be tailor made to meet your requirements right when you take off the shrink wrap.  I’m still amazed when I run across firms that are doing eDiscovery work who don’t have anyone on staff that can do development work.  That tells me that their processes are probably being dictated by the tools and they almost certainly lack the flexibility of other firms that have a robust developmental capability.

By the same token, it’s hard for me to consider a team to be a world class incident response team if it doesn’t have a robust malware reverse engineering capability.  Sure, you can do incident response without having malware gurus on staff, but it’s a hard sell to claim that you’re somehow on the cutting edge of incident response work if you can’t study the tools of an attacker in detail.

Lastly, it strikes me as a mistake to disqualify otherwise excellent forensicators because they aren’t familiar with a particular tool that your team uses.  While it’s certainly something to take into account, a candidate’s fundamental knowledge is more important than their ability to drive a tool. It’s much harder to pick up the fundamentals than it is to learn to drive a tool.

Sunday, August 8, 2010

Tweeting Forensicators

During a recent episode of the Inside the Core Podcast, Joe Garcia of Cybercrime 101 spoke about how he uses Twitter to tap into the collective knowledge of the community.  I held out against using social media for a very long time and it has only been within the last year that I’ve come to embrace at least some of it.

I say some of it because I briefly experimented with Facebook and decided it was wretched.  Its business model is designed around the concept of users being a commodity rather than a customer.  The users gladly input their personal data into the system and Facebook diligently works at turning that personal data into cash for Facebook.  Factor in all of the noise from the games, a mediocre user interface and annoying ads and I’m more than happy not to use it.

Twitter, on the other hand, has turned out to be very useful communication method that I’ve embraced along with many others in the digital forensics community.  I initially created a Twitter account just to see how the system worked.  I didn’t really do much of anything with account for quite some time.  I eventually decided to start following some of the forensic gurus who Tweet and that resulted me becoming actively involved in the digital forensics Twitter community.

It’s a great way to keep up on developments from the community because it tends to work like a form of Digg where the users you follow will determine what sort of news stories, research results and other information appear on your Twitter timeline.  For example, I follow digital forensic and information security gurus like Rob Lee, Harlan Carvey, Richard Bejtlich, Chad Tilbry, Ed SkoudisMike Murr (rumor has it that Mike isn’t actually blue in real life. I refuse to believe this until I see it with my own eyes), Stephen Northcutt and Mike Cloppert. Most of these people use their Twitter accounts to distribute news and commentary on the information security issues of the day.  For example, Richard Bejtlich’s Twitter feed was a must read for those who weren’t able to attend this year’s Black Hat in Las Vegas.  Twitter was also a great source of information during the recent SANS Forensic Summit for those of us who weren’t able to attend.  Because so many people who were at the summit were actively Tweeting about the event, those of us who weren’t there could interact with the participants and experience at least a little bit of the energy of the event.

There are also a lot of our fellow forensicators who also use Twitter to socialize and interact with the community on a more personal level.  The Twitter forensic community has been a nice experience in that it has helped to build a sense of camaraderie that can be hard to establish when you have so many people who are physically separated from each other.   I have found this community to be very helpful when I need to get information to help solve a problem on short notice. For example, I recently ran into trouble with an encrypted device and I was able to get instantaneous help from a variety of forensic experts from around the globe in helping me solve my problem.  A problem that several years ago might have taken me days to get a resolution to through sources like email list servs was able to be solved in a matter of an hour or so through Twitter.

Building up strong relationships is important for professional and technical success.  It can be hard to sell the value of developing strong relationships in an industry that can sometimes be dominated by traditional IT type who aren’t necessarily the most social people to begin with.  I’ve spent a lot of time over the years establishing relationships with other forensic people because I learned very early in my forensic career that since you can’t know everything, it’s important to have relationships with people who can help you when you get into a bind. Through Twitter, I have been able to meet and get to know some great people such as Joe Garcia, Lee Whitfield, Mark “Toolio” McKinnon and many others who I never would have had the opportunity to interact with had I not become involved with the Twitter digital forensics community.

So my advice is to give Twitter a try and become involved with the Twitter digital forensics community.  You can lurk without becoming actively involved and just soak up all of the good knowledge that is passed around the community each day or you can get more actively involved and start to build some productive relationships with your peers.

Reason #217 Why You Shouldn’t Hire A “Computer Guy” To Do A Forensic Examination

Lee posted this sanitized report that came from someone who clearly is a “computer guy” rather than a lethal forensicator.  I have seen this problem first hand and I have heard many similar stories from my fellow examiners who have dealt with this problem in the past.

It’s the same basic scenario that plays out around the globe it seems.  An otherwise sharp attorney has a client who needs an expert to deal with computer evidence during a legal proceeding.  The attorney decides that because it’s computer related evidence, they need a “computer expert” to act as their expert witness.  For whatever reason, they are lured into the trap of thinking that someone with a lot knowledge about computers must also be qualified to do digital forensic work.  Maybe this “expert” even has a Microsoft certification and the attorney thinks that an MCSE qualifies this person to perform a forensic examination.

The report that Lee has in his blog post is the common result and it’s a disaster for the attorney and the client.  A report like this will likely result in a very uncomfortable result if the other side as a competent forensicator who is advising the opposing counsel.   I can only imagine the miserable experience that this “expert” would have had trying to defense this report during a cross-examination. 

If you read the report and find yourself having  a hard time seeing what the problems are in the report, I’d like to gently suggest that you might find a lot of value in taking the SANS Computer Forensics Fundamentals course.  The good news: Rob Lee will be teaching this very course in Las Vegas next month at SANS Network Security 2010.  The bad news: If you take this course next month, you’re stuck with me being Rob’s Teacher’s Assistant.  I’m very much looking forward to helping Rob turn out another batch of lethal forensicators and I hope I get to see some of you there at Network Security 2010.

Wednesday, August 4, 2010

Newer Blog Template

I started to get eyestrain while reading my own blog and there was at least one other person who mentioned they were having the same issue with the new blog template.  I now understand why major news sites and blogs use black text on a white background. This current template will serve as a temporary one until I can develop something a bit more permanent.  The ultimate goal is to come up with a template that is clean and easy to read.

Monday, August 2, 2010

New Blog Template

When I created the blog, I picked out the most minimalist template that I could get away with because I just wanted to get some content up without agonizing over appearance.  I decided that it’s time to make the blog look a bit better now that it’s gained in popularity.

I’ve been scouring the Internet for awhile looking for a template that would be appropriate, but couldn’t really come up with anything that wasn’t stereotypical.  The standard technical templates are what you’d expect…pictures of computers, mobile phones, countless Matrix themed backgrounds, Windows themes, etc.

I like the white lettering on black background that the SANS Forensic Blog uses so I decided to just use the Blogspot template creator to come up with something workable that I hope you will like better than what I had before.

I have increased the size of the font in addition to using the black and white theme so that it would be easier to read.  I also have to admit that I’m fond of the juxtaposition of the scenic yet barren desert background alongside the highly technical content of the blog itself.

I’ve also put up a live Twitter feed for the AFoDBlog Twitter feed so that readers can get a sense of what I’m putting out through that account.

Wednesday, July 28, 2010

Go After The Flank

There was a thread this week on one of the digital forensic email lists I follow where the initial email was from an examiner who was seeing signs of an anti-forensic wiping program.  The examiner was looking for assistance in determining what program might have been used.  He had performed what any of us would normally do such as looking in places like the registry and so forth.  I responded to the list on how I sometimes approach problems like these in an indirect manner by looking at web history.  An examiner from a government digital forensics lab found the response useful since he’s in the process of training some new examiners and asked if he could pass it along.  Of course, I was flattered that he thought it was useful information so I was happy to see him make us of it in training his new examiners.  I thought I’d share my thoughts with the rest of the team through this blog in case anyone else found it useful also. 

Web history is still good for this sort of investigation also.  It's an indirect way of going after the problem, but one of the things I've learned about digital forensic examinations is that sometimes it pays to flank the enemy, so to speak. 

For example, if you come up empty with the traditional registry forensic searches, hitting an image with something like HSTEX and going over all of the browser history that is available might get you some results.  I've had cases like that where if I can an image soon enough after a application of interest is installed and used, I can see the predictable timeline of events such as the user's Google searches looking for a particular application, the user accessing a specific application's website, the downloading link and sometimes the file access information from IE history when the user starts interacting with the program in question.  

That's why even if you have a user who is using a non-IE browser, you still want to process all of the browser history especially those IE index.dat files because you can still get some interesting file hits from that history.

This is one of the reasons why digital forensics feels like an art sometimes.  It’s certainly a science and we should be using the scientific method early and often in how we approach our jobs. In addition to having a strong analytical approach, one of the things I like to see in an examiner is a healthy amount of creativity and curiosity.  These are qualities that greatly assist in solving challenges that we’re continuously faced with in digital forensic examinations.

Tools

I’ve been bookmarking quite a few websites that have come up through my Twitter feed and other resources that I’d like to share with the group.

The HSTEX tool that I mentioned above is Craig Wilson’s awesome web browser history extractor.  You can find it here and it goes together with his Net Analysis product like chocolate and peanut butter.

The first is one that I learned about from Jonathan Krause and that’s ddrescue.  Jonathan pointed out an article at howtogeek.com that talks about the use of the tool. This might be a tool that many are already familiar with especially if you are used to doing forensics in a Linux environment.  I’ve recently rediscovered the joys of using Linux in digital forensic examinations so I’m enjoying learning about tools like this.

The next tool of interest is the CAINE Live CD.  I don’t remember how I learned about this tool, but it looks interesting enough where I’d like to play with it more to see if I should add it to the toolbox. The CAINE project is managed by Nanni Bassetti and contains a whole host of forensic tools including the previously mentioned  ddrescue. Another nice feature is the WinTaylor aspect which includes tools like the Nirsoft Mega Report which uses a variety of Nirsoft tools to extract data for a report.

Next up is FirePasswordViewer and, again, I didn’t write down where I learned about this one.  I haven’t tried this program either, but it looks like it could be a useful tool for extracting login passwords from Firefox.  This gets back to the idea of flanking the enemy when it comes to forensic examinations.  If I have an encrypted container that I can’t easily brute force, I might be able to just cut the Gordian Knot by obtaining passwords from easier to attack sources like this and using those same passwords against the encrypted container.  Sure, you used Serpent-Twofish-AES encryption on your TrueCrypt container that you didn’t want the police to examine, but you used the same password that you saved in your Firefox password container to login to your Facebook account.

Lastly, we have the Paladin Live CD from Sumari.  The guys over at the Inside the Core Podcast (see below) talked about it on their most recent episode. I haven’t been able to test this one out yet either, but it’s a Live CD that can be used for making images.  The nice thing is that it can be used to image Macs in addition to PCs.  When people ask me about how long it takes to remove a hard drive from a Mac laptop, I tell them about 15 years.  Four years of undergraduate school, four years of medical school, five years of general surgical residency and a two year fellowship until you have the necessary surgical skills to successfully remove a hard drive from a Mac and to put it back in without the dreaded “Bag O’ Laptop”.

Podcasts and Blogs

I don’t perform Mac forensics, but given that the Inside the Core defeated Forensic 4cast (where I’m a panelist from time to time) and Cyberspeak in the 4cast awards, I thought I’d give it a spin this week.  I have to say that even though this isn’t an area of digital forensics that I’m currently engaged in, I really enjoyed the podcast.  The most recent episode features an extended section on Google Chrome forensics which even though it was geared towards the examination on a Mac platform was useful information for Chrome forensics on a PC.

Ken Pryor put up a great blog post on the SANS Forensic Blog.  Ken provided us with a nice compilation of all of the various test images that we can use for practice and research purposes.  There were quite a few that I didn’t realize were available and I’ll be happily be making use of them in future research efforts.

Thursday, July 22, 2010

Detective Cindy Murphy’s Cell Phone Evidence Extraction Process

This is a special edition of the blog in which I am honored  to host a full paper that has been authored by Detective Cindy Murphy of the Madison, Wisconsin Police Department.  Cindy Murphy is a lethal forensicator and if you make the decision to engage in technology related crime, I highly recommend avoiding doing so in Madison, Wisconsin. Cindy will find you and you will go to jail.  Don’t say you weren’t warned. 

In addition to being a very sharp computer forensic examiner, she does cutting edge research in the area of mobile device forensics. For example, she has developed the fraternal clone method for CDMA phones which has been published in the Journal of Small Scale Digital Forensics

You can find a PDF version of this paper over at Jesse Kornblum’s website.

Cellular Phone Evidence Data Extraction and Documentation

By Detective Cindy Murphy

Developing Process For The Examination
of Cellular Phone Evidence

Recently, digital forensic examiners have seen a remarkable increase in requests to examine data from cellular phones. The examination of cellular phones and the extraction of data from the same present challenges for forensic examiners:

· The numbers of phones examined over time using a variety of tools and techniques may make it difficult for an examiner to recall the examination of a particular cell phone.

· There is an immense variety of cellular phones on the market, encompassing a array of proprietary operating systems and embedded file systems, applications, services, and peripherals.

· Cellular phones are designed to communicate with the phone network and other networks via Bluetooth, infrared and wireless (WiFi) networking. To best preserve the data on the phone it is necessary to isolate the phone from surrounding networks, which may not always be possible.

· Cellular phones employ many internal, removable and online data storage capabilities. In most cases, it is necessary to apply more than one tool in order to extract and document the desired data from the cellular phone and its storage media. In certain cases, the tools used to process cellular phones may report conflicting or erroneous information, thus, it is critical to verify the accuracy of data from cellular phones.

· While the amount of data stored by phones is still small when compared to the storage capacity of computers, the storage capacity of these devices continues to grow.

· The types of data cellular phones contain and the way they are being used are constantly evolving. With the popularity of smart phones, it is no longer sufficient to document only the phonebook, call history, text messages, photos, calendar entries, notes and media storage areas. The data from an ever-growing number of installed applications should be documented as these applications contain a wealth of information such as passwords, GPS locations and browsing history.

· The reasons for the extraction of data from cellular phones may be as varied as the techniques used to process them. Cellular phone data is often desired for intelligence purposes and the ability to process phones in the field is attractive. Sometimes though, only certain data is needed. In other cases full extraction of the embedded file system and the physical memory is necessary for a full forensic examination and potential recovery of deleted data.

Because of the above factors, the development of guidelines and processes for the extraction and documentation of data from cellular phones is extremely important. What follows is an overview of process considerations for the extraction and documentation of cell phone data.

Cellular Phone Evidence Extraction Process

Untitled

Figure 1: Evidence Extraction Process

Evidence Intake PHASE

The evidence intake phase involves the procedure by which requests for examinations are handled. The evidence intake phase generally entails request forms and intake paperwork to document chain of custody, ownership information, and the type of incident the phone was involved in and outlines general information regarding the type of data the requester is seeking to have extracted or documented from the phone.

A critical aspect at this phase of the examination is the development of specific objectives for each examination. This not only serves to document the examiner’s goals, but also assists in the triage of examinations and begins the documentation of the examination process for each individual phone examined. This information allows the examiner to triage cellular phones.

Identification PHASE

For every examination, the examiner should identify the following:

· The legal authority to examine the phone

· The goals of the examination

· The make, model and identifying information for the cellular phone itself

· Removable & external data storage

· Other sources of potential evidence

Legal Authority:

Case law surrounding the search of data contained within cellular phones is in a nearly constant state of flux. It is imperative that the examiner determines and documents what legal authority exists for the search of the cellular phone, as well as what limitations are placed on the search of the phone prior to the examination of the device:

  • If the cellular phone is being searched pursuant to a warrant, the examiner should be mindful of confining the search to the limitations of the warrant.
  • If the cellular phone is being searched pursuant to consent, any possible limitations of the consent (such as consent to examine the call history only) and should determine whether consent is still valid prior to examining the phone.
  • In cases where the phone is being searched incident to arrest, the examiner needs to be particularly cautious, as current case law in this area is particularly problematic and in a state of constant change.

Particular questions as to the legal authority to search a cellular phone should be directed to a knowledgeable prosecutor or legal advisor in the examiner’s local area. (Mislan, Casey & Kessler 2010)

The Goal of the Examination:

While the general process used to examine any given cellular phone should be as consistent as possible, the goal of the examination for each phone may be significantly different. It is unlikely that any given forensics lab has the capability nor the capacity to examine every cellular phone that contains data of evidentiary value in every kind of case. For this reason, it can be useful to identify what level of examination is appropriate for any given cellular phone.

The first of two main considerations is who will be responsible for the process of documenting the data. The second main consideration is how in depth the examination needs to be. Of those phones that are submitted to the lab for examination, there will be differences in the goals of each examination based upon the facts and circumstances of the particular case.

In some cases evidence from cellular phones may be documented in the field either by hand or photographically. For example, in the interest of returning a victim’s main communication lifeline while still documenting information of evidentiary value, or in the case of documenting evidence in a misdemeanor or minor offense, field documentation would be a reasonable alternative to seizing the device. In other cases, it may be sufficient to have an officer or analyst with basic training in the examination of cellular phones perform a quick dump of cellular phone data in the field, specifically for intelligence purposes using commercially available tools designed for this purpose.

A smaller subset of cellular phones may be submitted for examination with the goal of targeted data extraction of data that has evidentiary value. Specifically targeted data, such as pictures, videos, call history, text messages, or other specific data may be significant to the investigation while other stored data is irrelevant. . It also might be the case that only a certain subset of the data can be examined due to legal restrictions. In any event, limiting the scope of the exam will likely make extraction and documentation of the data less time consuming.

The goal of the exam may alternatively include an attempt to recover deleted data from the memory of the phone. This is only possible if a tool is available for a particular phone that can extract data at a physical level (See levels detailed at the end of this section). If such a tool is available, then the examination will involve traditional computer forensic methods such as data carving and hex decoding, and examination process may be a time consuming and technically involved endeavor, as deeper levels of examination necessitate a more technically complex and time consuming process.

Figure 2

Figure 2: Goal of the Exam

The goal of the exam can make a significant difference in what tools and techniques are used to examine the phone. Time and effort spent initially on identification of the goal of the exam with the lead investigator in the case can lead to increased efficiency in the examination process.


These types of realities should be addressed in training, and the triage process related to the initial submission of cellular phones for examination should based upon the individual circumstances and severity of the case.

The Make, Model and Identifying Information for the Cellular Phone:

As part of the examination of any cellular phone, the identifying information for the phone itself should be documented. This enables the examiner not only to identify a particular phone at a later time, but also assists in the determination about what tools might work with the phone as most cellular phone forensic tools provide lists of supported phones based on the make and model of the phone. For all phones, the manufacturer, model number, carrier and the current phone number associated with the cellular phone should be identified and documented.

Depending upon the cellular phone technology involved, additional identifying information should be documented, if available, as follows:

CDMA cellular phones:

The Electronic Serial Number (ESN) is located under the battery of the cellular phone. This is a unique 32 bit number assigned to each mobile phone on the network. The ESN may be listed in decimal (11 digits) and/or hexadecimal (8 hex digits). The examiner should be aware that the hex version of the ESN is not a direct numeric conversion of the decimal value. An ESN converter can be found at http://www.elfqrin.com/esndhconv.html.

The Mobile Equipment ID (MEID), also found under the battery cover, is a 56 bit number which replaced the ESN due to the limited number of 32 bit ESN numbers. The MEID is listed in hex, where the first byte is a regional code, next three bytes are a manufacturer code, and remaining three bytes are a manufacturer-assigned serial number. CDMA phones do not generally contain a Subscriber Identity Module (SIM) card, but some newer hybrid phones contain dual CDMA and GSM technology and can be used on either CDMA or GSM networks. Inside these dual technology phones is located a slot for a SIM card. The identifying information under the battery of these phones may list an IMEI number in addition to the ESN/MEID number.

CDMA phones also have two other identifying numbers, namely, the Mobile Identification Number (MIN) and Mobile Directory Number (MDN). The MIN is a carrier-assigned, carrier-unique 24-bit (10-digit) telephone number. When a call is placed, the phone sends the ESN and MIN to the local tower. The MDN is the globally-unique telephone number of the phone. Prior to Wireless Number Portability, the MIN and MDN were the same but in today's environment, the customer can keep their phone number (MDN) even if they change carriers.

GSM cellular phones:

The International Mobile Equipment Identifier (IMEI) is a unique 15-digit number that identifies a GSM cellular phone handset on its network. This number is generally found under the battery of the cellular phone. The first 8 digits are a Type Allocation Code (TAC) and the next 6 digits are the Device Serial Number (DSN). The final digit is a check digit, usually set to 0.

On a GSM phone, there will be at least a Subscriber Identity Module (SIM) card slot which is also generally located under the battery. The SIM card may be branded with the name of the network to which the SIM is registered. Also located on the SIM card is the Integrated Circuit Card Identification (ICCID), which is an 18 to 20 digit number (10 bytes) that uniquely identifies each SIM card. The ICCID number is tied to the International Mobile Subscriber Identity (IMSI) which is typically a 15-digit number (56 bits) consisting of three parts including the Mobile Country Code (MCC; 3 digits), Mobile Network Code (MNC; 3 digits in the U.S. and Canada, and 2 digits elsewhere), and Mobile Station Identification Number (MSIN; 9 digits in the U.S. and Canada, and 10 digits elsewhere) which are stored electronically within the SIM. The IMSI can be obtained either through analysis of the SIM or from the carrier.

The Mobile Station International Subscriber Directory Number (MSISDN) is the phone's 15-digit, globally unique number. The MSISDN follows the International Telecommunication Union (ITU) Recommendation E.164 telephone numbering plan, composed of a 1-3 digit country code, followed by a country-specific number. In North America, the first digit is a 1, followed by a 3-digit area code.

iDen cellular phones:

iDen cellular phones contain an International Mobile Equipment Identity (IMEI) that identifies an iDen cellular phone on its network. This number is generally found under the battery of the cellular phone. iDen cellular phones also contain SIM cards, with the identifying information described above, though they are based on different technology than GSM SIM cards and are not compatible with GSM cellular phones.

Unique to iDen cellular phones is the Direct Connect Number (DCN) which is also known as the MOTOTalk ID, Radio-Private ID or iDen Number. The DCN is the identifying number used to communicate directly device-to-device with other iDen cellular phones. This number consists of a series of numbers formatted as ###*###*##### where the first three digits are the Area ID (region of the home carrier’s network), the next three digits are the Network ID (specific iDen carrier) and the last five digits are a subscriber identification number which sometimes corresponds to the last five of the cellular phone number. (Punja, & Mislan, 2008)

Removable /External Data Storage:

Many cellular phones currently on the market include the ability to store data to a removable storage device such as a Trans Flash Micro SD memory expansion card. In the event that such a card is installed in a cellular phone that is submitted for examination, the card should be removed by the examiner and processed using traditional digital forensics techniques. The processing of data storage cards using cellular phone forensic tools while the card remains installed within the phone may result in the alteration of date and time stamp data for files located on the data card.

Additionally, cellular phones may allow for the external storage of data within network based storage areas accessible to the phone’s user by computer. Accessing this data, which is stored on the cellular phone service provider’s network, may require further legal authority and is generally beyond the scope of the examination of a cellular phone handset. However, the potential existence of network based data storage should be taken into account by the examiner.

Many feature phones and smart phones are also designed to sync with a user’s computer to facilitate access to and transfer of data to and from the cellular phone. Full backups of the data from a phone may be found on the phone owner’s computer. The potential for the existence of these alternative storage areas should be considered by the examiner as additional sources of data originating from cellular phones.

Other Potential Evidence Collection:

Prior to beginning examination of a cellular phone, consideration should be given to whether or not other evidence collection issues exist. Cellular phones may be good sources of DNA, fingerprint or other biological evidence. Collection of DNA, biological and fingerprint evidence should be accomplished prior to the examination of the cellular phone to avoid contamination issues.

Preparation PHASE

Within the Identification phase of the process, the examiner has already engaged in significant preparation for the examination of the phone. However, the preparation phase involves specific research the regarding the particular phone to be examined, the appropriate tools to be used during the examination and preparation of the examination machine to ensure that all of the necessary equipment, cables, software and drivers are in place for the examination.

Once the make and model of the phone have been identified, the examiner can then research the specific phone to determine what available tools are capable of extracting the desired data from the phone. Resources such as phonescoop.com and mobileforensicscentral.com can be invaluable in identifying information about cellular phones and what tools will work to extract and document the data from specific phones. The SEARCH toolbar (available as a free download from www.search.org) contains additional and regularly updated resources for cellular phone examinations.

Appropriate Tools:

The tools that are appropriate for the examination of a cellular phone will be determined by factors such as the goal of the examination, the type of cellular phone to be examined and the presence of any external storage capabilities.

A matrix such as the one shown below of the tools available to the examiner, and what general technology of phones (GSM, CDMA, iDEN, SIM Card) they are compatible to be used with in an examination may also be helpful.

 

CDMA

GSM

iDen

SIM

Logical Dump

Physical Dump

BitPim

X

     

X

 

Data Pilot Secure View 2

X

X

       

Paraben Device Seizure

X

X

 

X

X

 

SIMCon

     

X

   

iDen Media Manager

   

X

     

Manufacturer / Other

X

X

X

X

   

Cellebrite

X

X

X

X

X

X

CellDEK

X

X

X

X

X

X

Oxygen Forensic Suite

X

X

 

X

   

XRY / XACT

X

X

X

X

X

X

Figure 3: Cellular Phone tool matrix (Kessler, 2008)

Cellular Phone Tool Leveling System:

When identifying the appropriate tools for the analysis of cellular phones, a useful paradigm is the Cell Phone Tool Leveling System. (Brothers, 2009) The tool leveling system is designed to categorize mobile phone and GPS forensic analysis tools by the depth to which they are capable of accessing data on a given device. As you move up the pyramid (generally):

· Methods get more “forensically sound”

· Tools get more expensive

· Methods get more technical

· Longer Analysis times

· More training required

· More invasive

 

Pyramid

Figure 4: Cellular Phone Tool Leveling Pyramid – (Brothers 2009)

1. Manual Analysis – physical analysis of the phone involving manual manipulation of the keyboard and photographic documentation of data displayed on the screen.

2. Logical Analysis - Connect data cable to the handset and extract data using AT, BREW, etc. commands in client/server architecture.

3. Physical Analysis (Hex Dump) - Push a boot loader into phone, dump the memory from phone and analyze the resulting memory dump.

4. Physical Analysis (Chip-Off) - Remove memory from the device and read in either second phone or EEprom reader.

5. Physical Analysis (Micro Read) - Use an electron microscope to view state of memory.

In general, examinations gather more detailed information and take more time as one advances through the levels.

Other resources that should not be overlooked are cellular phone manufacturer’s and cellular phone carrier’s websites. They can contain links to electronic versions of manuals for almost every make and model of phone as well as cable and phone driver downloads. Manufacturers will also sometimes provide free software designed for users to access and synchronize the data on their phones with their computers. While these tools are not designed specifically for extraction and documentation of evidence, they may be useful when other tools do not work. Caution should be used when utilizing these tools, as they are generally designed to access and write data back to a cellular phone.

Isolation PHASE

Cellular phones are by design intended to communicate via cellular phone networks. They are also sometimes capable of connecting to each other and to other networks via Bluetooth, infrared and wireless (WiFi) network capabilities. For this reason, isolation of the phone from these communication sources is important prior to examination. Isolation of the phone prevents the addition of new data to the phone through incoming calls and text messages as well as the potential destruction of data through a kill signal or accidental overwriting of existing data as new calls and text messages come in.

Isolation also prevents overreaching of the legal authority of warrant or consent that covers data on the phone. If the phone is isolated from the network, the examiner cannot accidently access voicemail, email, Internet browsing history or other data that may be stored on the service provider’s network rather than on the phone itself.

Isolation of a cellular phone in the field can be accomplished through the use of Faraday bags or radio frequency shielding cloths which are specifically designed for this purpose. Other available items such as arson cans or several layers of tinfoil may also be used to isolate some cellular phones. One problem with these isolation methods however is that once they’re employed, it is difficult or impossible to work with the phone as you can’t see through them or work with the phone’s keypad through them. Faraday tents, rooms, and enclosures exist, but are cost prohibitive.

Additionally, some laboratories within the US federal government may use signal jamming devices to curtail radio signals from being sent or received for a given area. The use of such devices is illegal for many organizations and their use should only be implemented after verifying the legality of the use of such devices for a given organization. Additional information about the FCC exceptions can be found at www.fcc.org.

Another viable option is to wrap the cellular phone in radio frequency shielding cloth and to then place the phone into Airplane mode (Also known as Standalone, Radio Off, Standby or Phone Off Mode). Instructions for how to place a phone into Airplane mode can be found in the manufacturer’s user manual for the particular make and model of cellular phone.

Unfortunately, not all phones have an Airplane mode, and sometimes even the most seemingly foolproof isolation methods can fail. Even if a cellular phone is successfully isolated from all networks, user data can still be affected if automatic functions are set to occur, such as alarms or appointment announcements. If these situations arise the examiner should document their attempts to isolate the phone and whether any incoming calls, text messages or other data transmissions occur during the course of the examination.

Processing PHASE

Once the phone has been isolated from the cell phone and other communication networks, the actual processing of the phone may begin. The appropriate tools to achieve the goal of the examination have been identified in the steps described previously, and they can now ideally be used to extract the desired data from the phone.

Accessing data stored on these cards through use or processing of the cellular phone may alter data on the data storage card thus any installed data storage/memory cards should be removed from the cellular phone and processed separately using traditional computer forensics methods to ensure that date and time information for files stored on the data storage/memory card are not altered during the examination. Similarly, SIM cards should be processed separately from the cellular phone they are installed in to preserve the integrity of the data contained on the SIM card.

Consideration should be given to the order of the software and hardware tools used during the examination of the cellular phone. There are advantages to consistency in order of tools used during examination of a cellular phone. This consistency may help the examiner to remember the order of tools used in the examination at a later time. Also, depending on the circumstances, it may make sense to use more intrusive tools first or last during the course of the examination depending upon the goals of the exam. For example, if the goal is to extract deleted information from the physical memory of the phone, starting the examination with a physical dump of the memory (if tools for this function are available) would make more sense than extracting individual files or the file system of the phone.

Verification PHASE

After processing the phone, it is imperative that the examiner engages in some sort of verification of the accuracy of the data extracted from the phone. Unfortunately, it is not unusual for cellular phone tools to erroneously or incompletely report data or to have different tools report conflicting information. Verification of extracted data can be accomplished in several ways.

Comparison of Extracted Data to the Handset

Comparison of the extracted data to the handset simply means checking to be sure the data which was extracted from the cellular phone matches the data displayed by the phone itself. This is the only authoritative way to ensure that the tools are reporting the phone information correctly.

Use of More than One Tool, Compare Results

Another way to ensure the accuracy of extracted data is to use more than one tool to extract data from the cellular phone and to cross verify the results by comparing the data reported from tool to tool. If there are inconsistencies, the examiner should use other means to verify the accuracy of the data extracted from the phone. Even if two tools report information consistently, verification via manual inspection of the handset is authoritative.

Use of Hash Values

If file system extraction is supported, traditional forensic tools can be used for verification of extracted data in several ways. The forensic examiner can extract the file system of the cell phone initially, and then hash the extracted files. Any individually extracted files can then be hashed and checked against the originals to verify the integrity of individual files.

Alternatively, the examiner could extract the file system of the cell phone initially, perform the examination and then extract the file system of the phone a second time. The two file system extractions can then be hashed and the hash values compared to see what data on the phone, if any, has been altered during the examination process. Any changed files should then be examined to determine if they are system files or user files to potentially determine the reason for the changes to the files. (Murphy, 2009) It should be noted that hashing only validates the data as it exists after it has left the phone. Cases where the data extraction process actually modifies data have been documented in other papers. (Dankar, Ayers & Mislan 2009)

In some cases, a combination of verification techniques may be employed to validate the integrity of the data extracted from the phone.

Documentation & reporting PHASE

Documentation of the examination should occur throughout the process in the form of contemporaneous notes regarding what was done during the examination. Examination worksheets can be helpful in the examination process to ensure that basic information is recorded.

The examiner’s notes and documentation should include information such as:

· The date and time the examination was started

· The physical condition of the phone

· Pictures of the phone and individual components (e.g., SIM card and memory expansion card) and the label with identifying information

· The status of the phone when received (off or on)

· Make, model, and identifying information

· Tools were used during the examination

· What data was documented during the examination

Most cellular phone tools include reporting functions, but these may not be sufficient for documentation needs. At times, the cellular phone tools may report inaccurate information such as the wrong ESN, MIN / MDN numbers, model, or erroneous date and time data, and so care must be taken to document the correct information after data verification. For law enforcement purposes, the process used extract data from the phone, the kinds of data extracted and documented and any pertinent findings should be accurately documented in reports. Even if the examiner is successful in extracting the desired data using available tools, additional documentation of the information through photographs may be useful, especially for court presentation purposes.

Presentation PHASE

Consideration should be given throughout the examination as to how the information extracted and documented can clearly be presented to another investigator, prosecutor and to a court. In many cases, the receiver may prefer to have the extracted data in both paper and electronic format so that call history or other data can be sorted or imported into other software for further analysis.

The investigator may also want to provide reference information regarding the source of date and time information, EXIF data extracted from images or other data formats, in order that recipients of the data are better able to understand the information.

For court purposes, pictures or video of the data as it existed on the cellular phone may be useful or compelling as exhibits. Extracted text messages may be great evidence, but pictures of the same text messages may be more familiar and visually compelling to a jury.

It is often very useful to present a series of pictures of text messages and call history logs in chronological order via a simple PowerPoint® presentation so that the progression of communications is shown clearly to the audience, whether the audience is an investigator, prosecutor, or jury. This is especially effective if there are a number of cellular phones involved in a case.

Archiving PHASE

Preservation of the data extracted and documented from the cellular phone is an important part of the overall process. It is necessary to retain the data in a useable format for the ongoing court process, future reference, and for record keeping requirements. Some cases may endure for many years before a final resolution, and most jurisdictions require that data be retained for varying lengths of time for the purposes of appeals.

Due to the proprietary nature of the various tools on the market for the extraction and documentation of cell phone data, consideration should be given to the ability to access saved data at a later date. If possible, store data in both proprietary and non-proprietary formats on standard media so that the data can be accessed later even in the event that the original software tool is no longer available. It may also be a good practice to retain a copy of the tool itself to facilitate the viewing of the data at a later date.

Conclusion

With the growing demand for examination of cellular phones and mobile devices, a need has also developed for the development of process guidelines for the examination of these devices. While the specific details of the examination of each device may differ, the adoption of consistent and well documented examination processes will assist the examiner in ensuring that the evidence extracted from each phone is well documented and that the results are repeatable and defensible in court. The information in this document is intended to be used as a guide for forensic examiners and digital investigators in the development of processes that fit the needs of their workplace.

The author wishes to thank the following individuals for their thoughtful and insightful additions as well as their assistance in reviewing and editing the content of this document:

· Richard Ayers –National Institute of Standards in Technology

· Sam Brothers – US Customs and Border Protection

· Richard Gilleland – Sacramento Police Department

· Michael Harrington –

· Eric Huber – A Fistful of Dongles

· Gary Kessler – Gary Kessler Associates

· Andrew Muir – Intern, Madison Police Department

· Tim O’Shea – US Attorney’s Office, Western District of Wisconsin

Bibliography

Brothers, S. (2009). How Cell Phone "Forensic" Tools Actually Work - Proposed Leveling System. Mobile Forensics World 2009. Chicago, Illinois

Ayers, R., Dankar, A. & Mislan, R. (2009). Hashing Techniques for Mobile Device Forensics. Small Scale Digital Device Forensics Journal , 1-6.

Kessler, G. (2010). Cell Phone Analysis: Technology, Tools, and Processes. Mobile Forensics World. Chicago: Purdue University.

Mislan, R.P., Casey, E., & Kessler, G.C. (2010). The Growing Need for On-Scene Triage of Mobile Devices. Digital Investigation, 6(3-4), 112-124

Punja, S & Mislan, R. (2008). Mobile Device Analysis. Small Scale Digital Device Forensics Journal, Vol. 2, No. 1 , 2-4.

Murphy, C. (2009). The Fraternal Clone Method for CDMA Cell Phones. Small Scale Digital Device Forensics Journal , 4-5.

Friday, July 16, 2010

Stop, Children, What’s That Sound?

In a previous post, I outed myself as an unrepentant SANS cheerleader.  To expand a bit on that full disclosure, it would be appropriate to point out that I will be acting as Rob Lee’s teacher’s assistant for SEC408 at SANS Network Security 2010 which will be held in Las Vegas from September 20th through the 25th.  I have a passion for teaching and presenting so I’m looking forward to this opportunity.

With that out of the way, I recently completed both SEC408 and SEC508.  I won’t bother with a review of either course because you can guess what I thought of them.  I think the SEC408/508 material is some of the best digital forensics training that I’ve ever run across.   I consider SEC408 and SEC508 to essentially be two parts of the same class.  I would strongly encourage even those who are experienced forensicators to consider taking SEC408 before taking SEC508.  SANS has put together a very nice assessment test for people to determine what courses they would best benefit from.  While it’s entirely possible that someone could already have SEC408 knowledge and not need to take the course before 508, I learned quite a bit from the SEC408 course.

SEC408 provided me with additional knowledge in areas that I already had a pretty decent grasp of such as browser forensics.  It was an excellent class that helped me sharpen my edge in forensic fundamentals.  I consider SEC508 to be a transformational experience where I was given entirely new tools that I have been using with great enthusiasm now that I have them in my arsenal. The tool that I want to blog about today is what Rob Lee accurately calls the Super Timeline.

Making Use of a Super Timeline

I won’t go over how to create a Super Timeline since Rob has already covered that as a high level in on the SANS Forensic Blog. What I’ve been working on recently is how to best make use of the resulting timeline. I have also discovered some interesting artifacts that never occurred to me to consider as part of a timeline.

What I’ve learned is that creating a Super Timeline is only the beginning of timeline analysis.  Because the Super Timeline method captures so many time stamps, it is likely that a Super  Timeline will contain too many entries to manually review line by line especially if an examiner creates a timeline for an entire drive image.  The challenge is to be able to pin down what portions of that timeline are relevant to the examination at hand.

What I recommend is to use more tactical forensic tools to pull out specific dates and times that can then be viewed in greater detail by using the Super Timeline.  A classic forensic examination is one where an examiner is asked to determine whether someone removed information such intellectual property from a computer using methods such as email or a USB device.  The Super Timeline is an invaluable tool for this sort of examination, but you have to know where to look on the timeline to get the data of interest.  Tools that can help an examiner do this are tools such Digital Detective’s Net Analysis and HSTEX, Harlan’s Reg Ripper and keyword searching via spreadsheet programs such as Excel.

I like the Net Analysis and HSTEX combo and I’ve been using both tools for many years.  Craig Wilson was recently awarded a well deserved Forensic 4cast Lifetime Achievement Award.  An examiner can take the latest version of HSTEX and use it to extract web browser history from an image.  If it’s a Windows operating system that is being examined, the Internet Explorer history will be of great interest because the examiner can load the HSTEX results into Net Analysis and then filter on terms like “file” to show just file access entries or terms like “attach” to find evidence where files might be uploaded or downloaded from something such as web based email.  The examiner can then take the date and time information for specific events of interest and refer to the Super Timeline to get a clearer picture of the events that surrounded that point in time.

Harlan has been doing some great work in the area of registry forensic research and tool development. Harlan’s Reg Ripper tool is a one that every examiner should have in their tool box and it’s Harlan’s regtime.pl tool that provides registry date and time data in the creation of a Super Timeline.  For example, using the Reg Ripper tool to determine what types of USB devices have been connected to a system allows the examiner to then search for device specific keywords on the Super Timeline.

Super Timelines are designed to be loaded up into a spreadsheet such as Microsoft Excel.  These spreadsheets can also be used to help an examiner zero in on specific events through keyword searching. Keywords such as the word “USB” can be used to help determine when a USB specific event occurred in the timeline.

One of the added bonuses that I’ve discovered from using Super Timelines is that it’s shown me new artifacts to be aware of during an examination.  For example, while examining a recent Super Timeline I saw the last accessed times being updated .wav files for the sounds that are made when a USB device is inserted or removed.  It occurs to me that this is a valuable thing to keep in mind when trying to determine what a user did on a particular computer.  When a user interacts with an operating system GUI like Windows, certain actions can result in sound files playing and that can result in the last accessed time stamps of those files being updated.

Twitter Update

I have decided to create a separate unprotected Twitter account called @AFoDBlog for the blog which will be dedicated exclusively to alerting readers to new blog posts and to also pass along digital forensic content that I think will be of interest.  It’s intended to be a low traffic volume feed that emphasizes quality over quantity. Since it’s unprotected you can see what you are getting into before following it.

I use my protected @ericjhuber account to Tweet about digital forensics. I also use it to socialize with my fellow digital forensic examiners which might not be something that readers care to read about.  Most people continue to follow that account once they start reading it, but I have noticed that some unfollow it.  I assume it’s because they aren’t necessarily interested in reading Ken Pryor and me swapping patrol stories about being bitten by cop hating dogs.  I, of course, think this is riveting stuff, but I understand others might not see it that way.

Tuesday, July 6, 2010

My Precious

SANS Digital Forensics has come up with another great idea which is the introduction of the SANS Institute Digital Forensics Lethal Forensicator Coin.  Rob explains the conditions under which the coin will be awarded over at the SANS Forensic Blog. For reasons I can’t completely explain or fully understand, I have an insatiable desire for one of these coins.  Gollum. Gollum.


The community has plenty of certifications and there is a continuing debate about the issues of licensing and certification. That’s another post or two for another day, but what we haven’t had has been these sort of awards where members of our community can be recognized for their achievements.  That’s one of the reasons why I’ve been supporting the efforts of Lee Whitfield in regards to the Forensic 4cast awards.  It’s nice to have an avenue where our best and brightest can be recognized by their peers and Lee has done the community an invaluable service with these awards.

$FILE_NAME Discussion


Harlan has a great contribution to the discussion we’ve been having here on the blog about the purpose of GUI tools and what examiners use them for in their examinations.  I think I’ve pretty much beaten the $FILE_NAME aspect into the ground, but Harlan’s post and A Thulin’s comment on my previous post made me take a step back and ask myself what do I use my primary GUI forensic tools for and what do I expect out of them?

Some rough definitions are probably in order at this point.  When I talk about primary GUI forensic tools, I’m talking about tools that are designed and marketed as a primary tool for an examination. These tools are designed to parse common file systems and in many cases are loaded up with other features such as capabilities dealing with browser forensics, email forensics, file carving and the like.  Tools in this category would be tools like EnCase, FTK, X-Ways Forensics, etc.

The tools that I think of as secondary GUI tools would be tools like Net Analysis or Cache Back which are designed to present information in a GUI format and allow an examiner to navigate around a particular bit of data. These sort of tools are designed for a limited purpose such as parsing a particular artifact like web browser logs.  Unlike the primary GUI tools, they aren’t designed to be a general forensic tool.

The tools that I think of as secondary tools would be non-GUI programs like Harlan’s Reg Ripper which aren’t designed to allow an examiner to navigate around a particular artifact (like what, for example, Registry Viewer does), but are intended to do things like parse a particular bit of data with the goal of providing the examiner a report of it’s findings.

What do I use my primary GUI forensic tools for? Not nearly as much as I used to use them for when I first started.   The tool that I tend to use the most is still EnCase and as I thought about Harlan’s post I asked myself why I still use it the most even though FTK 3 is arguably a better product.

I think I prefer EnCase still because it feels like a sort of glorified read-only hex “editor” (I put editor in quotes since it’s obviously not designed to edit anything) that is easy for me to use when I need to work at the hex level on a disk. It’s what I’ve “grown up” using so it’s something that I’m very comfortable with for what I do with primary GUI tools.    The rub is that I find myself using FTK 3 more and more these days since it scales much better when it comes to larger data sets and handles a lot non-file system artifacts like compressed files much better than EnCase.  If someone was just starting out and could only buy one, I’d tell them to get FTK 3.  I’d hate to have just one, but I suppose I’d make the same decision if I were forced to make the choice.
I just don’t tend to use all of the additional features that are loaded into programs like EnCase and FTK. Browser forensics are a good example. I don’t use FTK or EnCase to parse web history artifacts like index.dat files because there are much better programs out there such as Net Analysis.  I think that’s one of the reasons why I continue to use EnCase.  For the most part, it does what I need it to do when it comes to file system parsing and I have loads of secondary forensic (GUI and otherwise) tools that I can use by just right clicking on a file in EnCase and sending to a tool like Net Analysis.  I’m also using the SIFT Workstation a lot these days because it’s packed with all sorts of tools that greatly enhance my ability to do an examination by doing things like memory forensics and Super Timeline analysis.

I do tend to use FTK 3 for email forensic work and I think that is one of the reasons why FTK 3 is a good value.  You get the general file system forensics along with some decent secondary tools like Registry Viewer and PRTK.  You also get a nice email forensic tool built into FTK 3.  I’ve just never been that enamored with EnCase as an email forensic program largely because I don’t think the GUI works as well for email forensics compared to FTK and because I hate having to reload the email each time I reopen a case in EnCase.  EnCase just scales poorly when it comes to email, but I know Guidance is working hard to catch up based on what I learned from them at CEIC.

So the bottom line is that as I have gained more experience, I’ve relied less on my primary GUI tools and more on a whole host of other specialized tools along with the manual parsing of artifacts.  Since EnCase does a good job parsing file system data and can easily be used as a springboard for the use of other tools, I keep using it especially since it just feels very natural for me when it comes to hex level forensic work.

Wednesday, June 30, 2010

$FILE_NAME Follow Up

After I posted on the lack of NTFS $FILE_NAME data provided by the major GUI forensic tools, there were several great comments left for that post that described a variety of tools from people like Harlan, David Kovar and Mark Menz.   While these are great tools from three  forensic gurus, I’m still a bit perplexed why the major GUI software tool makers don’t just deal with parsing this data head on.

I’ve had at least one person tell me that EnCase could do this with an EnScript.  Of course, EnCase can do a lot of things with EnScripting. The rub is that I don’t want to use an EnScript for something that should be part of the standard GUI column view along with the $STANDARD_INFORMATION time stamp values.  For example, I want to be able to quickly view the $FILE_NAME information for the files stored in particular folder or volume for timeline purposes.  One of the primary reasons we use GUI forensic tools like EnCase and FTK is that they serve as  overall file system examination tools.  We can use them to examine our evidence from a high level and then decide which of the more specialized tools we wish to employ to drill down on specific artifacts. 

I don’t expect EnCase or FTK to do everything for me. That’s why we have people like Craig Wilson, Rob Lee, Harlan Carvey, Mark McKinnon, Lee Whitfield, Paul Sanderson, Kristin Gudjonsson and all of the rest of the fantastic forensic tool developers out there who make great tools for specific purposes that compliment the major GUI tools. However, I do expect them to parse basic $MFT record information which includes $FILE_NAME time stamps.

Since I made my original post, I discovered that the fine people over at Technology Pathways are doing this at least with a free version of their Pro Discover tool.  Pro Discover Basic is  a very basic GUI forensic tool, but it does what every major GUI tool should do which is to parse both the $STANDARD_INFORMATION and $FILE_NAME  time stamps in glorious column form.

EnCase doesn’t do this at all.  FTK is sort of…kind of…starting to move in this direction.  If you look in the comments section of my previous post on this issue, you’ll see that a couple Access Data engineers were nice enough to drop by and explain that FTK 3.1 parses this data….sometimes. I say “sometimes” because it doesn’t do it as part of the normal column view and it reportedly only shows the data to the examiner if the $FILE_NAME values are different form the $STANDARD_INFORMATION values.  I have no idea why Access Data is making it this complex.  I absolutely do not want this level of hand holding from my forensic tools.  I want to be able to see for myself what the time stamp values are for a given file.  Concealing basic time stamp information from me because they think it’s…I guess…not important isn’t helpful.

If the Guidance Software and Access Data think that having the extra $FILE_NAME columns in their standard GUI file system view would somehow confuse the examiner or clutter the interface, then they can make them turned off by default and require the examiner to “opt in” to see them.

What am I missing here?

Forensic 4cast Awards

The Forensic 4cast awards are upon us! If you haven’t voted yet, you still have time before the awards presentation at the SANS Forensics and Incident Response Summit on July 8th.  You can also attend the award ceremony for free even if you aren’t attending the summit. Lastly, the fine people over at Disk Labs have sponsored the actual awards which are pretty amazing looking.

Saturday, June 19, 2010

Give Me $FILE_NAME or Give Me Death

I think we’re long past the point as a community where we should be pushing the vendors of our GUI forensic tools to provide us with the $FILE_NAME time values inside of an NTFS $MFT record.  Every tool parses the $STANDARD_INFORMATION time values, but that should no longer be considered the bare minimum for a GUI forensic tool.  Most tools do not provide the $FILE_NAME time values as part of their standard file system navigation experience.  The concern that has been expressed in the past was that adding this information would be confusing to the user.  While I can certainly understand that it might be confusing to an inexperienced or poorly trained examiner, that’s not a good reason for not presenting the information.  If an examiner doesn’t understand how an $MFT record works, then this confusion is a teachable moment that will hopefully prompt the examiner to learn more about the inner workings of an $MFT record.  The information is out there and it’s easily accessible on the Web, through training courses and books.

Yes, I can parse the data manually or by scripting with the various vendor tools.  However, it’s much more useful to me if I can have these data stamps parsed automatically and presented to me as part of the main user interface experience.

I’m not familiar with all of the forensic tools that are available so I’ll have to rely on other people to let me know what tools might be doing this already. I’ve been using Sleuth Kit more and more these days and it parses everything (istat) because it’s Brian Carrier’s awesome tool.  I heard a long time ago that Pro Discover might present some of this information to the user also, but I’d be curious if someone could verify that for me. Any other tools that are doing this?

What do you think? Am I missing something? Why wouldn’t we want this information presented to us up front in our GUI tools?

Forensic 4cast Awards Voting Has Opened

The nominations have closed for the upcoming Forensic 4cast awards and the voting has started.  SANS announce this week that the awards will be open to everyone so if you are in the DC area and aren’t attending the SANS Forensic and Incident Response Summit, you can still attend the awards.

New Tools

I’ve been made aware of a couple new forensic tools that I’d like to share with everyone. 

The first one is Defraser which is a tool by the Netherlands Forensic Institute.  I learned about this tool when I was taking SEC563 at SANSFIRE recently.  This is a carving tool that will recover full and partial video data.  I have just started it so I can’t yet speak to how well it works yet, but I’m excited about the possibilities.

The second tool is called raw2vdmk.  It looks like it’s an alternative to LiveView.  I use LiveView quite a bit and I’m quite fond of it.  I haven’t tried raw2vdmk, but I would potentially give it a spin if it could do something that LiveView couldn’t do for me.

Tuesday, June 15, 2010

Bacon

This post is about SANS and last week’s SANSFIRE 2010.  It also contains a review of the SEC563 Mobile Device Forensics course that I attended at the conference.
Full Disclosure: I’m a member of the GIAC Advisory Board and an advisor to the GIAC Ethics Council.
Fuller Disclosure: I’m a SANS independent contractor who is writing test questions for the GCFE (GIAC Certified Forensic Examiner) certification. This is the certification that will be linked to the SEC408 Computer Forensics Fundamentals course. SANS is nice enough to pay people who do this work a little bit of money for their work.  Don’t tell SANS, but I’d do it for free.
Fullest Disclosure: I’m an unrepentant SANS cheerleader.

I Heart SANS

My first SANS experience was in 2004 when I took SEC504 Hacker Techniques, Exploits and Incident Handling from Ed “Skodo Baggins” Skoudis at a smaller SANS event that was held in Phoenix.   I had no idea who Ed was before taking the class, but I certainly knew who he was after the class.  SEC504 with Ed was one of the finest training experiences that I’ve ever attended and I cherish the experience to this day.  I consider it a transformational experience because it opened my eyes to all of the possibilities in information security world.  Ed essentially acted as Virgil to my Dante.  Not only was the course content fascinating, but I was amazed at what an incredible instructor Ed was.  Since that course, Ed has been an example to me of just how good an instructor can and should be. 

When I took this course with Ed, it was in the days of the old certification model where certification GCIH candidates were required to complete a white paper before they were allowed to attempt to take the two tests that were necessary to pass the  certification process.  Incident handling was new to me, but I somehow managed to successfully complete the paper and then was faced with the two tests.  The first test covered the incident handling process and I scored somewhere in the 80s on that test and passed.  The second test dealt with the technical aspects of the course and I think my score was 78.  I remember it was in the 70s and I was very glad to have achieved that score.  It was a long and difficult process, but completing it was more than worth it.

I recertified over a year ago and scored well on that test.  Because I scored over 90, I was invited to join the GIAC Advisory Board.  After I did so, I had more of the SANS world opened up to me because I could see the SANS staff interacting with the rest of the Advisory Board through the Board’s email list.  This was a very educational experience because it allowed me to observe Stephen Northcutt and some of the other SANS leadership in action.  Additionally, I recently had the opportunity to serve with Stephen with project that we are both involved in.  Stephen is the face of SANS since he is the CEO of SANS, holds a position on the GIAC Board (I can’t remember if he’s the chair of the board or not. I’m sure he told me last week, but my brain can only hold so much new information while being soaked with the SANS knowledge fire hose) and is the President of the SANS Technology Institute.

The best I can tell is that Stephen is the person who the Dos Equis people modeled their most recent advertising campaign on. Stephen has got to be a contender for the information security version of the Most Interesting Man in the World(tm). When he’s not traveling around the globe leading his merry band of SANS people, he does things like write, pontificate, snorkel, sail and live in Hawaii.


SANSFIRE 2010 Review

Last week’s SANSFIRE was my first major SANS conference.  As you can tell from the tone of this post so far, I was not disappointed.  SANS does a great job putting on these conferences and there is a lot of attention to detail.  There were legions of helpful work study facilitators who made everything run smoothly.  The major SANS conferences are a great experience because not only do you get to attend training with the top SANS instructors, but there are a whole host of networking opportunities available to you. These conferences are attended by a large number people with very diverse information security backgrounds.  There were plenty of after hour events to attend such as the very popular SANS @Night presentations where industry experts gave talks that could be attended by anyone at the conference.  SANS also provided snacks and drinks during the morning and afternoon breaks that kept everyone going.  During one of the evenings early in the conference, they provided free food (very nice hot dogs and pretzels this time) along with a live band and several cash bars. Day 5 was ice cream day where the afternoon snack was all sorts of frozen goodies.  One of the nice touches is that they had a cash bar available during the initial registration on Sunday evening and even provided a free drink ticket with the registration packet.  That’s right.  We got a free beer on SANS after we picked up our registration information.  It was a nice touch after more than three hours of slogging through East Coast traffic to get to Baltimore. 

One of the things I found was that the SANS instructors are very approachable even if you aren’t taking their class.  I was able to meet a lot of the instructors who I have met through various electronic methods, but never in person such as James Tarala, Chad Tilbury and Paul Henry.  I was also able talk to Ed Skoudis in person after corresponding with him for many years.  I’ve recently started presenting on digital forensics in conference settings and Ed is always good for a great teaching tip or two. The SANS staff (both the instructors and the support staff) earn their pay during these conferences because they always have to be “on” in case they run across someone like me after class.


SANS SEC 563 Mobile Device Forensics Review

The class that I was at SANSFIRE to attend was SEC563 Mobile Device Forensics.  Eoghan Casey and Terry Maguire from cmdLabs taught the class. Eoghan has been the primary person behind the course since it’s inception.  Thus, those of us who took the class had the benefit of being taught by two very accomplished digital forensic examiners and instructors.  If I had only one word to describe what I thought of this course, I would pick the following word: Bacon.  Not turkey bacon.  That’s undead zombie pseudo-bacon. We’re talking thick cut smoked bacon.  I like bacon and I liked SEC563.

Putting together a five day mobile device class is a pretty tall order given the current fluid state of the tools and methods. There isn’t a lot of standardization in the mobile device world given all of the different phones, carriers, operating systems and third party applications.  The computer forensics world is relatively static and mature at least to the extent that we deal only with relatively small number of operating and file systems.
The course struck a very even balance between lecture content and hands on exercises for the students.  Students are introduced to a wealth of different forensic tools and many of them are used during the practical exercises.  Because there is so much hands on work, the class is limited to no more than 25 students.  

The class was an overview of the mobile device forensics world and provided students the fundamental knowledge to get started by exposing them to the wide variety of tools and methods that are available.  I took this class because I am relatively new to mobile device forensics and I found that I learned an immense amount.  I wish I would have taken this class earlier in my studies because it would have made tool selection and process development much easier.  I came out of the course with a fundamental understanding of how to examine SIM cards, CDMA and GSM phones.  I can’t call myself an expert in mobile device forensics and it would have been unreasonable to think that even with instructors like Eoghan and Terry that I could be brought up to their level in just week.  However, taking this course is one of the most efficient ways to gain the fundamentals that an examiner would need to pursue mastery of the subject.

This course reinforced my initial impression that mobile device forensics is basically the wild, wild west right now.  There are some useful tools out there, but the state of the tools and methods aren’t nearly as mature as they are in computer forensics.  Eoghan and Terry stressed the need to validate results and to not put all of your faith into one tool.   Manual review of mobile devices is still very necessary in some cases and validation has to be a key concern of an examiner. 

So the bad news is that the state of mobile device forensics is very fluid and complicated.  A lot of hex level examination still needs to be done in cases where tools won’t do the parsing for you. To me, this is also the good news.  I know some examiners hate it, but I enjoy working at the hex level.  It’s not practical to do it as a primary method of examination, but there’s just something I find really fulfilling when I pull a bit of useful evidence out with a hex editor.  If you like this sort of thing, you’re going to love both mobile device forensics and this class.