Wednesday, June 30, 2010

$FILE_NAME Follow Up

After I posted on the lack of NTFS $FILE_NAME data provided by the major GUI forensic tools, there were several great comments left for that post that described a variety of tools from people like Harlan, David Kovar and Mark Menz.   While these are great tools from three  forensic gurus, I’m still a bit perplexed why the major GUI software tool makers don’t just deal with parsing this data head on.

I’ve had at least one person tell me that EnCase could do this with an EnScript.  Of course, EnCase can do a lot of things with EnScripting. The rub is that I don’t want to use an EnScript for something that should be part of the standard GUI column view along with the $STANDARD_INFORMATION time stamp values.  For example, I want to be able to quickly view the $FILE_NAME information for the files stored in particular folder or volume for timeline purposes.  One of the primary reasons we use GUI forensic tools like EnCase and FTK is that they serve as  overall file system examination tools.  We can use them to examine our evidence from a high level and then decide which of the more specialized tools we wish to employ to drill down on specific artifacts. 

I don’t expect EnCase or FTK to do everything for me. That’s why we have people like Craig Wilson, Rob Lee, Harlan Carvey, Mark McKinnon, Lee Whitfield, Paul Sanderson, Kristin Gudjonsson and all of the rest of the fantastic forensic tool developers out there who make great tools for specific purposes that compliment the major GUI tools. However, I do expect them to parse basic $MFT record information which includes $FILE_NAME time stamps.

Since I made my original post, I discovered that the fine people over at Technology Pathways are doing this at least with a free version of their Pro Discover tool.  Pro Discover Basic is  a very basic GUI forensic tool, but it does what every major GUI tool should do which is to parse both the $STANDARD_INFORMATION and $FILE_NAME  time stamps in glorious column form.

EnCase doesn’t do this at all.  FTK is sort of…kind of…starting to move in this direction.  If you look in the comments section of my previous post on this issue, you’ll see that a couple Access Data engineers were nice enough to drop by and explain that FTK 3.1 parses this data….sometimes. I say “sometimes” because it doesn’t do it as part of the normal column view and it reportedly only shows the data to the examiner if the $FILE_NAME values are different form the $STANDARD_INFORMATION values.  I have no idea why Access Data is making it this complex.  I absolutely do not want this level of hand holding from my forensic tools.  I want to be able to see for myself what the time stamp values are for a given file.  Concealing basic time stamp information from me because they think it’s…I guess…not important isn’t helpful.

If the Guidance Software and Access Data think that having the extra $FILE_NAME columns in their standard GUI file system view would somehow confuse the examiner or clutter the interface, then they can make them turned off by default and require the examiner to “opt in” to see them.

What am I missing here?

Forensic 4cast Awards

The Forensic 4cast awards are upon us! If you haven’t voted yet, you still have time before the awards presentation at the SANS Forensics and Incident Response Summit on July 8th.  You can also attend the award ceremony for free even if you aren’t attending the summit. Lastly, the fine people over at Disk Labs have sponsored the actual awards which are pretty amazing looking.

Saturday, June 19, 2010

Give Me $FILE_NAME or Give Me Death

I think we’re long past the point as a community where we should be pushing the vendors of our GUI forensic tools to provide us with the $FILE_NAME time values inside of an NTFS $MFT record.  Every tool parses the $STANDARD_INFORMATION time values, but that should no longer be considered the bare minimum for a GUI forensic tool.  Most tools do not provide the $FILE_NAME time values as part of their standard file system navigation experience.  The concern that has been expressed in the past was that adding this information would be confusing to the user.  While I can certainly understand that it might be confusing to an inexperienced or poorly trained examiner, that’s not a good reason for not presenting the information.  If an examiner doesn’t understand how an $MFT record works, then this confusion is a teachable moment that will hopefully prompt the examiner to learn more about the inner workings of an $MFT record.  The information is out there and it’s easily accessible on the Web, through training courses and books.

Yes, I can parse the data manually or by scripting with the various vendor tools.  However, it’s much more useful to me if I can have these data stamps parsed automatically and presented to me as part of the main user interface experience.

I’m not familiar with all of the forensic tools that are available so I’ll have to rely on other people to let me know what tools might be doing this already. I’ve been using Sleuth Kit more and more these days and it parses everything (istat) because it’s Brian Carrier’s awesome tool.  I heard a long time ago that Pro Discover might present some of this information to the user also, but I’d be curious if someone could verify that for me. Any other tools that are doing this?

What do you think? Am I missing something? Why wouldn’t we want this information presented to us up front in our GUI tools?

Forensic 4cast Awards Voting Has Opened

The nominations have closed for the upcoming Forensic 4cast awards and the voting has started.  SANS announce this week that the awards will be open to everyone so if you are in the DC area and aren’t attending the SANS Forensic and Incident Response Summit, you can still attend the awards.

New Tools

I’ve been made aware of a couple new forensic tools that I’d like to share with everyone. 

The first one is Defraser which is a tool by the Netherlands Forensic Institute.  I learned about this tool when I was taking SEC563 at SANSFIRE recently.  This is a carving tool that will recover full and partial video data.  I have just started it so I can’t yet speak to how well it works yet, but I’m excited about the possibilities.

The second tool is called raw2vdmk.  It looks like it’s an alternative to LiveView.  I use LiveView quite a bit and I’m quite fond of it.  I haven’t tried raw2vdmk, but I would potentially give it a spin if it could do something that LiveView couldn’t do for me.

Tuesday, June 15, 2010

Bacon

This post is about SANS and last week’s SANSFIRE 2010.  It also contains a review of the SEC563 Mobile Device Forensics course that I attended at the conference.
Full Disclosure: I’m a member of the GIAC Advisory Board and an advisor to the GIAC Ethics Council.
Fuller Disclosure: I’m a SANS independent contractor who is writing test questions for the GCFE (GIAC Certified Forensic Examiner) certification. This is the certification that will be linked to the SEC408 Computer Forensics Fundamentals course. SANS is nice enough to pay people who do this work a little bit of money for their work.  Don’t tell SANS, but I’d do it for free.
Fullest Disclosure: I’m an unrepentant SANS cheerleader.

I Heart SANS

My first SANS experience was in 2004 when I took SEC504 Hacker Techniques, Exploits and Incident Handling from Ed “Skodo Baggins” Skoudis at a smaller SANS event that was held in Phoenix.   I had no idea who Ed was before taking the class, but I certainly knew who he was after the class.  SEC504 with Ed was one of the finest training experiences that I’ve ever attended and I cherish the experience to this day.  I consider it a transformational experience because it opened my eyes to all of the possibilities in information security world.  Ed essentially acted as Virgil to my Dante.  Not only was the course content fascinating, but I was amazed at what an incredible instructor Ed was.  Since that course, Ed has been an example to me of just how good an instructor can and should be. 

When I took this course with Ed, it was in the days of the old certification model where certification GCIH candidates were required to complete a white paper before they were allowed to attempt to take the two tests that were necessary to pass the  certification process.  Incident handling was new to me, but I somehow managed to successfully complete the paper and then was faced with the two tests.  The first test covered the incident handling process and I scored somewhere in the 80s on that test and passed.  The second test dealt with the technical aspects of the course and I think my score was 78.  I remember it was in the 70s and I was very glad to have achieved that score.  It was a long and difficult process, but completing it was more than worth it.

I recertified over a year ago and scored well on that test.  Because I scored over 90, I was invited to join the GIAC Advisory Board.  After I did so, I had more of the SANS world opened up to me because I could see the SANS staff interacting with the rest of the Advisory Board through the Board’s email list.  This was a very educational experience because it allowed me to observe Stephen Northcutt and some of the other SANS leadership in action.  Additionally, I recently had the opportunity to serve with Stephen with project that we are both involved in.  Stephen is the face of SANS since he is the CEO of SANS, holds a position on the GIAC Board (I can’t remember if he’s the chair of the board or not. I’m sure he told me last week, but my brain can only hold so much new information while being soaked with the SANS knowledge fire hose) and is the President of the SANS Technology Institute.

The best I can tell is that Stephen is the person who the Dos Equis people modeled their most recent advertising campaign on. Stephen has got to be a contender for the information security version of the Most Interesting Man in the World(tm). When he’s not traveling around the globe leading his merry band of SANS people, he does things like write, pontificate, snorkel, sail and live in Hawaii.


SANSFIRE 2010 Review

Last week’s SANSFIRE was my first major SANS conference.  As you can tell from the tone of this post so far, I was not disappointed.  SANS does a great job putting on these conferences and there is a lot of attention to detail.  There were legions of helpful work study facilitators who made everything run smoothly.  The major SANS conferences are a great experience because not only do you get to attend training with the top SANS instructors, but there are a whole host of networking opportunities available to you. These conferences are attended by a large number people with very diverse information security backgrounds.  There were plenty of after hour events to attend such as the very popular SANS @Night presentations where industry experts gave talks that could be attended by anyone at the conference.  SANS also provided snacks and drinks during the morning and afternoon breaks that kept everyone going.  During one of the evenings early in the conference, they provided free food (very nice hot dogs and pretzels this time) along with a live band and several cash bars. Day 5 was ice cream day where the afternoon snack was all sorts of frozen goodies.  One of the nice touches is that they had a cash bar available during the initial registration on Sunday evening and even provided a free drink ticket with the registration packet.  That’s right.  We got a free beer on SANS after we picked up our registration information.  It was a nice touch after more than three hours of slogging through East Coast traffic to get to Baltimore. 

One of the things I found was that the SANS instructors are very approachable even if you aren’t taking their class.  I was able to meet a lot of the instructors who I have met through various electronic methods, but never in person such as James Tarala, Chad Tilbury and Paul Henry.  I was also able talk to Ed Skoudis in person after corresponding with him for many years.  I’ve recently started presenting on digital forensics in conference settings and Ed is always good for a great teaching tip or two. The SANS staff (both the instructors and the support staff) earn their pay during these conferences because they always have to be “on” in case they run across someone like me after class.


SANS SEC 563 Mobile Device Forensics Review

The class that I was at SANSFIRE to attend was SEC563 Mobile Device Forensics.  Eoghan Casey and Terry Maguire from cmdLabs taught the class. Eoghan has been the primary person behind the course since it’s inception.  Thus, those of us who took the class had the benefit of being taught by two very accomplished digital forensic examiners and instructors.  If I had only one word to describe what I thought of this course, I would pick the following word: Bacon.  Not turkey bacon.  That’s undead zombie pseudo-bacon. We’re talking thick cut smoked bacon.  I like bacon and I liked SEC563.

Putting together a five day mobile device class is a pretty tall order given the current fluid state of the tools and methods. There isn’t a lot of standardization in the mobile device world given all of the different phones, carriers, operating systems and third party applications.  The computer forensics world is relatively static and mature at least to the extent that we deal only with relatively small number of operating and file systems.
The course struck a very even balance between lecture content and hands on exercises for the students.  Students are introduced to a wealth of different forensic tools and many of them are used during the practical exercises.  Because there is so much hands on work, the class is limited to no more than 25 students.  

The class was an overview of the mobile device forensics world and provided students the fundamental knowledge to get started by exposing them to the wide variety of tools and methods that are available.  I took this class because I am relatively new to mobile device forensics and I found that I learned an immense amount.  I wish I would have taken this class earlier in my studies because it would have made tool selection and process development much easier.  I came out of the course with a fundamental understanding of how to examine SIM cards, CDMA and GSM phones.  I can’t call myself an expert in mobile device forensics and it would have been unreasonable to think that even with instructors like Eoghan and Terry that I could be brought up to their level in just week.  However, taking this course is one of the most efficient ways to gain the fundamentals that an examiner would need to pursue mastery of the subject.

This course reinforced my initial impression that mobile device forensics is basically the wild, wild west right now.  There are some useful tools out there, but the state of the tools and methods aren’t nearly as mature as they are in computer forensics.  Eoghan and Terry stressed the need to validate results and to not put all of your faith into one tool.   Manual review of mobile devices is still very necessary in some cases and validation has to be a key concern of an examiner. 

So the bad news is that the state of mobile device forensics is very fluid and complicated.  A lot of hex level examination still needs to be done in cases where tools won’t do the parsing for you. To me, this is also the good news.  I know some examiners hate it, but I enjoy working at the hex level.  It’s not practical to do it as a primary method of examination, but there’s just something I find really fulfilling when I pull a bit of useful evidence out with a hex editor.  If you like this sort of thing, you’re going to love both mobile device forensics and this class.

Thursday, June 3, 2010

Forensic 4cast Awards

The 2nd annual Forensic 4cast Awards will be held at the SANS Forensic Summit on July 8th of this year.  You do not have to attend the Summit to make nomination or to vote for the awards.  The nomination period is open, but it will close on the 13th of this month so get your nominations in soon!

Lee Whitfield hosted another episode of Forensic 4cast this weekend. Mark McKinnon and I were the panelists and we discussed the recently completed CEIC conference, the Guidance Software acquisition of Tableau and many other topics such as the upcoming Summit. 

I’ll be at SANSFIRE next week absorbing powerful Eoghan-Fu when I take his SEC563 Mobile Device Forensics class.  I’m looking forward to being able to finally meet Eoghan and many of the other SANS Instructors in person as well as learning from my fellow students.  I’ll post a review of the course and SANSFIRE afterwards.  I’ll also provide a reviews for the OnDemand versions of SEC408 Computer Forensics Essentials and SEC508 Computer Forensics and Incident Response in the near future. 

Eoghan released his most recent book late last year which Harlan reviewed on his blog.  You can read my review as well as one from Richard Betjlich over at Amazon. It is an excellent book that is good for those of us who are already in the field and those who are considering a career in digital forensics.

Lastly, I want to thank all of you for your support of the blog and your comments here and in other venues.  I’m very grateful for the feedback and the response has been more than I ever expected.  I use Google Analytics to provide some idea if anyone reads this blog and the numbers have been very strong.

Saturday, May 29, 2010

My Big Fat CEIC 2010 Post

I attended CEIC 2010 last week and I think I’m still processing all that I took in during the week.  This was the first time that I attended the Guidance Software run conference and they did a magnificent job with it.  It was a very well planned and executed conference which was reported around 1300 people in attendance.  It was held at the Red Rock Resort which is a very modern and well run facility.  I still sort of miss the “disco elevators”.  Those of you who were there know what I’m talking about…

I don’t get out to as many conferences as I would like because of time and expense, but one of the reasons I like to travel to these conferences is that I get to meet people in person who I generally only get to communicate with via electronic methods like email, twitter and phone.  These conferences are also a great way to get a lot of information very quickly about the state of the industry which allows you to keep up on industry trends.  For example, I’ve used HTCIA conferences that I have attended in the past to ramp up on the state of mobile phone forensics.  This time I spent a lot of time learning and talking about timeline analysis and memory forensics.

I also was able to spend time speaking to various people inside of Access Data and Guidance Software.  It turns out that my previous “Don’t Panic” post circulated around Guidance Software and, fortunately, they took it in the constructive spirit it was intended rather than just someone else running them down.  One of the things that they were concerned about was that when I spoke about employees from Guidance who went over to Access Data , they didn’t want people to think that they had lost their developmental staff in that process.  They made a point to let me know that they didn’t suffer a developmental exodus and that with the addition of the Tableau developmental team, they are very excited about their prospects for future innovation.  Access Data also made a point to praise and promote their developmental team.  Given that team is responsible for FTK 3, they certainly deserve to take a victory lap.

Both Guidance Software and Access data are working on some exciting innovations that they were generous enough to talk to me about.  My purpose as a blogger is to positively contribute to the discussions of issues important to our community and to distribute my own research work.  My purpose isn’t to “scoop” other bloggers or to make announcements that disrupt a vendor’s communication and marketing strategy by revealing information before a vendor is ready.  Doing so wouldn’t serve any useful purpose and my contacts likely wouldn’t talk to me again which means I wouldn’t have access to their industry insights. Thus, this paragraph will just have to serve as a teaser of sorts.   Talking to both camps felt like talking to two championship class NFL football teams who were gearing up for the Super Bowl.  Both companies are hard at work innovating and creating good things for the community.

Of course, What I can talk about is the information that was made public such as what was discussed in the “EnCase Forensic Roadmap” session that was held at CEIC.  This is where Ken Basore and Ashley Stockdale discussed what we can expect in the next year or so with EnCase Forensic.  Some of the high points were:

1. Guidance is working on a new indexing engine.  They are not considering using a third party licensing engine and are sticking with in house development team’s effort. I originally thought this wasn’t a great idea because I could never understand why they just didn’t do something like licensing dtsearch, but it was explained to me that when you do that, you lose a certain amount of control over your product.  What happens if your third party tool (whether it’s indexing, file viewing, email parsing, etc, etc) causes software instability?  People will blame you for it when it’s an issue that needs to be addressed by the third party technology maker.

It also is obviously going to cut into profits compared to the financial benefits you reap when you develop your own tools. However, third party technology is appealing because you simply can’t expect your development teams with finite resources to be experts in everything. Thus, companies like Access Data and Guidance Software have difficult decisions to make when considering how use their resources. Do you develop in house? Do you license technology? Do you just purchase it outright?

So I find myself ambivalent on this decision to continue to develop an internal indexing engine.  Maybe it’s a good idea, maybe it’s not.  We’ll know soon enough and I hope that the next version of the index engine is successful.  I don’t use the current EnCase indexing engine (I use Access Data’s FTK for all of my indexing needs) because I gave up on it after they released it before it was ready.  I intend to give it a try the next time I do an examination so that I have a basis to compare it with whatever they come up with next.

2. Multi-threaded acquisition.  This innovation has already been introduced in version 6.16.  While I haven’t had a chance to test it, I did talk to at least one person who stated that the acquisition speeds rivaled the excellent Tableau TIM product.

3. Evidence Preprocessing innovations.  Guidance is working on an evidence preprocessor that will run in the background of EnCase. It will provide examiners with intuitive options and will present the examiner evidence in stages.  Thus, you will be able to access data as it’s processed rather than having to wait until all of the processing is over.  Since it’s going to run in the background, you’ll also be able to work on your case while the processor is running.

This is a great idea, but one of the biggest complaints that I hear from people and that I have myself is when you ask EnCase to do some sort of processing, you increase your risk of encountering the dreaded “White Screen of Wait”. This is where EnCase chugs away on something, but uses so much resources that you can’t actually do anything with the program until the resources are freed up.  Just this week I followed a twitter thread with some experienced forensic examiners who were lamenting this issue.  Thus, if this is going to be successful, it’s going to have to truly be able to run in the background and not prevent the examiner from working with their case.   The hopeful news on this front is item 4 which is…

4. Work product storage innovation.  This is my terminology rather than Guidance’s.  I forgot the language that they used but I have the phrase “transportable cache files” in my notes. To their credit, Guidance understands that we hate having to pay for the same real estate twice, so to speak.   One of the frustrations we all have with EnCase is that when you do something like parse a container file like a Zip file, you essentially have to do the same thing all over again when you open up a case.  What Guidance is going to do is get away from the model where all of your work product is stored in just the traditional EnCase evidence file.  There will be additional container files that will contain your work product so that you just have to do processing once and not have to repeat it again.

This is huge and this is clearly an attempt to keep up with Access Data’s FTK (1 and 3) where you just have to process things once and you’re done.  In fact, FTK 3 processes a lot of data very quickly and you’re done. 

So the innovation battle lines are drawn when it comes to indexing and work product storage.

5. Evidence File V2.  The new version of the EnCase evidence file will be faster, smarter, better looking and will have a lovely singing voice.  Okay, maybe that’s not what they said, but that’s essentially what I heard.  They are also going incorporate the option to encrypt evidence files.   The new format will still have the same metadata that we’re used to and will do MD5\CRC checks, but we’ll have the option to encrypt the data portion of it with a password.

Having the option to encrypt evidence files is nice because sometimes we don’t always have an encrypted container drive (You do encrypt your evidence when you ship it, right?)  available to ship images or the person on the other end might not have the decryption technology easily available.

6. More options for report creation. I didn’t take as many notes on this because unless Guidance tells me their reporting option will make bacon directly appear on my desk, I don’t much care. I long since gave up on using EnCase to make forensic reports.   That said, they are going to give us the option to put hyperlinks in reports and to resize/rotate pictures.   Don’t feel too bad, Guidance. I don’t use Access Data’s report function either.  I certainly like it better, but my customers don’t and they are the ones who matter.

7. Decryption.  They said that they will have the ability to decrypt Windows 7 Bitlocker soon.  This is good news and one of the things I’ve really appreciated about Guidance and Access Data is their aggressiveness in working with encryption vendors to incorporate decryption technology into their products.  It makes our lives as examiners so much easier because manual decryption processes can be long and painful.

8. Email Threading.  EnCase will have the ability to follow email threads across multiple email repositories.  This is a very nice option to have, but I suspect I won’t be using it since EnCase is pretty painful to use for email investigations compared to tools like FTK.  However, this signals to me that Guidance isn’t giving up on enticing it’s customers to use EnCase for email investigations and that’s a good thing.

9. Neutrino\mobile phone forensics.  Digital forensics is a very broad field with all sorts of devices, operating systems and file systems.  It’s hard enough being good at traditional hard disk file system forensics.  The innovation in the mobile device market is staggering which is why I think we haven’t seen one mobile device forensic vendor establish a dominant position in the market.  Guidance seems to understand that they just don’t have the developmental cycles to keep up on everything going on in the mobile device world so they have apparently decided to concentrate on digital forensics of smart phones like Android, iPhone, etc.

This makes good sense to me.  The market for smart phones is growing quickly and, as Guidance points out, they have a lot of experience with parsing file system artifacts. Trying to be a comprehensive mobile device forensic company and keeping up with their competitors like Access Data on the traditional disk forensics front doesn’t seem like a winning proposition.

One of the executives I was able to meet at CEIC was Robert Botchek.  Based on my discussions with him and others, I’m convinced that the Tableau purchase is a good move for Guidance and the community as a whole.  I found Rob to be very unique in that he has deep technical skills, an excellent business mind and is a very personable fellow who can communicate complexity in an understandable manner.   The Tableau name will continue to exist in some form, but Tableau will be a part of Guidance software.  The chain of command issues have already been decided and Robert is a direct report to Guidance’s CEO Victor Limongelli.  Thus, Victor and the rest of the Guidance senior executive management will get the benefit of Rob’s business background and keen insights into the digital forensics market.  The biggest issue will be the traditional one that you have in acquisitions like these which is integrating two different organizational cultures.  If Guidance can pull this off, this should be a good move for everyone involved.

Being able to finally meet Victor in person was also a treat.  He’s also a very smart and personable fellow and, along with all of the other Guidance executives and employees I spoke with, seems to genuinely want people to understand that Guidance doesn’t want to be the organization that we’ve all, unfortunately, grown to distrust if not actively dislike.  Essentially, they want people to understand that they are the new Guidance Software.  The Tableau purchase and the fruits that will hopefully come from it should help on that front.

I’ve been thinking about what companies like Guidance and Access Data can to do engage the community better. An obvious method would be to interact more with the community via social media (Access Data makes great use of Twitter, for example) and the various email lists that are popular with the community. As I thought on it more, it occurs to me that vendors who can afford it should take a page out of Guidance Software’s old play book and hire Directors of Customer Relations.   One of the darkest days of my digital forensics career is the day I learned that the great Bill Siebert had passed away.  For many years during the bad old days of Guidance Software, Bill was the face of the organization.  His title might have been Director of Customer Relations, but it was really Director of making-you-not-hate-Guidance-nearly-as-much-if-Bill-wasn’t-working-for-them.  If you had a problem with Guidance, you could go to Bill and you know he’d tell it to you straight and do whatever it took to get the issue resolved.  He wasn’t a company line type who just told you want you wanted to hear. He’d tell you if he thought Guidance was doing something silly and then do his best to fix it for you. Once Bill left Guidance, things really pretty rocky with my relationship with them and I think one of the biggest public relations mistakes they ever made was not filling that role.  You could never replace Bill, but they should have at least filled that role.   In my case, the relationship with Guidance was repaired through the herculean efforts of my Guidance Software sales representative.    He’s the few sales representatives that I’ll knowingly pick up the phone for when I think it’s him calling. I never thought I’d type that about a sale representative, but there it is.  However, I understand his role is to sell me more stuff rather than to engage the community at large.

What would that role look like today at a place like Access Data or Guidance? The person in that position would be someone that has instant credibility with the community because they were an experienced practitioner rather than someone in a sales or marketing role.  In fact, you wouldn’t have that person as part of the sales organization.  The best position on the organization chart for that person would be to report to a senior executive manager in an operations or developmental role.   This person’s skip level manager would be the CEO and would have access to senior executive management so that they could establish a two way communication between the company leadership and their current and potential customers.  They would be someone who would directly engage the community in the places they inhabit such as forums,  email lists, blogs, podcasts, conventions and social media. Because they were part of the extended senior leadership team they would act as a conduit between the community and senior executive leadership.

That’s enough organizational development pontificating, I think.  I also wanted to comment on some of the people I met and some of the presentations since part of what I love about conventions is meeting people in person and learning new things.

The first session that I attended was Dave Shaver’s “Defeating Advanced Hiding Techniques”.   I don’t know how he did it in 90 minutes, but the course was a comprehensive review of how an experienced digital forensics examiner such as Dave approaches doing an incident response investigation and discovering what sort of evil has been buried in the shadows of a computer.   The conference was also a treat for me because I was finally able to meet Dave and his co-conspirator from Army CID Ryan Pittman.  They’re both some of the nicest guys you’d want to meet and very sharp forensic gurus.   They co-authored the excellent chapter on Windows Forensics in Eoghan Casey’s most recent book.

I finally got to meet Rob Lee in person after countless emails, tweets and phone conversations.  It’s amazing how you don’t really know someone as well even after all of that until you just sit down and talk to them for awhile.  Rob is a big friendly well of digital forensic knowledge and energy. He looks like he played football at the Air Force Academy and he put that command presence to use in his “Super Timeline ” class at CEIC.  If you haven’t taken that class, you can get that content and a lot more by taking the SANS SEC508 class.  “Super Timeline Analysis” is where Rob instructs his students how to use tools like fls, Harlan Carvey’s Regtime.pl and Kristinn Gudjonsson’s log2timeline to make a timeline of activity on a system.  The resulting timeline is amazing at providing an examiner with a detailed view of what happened on a system. This is something that every digital forensic examiner needs to learn how to do.

I had the pleasure of having dinner with Larry and Lars Daniel of Guardian Digital Forensics.  They are both quality guys and excellent digital forensic examiners.  I really enjoyed talking to them about their experiences doing criminal defense work and their perspective on digital forensics in the legal system.

Adrian O’Leary of the Metropolitan Police gave a fantastic presentation on their ability to extract data from physical flash memory on mobile devices.  I don’t know how much information they want public, but I do highly recommend attending any presentations he does in the future.

I also discovered one of my new favorite conference presenters when I attended Joshua Gilliland’s “Textual Relations” presentation. Joshua is a very accomplished presenter and also sports a pretty sharp bow tie.  His presentation was an overview of legal issues involving text messages as well as illustrating how some people have scored massive legal own goals through texting things they really should not have.

Michael Webber’s memory forensic presentation was very educational and it’s been an area that I am very interested in from a research perspective.  He’s a very accomplished presenter who does a great job explaining complicated information in a short amount of time to a large amount of people.  Unfortunately, he only had 90 minutes, but he made good use of the time and I’d love to attend more training with him.

I know I’ll forget to mention all of the great people I finally got to meet in person, but it was a treat being able to connect with people like Eric Smith from Lockheed Martin and Greg Dominguez from Forensic Computers.

As you can tell, I had a wonderful whirlwind of a week at CEIC and I enjoyed the experience very much. I hope to make it CEIC 2011 in Orlando next year.  Great job, Guidance!

Saturday, May 22, 2010

TIM GCFA EWR LAS CEIC LBS



TIM
As a follow up about my previous post about the Guidance Software purchase of Tableau, I saw that Tableau’s Robert Botchek announced on one of the digital forensic email lists that Guidance is going to remove the Tableau requirement for TIM. This means that TIM will work with write blockers other than Tableau.   This is an amazing bit of news for all of us and especially those of us who have been in the industry for awhile.  Great job, Guidance!

GCFA
So I passed the GCFA exam this week with a 92.67% score.  I’m positive that I lost some key data from my past such as my old high school locker combination, past phone numbers and the like since more than a few brain cells died in the attempt.  I completed the SEC508 course that is the basis for the GCFA test via SANS OnDemand method and I’ll post a detailed review of that course and the SEC408 OnDemand course in the near future.  In the mean time, Joe Garcia of Cyber Crime 101 has posted his audio review of his SEC408 experience with the mighty Mike Murr.  As a teaser, Joe announces some exciting new news about the future of the SEC408 course.  I won’t steal his thunder here so you’ll have to give it a listen.

It’s not that the GCFA test is unreasonably difficult, but I had set a goal for myself to get a score over 90%. That means I could only miss 15 questions in a 150 question test that covers some pretty complicated material and has to be completed in 240 minutes.  Proper pre-test preparation is a must because if you don’t have a strong foundation in the material all of the books, notes and whatnot that you bring into the test facility aren’t going to save you.  You just don’t have time to teach yourself new concepts on the fly.  Thus, if you take the SEC508 material seriously when it’s presented to you through whatever format you choose from SANS, do a decent job with the practice tests and practice good test taking skills (including creating a proper index), you’ll have an excellent chance of passing the test.

SANS provides you with two practice tests as part of your GIAC attempt.  You can purchase additional tests for $99.  The impression that I get is the the practice test question bank provides enough unique questions for roughly one and a half practice tests.  Therefore, the score you get on your first practice test is going to be the best indicator of how well you can expect to do on the real test.   Subsequent practice tests will result in higher scores because of repeat questions.  My first practice test score was 88%, my second score was 96.67% and my third score was 98%.  Based on my first and second scores, I was reasonably certain my final test score would fall somewhere in between and it did.   Why did I purchase a third practice test?  Because SANS allows you to take their tests through an open book method, you can bring your SANS course material into the test with you.  The rub is that you need to be able to locate specific subject matter areas quickly if you are going to research the answer to a particular question.  The best way to do this is to create a proper index.  The methods vary, but one of the ways to ensure your best performance on a GIAC test is make sure that you are comfortable with your index.  The reason I took the third test is because I wanted one last test where I would concentrate on training myself to use my index. 

Another thing that I strongly recommend is to look up answers in the cases when you are uncertain of the answer.  The mistake I made during my first practice test was to answer some questions based on the thought that “it’s probably this answer”.  Probably isn’t a good standard to use for GIAC tests because it means you’re going to guess wrong in some circumstances.  If you know the material well and you have a good index, you should have time to look up those “probably” answers and turn them into “certainly” answers. Also, don’t be afraid to skip questions.  The test engine allows you to skip five and I used all of my skips for questions that I knew would require some extra reading and pondering.

EWR LAS CEIC LBS
I’ll be heading out to CEIC tomorrow and I’m looking forward to giving my presentation on Adobe Flash Cookies and meeting new people as well as people I normally just get to communicate with in the virtual world.  One of the reasons  I like going to these conferences is that I can finally meet in person those who I have only communicated with through email, twitter, etc.   They’re a very useful way to keep up on industry trends, tools and techniques.  It’s very powerful having so much knowledge from the community under one roof for a short amount of time.

It’s also a moral imperative that I get a Bacon N’ Egg burger at LBS.  I admit it.  I think with my stomach.  If you actually read this blog and you see me at CEIC, I’d like to know your thoughts on what you think of the blog so far.

Saturday, May 15, 2010

Don’t Panic

This was a big week for digital forensic news. We learned that Guidance Software purchased Tableau and that Access Data would be releasing FTK Imager for the Mac and Linux. All of this great digital forensic news will make for great fodder if I were going to be on the next Forensic 4cast, but I won’t because I have a prior commitment. The forensic gods can be cruel. However, Lee and his band of merry forensic practitioners will have an excellent show for you soon where they discuss these issues. Fortunately, I have a blog that is read by many twos of examiners where I can comment on these sort of things.

The initial reactions about the Tableau purchase from my fellow digital forensic examiners ranged from concern to opposition. Not exactly a vote of confidence for the folks over at Guidance, but having been in this business for many years now, I understand their concern. We’ve all been burned by the major forensic software vendors like Guidance. How many disastrous EnCase version releases have you lived through? I’ve been through three so far where the digital forensic community essentially paid to be beta testers until the Guidance fixed their product to do what they said it would do when they sold it to us. Remember how well the indexing feature worked when V6 came out?

Access Data has evolved into Guidance’s mortal enemy and they haven’t been immune to playing Lucy to the community’s Charlie Brown trying to kick the forensic football. FTK ME FTK 2 was a situation where, once again, a major forensic vendor released a product that they should have known wasn’t ready for prime time and essentially expected their customers to pay to beta test their product.

Back when I first started in forensics, EnCase was in version 3 (Good Ol’ 3.22g was the classic V3 version) and most people used it as their primary forensic tool and used FTK 1 for things like email and to test their keywords. Sure, some people used FTK as their primary GUI toolset, but they weren’t the majority. The world was Guidance’s oyster and they acted (and charged) like it. This attitude created a lot of hard feelings in their customer base which linger to this day.

Not too long ago, Access Data made it’s great leap forward when it obtained a cash and talent injection (lots of that talent came from Guidance) which resulted in a flurry of product innovations including the wretched FTK Vista FTK 2. You could see what they wanted to do with FTK 2 and how cool it could be, but it just didn’t work. For whatever reason, they released it before it was done baking which might have been a tribute to Guidance because that’s what they had been doing to their customers for years. Eventually, they got it right and released FTK 3 (AKA FTK The Apology) which is a great tool. Access Data even made an offer to buy Guidance. I’m not sure if it was a serious offer or just a good PR stunt, but it illustrated how far Access Data had come from behind to get to where they are today.

Guidance is a publically traded company and as such we can review a lot of their financial data because they have to send so much of it to the SEC. Access Data isn’t a publically traded company so they don’t have to release much of anything. Thus, we can’t really compare financial information, but my opinion is that Access Data took the lead in the innovation competition with FTK 3. Guidance has been doing incremental innovation with their EnCase tool, but EnCase V6 doesn’t feel all that different to me than EnCase V3. Sure, the UI has evolved a bit and they’ve added incremental innovations over the years such email support, Internet history support and great encryption support. The rub is that a lot of their innovations have been done better by other people with other tools (both paid and free). There isn’t much reason to use, for example, their email or Internet history support options. If I’m going to parse an index.dat file, it’s not going to be with either EnCase or FTK. However, for email FTK still wins hands down and EnCase has never been a great email forensic tool. FTK 3 is a big change from FTK 1. While the UI borrows quite a bit from FTK 1, the move to Oracle allowed Access Data to do a lot more with the tool such as handle larger data sets in a more efficient manner. They have a long laundry list of innovations that they have put into FTK 3 such as fuzzy hashing, distributed processing and remote evidence mounting. You can have all of this cool technology for a pretty reasonable price. Gone are the days when FTK was a glorified email tool. You can now comfortably use FTK as your primary forensic GUI tool and not use EnCase if you like. This is a problem if you are Guidance Software especially since Access Data is working very hard at closing the gap at the enterprise level.

The last thing any one of us in the digital forensic community should want is for one of these companies to “win”. We don’t want to go back to the days where one was dominant and treated its customer base accordingly. I don’t know anyone who didn’t dread the idea of Access Data purchasing Guidance Software to return us back to the pre-competitive era in digital forensic GUI tools. Robert Botchek and Tableau have been doing a lot of innovation in the area of data acquisition and have rightly earned the good will of the community because of that. The TIM tool when coupled with a Tableau product is an amazing innovation in data acquisition, for example. I suspect that this purchase was a low cost way for Guidance to help close the innovation gap that has been opened by Access Data. If Guidance essentially allows Tableau to be Tableau and continue to innovate, it should be good for Guidance and the community. I wonder if the deal that Guidance made (and this is pure speculation on my part) was essentially to tell Botchek\Tableau that GSI would provide the funding and the day to day operational support (HR, payroll, marketing, etc) while the Tableau team would be free to just concentrate on innovation.

We all know what the worst case scenarios could be based on past behavior. For example, TIM becomes an EnCase only tool and you have to pay $500 more per dongle to use. That would be a Bad Thing(tm), but I suspect that Guidance knows it now lives in a world where it can’t act like it used to act and continue to be successful.

My bottom line is that I like and use products from Access Data and Guidance Software. EnCase V6 is my primary GUI forensic tool, but I’m increasingly using FTK for tasks that I used to do in EnCase. I have no desire at all to return to the bad old days where one of them was dominant over the other. We should want both organizations to win rather than having one of them lose. If this Tableau purchase helps maintain a rough balance of power between the two, I think it’s going to be good for the community.

Saturday, May 8, 2010

Flip Video Forensics

Staying true to my compulsion to forensically examine anything I can connect to a computer, I decided to see what sort of information I could pull off of a Flip Video UltraHD device.

It turns out that these devices aren't terribly difficult to examine which isn't surprising since they're a narrowly purposed. They're a very user friendly device that allows easy creation and sharing of relatively high quality videos. They are designed to be plugged into a computer's USB port so that video can be pulled off and shared via the software included on the device itself.

Like the Kindle, write blocking can be accomplished by standard USB write blocking procedures. For this examination, I used the Windows USB write blocking software (essentially just an automated registry modification program) that came with the SANS 508 class disk. You should also be able to use traditional hardware write blocking methods such as the Tableau T8 USB write blocker.

The device has one FAT32 partition that comes in at around 7.6GB with most of it being unallocated space that is used for video storage. The actual system files don't take up much more than 120MB of data and include the software needed to run the actual device as well as the software that a user would place on a computer to manage their videos. There is software on the device for both Windows and Mac.

The videos themselves are in the DCIM\100VIDEO folder and are in MPEG-4 format. The video files are numbered in the order they are created starting with "VID00001.MP4". There aren't any surprises when it comes to deleted videos and you can recover those videos like you would any other file from a FAT32 volume. Thus, deleted videos will show up as "_ID0007.MP4" as you would expect based on normal FAT file system behavior. I did a keyword search for the header information for MP4 videos and I was able to get plenty of hits in unallocated space. A system files of interest sits in the root folder and that's the "INFO.BIN" file which contains useful information such as the firmware and serial number information for the device.

Saturday, May 1, 2010

GIAC Certified Haiku Master

One of the many reasons why I like the SANS Institute is that the collective SANS community is made up of some pretty sharp and creative people. One of the creative things that SANS did in promoting their upcoming SANS Boston conference was to have a Twitter Haiku contest that was judged by Craig Duerr with the words provided by Stephen Northcutt.

I've always had an interest in Haiku so even though I won't be able to make the conference this year, I decided to take a shot at the competition. It turns out that Dan Crowley also has a thing for Haiku and he decided to participate also. The battle was joined and Dan and I entered what we affectionately began to call the SANS Haiku Thunderdome (Two poets enter, one leaves).

Dan won the first round and I somehow managed to win the next two rounds and emerge the winner. I'll be the first to admit that Dan is a better Haiku poet than I am, but sometimes a blind squirrel finds a nut and that just happened to be me this time around.

The spoils of victory were a certificate proclaiming me a GIAC Certified Haiku Master and I also was awarded a Iron Kung Fu fan as my trophy. Very cool. We'll call this competition reason #214 why I love SANS.









Sunday, April 25, 2010

The Ballad of Grayson Lenik

Grayson Lenik is a relatively new member of our community who has made the decision to move from a systems administration focus to a digital forensics focus. You can follow his journey at his blog "An Eye on Forensics".

Grayson is clearly a sharp fellow. From what I can tell, he passed the SANS GCFA exam via the challenge process rather than taking any of the SANS course content. That's impressive considering the scope and difficultly of that exam. Grayson has been encouraged to contribute to the recently started Into The Boxes digital forensic online magazine and, to his credit, he's accepting the challenge and looking for a topic to research.

His comments on the research issue made me think of my decision making process regarding engaging digital forensics research. I've been doing digital forensics for a relatively long time now, but it was only last year when I decided that I'd start to contribute to the community in a meaningful manner in this area.

The reasons why I hadn't done so were largely due to intimidation. I look up to people such as Harlan, Rob, Jesse and Eoghan (otherwise known as the people who don't need last names to know who they are) and the work that they have done advancing the field with their research, training and tool development efforts. Who was I to even think that I could play on the same field as them? I also fell into a common trap that I see with IT security people which is that because I didn't know everything, I thought that I didn't know anything.

Last year I stumbled across Adobe Flash Cookies while doing an examination and started to dig into them. I began to learn that some of these cookies can provide a treasure trove of information for a digital forensic examination and started to parse them out as well as I could. I made a couple phone calls to some very experienced examiners and asked them if they had heard of them before and was told that they had not. One of those examiners was actually able to take what I told them over the phone and put it to use in a criminal investigation they were using so I knew I had something that would be beneficial to the community.

So I decided to just plow ahead and start writing something up with the goal having something to present at a conference like CEIC. I started to create an early overview paper.
I was lucky enough to have people like Cindy Murphy, Gary Kessler, Jimmy Weg and Mark Johnson review that paper and make suggestions on how to improve it. Cindy even managed to carve out some time from her busy schedule to do some additional research in regards to a particular kind of cookies that really helped fill out my knowledge. I briefly distributed the paper through some of the email lists like IACIS and HTCC hoping that it might get the word out and generate some additional research leads.

I sent it out to the community and heard....nothing much. I later learned this is a pretty common occurrence in our community even for Those-Who-Only-Need-A-First-Name. A digital forensics researcher will put a lot of work and effort into a project, release it out for free and ask for feedback...and will rarely get any back. I would get people thanking me for providing them the paper after I sent it to them, but then no response back to my requests for feedback on whether it was useful, whether they found any errors, how I could improve the final product, etc.

One of the notable exceptions to this which was Jesse Kornblum. Some time after I had released the paper, I checked my email to see a request from Jesse for the paper. It was a classic good news\bad news situation. The good news was that Jesse Kornblum wanted to see the paper. The bad news was that Jesse Kornblum wanted to see the paper. I'll admit a certain amount of dread when I hit the send button. The short version of the story is that Jesse liked what I had done. He offered encouragement and suggestions on how to proceed. Very cool!!!

So bolstered with my new found confidence, I pressed forwards with the research project and hit a major sticking point when I encountered some very odd metadata behavior that I absolutely could not figure out. I was saved by Eoghan Casey who helped me determine that the odd behavior I was seeing was due to File System Tunneling (which I will explain at my CEIC presentation next month). Yet another of my forensic idols riding to the rescue!

Around January or so, however, I was starting to realize that I was over my head. I able to parse out the header information for these artifacts, but I didn't have the knowledge to completely parse everything out. My hex-fu was okay, but it wasn't good enough to completely finish the project the way I wanted to complete it. The way I saw it was that I could either crawl back into my hole and admit defeat or just publish what I learned so far and hope that someone else could run with the research at a later date. I decided to do the second option with an eye on getting what I had completed published in some form.

Then on Feb 17th, 2010, I got lucky. Kristinn Gudjonsson posted some of his Adobe Flash Cookie research on the SANS Forensic blog. My initial reaction was that I had been too slow, too unknowledgeable and had been just wasted months of my research life because what he had done was so fantastic that it was better than I could have ever done. I even found that I had made at least one major error in my original header research. Woe is me, right? However, when I started to look closer, I realized that we had approached the research from different standpoints. Kristinn is an amazingly sharp incident responder and forensic examiner with an engineering background. That means he spent a lot of time looking at the hex level view of these cookies and did an exceptional job parsing them out. I approached the research from a more traditional investigative digital forensics perspective which means I concentrated on the metadata (which is why I discovered and overcame the file tunneling issue) and a lot of the higher level aspects of the research such as how and when Flash cookies tended to appear on a machine. I became excited about the prospect of merging the research, but would someone like Kristinn be willing to talk to little old me? (There's that self doubt again...)

As you know from my previous blog entries, yes, he was more than willing to talk and after a flurry of emails comparing our various notes on the project, we decided it made good sense to team up and create a final research project.

The moral of the story?

1. Be like Grayson Lenik, not Eric Huber. Grayson has been a member of our community literally only for a matter of months and he's already sharing what he's learning through his educational process and he's going to do a research project for ITB. It took me years before I decided to do what Grayson is doing now.

2. Research what you know and if you get stuck, get help and continue on. There is a vast amount of research opportunities in digital forensics for all skill levels. Harlan wrote a particularly pithy bit of advice for Grayson when he said "...start writing about what you know...we'll work with you." That's essentially what I have been doing. I plow through the best I can within the range of my abilities and if I get stuck, I go ask for help. Grayson will do great because he's a sharp fellow who has the desire to do the work and he'll have people like Harlan and Don Weber to help him when he needs it. What I've found is that the gurus like Harlan and Don are very helpful if you approach them in the right way.

3. If you don't have time to complete a project, even partial research is helpful and someone else might take what you have done and run with it. I did that with my Kindle forensic research. I knew I wasn't going to have the time and probably the knowledge to completely parse every aspect of what one can find on a Kindle so I posted what I learned on this blog.

4. Provide feedback. If you don't have the time or desire to do digital forensic research, no worries. However, one thing that you can do to help those who are doing it is to provide feedback when you have found something useful that helped you in your job. Did you like a particular digital forensics book? A nice thing to do would be to post review at a site like Amazon. Even negative feedback is welcome as long as it's constructive. If I made a mistake, I want to know about it. If what I wrote didn't make any sense, it doesn't help me develop as a writer or a researcher if I don't know what I'm doing wrong.

Tuesday, April 20, 2010

Forensic 4cast and Me

The most recent Forensic 4cast podcast is up with a brand new format. Lee has decided to test out a panel format where he brings together people from the digital forensic community to discuss the topics of the day.

This episode included a panel that consisted of Lee, Tom Yarrish, Joe Garcia and myself. Give it a listen and let Lee know what you think about the new format. I'm grateful to Lee for the opportunity and I hope I did a good job for him. I have to admit that I was a bit vexed when I heard the podcast after the fact because the sound quality from my phone wasn't remotely as good as the other panelists. I already have a proper Skype certified phone on order from Newegg so that I can use it with Skype next time and not sound like the panelist who is calling from the outer reaches of Absurdistan.

Lee has also released the much anticipated presentation on Volume Shadow Copies that he was due to give at the SANS EU Forensic Summit. That summit was delayed because of, as Chad Tilbury puts it, the Krakatoa eruption in Iceland. Chad made the Krakatoa reference on Twitter this week and I've been laughing about it ever since. It's yet another reason why I like socializing with my fellow digital forensic examiners on Twitter. Chad is a very sharp fellow and one of the primary SANS digital forensics instructors.

As Lee was nice enough to mention at the end of the podcast, I will be presenting on the topic of Adobe Flash Cookies at this year's CEIC conference. Kristinn Gudjonsson and I have been working on an article to submit to an academic journal and I have crafted an overview of the research for the presentation. The presentation won't cover much of the content in the article because there just won't be enough time to do that, but it will provide examiners with enough of an understanding of these artifacts to start using them in their digital examinations. I'm looking forward to CEIC this year as there are a lot of amazing presentations such as Rob Lee's Super Timeline Analysis Lab. I also think it's a moral imperative that I have an Bacon N' Eggs burger at LBS Burger.

I started this research project independently late last year and it turns out Kristinn had also been working on parsing these artifacts as part of his larger log2timeline research. He posted about them on the SANS Forensic blog earlier this year and that's when we discovered that we had been working on the same subject. We essentially had a "you got chocolate in my peanut butter" moment and decided to work together on putting together a paper that we hope will be useful to the community. Kristinn is certainly the brains behind the operation given his very robust technical background. I never would have been able to fully parse these artifacts on my own because I don't have the deep technical knowledge that Kristinn has so I'm lucky he posted on the SANS Forensic blog when he did and that he's generous with his time and knowledge.

One of the reasons I mentioned Chad earlier in this post is that he also did some research on Adobe Flash Cookies and posted about it on the SANS Forensic blog.

Friday, April 16, 2010

Additional Thoughts on Kindle Forensics

We're in an exciting time in digital forensics. It seems like each week we have a sharp digital forensic researcher discovering some new method or creating a new tool for us. We have seen incredible advances in traditional hard drive forensics and we have the wonderful and relatively new world of mobile device forensics to explore.

I've been doing digital forensics many years now and one of the things I've noticed about digital forensics people is that we sometimes tend to engage in catastrophic thinking when it comes to advances in technology and the future of digital forensics. We've all seen the various predictions that hard drive sizes, thin clients, encryption and other advances would spell the end for digital forensics. In fact, these advances show that our skills will become more in demand. However, we will have to constantly keep our edge sharp or we will fall behind. There will always be some sort of digital technology that will require a digital forensics practitioner to examine. Digital forensics will no more fade way than will technology or law, but it will be a constantly changing field.

The Kindle is a great example of how technological advances will provide examiners new opportunities for their examinations, but why examiners need to invest a considerable amount of time keeping their technological edge. The Kindle isn't a computer and it's not a cell phone, but it has qualities of both.

I recently received an Amazon gift certificate from a friend of mine. Amazon can distribute their gift certificates through email. In this case, the gift certificate was sent to my email address and included a code that I could enter into my Amazon profile to credit my account for the proper amount. Of course, I used that amount to purchase several books for my Kindle.

The Kindle book store can be accessed by the Kindle itself through the device's 3G network connection. There isn't any need to connect the device to a computer to download purchased content like you would for something like iTunes. You merely access the Kindle store via your Kindle device and you can purchase your books using your Amazon account. Another option is that you can log onto the Kindle bookstore on a computer using the Amazon website. You can then shop for Kindle books, purchase them through the website and have the content delivered to your Kindle via the wireless network. This is what I did with my gift certificate and after I had made my purchase, I picked up my Kindle and the books were on my device.

Great stuff for the consumer, but something that a forensic examiner would need to be very aware of when dealing with the Kindle as evidence. The last thing you want is to have a Kindle sitting in your evidence room waiting to be examined and to have additional content land on the machine and potentially overwrite existing evidence.

My advice is to treat the Kindle like you would any other mobile device examination up to and including using a shielded environment where the device can't phone home. A good research project for someone would be to determine whether or not it's safe to keep the device outside of a shielded environment if the 3G network is disabled by the examiner.


Tuesday, April 13, 2010

A Cursory Look at Kindle Forensics

I recently purchased a Kindle which I have come to adore. It's one of those devices make it hard imagine what life was like before you purchased it. However, being the hopeless forensic geek that I am, I had to figure out what sort of forensics could be performed on the device. (No, I have no idea how I got someone to marry me. I really don't.)

I purchased the current generation Kindle with the 6" screen. This model provides the user the ability to plug the device into a computer via a USB port to interact with the device. Amazon accomplishes this ability by creating a 1.5GB portion of the device that is visible and accessible to the user as if it were a standard USB storage device.

From the research that I have conducted so far, it appears that you can treat the Kindle as you would any other USB storage device for imaging purposes. The best way to do it is to use the USB cable that Amazon provides for connecting the Kindle to a computer. You can then write block a Kindle like you would any USB device. For my research, I used a Tableau T8 USB Forensic Bridge and was able to make the image using EnCase without any problems.

I haven't spent much time on the analysis portion of this research. However, I can report that a Kindle USB Drive shows up as an mkdosfs\FAT32 situation. This makes sense given that the Kindle runs some sort of Linux OS that we can't see via this USB capture process.

There are some interesting artifacts of the low hanging fruit variety. For example, "userannotlog" file located in the system folder. It lists the last book that I read, what my position was in the book and it also includes clear text time stamp information that correlates with when I know I was reading the book in question. Very cool.

The "documents" folder, as you might expect, contains the actual content that I have on my Kindle. I don't have much on it right now, but each book has an .azw file which is the actual content of the book in a proprietary format and a .mdp file that...well, I don't know what it does at this point.

There is a "search indexes" folder in the system\search indexes folder path that, one assumes, keeps track of searching done on the device. I bought a wine book that I did a search for the word "Pinotage" (Sigh. Yes, add "wine geek" to my list of vices...) and I used that as a keyword for a search...and came up with nothing eventful. There were about 20 hits on the word, but all of them in the context of other words in that alphabetical range so nothing that would show that I searched for that word.

You'll find a lot of indexed words in the system\search indexes\Index.db What I'm seeing already is that there are three bytes before each word that are clearly meaningful. For example, the word "pinewood" is preceded by 0x740008. So what we have is the word "pinetorch" and then 0x740008 and then the word "Pinewood". I don't know what the 0x74 means or if it's associated with the word "pinetorch" or "pinewood", but the 0x08 is the length of the pinewood entry. It's probable that this length indicator actually uses two bytes which would make 0x0008 the bytes that indicate length. I'm seeing this behavior consistently in this index file where a word is preceded by byte(s) whose hex value correlates with the length of the word that comes after the byte(s). Interestingly, I'll see a block of words pretty close together and then one word will end with 0x7A instead of 0x74 and then there won't be anymore words until a new block starts again about 900 or so bytes later. Towards the end of this file, there is a listing of the books on the Kindle and the paths to their associated files.

There is also a reader preferences file in the system\com.amazon.ebook.booklet.reader\reader.pref location. It has a clear text time stamp that appears to correlate with the last time I used the Kindle. It also declares what preferences I'm using for a dictionary, the type of justification I'm using and the last book I read.

There's a white paper in here for someone somewhere.