Review of the new SANS 585 Smartphone Forensics Course

I recently had the opportunity to beta test the soon to be released SANS 585 Smartphone Forensics course and I wanted to share some thoughts about the course content and the labs.

The course page on the SANS website (http://www.sans.org/event/for585-advanced-smartphone-mobile-device-forensics/course/advanced-smartphone-mobile-device-forensics) provides an accurate overview of each day’s topics so I’ll focus more on thoughts and opinions than lists.

Overview

The course starts with an overview of cellular technology and networks and quickly moves on to explore advanced topics. The jump from the basics into topics like wear leveling, garbage collection and so on is an earmark of a SANS forensics course, which is one of the reasons why I love these courses so much. The refresher of the basics is nice, but the integration of advanced issues – which is where many of us need the help – is nothing short of awesome. Throughout all five days, the course provides full-page examples that demonstrate the concepts explained within the content.

The initial section on parsing the contents of a SIM in hex is a smooth introduction into a course that delivers a healthy dose of hex each day.  It’s important to understand that the emphasis on hex is never “hex for the sake of using hex.”

Hex is used to locate and parse artifacts that commercial programs will not automatically parse and for digging for deleted artifacts not present in the tools’ reporting mechanisms. One lab shows a tool reporting six entries in an application. Analyzing the underlying sqlite database confirms that the table does indeed have six entries. You can then look at the sqlite database in hex to uncover how many messages were not picked up in the report. This course is full of tricks of the trade that can make huge differences in efficacy in real world settings.

Also, the labs are all incredible and the ‘answers’ sections at the back of each lab are perfect. They don’t just give an answer; they give detailed walkthroughs with plenty of screenshots. It’s another testament as to how meticulous, knowledgeable and detail-oriented the course – and its designers – are.

Day One

The core concepts section covers the basics and continues with the overview of smartphone handling and acquisition and a tool overview. The course moves on to using FTK imager to examine an SD card and to parsing SIM card data at the hex level. The first day ends with a section on general mobile device repair that provides an overview of resources, tools, and tips.

Day One’s appendix is a step-by-step guide to acquire data utilizing Cellebrite, XRY and Oxygen. Students who already perform mobile device forensics on a daily basis may not crack open this section of the book, but it contains great walkthroughs with plenty of pictures and is a great reference for those who are new to these tools.

Day Two

Day Two provides a detailed look at the Android file system, including where certain types of evidence may be located. While this section makes up the bulk of the day, the section at the end is where I’d like to share my observations.

The last part of Day Two starts off with a talk about malware and using Cellebrite PA to scan devices for malware. It also includes a few slides that introduce various Android spyware programs, available for purchase on the internet, and then show artifacts that these different applications could leave on a device. Mobile device spyware applications aren’t something that I look for on a regular basis, but this will be a fantastic resource for those times when I am in need of this information.

The appendix contains a guide to examining an image in Internet Evidence Finder and using XRY to parse a Samsung Kies backup.

Day Three

Day Three is for iOS devices and provides an in-depth look at the iOS file system and where certain types of evidence may be located. It also includes information on how to identify if a device has been jailbroken or wiped, how to recover data from third-party communication applications, and so on. In addition, there is some tool-specific content, including keyword searches and timeline generation.

Day Four

Day Four is split in half, with the first portion covering Blackberry devices and the second covering forensics on backup files.

The Blackberry device presentations are extremely in depth and include familiarization with Blackberry artifacts at the hex level.

Several of the 585 labs do a solid job of reinforcing the concept that an examiner should use multiple tools to examine a device. However, one of the Day Four labs takes it to the next level by having the student examine a Blackberry device using four different methods. The student is given a list of questions to answer, and every one of the four examination methods used in the lab will reveal artifacts that the other three do not.

Day Five

Day Five is a grab bag day that covers Windows mobile, Nokia & Symbian, knock-off devices and third party applications.

The Nokia & Symbian section does a great job covering the file system and artifacts down to the hex level. The next time I have a question concerning a device running these operating systems, this book will be the first thing I reach for.

The Windows Mobile forensics section covers several topics including Windows Mobile registry analysis and usage artifacts.

The knockoff section provides both a good overview of dealing with clones and some specific guidance and examples for artifact parsing.

The final section discuses different types of third party applications on iOS and Android devices and parsing these types of applications.

The Day Five appendix gives a step-by-step walkthrough for using a Cellebrite PA with CHINEX to examine a clone phone.

Conclusion

I’ve taken multiple mobile device forensics courses, including the SANS 563, and can say with the utmost confidence that this course is phenomenal. The books will be an invaluable desk reference the next time I’m poking around inside a file system, and the labs do a great job re-enforcing lessons taught in the course.

The topics covered in the course can be considered advanced but are also very practical. Topics such as parsing and searching devices not supported by commercial tools and digging in hex for deleted artifacts are extremely important and not incredibly intuitive to try to learn through trial and error.

In closing, this course is a much needed – and valuable addition – to the SANS forensics course lineup.

Forensic Artifact Analysis of the Burner App for the iPhone

In April of this year, I saw a thread on Forensicfocus.com discussing a new smart phone app called “Burner” which lets users purchase disposable phone numbers for short-term use. The application has some very practical uses for online activity – such as selling items on Craigslist – but it also has some obvious implications for anybody performing digital forensics work.

At this moment, none of the commercial mobile device forensics tools I have available to me parsed the data from the burner application. I’m sure that will change if the app continues to grow in popularity.

I recently had an opportunity to install Burner on my iPhone 5 and examine the artifacts left on the phone after I used it. I also wrote a Python script to parse information from the burner.sqlite file and generate a HTML report. I’ve affectionately named the script ‘Oven Mitt’. Burner leaves quite a bit of data intact on the device. Additionally, a lot of what it does cover up can be acquired by other means.

As expected, Burner stores its data in var\mobile\Applications directory on iOS devices. The specific sub-directory will vary from phone to phone, but the easiest way to locate the files is to search for any files with ‘burner’ in the filename. The var\mobile\Application\<varies>\Documents directory contains a SQLite file named Burner.sqlite which contains most of the information we’ll be looking at. The var\mobile\Applications\<varies>\Library\Preferences directory contains a plist file named com.adhoclabs.burner.plist which contains some interesting configuration information, including the number of times the Burner app has been executed. However, this post is focused on examining and parsing the Burner.sqlite file.

The first table in the database worth looking at is the ‘ZBURNER’ table which contains any currently valid Burner numbers. The Z_PK field is the primary key for Burner numbers on the device. When a temporary number expires it is ‘burned’ and removed from this table. A user can also manually burn a number by hitting the burn button and agreeing to the following warning:

burner_warning

 

One interesting note about this field is that primary keys don’t seem to be reused. Even when there are no entries in the table a recently created number was assigned the Z_PK value of 4 which (correctly) indicated that I had used three previous numbers on that device.

Other fields of interest in this table include:

  • ZNUMBER: The temporary phone number associated with this burner entry
  • ZFRIENDLYNUMBER: the ZNUMBER field in a more human readable format
  • ZABOUT: The user provided nickname for the temporary burner number
  • ZCREATED: The date/time the burner number was created
  • ZEXPIRES: The date/time the burner number will expire
  • ZUPDATED: The date/time the burner number was updated

All timestamps within the database are in UTC and use Mac absolute time which is the number of seconds that have passed between 1/1/2001 and now. The following Python code performs the necessary conversion to a human readable format.

bmd = datetime.datetime(2001,1,1,0,0,0)

realtime = bmd + datetime.timedelta(0,int_dtn) # int_dtn is the timestamp from the database field

The most interesting table in the database is the ZCALLITEM table. The ZCALLITEM table on my phone contains call records and SMS text content from numbers which expired over a month ago but is missing content from numbers which I manually burned earlier this week. It appears as though numbers which burn themselves due to expired time leave their activity history intact in the ZCALLITEM table while numbers which are manually burned remove the content.

Similarly to the ZBURNER table you can discover that content has been removed by looking at the primary key in the Z_PK field. In the picture below you can clearly see that two records are missing. Each of these records were SMS messages sent with a Burner number which I then manually burned.

missing_callitem

 

Noteworthy fields in the ZCALLITEM  table include:

  • ZDATE: The date/time of the activity
  • ZTYPE: The activity type. Options include sms, outbound_sms, outbound and call
  • ZCALLITEMTOINBOUNDNUMBER: Foreign key to the ZINBOUNDNUMBER table. This is the table that stores the phone number that calls or SMS messages were placed to or received from and is obviously a key field.
  • ZBODY: For SMS messages this field contains the content from the message. For missed calls where a voicemail is left this field contains a link to the voicemail message. I’ll discuss voicemail more below.
  • ZCONNECTED: This field is ‘1’ for sms messages or calls which connected. If a call was missed then this field is a ‘0’
  • ZCALLITEMTOBURNER: This field matches activity from this table to the ZBURNER table by primary key. If no entry exists in the ZBURNER table than this field will be blank. You might still be able to associate activity from the ZCALLITEM table to a specific number by pairing data to traditional iOS forensics data.

In the Oven Mitt screenshot below you can see that, according to the Burner.sqlite database, Burner was used to place outgoing calls to a number in the 520 area code on 7/9/2013 at 18:40 and 18:44. The table contains the numbers that were called but not the burner number being used at that time, since the number has been subsequently burned and removed from the ZBURNER table

outgoing_calls

If an examiner wanted to determine the temporary burner number that was being used at that time, s/he could use a traditional iOS forensics tool to examine the call logs for any records matching up with the timestamps from Burner.

In the below screenshot from a commercial forensics tool, you can see two outgoing calls to a 202 number with timestamps that match the Burner records perfectly. The examiner now knows both the 520 numbers that were called and the temporary 202 that Burner used to place the calls.

outgoing_burn_num

Voicemail:

As I mentioned earlier the ZBODY field in the ZCALLITEM table contains a link to audio file with the voicemail for any missed calls where a voicemail was left. A missed call from 7/10/2013 generated a link from twilio.com while a missed call from 7/18/2013 used a link pointing to s3.amazonaws.com. I’m not sure if Burner rotates between various services or switched services this last week. Burner also stores a local version of the audio in the var\mobile\Application\<varies>\ directory.

“Oven Mitt” was written in Python 2.7, uses only native libraries and can be downloaded here. The script is designed to be placed in a directory with the Burner.sqlite file. When it’s run the script will parse the contents of the ZBURNER, ZCALLITEM and ZINBOUNDNUMBER tables and generate a HTML report that shows the burner numbers currently active on the phone and the history from the ZCALLITEM table in a human readable format.

Although, this is far from a conclusive study, I wanted to take a quick look at Burner artifacts on the iPhone and create a script which can help parse the contents. I encourage you to manually verify any results as I only had access to one device to test this on. Also, if there’s any interest I can try to procure an Android device to take a look at the artifacts on that system.

If you have any feedback or questions feel free to shoot me an email using the contact form.

Python Tool for Parsing Data from Rand McNally GPS units

I recently encountered a Rand McNally Intelliroute TND 720 GPS unit and none of the commercial forensic tools had the ability to acquire data from the device so I imaged the device and poked around for any interesting data files.

I found a file called DestHistory.txt which obviously peaked my interested. I opened the file in notepad and while it contained a lot of unusual characters it also contained multiple recent destinations sandwiched in between those characters.

I wrote a small python script which takes the contents of the DestHistory.txt and parses it into both a HTML report and a KML file which can be opened in Google Earth. The tool is called rmparse and can be download here.

The project was fairly straight forward. The only hiccup was that the DestHistory.txt is in Unicode format so when my script parsed the file there was a null between every character. I tried a standard B = A.replace( “ “, “”) command but had no luck. I ended up using a B = A.replace(“\x00”, “”) command and it worked like a champ.

The rest of the script was just parsing out the relevant portions, sticking them back together and then wrapping them in HTML and XML.

A few notes about this script:

The script has only been tested with the listed model. If it does or does not work with any other model feel free to post that in the comments.

The source file name of DestHistory.txt has been hard-coded in so the tool can be run from a GUI. Copy the DestHistory.txt to a directory, place the script in the same directory and run the script. It will generate the HTML report and KML file in that directory.

The script takes the lat/long coordinates from the source file, splits them, adds a period in the proper location and sticks them back together. The function that performs this assumes that the coordinates are in North America. It would be a simple modification to adjust that function to another region.

Quick Update and Minor Tool Announcement

June was a fairly busy month as I knocked out my GISP and CEH. The GISP required no extra study on my part as I had just finished my CISSP exam and it’s basically an open book CISSP. The GISP questions were more technical than the CISSP versions which honestly made the test easier. Well, that and the open books 🙂

The CEH is fairly straightforward with a lot of tool specific questions, port related questions and scenarios which test your basic network security knowledge.

The CEH was a nice one to get out of the way and the GWAPT should be the next one on my list. I just finished going through the SANS SEC 542 course in the On-Demand format and will now start spending some time with the course exercises and creating my index.

If anyone has any specific questions on my GISP or CEH prep please feel free to ask.

On another note I recently encountered a Rand McNally GPS unit which no commercial forensic tool I had access to was able to parse. I wrote a small python script which parsed the destination history file and created a HTML report and KML file for Google Earth display. The tool is working but there are a few tweaks I’d still like to make to the KML structure. I’m planning on releasing the tool for public consumption later this week.