scripts

On the Fifth Day of APOLLO, My True Love Gave to Me – A Stocking Full of Random Junk, Some of Which Might be Useful!

Today we go over one of the stranger databases on iOS, the Aggregate Dictionary database, or ADDataStore.sqlitedb. This database is only available with a physical file system dump in the /private/var/mobile/Library/AggregateDictionary/ directory. The database also has a different way of storing its data. Instead of a “this time this event happened” storage system like many of the other database I’ve gone over. This one aggregates data for the last seven days. It only records data on a per-day basis so the APOLLO modules will only store a day timestamp. 

This database is good to find random bits of useful stuff, not every case will need it but you might be surprise what it is tracking. The Distributed Keys and the Scalars are each keeping track of seemingly obscure items. In my opinion, the Scalars are more interesting. On my example database I have 5398 unique keys for Scalars and 795 keys for the Distributed table. I can’t even begin to show you everything this database tracks, its best just to take a look at one yourself to see what might help in your own investigations. I will focus here on a few of the more interesting Scalars entries.

I’ve previously written about this database with respect to Pincodes, Passcodes, and Touch ID.

APOLLO only has two modules for the Aggregate Dictionary. Since these are using only a per-day timestamp, I’ve made them easy to filter out or as a user choose to not use these modules. They can look a tad messy in the final output.

The first item I pulled out of the stocking is CarPlay data. The example below shows how many cars I’ve connected using CarPlay in com.apple.CarPlay.VehicleCount. I’ve connected this device to three different cars. I’m still testing the difference between*.CarPlayCar.* and *.tCarPlayPhone.* keys, however I could guess that Activations has something to do with how many times the app was selected while the ActiveTime is how long the app was in use.

The next item has to do with the Messages application. This app is collecting metrics on the different types of items sent and received over SMS or iMessage protocols.

The Settings applications (com.apple.Preferences) keeps track of which setting got viewed. The example below shows I viewed the Bluetooth menu twice on the 10th, and the privacy settings once on the 14th. I encourage everyone to search for various bundle IDs of interest in this output – you never know what the apps are going to store!

The Clock app (com.apple.MobileTimer) keeps track of how many alarms the device has set, if any are active, number repeating, and named alarms. Some apps are better at storing data like this than others.

Safari records the number of tabs open on a particular day

The Photos app (com.apple.mobileslideshow) keeps track of how many photos there are in the albums.

Curious how many times a device was plugged in during a day, try the com.apple.power.state.pluggedin.count key. We will revisit this action in an upcoming device state article.

When I say obscure, I mean obscure – it also keeps track of button presses…because why not!

Some settings are also stored in this database. You can perform a filter for a couple of keywords – enabled or disabled. I’ve provided a screenshot for both. For the “Enabled” keys, I chose to filter on the main interface, Springboard (com.apple.Springboard). The value is a binary value that means on (1) or off (0). This is different for the “Disabled” keys.

If the keys have the term “disabled” in them, you have to think opposite. For example, all these Accessibility features are actually turned on or enabled. If the disabled setting is a 0, it means it is turned on (you can see many of the settings in the iOS screenshot, Shake to Undo and Vibration for example.)

Finally, in the toe of the stocking we can get some data usage information. I’ve filtered here by the Twitter application (com.atebits.Tweetie2). The screenshot shows how much data was transferred in kilobytes over Wi-Fi and/or WWAN – incoming or outgoing.

Discover & share this Drinking GIF with everyone you know. GIPHY is how you search, share, discover, and create GIFs.

On the Fourth Day of APOLLO, My True Love Gave to Me – Media Analysis to Prove You Listened to “All I Want for Christmas is You” Over and Over Since Before Thanksgiving

The fourth day brings us media artifacts using the knowledgeC.db and CurrentPowerlog.PLSQL databases. Each database stores similar yet somewhat different records when it comes to audio, and video usage.

Let’s get in the mood!

KnowledgeC.db 

Starting with the knowledgeC.db database we get an idea of audio inputs and outputs. Using the knowledge_audio_input_route and knowledge_audio_output_route modules we can extract this data.

The example below is audio inputs. Two different devices can be seen in the screenshot below, a CarPlay device and my AirPods.

In the output example we see the same two devices. Common to both is the fact that you may see it bouncing back and forth as devices are being used. For example, if you are using CarPlay rocking out to WHAM’s Last Christmas, it will bounce between playing your music and using the microphone/speaker to listen/dictate messages as they come in.

The knowledge_audio_media_nowplaying is where the real truth of your guilty musical pleasures will make an appearance. It provides a detailed listing of what the user was listening to at a particular time. The example below shows me listening to music, podcasts, and audio books. There is no denying that you started listening to holiday music entirely too early!

CurrentPowerlog.PLSQL

The powerlog_app_nowplaying module sounds like it should extract the same information as above, however it is different in that it will just show the application’s bundle ID being used to play the media.

More context can be found by using the powerlog_app_audio module. This module will extract the app/service name and/or bundle ID for the app using the audio function. In the screenshot below, I received a phone call and a bit later connected the iPhone to my car to listen to music. During that time, I was also using Siri (assistantd) to dictate messages. Fortunately for me, this database doesn’t show what I was listening to! 🤭

Similar to the module above, is the powerlog_audio_routing module. This shows where the audio was routed to, either the Speaker, Receiver, or Car in this example. 

Finally, the Powerlog populates a table just for videos. The powerlog_video module will show the application bundle ID that the video was played with, however not what video was playing.

On the Third Day of APOLLO, My True Love Gave to Me – Application Usage to Determine Who Has Been Naughty or Nice

On this third day, we will focus on application usage. We will cover three databases:

  • KnowledgeC.db

    • Be sure to check out more detailed information on this database in my two previous articles.

    • Access to this database is limited to a file system dump, it will be located in /private/var/mobile/Library/CoreDuet/Knowledge/KnowledgeC.db

  •  InteractionC.db

    • Access to this database is limited to a file system dump, it will be located in /private/var/mobile/Library/CoreDuet/People/InteractionC.db

  • CurrentPowerlog.PLSQL

    • Access to this database is usually limited to a file system dump, but I recently learned that if you can perform a sysdiagnose on the iOS device you can get this log as well! This method is certainly not forensically sound, but it will get you access to the database. You will have to rename this database to “CurrentPowerlog.PLSQL” from something like “powerlog_YYYY-MM-DD_##-##_########.PLSQL” to parse it with APOLLO. I have plans to update the script to support this naming scheme. 

    • In a file system dump, you can find this database in /private/var/containers/Shared/SystemGroup/<GUID>/Library/BatteryLife/CurrentPowerlog.PLSQL

      • It is worth a mention here that additional historical Gzipped PowerLog databases may be found in the /Archives directory in the path above. (My APOLLO script does not handle auto unarchiving of these yet, but it is planned for the future.)

Application Information, Installation, & Deletion

Let’s start by getting some basic app information. The Powerlog has a nice listing of iOS applications with their App name, executable name, bundle ID, and version and app type. These entries are not exactly an ongoing log file, but it does get updated frequently. The app type column can provide information on what type of application it is.

  •  1 – Background iOS Service

  • 3 – iOS Native App

  • 4 – 3rdParty App

These entries will also have information on if the app has been deleted. This listing can be extracted by using the powerlog_app_info module.

I have a very similar module called powerlog_app_deletion that uses the application deletion time as its key to get just the recently deleted applications.

To get a listing of recent application installs, we need to go to the KnowledgeC.db database. You will see in this blog series that you cannot rely on one database for all the answers. The module knowledge_app_install will extract the time of install and Bundle ID for the newly installed app. 

Application Play by Play 

Knowing what application was being used at any given moment, down to the second, can be a huge forensic win. Fortunately, we have a couple of options, each a little different.

The first is the KnowledgeC.db database. The knowledge_app_inFocus module shows the app Bundle ID, day of the week, GMT offset, start/end of the usage time from which I calculate usage time in seconds.

Another option is the powerlog_app_usage module which uses the CurrentPowerlog.PLSQL database. The output contains the app Bundle ID and a few other metadata items.

You will notice a few additional columns in the output:

  • ORIGINAL_SCREEN_STATE_TIMESTAMP

  • OFFSET_TIMESTAMP

  • TIME_OFFSET

These additional columns are needed because the timestamps need to be normalized. The original timestamp is stored with an offset, sometimes in the future, sometimes in the past. This offset is stored in PLSTORAGEOPERATOR_EVENTFORWARD_TIMEOFFSET table in the CurrentPowerlog.PLSQL database. This offset gets used in many of the Powerlog tables. I want to provide a warning to other investigators to not always believe the timestamp provided in a database. You MUST test this, I’ve seen offsets from as little as a few seconds to as much as 15 minutes. In my queries I’ve done my best to test what I can, but I’m sure I’ve missed some and they may change per iOS version (we will see an instance of this in the next module). Noticed I missed one? Please let me know!

The modules powerlog_app_usage_by_hour and powerlog_app_usage_by_hour_iOS12 extract data usage on a per hour basis. During the research for this blog article, I was using my own iOS 12 Powerlog output from sysdiagnose to test differences with iOS 12. I came across one issue with this particular module, I determined in iOS 12 that the timestamp in this table now implements the same time offset that we saw above, however on iOS 11 (and older) it did not. Soon I will implement a function in the script to determine which iOS to perform which queries, however now it will run both so please be aware of this.

Below is an example from iOS 11.

Below is an example from iOS 12 with the time offset applied (offset columns are hidden from view). Not shown in the screenshot is a bug where the applied time offset may show the ADJUSTED_TIMESTAMP off by a single second for earlier records. If anyone has recommendations on a better SQLite query to perform this offset adjustment, drop me a line!

These modules extract an applications usage on a per hour basis. The query will extract the app bundle ID, screen and background time, and additional background usage with audio and location.

Context Providers 

The next step is to provide more context to each application being used. We have quite a few modules for this.

The InteractionC.db database contains lots of information to correlate contact information with application activity. The screenshot below only shows a portion of the contact data for each record, but it should be a good indication of what email was being read or what contact was being texted. This module is named interaction_contact_interactions.

Curious what the user was browsing when they were using Safari, try the knowledge_safari_browsing module. There are a few catches to this one, it will not record private browsing and entries will be removed when they are history is cleared (either by the system or user).

Using application activity streams in the KnowledgeC.db database we can extract more information using knowledge_app_activity module on to determine items like Apple Map directions, Mailbox or Message views, specific 3rd party application usage, and other native Apple iOS app activities.

I created a similar module, knowledge_app_calendar_activity, specifically for Calendar activity since it has a bit more calendar related metadata items associated with it.

Again using the KnowledgeC.db for context, I can use the knowledge_app_intents module to determine more details of application usage, including sent messages, map searches and navigation, Safari browsing, and phone calls. A keen eye will notice the data blob in the SERIALIZED INTERATION (HEX) column starts with the hex of a binary plist. This binary plist contains more specific contact information, such as who a message was sent to or what contact was called. I saved it in hex to easily copy/paste it into a hex editor or to save off as a plist file.

Finally, we can use the knowledge_application_portrait to get even more contact information for messages or account information for an associated email account. The items in GROUP ID can vary but usually if it’s a GUID it will correspond to a specific email account that can be found in the user’s Accounts database.

On the First Day of APOLLO, My True Love Gave to Me - A Python Script – An Introduction to the Apple Pattern of Life Lazy Output’er (APOLLO) Blog Series

I originally released APOLLO at the Objective by the Sea conference in early November. Since then I’ve received a surprising amount of positive feedback about various analysts using this tool or the accompanying SQL queries on their file system dumps to help a variety of investigations.  

It now time for a proper introduction. I will present this to you in what I’d like to call “The Twelve Days of APOLLO” a holiday themed (very, very loosely) blog series starting today! 

Why

APOLLO stands for Apple Pattern of Life Lazy Output’er. I wanted to create this tool to be able to easily correlate multiple databases with hundreds of thousands of records into a timeline that would make the analyst (me, mostly) be able to tell what has happened on the device.

iOS (and MacOS) have these absolutely fantastic databases that I’ve been using for years with my own personal collection of SQL queries to do what I need to get done. This is also a way for me to share my own research and queries with the community. Many of these queries have taken hours, even days to research and compile into something useful. 

My goal with this script is to put the analysis function the SQL query itself. Each query will output a different part of the puzzle. The script itself just compiles the data into a CSV or SQLite database for viewing and filtering. While this database/spreadsheet can get very large, it is still more efficient that running queries on multiple databases and compiling the data into a timeline manually.

Because this script is all based on investigator provided queries it is highly customizable. If an investigator only wants health data, they can elect to run only those query modules. Each query can be customized by the investigator relatively easy, if you don’t need that column – remove or comment it out! I’ve uploaded many queries that I think most investigators would find useful. It is my hope that other investigators create their own and share them with the community.

How

The script is a simple Python script that intakes what I’m calling modules. Each module is a single SQL query that pulls out specific data from a database. The module will also have some metadata about what the SQL query

The module is a text file that contains a few required items:

  • DATABASE – The exact name of the database to perform the SQL query on.

  • ACTIVITY – What this particular record is categorized as.

  • KEY_TIMESTAMP – The timestamp to be used as the key for timelining.

  • SQL Query – The query that is performed on the database. They query should extract one specific type of record from the database, thus a database may have many modules for very specific outputs.

Depending on what the user is looking for they can run one, some, or all the modules and it will output to either a CSV or SQLite database file.

Example Usage

The script is a simple python script that only takes only a few arguments. 

python apollo.py -output {csv, sql} <modules directory> <data directory>

There are two output options, a CSV file or a SQLite database. Please let me know if other outputs are required. Following that, the path to the module’s directory and the path to the data directory. The data directory can be a single directory full of the databases needed to parse or a full file system dump – the script will find the databases by name in the module.

An example of the output is seen below. Each SQL query run will have different output for that particular query and database in the output column. In the example we see everything from health steps/distance, to location, to application usage. The Key column contains the timestamp that each record is organized on, the Activity contains the type of record. The database and module columns contain which database and module parsed that information.

Challenges 

One of the main challenges with these Pattern of Life databases is access. Most of the really good forensically useful POL data is not easily accessible, particularly with iOS devices. With macOS devices we may need to deal with FileVault encryption or database level encryption. This script assumes you have good data to work with. Some databases may be available some may not be.

Feedback

While I primarily created this script for my use (don’t we all), I am open to feedback. If instead of using ConfigParser module files you want something else, let me know. If you need different output formats, let me know. I will consider all feedback.

I would appreciate all the help I can get. I’ve primarily tested these queries with a focus on iOS 11, however I know many will work on older versions and some have been tested with available data from iOS 12 (iOS Health). I’ve also tested running the script on macOS with Python 2.7 installed, it may not run as expected on other platforms. (FWIW: I will upgrade to python3 when macOS does.)

The script was just updated to be more efficient by Sam Alptekin of @sjc_CyberCrimes. Sam made the script much faster when running against full file system dumps. He was able to take it down from hours to run to mere minutes. Thanks Sam!

The Next Eleven Days

Each day will have a different topic that will guide you through the usefulness of the APOLLO framework. I focus primarily on iOS in these articles, but many of the queries can be ported over to macOS as well. I do intend to work on this in the future however I will discuss more on improvements later.

I will cover all sorts of topics in the next couple of weeks.

  • Device State

  • Media

  • Health

  • GUI Artifacts

  • Network and Communications

  • Connections

  • Application Usage

  • So much more!

Get APOLLO here and take it for a spin!

Day 2: On the Second Day of APOLLO, My True Love Gave to Me - Holiday Treats and a Trip to the Gym - A Look at iOS Health Data

Day 3: On the Third Day of APOLLO, My True Love Gave to Me – Application Usage to Determine Who Has Been Naughty or Nice

Day 4: On the Fourth Day of APOLLO, My True Love Gave to Me – Media Analysis to Prove You Listened to “All I Want for Christmas is You” Over and Over Since Before Thanksgiving

Day 5: On the Fifth Day of APOLLO, My True Love Gave to Me – A Stocking Full of Random Junk, Some of Which Might be Useful!

Day 6: On the Sixth Day of APOLLO, My True Love Gave to Me – Blinky Things with Buttons – Device Status Analysis

Day 7: On the Seventh Day of APOLLO, My True Love Gave to Me – A Good Conversation – Analysis of Communications and Data Usage

Day 8: On the Eighth Day of APOLLO, My True Love Gave to Me – A Glorious Lightshow – Analysis of Device Connections

Day 9: On the Ninth Day of APOLLO, My True Love Gave to Me – A Beautiful Portrait – Analysis of the iOS Interface

Day 10: On the Tenth Day of APOLLO, My True Love Gave to Me – An Oddly Detailed Map of My Recent Travels – iOS Location Analysis

Day 11: On the Eleventh Day of APOLLO, My True Love Gave to Me – An Intriguing Story – Putting it All Together: A Day in the Life of My iPhone using APOLLO

Day 12: On the Twelfth Day of APOLLO, My True Love Gave to Me – A To Do List – Twelve Planned Improvements to APOLLO

Slides and Script! From Apple Seeds to Apple Pie & Introducing APOLLO: The Apple Pattern of Life Lazy Output'er

I had the privilege and honor to present at the first ever Objective by the Sea Mac Security Conference yesterday in Maui (hardship, right?). It was only the first day and it was absolutely spectacular, I may have to make this one a regular! I can easily recommend attending this conference.

I presented From Apple Seeds to Apple Pie - an Apple Pattern of Life talk (mostly focused on iOS devices). You can find the slides in my Resources section.

I also just released a (very) beta version of APOLLO (Apple Pattern of Life Lazy Output’er) on my GitHub page. The TL;DR of the script: Take all the creepy databases that Apple writes events to, perform individual SQL queries on them to pull out investigative useful data, and combine them into another SQLite database for easier/quicker analysis and correlation.

This script and its modules are still in the testing phases so please be careful when using this on real cases. Expect more modules and testing to be released, I’m holding some back due to some timestamp issues and other are partially written up.