Follow-on to DFIR Summit Talk: Lucky (iOS) 13: Time To Press Your Bets (via @bizzybarney)

Facial Recognition in Photos

One facet of my DFIR Summit talk I want to expand upon is a look into the Photos application, and a few of the derivative pieces of that endeavor.  While trying to focus on the topic of facial recognition, it seemed prudent to include a brief progression from snapping a photo thru to a persons name being placed beside their face in the Photos application.  

When you use the Native camera and snap a photo, depending on user options, at least a few standard things occur.  It ultimately writes the newly taken photo to /private/var/mobile/Media/DCIM/1**APPLE/IMG_0001.HEIC / .JPG.  As the photo is taken, the Photos.sqlite database is updated to reflect a lot of the metadata about the photo,  which will be covered a bit later.  Additionally, the “PreviewWellImage.tiff” is created.  The “PreviewWellImage.tiff” represents the photo you see when you open your Photos application and see a preview of the most recent image, which is the photo just taken by the camera in this instance.

The beginning of the user’s photos reside in the ../100APPLE/ directory, but this directory iterates upwards (101APPLE, 102APPLE, etc) as more and more photos and videos are saved.  If iCloud syncing is turned on by the user, then several others behaviors occur - but that is for another time.

Let’s focus on the analysis and intelligence built into the Photos application.  I’m able to type text strings and my photos are immediately searched for matching objects within them.  There is a section of my Photos dedicated to “People” where a name has been associated with a face that Apple has analyzed.  

Some of the analysis pieces occurring with the Photos happens in the mediaanalysis.db file.  This file is analyzing and scoring the media files and producing results that seems to feed into other pieces of the analysis.  Some scoring results to highlight are ones that focus on Humans, Faces, and Pets.  

Path: /private/var/mobile/Media/MediaAnalysis/mediaanalysis.db

The ‘Results’ table of the mediaanalysis.db file contains the ‘assetId’ which represents a media file, a ‘resultsType’ which is the specific analytical type and score value found in that media file, and a BLOB (binary large object) which is a binary .plist (bplist).  You can see in the image below, the ‘assetId’ 3 has numerous ‘resultsType’ associated with it and the BLOB for ‘resultsType’ of 1 is selected and on the right you can see the bplist.  

That bplist can be printed to a text file by saving the bplist as a file and then using ‘plutil’ to print it to a text file. As you can see below beside the red star, the printed text is a clean presentation of that .bplist and it tells us that ‘resultsType’ of 1 is associated with Faces based on the scoring.  

I repeated that process for the remaining pieces of data and wrote a brief description of the results type for each, although my device still had a few types that did not have any results.

After running a SQL Query against this database, you can sort the results to potentially see just files that have the results type for ‘humanBounds’ and ‘humanConfidence’.  The ‘localIdentifier’ column from the “assets’ table is a UUID which matches up to the ZGENERICASSETS table of the Photos.sqlite.  

Here is the query for the mediaanalysis.db file.  It’s big and ugly but please test it out if you’re interested, but this piece just seems to build into what we will see later in the Photos.sqlite where this all comes together.

select
	a.id,
	a.localIdentifier as "Local Identifier",
	a.analysisTypes as "Analysis Types",
	datetime(a.dateModified+978307200, 'unixepoch') as "Date Modified (UTC)",
	datetime(a.dateAnalyzed+978307200, 'unixepoch') as "Date Analyzed (UTC)",
CASE
	when results.resultsType = 1 then "Face Bounds / Position / Quality"
	when results.resultsType = 2 then "Shot Type"
	when results.resultsType = 3 then "Duration / Quality / Start"
	when results.resultsType = 4 then "Duration / Quality / Start"
	when results.resultsType = 5 then "Duration / Quality / Start"
	when results.resultsType = 6 then "Duration / Flags / Start"
	when results.resultsType = 7 then "Duration / Flags / Start"
	when results.resultsType = 15 then "Duration / Quality / Start"
	when results.resultsType = 19 then "Duration / Quality / Start"
	when results.resultsType = 22 then "Duration / Quality / Start"
	when results.resultsType = 23 then "Duration / Quality / Start"
	when results.resultsType = 24 then "Duration / Quality / Start"
	when results.resultsType = 25 then "Duration / Quality / Start"
	when results.resultsType = 27 then "Duration / Quality / Start"
	when results.resultsType = 36 then "Duration / Quality / Start"
	when results.resultsType = 37 then "Duration / Quality / Start"
	when results.resultsType = 38 then "Duration / Quality / Start"
	when results.resultsType = 39 then "Duration / Quality / Start"
	when results.resultsType = 48 then "Duration / Quality / Start"
	when results.resultsType = 8 then "UNK"
	when results.resultsType = 11 then "UNK"
	when results.resultsType = 13 then "UNK"
 	when results.resultsType = 21 then "UNK” 
	when results.resultsType = 26 then "UNK"
	when results.resultsType = 31 then "UNK"
	when results.resultsType = 42 then "UNK"
	when results.resultsType = 45 then "UNK"
	when results.resultsType = 49 then "UNK"
	when results.resultsType = 9 then "Attributes - junk"
	when results.resultsType = 10 then 'Attributes - sharpness'
	when results.resultsType = 12 then "Attributes - featureVector"
	when results.resultsType = 14 then "Attributes - Data"
	when results.resultsType = 16 then "Attributes - orientation"
	when results.resultsType = 17 then 'Quality'
	when results.resultsType = 18 then "Attributes - objectBounds"
	when results.resultsType = 20 then "Saliency Bounds and Confidence"
	when results.resultsType = 28 then "Attributes - faceId / facePrint"
	when results.resultsType = 29 then "Attributes - petsBounds and Confidence"
	when results.resultsType = 30 then "Various Scoring Values"
	when results.resultsType = 32 then "Attributes - bestPlaybackCrop"
	when results.resultsType = 33 then "Attributes - keyFrameScore / keyFrameTime"
	when results.resultsType = 34 then "Attributes - underExpose"
	when results.resultsType = 35 then "Attributes - longExposureSuggestionState / loopSuggestionState"
	when results.resultsType = 40 then "Attributes - petBounds and Confidence"
	when results.resultsType = 41 then "Attributes - humanBounds and Confidence"
	when results.resultsType = 43 then "Attributes - absoluteScore/ humanScore/ relativeScore"
	when results.resultsType = 44 then "Attributes - energyValues/ peakValues"
	when results.resultsType = 46 then "Attributes - sceneprint/ EspressoModelImagePrint"
	when results.resultsType = 47 then "Attributes - flashFired, sharpness, stillTime, texture"
end as "Results Type",
	hex(results.results) as "Results BLOB"
from assets a
left join results on results.assetId=a.id

Before diving into the Photos.sqlite file, I want to first point out the file recording the text strings as an apparent result of Apple’s object analysis of the photos.  This file stores the text results which empower our ability to search text strings in the Photos application and return results.  The text strings are not necessarily a result of any specific user activity, but instead an output from analysis automatically being deployed by Apple against the user’s media files.  

Path: /private/var/mobile/Media/PhotoData/Caches/search/psi.sqlite

The ‘word_embedding’ table within psi.sqlite contains columns ‘word’ and ‘extended_word’ which are just strings stored in BLOB’s.  Using DB Browser for SQLite you can export the table to a CSV and it prints the strings from the BLOB’s pretty cleanly.  Separately there is also a table named ‘collections’ that has ‘title’ and ‘subtitle’ columns that appear to be a history of the Memories and Categories that have been used or ones that will be used.

The last table in psi.sqlite to mention for this piece is the ‘groups’ table.  Within the groups table the ‘content_string’ contains some really interesting data.  I initially set out to find just the words “Green Bay” as it was something populated in my text search for the letter “g”.  What I found was far more interesting.  I did find “Green Bay” but additionally found in one of the other BLOB’s, “Green Bay Packers vs. Miami Dolphins”.  That BLOB has a little extra flavor added by Apple for me.  Whether they simply used my geo coordinates baked into my photos from being at Lambeau Field, or analyzed the content of the photos and found the various Miami Dolphins jerseys - I’m not sure.  But a very interesting artifact that is absolutely accurate, dropped in there for me.  Thanks Apple!

Now let’s tackle Photos.sqlite, but only as it pertains to facial recognition and associating photos of people to an actual name.  Because quite honestly this singular file is nearly a full time job if someone wanted to parse every inch of it, and maintain that support.

Path: /private/var/mobile/Media/PhotoData/Photos.sqlite

My instance of Photos.sqlite is a beast, weighing in at over 300MB and containing 67 tables packed full of data about my Photos.  We are going to focus on two tables - ZDETECTEDFACE and ZPERSON.  

ZDETECTEDFACE 

This table contains values that indicate features about faces to include an estimate of age, hair color, baldness, gender, eye glasses, and facial hair.  Additionally there are indicators for if the left or right eyes were closed, and X and Y axis measurements for the left eye, right eye, mouth and center.  So the data in this table is extremely granular, and was quite fun to work through.  Who doesn’t like looking at old photos?

ZPERSON

This table contains a count for the number of times Apple has been able to identify a certain face from the media files.  So in my device, I am recognized by name for hundreds of photos, but there are also photos of me where it hasn’t associated my name with my face.   For each face identified, a UUID (Unique Identifier) is assigned.  So although the analytics piece may not be able to connect a face with a name, it can group all identified instances of the unknown faces as being the same person. 

If there is an association made between the person’s face and a saved contact, the ZCONTACTMATCHINGDICTIONARY column’s BLOB data can possibly reveal a full name and the phone number.  This again can be achieved by printing the bplist to a .txt file.

select
	zga.z_pk,
	zga.ZDIRECTORY as "Directory",
	zga.ZFILENAME as "File Name",
CASE
	when zga.ZFACEAREAPOINTS > 0 then "Yes"
	else "N/A"
	end as "Face Detected in Photo",
CASE 
	when zdf.ZAGETYPE = 1 then "Baby / Toddler"
	when zdf.ZAGETYPE = 2 then "Baby / Toddler"
	when zdf.ZAGETYPE = 3 then "Child / Young Adult"
	when zdf.ZAGETYPE = 4 then "Young Adult / Adult"
	when zdf.ZAGETYPE = 5 then "Adult"
end as "Age Type Estimate",
case
	when zdf.ZGENDERTYPE = 1 then "Male"
	when zdf.ZGENDERTYPE = 2 then "Female"
	else "UNK"
end as "Gender",
	zp.ZDISPLAYNAME as "Display Name", 
	zp.ZFULLNAME as "Full Name",
	zp.ZFACECOUNT as "Face Count",
CASE	
	when zdf.ZGLASSESTYPE = 3 then "None"
	when zdf.ZGLASSESTYPE = 2 then "Sun"
	when zdf.ZGLASSESTYPE = 1 then "Eye"
	else "UNK"
end as "Glasses Type",
CASE
	when zdf.ZFACIALHAIRTYPE = 1 then "None"
	when zdf.ZFACIALHAIRTYPE = 2 then "Beard / Mustache"
	when zdf.ZFACIALHAIRTYPE = 3 then "Goatee"
	when zdf.ZFACIALHAIRTYPE = 5 then "Stubble"
	else "UNK"
end as "Facial Hair Type",
CASE	
	when zdf.ZBALDTYPE = 2 then "Bald"
	when zdf.ZBALDTYPE = 3 then "Not Bald"
end as "Baldness",
CASE
	when zga.zlatitude = -180
	then 'N/A'
	else zga.ZLATITUDE
end as "Latitude",
CASE
	when zga.ZLONGITUDE = -180 
	then 'N/A' 
	else zga.ZLONGITUDE
end as "Longitude",
	datetime(zga.zaddeddate+978307200, 'unixepoch') as "Date Added (UTC)",
	ZMOMENT.ztitle as "Location Title"
from zgenericasset zga
left join zmoment on zmoment.Z_PK=zga.ZMOMENT
left join ZDETECTEDFACE zdf on zdf.ZASSET=zga.Z_PK
left join ZPERSON zp on zp.Z_PK=zdf.ZPERSON
where zga.ZFACEAREAPOINTS > 0

Below is a sample of the output of this analysis, paired with the photo the metadata came from.  You can see it is able to identify me and my two daughters by name, and accurately assess our genders, my sunglasses and facial hair.  

Please test, verify, and give me a shout on Twitter @bizzybarney with any questions or concerns.  

Socially Distant but Still Interacting! New and Improved Updates to macOS/iOS CoreDuet interactionC.db APOLLO Modules

The interactionC.db database certainly does not get as much as attention as its CoreDuet partner in crime, knowledgeC.db. However, I think it has quite a bit of investigative potential. I’ve written about it before in a prior blog, however I’d like to give it more attention here.

I spent this weekend updating the APOLLO modules to have more contextual support and better backwards compatibility with older iOS versions. This database was also introduced to the macOS side with 10.15. 

I’ve added a new query for this database for the ZKEYWORDS table. This tables appears to capture keywords that are contained in various Calendar (com.apple.mobilecal) events. It seems only select certain events as not all my calendar event has keywords in this table.

The main interactionsC.db query has many new updates including attachments and sender/recipient correlation. In general, this database is keeps track of “recent” contact interactions. As an example, I used this query on my iOS database that I copied off of my iPhone a few days ago and it shows ~28,000 entries going all the way back to January of this year, 6+ months! 

In the screenshot below is a Messages (com.apple.MobileSMS) conversation between Heather Mahalik and I. Items that are blurred are our phone numbers. The GUIDs are our Contact Person IDs, these can be correlated with information in the iOS Address Book database. Why some recipient information is blank, I’m not sure. I scrolled way back in our conversation history and the timestamps are spot on. The content of this conversation could (and should) be correlated with the Messages database (sms.db).

This data is not just for Messages and may include other application bundle IDs. Some that I’ve seen in my data include:

  • com.apple.InCallService – Phone Calls

  • com.apple.MobileSMS - Messages

  • com.apple.Preferences - Settings

  • com.apple.ScreenshotServicesService - Screenshots

  • com.apple.mobilecal - Calendar

  • com.apple.mobilemail - Mail

  • com.apple.mobilesafari - Safari

  • com.apple.mobileslideshow - Photos

Contact Interactions & Attachments

For another example, let’s take a look at some interactions for the Photos app (com.apple.mobileslideshow). Not every interaction will have attachments associated with them. The screenshot below contains some AirDrop activity from this device. Some images were AirDropped from the Photos app itself, while one image was AirDropped from within the Messages application as shown in the Target Bundle ID column. It also shows the contact information to whom it was sent - helpful! 

Some of these attachments have a type associates with them in the UTI column (HEIC/PNG), the other has an associated attachment ID, show in hex (0x DCCB49C2FC74461EAD90DAB0C537DBF7). This is a hex representation of the UUID for the image. It seems not every image will have this attachment ID.

This image UUID can be searched for in the Photos database, as shown below to determine which image exactly was AirDropped. 

This might be another good artifact to look for in addition to some of the unified logs I’ve highlighted previously in AirDropping some Knowledge. This is what Elwood’s Phone looks like after lots of AirDropping of various items for that blog. 

Disassociated but not Forgotten Attachments

It seems some attachments are no longer associated with those in the ZINTERACTIONS Table but are not necessarily removed from the ZATTACHMENTS Table. This has potential forensic use as well. APOLLO will not extract these as there doesn’t appear to be timestamps associated with it however, I HIGHLY recommend at least taking a brief look at this table when doing an investigation.

The example below shows may attachment files of different types were associated with a contact interaction.  

  • PDF

  • Images (PNG, HEIC, JPG)

  • Contact VCard

  • Text

  • Archives

  • Movies

  • Documents

Unfortunately, we lose the contact context here. However, we can still find useful tidbits of information like text, file types, URLs, image UUIDs, etc.

Other examples include text that was copy/pasted, or URLs like a Google search link or a YouTube video that was sent.

A Note about Accounts & Recipient Counts

One more example of Mail (com.apple.mobilemail) shows the ACCOUNT column. This GUID can be used to tie interactions to a specific email account in the Accounts databases (Accounts3.sqlite/Accounts4.sqlite).

The Recipient Count columns can be a bit misleading. The column I’ve labeled ‘Recipient Count’ is the amount of recipients on an interaction. This examples shows 2, however that does not include myself. This is an example of an email thread between Heather, Lee, Phil, and myself. I would have thought there would be at least a 3 in this column however that doesn’t appear to be the case. A good example to not make assumptions!

The incoming/outgoing sender/recipient count columns are even more misleading – I haven’t quite figured those out but I believe this might be the total amount of interactions versus the amount of “recipients” on those interactions.

InteractionsC.db can really be a useful database, especially when it is used along with all the other data that APOLLO extracts – context can be everything!

Extensive knowledgeC APOLLO Updates!

While helping some investigators out I realized that my some of my APOLLO knowledgeC modules needed a bit of updating. Naturally I thought it would be quick, but it turned into quite an extensive update. I’ve included lots of brand-new modules as well as updates to ones that I’ve had before. 

Most of the updates to the older ones provided better backwards compatibility with older versions of macOS and iOS as well as adding additional contextual items to some of the queries from ZSTRUCTUREDMETADATA. Regression testing was performed on iOS 11, 12, and 13 and macOS 10.13, 10.14, and 10.15. Of course, please let me know if you run into a knowledgeC “stream” that I’ve not created a module for, or any issues that you might come across. 

I’ve highlighted a few modules below using my iOS 13.5 device. However, they may also apply to macOS and older iOS versions as well – review the modules for more documentation.

New Modules:

  • knowledge_activity_level_feedback.txt

  • knowledge_airplay_prediction.txt

  • knowledge_calendar_event_title.txt

  • knowledge_charging_smart_topoff_checkpoint.txt

  • knowledge_dasd_battery_temperature.txt

  • knowledge_device_locked_imputed.txt

  • knowledge_discoverability_usage.txt

  • knowledge_event_tombstone.txt

  • knowledge_inferred_microlocation_visit.txt

  • knowledge_knowledge_sync_addition_window.txt

  • knowledge_photos_edit_all.txt

  • knowledge_photos_deletes_all.txt

  • knowledge_photos_deletes_recent.txt

  • knowledge_photos_engagement.txt

  • knowledge_photos_share_airdrop.txt

  • knowledge_photos_share_all.txt

  • knowledge_photos_share_extension.txt

  • knowledge_segment_monitor.txt

  • knowledge_siri_activites.txt

  • knowledge_siri_flow_activity.txt

  • knowledge_sync_addition_window.txt

  • knowledge_sync_deletion_bookmark.txt

  • knowledge_user_first_backlight_after_wakeup.txt

The knowledge_app_activity_passbook.txt module was added to conveniently look for Apple Wallet (com.apple.Passbook) activity. Shown below I’m switching between my Apple Cash card and my Apple Card (yes, I got one for “research”).

The knowledge_photos_deletes_all.txt module appears to keep track of when I deleted a photo from the Photos app. This output is fairly vague. However, it could be useful in evidence destruction cases. The output of this one is similar to the other knowledge_photos_* modules.

Want to know if a thing was AirDrop’ed, copied, searched for, or otherwise interacted with from the iOS ShareSheet? The knowledge_sharesheet_feedback.txt module will help with that! Shown below, this module is keeping track of:

  • Photo Markups (com.apple.MarkupUI.Markup.MarkupPhotoExtension) via Camera App (com.apple.camera)

  • File Copies (com.apple.UIKit.activity.CopyToPasteboard) in Photos (com.apple.mobileslideshow)

  • Sending a photo in Messages (com.apple.MobileSMS) via Photos app (com.apple.mobileslideshow)

  • Finding text in a webpage (com.apple.mobilesafari.activity.findOnPage) in Safari (com.apple.mobilesafari)

  • Airdrop Activity (com.apple.UIKit.activity.AirDrop) 

Some modules are fairly self-explanatory. The knowledge_system_airplane_mode.txt modules keeps track of whether Airplane Mode on the device is enabled or not.

The next two are associated with the iOS low power mode functionality. The first, knowledge_device_battery_saver.txt which shows that I’ve activated Low Power Mode via the Control Center and while knowledge_device_low_power_mode.txt shows that it was turned on about two seconds after.

Click for larger view.

Updated Modules:

  • knowledge_activity_level.txt

  • knowledge_app_activity.txt

  • knowledge_app_activity_calendar.txt

  • knowledge_app_activity_clock.txt

  • knowledge_app_activity_mail.txt

  • knowledge_app_activity_maps.txt

  • knowledge_app_activity_notes.txt

  • knowledge_app_activity_photos.txt

  • knowledge_app_activity_safari.txt

  • knowledge_app_activity_weather.txt

  • knowledge_app_install.txt

  • knowledge_app_intents.txt

  • knowledge_app_location_activity.txt

  • knowledge_audio_bluetooth_connected.txt

  • knowledge_audio_output_route.txt

  • knowledge_device_batterylevel.txt

  • knowledge_device_inferred_motion.txt

  • knowledge_device_is_backlit.txt

  • knowledge_device_locked.txt

  • knowledge_device_pluggedin.txt

  • knowledge_discoverability_signals.txt

  • knowledge_notification_usage.txt

  • knowledge_paired_device_nearby.txt

  • knowledge_portrait_entity.txt

  • knowledge_portrait_topic.txt

  • knowledge_app_relevantshortcuts.txt

  • knowledge_safari_browsing.txt

  • knowledge_settings_doNotDisturb.txt

  • knowledge_siri.txt

  • knowledge_standby_timer.txt

  • knowledge_widgets_viewed.txt

The module knowledge_app_inFocus.txt has added extensions context. The extensions below show a location sign-in alert (com.apple.AuthKitUI.AKLocationSignInAlert) via the Springboard (com.apple.springboard), access to the Camera (com.apple.camera) via Messages (com.apple.MobileSMS), and access to Photos (com.apple.mobileslideshow) via Messages. All the while, playing around with the Unc0ver Jailbreak (science.xnu.undecimus).

New with knowledge_app_webusage.txt are the “Digital Health” columns. These will show website visits and associated URLs on various apps (not just Safari or Chrome!). 

In this example I was using Twitter (via Safari) on a device with the macOS hardware UUID (or iOS UDID) in Device ID column - let’s say my laptop. On my iPhone, I was also on Twitter but this time the iOS application (com.atebits.Tweetie2) ordering a new t-shirt from Jailbreak Brewery

Additions to knowledge_audio_media_nowplaying.txt include:

  • Is AirPlay Video

  • Playing – Values likely for Stopped, Playing, Paused – I will test those and update those in a future update.

  • Duration

  • Elapsed

  • Identifier

  • Media Type – Audio, Music, Video, Podcast

  • Output Device IDs (Binary plist in hex)

This is only a small slice of knowledgeC examples (and a very small part of APOLLO) so I hope this gives you some incentive to give it a try!

APOLLO and tvOS – It Just Works! (...and judges me for binging TV)

It’s been a while since I last jailbroke an Apple TV and had a forensic look at it. Using the checkra1n jailbreak, I decided to give it a try. The jailbreak itself was easy and went very smooth. This was using an 4th Gen Apple TV running tvOS 13.4

I wanted to run it through some of my APOLLO modules to see if any needed to be updated. Fortunately, none do as it acts just like iOS! (whew!) There is a noticeable lack of some files and databases compared to iOS proper, but some good ones are still accessible! 

KnowledgeC.db

Starting with my favorite database, knowledgeC.db you will notice there are many less “streams” for tvOS. Even so, there are a few that are still of investigative use!

KnowledgeC.db – App InFocus 

This screenshot below shows me going back and forth between different apps and the usage time for them. I watch both recent NASA launches on NASA TV (gov.nasa.NASA) while also watching some TV on Amazon Prime (com.amazon.aiv.AIVApp). The com.apple.HeadBoard app is the main app selection screen.

KnowledgeC.db – Now Playing

 (Note this module is getting an update hopefully later this week, what you see below has some of those updates.🤞)

This screenshot shows what binge-watching Alias on Amazon Prime looks like. After the NASA launch, back into Alias I went! episode after episode until the “Are you still watching?” message pops up. 😆

It’s not just TV and movies for me, sometimes I’m rocking out to music! This screenshot shows me streaming Apple Music. In the middle of this I watched some cat videos in the Photos app. Unfortunately, those do have any metadata associated.

TCC.db

Next up are app permissions with TCC.db. This one is sparse compared with those of iOS and macOS but could show some useful information. kTCCServiceLiverpool is generally assumed to be part of location services and kTCCServiceUbiquity is associated with iCloud. kTCCServiceMSO is a new one to me but apparently HBO needs it. 🤷🏻‍♀️

Locationd (cache_encryptedB.db)

You may think that Apple TVs probably do not capture much locational activity, however they are keeping track of WiFi locations in locationd’s cache_encryptedB.db. This particular Apple TV doesn’t leave my living room, but I do have others that I could travel with…if and when I travel again!

Networkd (netusage.sqlite)

Finally, all this streaming adds up on the network usage which can be seen in the netusage.sqlite database. I’ve sorted this output by Wi-Fi in. Not surprising on top are processes for Netflix, HBO, and Amazon. The NASA app even made it close to the top too! 🚀

Analysis of Apple Unified Logs: Quarantine Edition [Entry 11] – AirDropping Some Knowledge

I’ve written about this before in this article but wanted to revisit it for this series. For this scenario I want to test what certain items might look like when they are AirDrop’ed from an unknown source. Many schools have been receiving bomb threats via AirDrop, I want to see if there is a way to discover where they originated from.

In my testing you will see artifacts from two iOS devices:

  • Sender: Elwood’s iPhone

  • Receiver: miPhone11 

Note: This article focuses on iOS. For macOS you will likely have to add --info to these queries to acquire similar information.

Starting with the AirDrop basics – we need to determine the AirDrop ID for each user. One thing I’ve discovered since my last analysis was that the AirDrop ID is not consistent for the life of the device, it changes all the time! The last (current) AirDrop ID can be found in the /private/var/mobile/Library/Preferences/com.apple.sharingd.plist on iOS devices. I’ve even seen where there is no AirDrop ID in this plist due to AirDrop inactivity. The following query can provide the Airdrop IDs that are still available in the unified logs.

log show system_logs.logarchive --predicate 'eventMessage contains "AirDrop ID"'

We may also want to know what Discoverability Mode was being used at the time. A quick query for this is to look for ‘SharingDaemon’ which contains some sharingd metadata information.

log show system_logs.logarchive --predicate 'eventMessage contains "SharingDaemon State"'

These messages contain a few useful items:

  • Device make/model

  • iOS Version

  • Battery Status

  • Discoverability mode (Everyone, Contacts Only, Off)

  • Screen Status

  • Unlock Status

  • Wireless Proximity Status

In our contrived scenario we are assuming the receiver has their discoverability mode set to ‘Everyone’ during the time in question. The current mode can also be seen in the com.apple.sharingd.plist file. One more way of seeing the discoverability mode is by using this query (yes, I accidentally put in two --info arguments, this of course is not required):

log show --info system_logs.logarchive --predicate 'eventMessage contains "Scanning mode"'

On the Sender Device (Elwood’s iPhone):

Sharing Methods 

The first indication of AirDrop usage is how it is being initiated. This process is known as the ‘ShareSheet’. This is the window that is presented to the user to choose how they want to share an item. In this screenshot, I want to share a photo from within the Photos app. We can choose AirDrop, Messages, Mail, Notes, etc. Below that is a set of other activities that can be performed on the chosen photo.

This query can show us what application an item is being shared from. Each shared item may have different sharing options. This blog will show activity while AirDropping a photo, a note, a map, and a Safari link. 

log show system_logs.logarchive --predicate 'category = "ShareSheet" or category = "SharingUI"'

Starting at the top, a good indicator that something is about to be shared is to look for the message ‘Activating com.apple.sharing.sharesheet’. A connection will be made with the specific app that is to be shared from, in this example com.apple.mobileslideshow (Photos).

The items highlighted in red are looking for people to share with. This particular device is a test device with no contacts; therefore none were suggested. 

The items highlighted in purple and blue show the Share and Action activities that the user should see in the ShareSheet view.

In green, the “performing activity” message shows that AirDrop was selected by the user.

In pink, messages that start with “Item:” and have a GUID show that photos need a bit more preparation (file conversion, thumbnail creation, etc.). This was not seen in shared notes, maps, and Safari links. This specific activity can be filtered by using the GUID as shown below. The items highlighted in dark green provide temporary file paths used for the preparation but most importantly a filename that should be consistent with the item filename in the Photos.sqlite database (IMG_6782.JPG).

log show system_logs.logarchive --predicate 'eventMessage contains "74745469-9184-442C-B49D-5BE37CDD8CAA"'

More examples sharing methods of an AirDropped Note, Map, and a Safari Link are shown below. Notice the differences in the activities for each application. 

Note

Map

Safari Link

AirDropping an Item

Just because we see something was attempted to be shared, does not necessarily mean it was actually sent and received. The first part of this process is finding someone to AirDrop something to. I will be using the following query to go through some of these entries. 

log show system_logs.logarchive --predicate 'category = "AirDrop"'

In the screenshot above I’ve changed the default style to compact [with --style] to fit more in the screenshot. This shows the iPhone attempting to discover known and unknown contacts via Bonjour and AWDL. Highlighted are the entries that find my MacBook Air (Airdrop ID: eb4f5a53391b). Note the AWDL IPv6 addresses shown in yellow. These do appear get cached on the receiving end (look for messages that contain “com.apple.p2p: Caching peer:”). It appears AWDL IPv6/MAC addresses get rotated fairly frequently. These is another way of pairing two each device together (along with AirDrop IDs) but these are not kept forever and you need both devices to do this analysis.

Now that we have AirDrop contacts, lets send a photo! I sent a photo from Elwood’s iPhone to miPhone11 (IMG_6782.JPG) via Photos.

The message that start with “startSending” (in yellow) is what is being AirDrop’ed. It shows the item or file that it is sending along with the Receiver ID and Session ID. The Receiver ID is the AirDrop ID for the device that this item is being sent to, while the Session ID keeps track of this AirDrop session.

In green, the AirDrop transaction is started, however in dark green it shows that a connection cannot be made. During my testing, my miPhone11’s AirDrop ID got caught in an odd cached state with the wrong ID (3603f73a17de). The first time I attempted to AirDrop this photo it failed. It eventually discovered the correct ID (ecec57b722d8). Elwood’s iPhone showed a ‘Waiting…’ message and the transaction would not complete.

The second time around it actually sent. Note the changed AirDrop ID for miPhone11 is now 04f30cbdcb55. These IDs change all the time. It also recognizes the device can handle a live photo in an HEIC file format, so it sends that instead of the JPG.

The ShareSheet information for the Live Photo is below. I recommend matching up the GUIDs to find this information. Filtering for 6C85AC86-BEF2-42BC-9862-4982211791DF (from the screenshot above) would allow me to run the query below to find the rest of the actions associated with this asset.

log show system_logs.logarchive --predicate 'eventMessage contains "A79A3C4F-AF63-486E-A7FC-4173753B12E2"'

A few more AirDrop examples, a note, a map, and a Safari link.

Note

These entries show a note with the title ‘This is a threatening note!’ was shared but no contents of the note itself.

Map

Sharing a Map item, provides title but no exact address in this example.

Safari Link

Sending a Safari link shows the domain but no specific details on the URL. 

The same Safari link as a PDF file:

Sending the same link but as a PDF shows no domain at all.

These items lose context when only the unified logs are looked at. You may have to correlate these actions with other application databases and artifacts (Photos, Notes, Maps, Safari History, etc.) to provide this context.

On the Receiving Device (miPhone11):

We will be using a generic AirDrop query to find all entries associated.

log show system_logs.logarchive --info --predicate 'category = "AirDrop"'

A user receiving files can choose to accept or decline. These responses are documented in the unified logs.

For an incoming transfer an AirDrop connection is made with an identifier (0x101495960). After this entry there are quite a few lines detailing what type of file it is and where it is coming from. To know if the user has Accepted or Declined the transfer we need to focus on the ‘userResponse’ sections. The entry shown in green is the popup alert that is presented to the user to do this action.

Once the user has selected ‘Accept’, the transfer continues and is opened up in the default application for the file. This example shows the photo being imported and opened in Photos (com.apple.mobileslideshow). Once complete, the AirDrop connection is terminated. This photo could be found by looking for the “16C753E1-309A-46FD-A742-998D7A31047E” ZUUID in the ZGENERICASSET table of the Photos.sqlite database and looking for the associated file name.

If the user declines the AirDrop transfer the same message would show Declined or Cancelled and the AirDrop connection is terminated. 

How about an all-in-one query to tell me what transfers were initiated, from whom, where they Accepted or Declined, and opening information.

log show system_logs.logarchive --predicate 'category = "AirDrop" and (eventMessage contains "New incoming transfer" or eventMessage contains "Opening URLs:" or eventMessage contains "alertLog: idx:")' --style compact

Each of the gray highlighted sections is an incoming transfer from Elwood’s iPhone. Some transfers have an ‘Opening URLs’ entry that provides more context to what was sent, especially when it comes to Map and Safari links. It should be pointed out that just because you see a hostname like Elwood’s iPhone to be careful with attributing it to Elwood. Device hostnames are incredibly easy to change to let’s say…’Jake’s Google Pixel! 

I spent ages trying to come up with a smoking gun on a ‘victim’ device to attribute an AirDrop action to a specific sender device. There really does not appear to be a static identifier to be able to identify a specific address. AirDrop ID, AWDL IPv6/MAC addresses are the only way to pair these actions between devices but you need both devices to be able to do this type of correlation. This, of course, can be tricky in most investigations. Even if you do have access to the devices, the data gets flushed fairly quickly – you may only have a few days to acquire these logs.