When we normally talk about digital forensics, its primary usage area is in the law enforcement, military or intelligence arenas that we use the principle. But in today’s“ everything is digital” world, the use of digital forensic are moving outside digital labs and into broader use to solve complex questions or understand unknown scenarios in new principles.
I become involved in assisting the Swedish Accident Investigation Authority(HavKom) after two fatal airplane crashes in a short time span involving skydivers in Sweden during the Pandemic. As a previous skydiver I reached out to see how I could assist in understanding the causes of the accidents. Having a history with recovering data from all kinds of mobile devices, a handful devices was successfully extracted from various stage of destruction, that was included in the investigations.
Earlier this year, the HavKom reached out after the traditional investigation have failed to reveal the cause to a fatal helicopter crash in early 2021. After several months of analysing the remains of the helicopter, they could not find any clues to the accident, and the discussions with the manufacturer of the aircraft did not reveal anything that could shed light over the accident.
After discussions on what digital devices that where available for analysis and what kind of data they could reveal, the phones involved in the crash was selected. I focused on an iPhone that had pictures taken from within the aircrafton several buildings and fields on the ground just before the accident. The pictures have already been analysed by the investigators but did not bring the investigation forward at that point.
The phone was extracted using XRY, a mobile forensic tool, and all the data within the phone was analysed. A modern smartphone has a plethora of different data that is logged and leave traces in various apps and logs, everything from screen interactions to barometric sensors and positions. I can admit on doing a classical investigation error after the initial analyse that revealed nothing useful in my book and aiming for the hard-to-get sensor data that no-one really has analysed before. And by analysing the sensor data, get an understanding of what caused the accident. What I should have done, was to talk to an aircraft accident investigator and present ALL my initial findings, and let that investigator point on what was of interest.
And here lies one of the challenges we have when communication between different professions. What I assume is common knowledge about a topic I’m comfortable in, might just be on the edge of another professional’s knowledge of that area, and lead to confusion or missed opportunities.
So, after a month of analysing all the possible databases after any kind of positions, acceleration, pressure and other data, nothing. I had reached the end of the line. Absolutely nothing in there that I could report that could assist in the investigation. Grasping at straws, I focused on the pictures taken with the iPhone’s camera, even though they have previously been analysed. But rather on focusing on the picture content of the file I targeted the meta data in the files.
In virtually all digital pictures today, it is not just the actual picture that is stored in the actual file. A plethora of other information is also embedded in the EXIF data that resides before the actual image data. The specification for EXIF data is quite extensive and includes shutter and aperture settings, date and time, the imaging device (camera), direction, speed and GPS data among other things.
The GPS data portion of the EXIF data CAN contain a true and correct GPS position, altitude and an extremely correct time stamp, down to 0.0001 ms. The main issue with the GPS time stampand the position data is to try to detect if the EXIF GPS data comes form an actual GPS based position or if it has been using other methods of positioning that are available in today’s smartphones and populated in the GPS fields in the EXIF data.
The methods a modern smartphone can use for position itself ranges from Base-station based position (can be off with as much as a mile, but also as good as <100ft), Wi-Fi positioning (normally range from 200ft down to50 ft in accuracy) and true GPS data with an accuracy of 2ft with good reception and enough time to establish a position fix.
One tell-tale of that GPS has been used to set the position fields in the EXIF data is the presence of a non-static altitude value >1 in the GPS data. GPS is the only positioning system today in a smartphone that can generate altitude data, the barometric sensors can only generate relative pressure changes. All the pictures that were of interest had altitude data within the EXIF section, so a true GPS position, time, altitude and direction was assumed to be correct.
The GPS data was exported out for the images for the flight and then mapped into Google Earth .kml file. Google Earth comes with an array of useful features that are very powerful for visualizing geographical data and understand patterns and movement. Positions and other data can be store in .kml files that can nowadays be opened in other mapping tools as well.
So, with a feeling of failure, I wrapped up the case and sent over what I had so far, a bunch of nothing and a repacked position data from the pictures that had already been analysed earlier in the investigation.
2 days later I got a call. We solved it! Thanks to visualizing the mapping information from the pictures, for an experienced avionics investigator, the cause of the accident was obvious. The helicopter had got into a slow speed unexpected rotation, that can happen under certain circumstances.
The takeaway for me, as many times before, do not expect others to have the knowledge you have. Always make sure to dare to ask the obvious questions, to make sure you do not miss researching something that does may not be useful in your opinion, but in the eyes of another person can close the case.
Link to the investigation report (Swedish):