For decades, the forensic “gold standard” was straightforward: isolate the computer, pull the plug, and image the drive. In that era, what you saw on the screen was physically present on the magnetic platters, waiting to be extracted bit by bit. Today, that assumption is not just outdated; it is plain wrong. The rapid adoption of cloud storage services, partial on-demand synchronization, and full-disk encryption has fundamentally broken the traditional dead-box workflow, turning the simple act of powering down a suspect’s computer into a potential destroyer of evidence.
The core of the problem lies in the seamless integration of services like Microsoft OneDrive and Dropbox. With features like “Files On-Demand” and “Smart Sync” enabled by default, the file system becomes a hall of mirrors. A suspect’s folder may list gigabytes of incriminating documents, complete with metadata and thumbnails, yet physically contain zero bytes of actual data on the local disk. In this hybrid environment, the investigator faces a critical dilemma: perform a traditional disk image and recover nothing but useless pointers, or attempt live triage to access the files, inevitably altering the digital crime scene in the process.
The traditional “dead box” approach prioritizes preservation above all else, treating the computer as a static crime scene that must remain untouched. By isolating the machine and physically removing the storage media, investigators ensure that not a single bit of data is modified during acquisition. The drive is connected to a hardware write blocker, and a forensic duplicate is created – a bit-precise mirror of the original evidence. This method provides a safety net for the investigator, guaranteeing that the hash values of the original evidence and the image match. It is a slow, methodical process designed to freeze the state of the system, ensuring the chain of custody stands in court by proving the evidence is identical to the moment of seizure.
In contrast, live system forensics and digital triage prioritize speed and accessibility over absolute perfection. Working on a running, authenticated system allows the investigator to bypass barriers like full-disk encryption such as BitLocker or VeraCrypt, which would otherwise permanently lock away the data the moment power is cut. This approach yields immediate, actionable intelligence, but it comes at a steep price: the cardinal “do not touch” principle is fundamentally violated. Every interaction – plugging in a triage drive, running a tool, accessing the data – inevitably alters the system’s state, modifying memory, updating access logs, and changing timestamps. While this method opens doors that dead-box imaging cannot, it requires rigorous documentation to justify the inevitable alterations to the digital evidence.
The illusion of local storage is powered by sophisticated file system features that effectively decouple file metadata from physical content. Services such as Microsoft OneDrive (Files On-Demand) and Dropbox (Smart Sync) leverage NTFS reparse points to populate the file system with “ghost” entries – files that appear fully intact to the user but exist only as metadata placeholders on the disk. In Windows Explorer, these files display correct names, file sizes, timestamps, and even thumbnails, indistinguishable at a glance from locally stored data. However, this is a digital sleight of hand; the actual binary content resides exclusively on the cloud provider’s servers, kept in the cloud to preserve local storage capacity until the exact moment a user – or an unsuspecting forensic investigator – attempts to open them.
When an investigator follows the traditional dead-box workflow and images a drive containing on-demand cloud files, the result is often a hit or miss. The forensic image (provided that no full-disk encryption was used) will faithfully preserve the file system structure, including the file names, sizes, and timestamps, but the underlying data streams for these non-resident files are nonexistent. The acquisition tool captures only the reparse points – metadata markers pointing to a cloud location – leaving the investigator with files that are technically visible but forensically useless; they cannot be opened, viewed, or hashed because the binary content was never on the disk to begin with. This failure is compounded by the ubiquity of full-disk encryption systems like BitLocker. By powering down the machine to perform offline imaging, the investigator not only fails to capture the cloud data but risks permanently locking themselves out of the local data as well, potentially discarding the decryption keys held in RAM and turning the entire drive into a brick of inaccessible noise.
Accessing “on-demand” files during a live triage session triggers a cascade of file system activity that is dramatically opposed to forensic preservation. The moment an investigator attempts to preview or copy a cloud-only file, the operating system initiates an immediate download of the binary content from the server. This is not a passive read operation, but a heavy write event with severe implications. The file system updates the Last Access and Modification timestamps, generates a flurry of new records in the $UsnJrnl, and begins aggressively consuming free disk space to store the incoming data. For a drive nearing capacity, this sudden influx can be catastrophic, potentially filling the remaining storage and resulting in the out of free space condition. More critically, as these new data blocks are written to the disk, they overwrite unallocated space – the very location where deleted evidence, potentially recoverable artifacts, and Volume Shadow Copies reside – permanently destroying potential leads.
Despite the destruction it causes, this aggressive approach is frequently the only viable path forward. If the suspect’s computer is the sole point of access where the cloud authentication tokens are present and the drive is unlocked, the investigator is backed into a corner. Without the login credentials required to serve a legal warrant to the cloud provider or perform a cloud-to-cloud extraction, the data visible on that screen is the only copy accessible to law enforcement. In these high-stakes scenarios, the abstract purity of forensic soundness must yield to the practical necessity of data recovery. The investigator effectively trades the integrity of the unallocated space and file system metadata for the tangible value of the files themselves, accepting that the process observation inevitably alters the data.
Today, investigators must abandon the dogmatic “always pull the plug” approach in favor of a situationally aware hybrid workflow, adhering instead to the “Order of Volatility” outlined in RFC 3227. This standard dictates that evidence must be collected from the most ephemeral to the most stable, prioritizing data that vanishes the moment power is lost or a process is killed. In practice, this means the collection sequence begins with Memory, followed by System Information, Network Data, and finally Processes & Drivers.
Consequently, the investigator’s first move in a live environment is never to blindly click through “Files On-Demand” folders – an act that needlessly overwrites unallocated space – but to surgically capture RAM and encryption keys. Only after securing this volatile data, and subsequently capturing a bit-precise disk image to freeze the scene’s current state, should one attempt active triage. This hybrid workflow allows the investigator to extract cloud authentication tokens from memory and perform a forensically sound cloud-to-cloud extraction, securing the full file contents without turning the physical disk into a digital graveyard.
Elcomsoft Quick Triage is a tool designed to rapidly extract and analyze the most important evidence from a target computer or disk. It is equally effective during on-site operations and in laboratory environments, helping investigators make informed decisions at the earliest stages of an investigation.