Hello Reader,
It's day 9 of the Zeltser blog challenge and day 2 of vacation. As I write this I'm looking at the waves roll into the sand so you'll excuse me if I might be brief, the surf is calling (as are my kids).
In today's post we resume explaining the milestones of progression of the digital forensic examiner. We've covered up to milestone 4 in prior posts and we are now approaching a level of maturity in your progression as we explain milestones 5 and 6.
Milestone 5 - You become less about the tool and more about the artifact.
The more experienced an examiner you become the less dependent on your tools you become. As you get thrown into more time sensitive situations you begin to carry a thumb drive of one off tools that quickly triage artifacts to help you identify facts, actors and threats without the need of the dongle. In some cases you can find and interpret the artifact without a tool at all! That isn't to say that you won't keep your dongle protected tool suite, you just will use it when you need the convenience and additional functionality it provides.
You have come to understand that the underlying magic that you first experienced in milestone 1 was always contained within the artifact and not the tool itself, and the tool was just interpreting the data for you. The most important part of this milestone is how efficient you can become now. Once an investigation is requested you can ascertain which artifacts will contain data that responds to your inquiry, allowing you to get back results faster and with less random keyword searching.
Milestone 6 - You understand what's normal and what's missing for multiple versions of the same operating system.
This milestone may be one of the harder to achieve with so many variants in production and what normal means in your environment. However, the benefits to understanding what's normal in your environment will help you quickly zero in on what was left behind for you to find. Being able to know what's normal includes:
The benefits are many, but include:
It's day 9 of the Zeltser blog challenge and day 2 of vacation. As I write this I'm looking at the waves roll into the sand so you'll excuse me if I might be brief, the surf is calling (as are my kids).
In today's post we resume explaining the milestones of progression of the digital forensic examiner. We've covered up to milestone 4 in prior posts and we are now approaching a level of maturity in your progression as we explain milestones 5 and 6.
Milestone 5 - You become less about the tool and more about the artifact.
The more experienced an examiner you become the less dependent on your tools you become. As you get thrown into more time sensitive situations you begin to carry a thumb drive of one off tools that quickly triage artifacts to help you identify facts, actors and threats without the need of the dongle. In some cases you can find and interpret the artifact without a tool at all! That isn't to say that you won't keep your dongle protected tool suite, you just will use it when you need the convenience and additional functionality it provides.
You have come to understand that the underlying magic that you first experienced in milestone 1 was always contained within the artifact and not the tool itself, and the tool was just interpreting the data for you. The most important part of this milestone is how efficient you can become now. Once an investigation is requested you can ascertain which artifacts will contain data that responds to your inquiry, allowing you to get back results faster and with less random keyword searching.
Milestone 6 - You understand what's normal and what's missing for multiple versions of the same operating system.
This milestone may be one of the harder to achieve with so many variants in production and what normal means in your environment. However, the benefits to understanding what's normal in your environment will help you quickly zero in on what was left behind for you to find. Being able to know what's normal includes:
- Which services should be running.
- Where those services should be running from.
- What user the service should be running as.
- What log errors are normal.
- What logging is turned on by default.
- Which artifacts get created by default.
- What gets created when a user logs in via different methods.
- Where data created through user activity will exist by default.
- Knowing the default locations of application artifacts and system logs/registries.
- What applications are installed by default in your environment.
The benefits are many, but include:
- The ability to create your own custom white-list of hashes so you can focus only on that data created by your user.
- The ability to spot what artifacts the user deleted when trying to cover their tracks.
- Being able to quickly spot malicious processes trying to hide in plain sight.
- Being able to quickly spot out of place directories or logs, showing the user has a high degree of sophistication and you should no longer trust the system defaults.
- The ability to quickly bring out relevant data you have committed to heart in the prior section.
- The ability to find anomalous log ins and accesses to a system.
- The ability to correctly estimate what data you should expect to exist before you begin your investigation so you can manage the expectations of those requesting work from you.
- The ability to quickly identify user installed applications that need to be researched before being examined.