CLIMB A directory TREE

The truth is out there
So make sure your software can list all the files there are.

This preliminary article authored Jan. 2022.
However, by the time you read the article, a lot of time may have passed and the software that was tested may have been updated and now just might pass the tests. However, you should conduct tests of your own to see if the current version passes your tests and meets your needs.

Read this article and raise your forensic intelligence level a few points. 😄

This article is the 5th in a series of articles I decided write about software that would be useful in most any computer forensic investigation. I then decided to put the software thru its paces to see if it would pass the forensic challenges that I could forsee a wise defense challenge would make. I set up and designed minimal test environment and file system that would stress test the software for evidentiary reliability and consistancy.

This is a list of the first four articles in this set.

   -    Forensic file copying  Article tests over 40 "forensic" file copiers
   -    Forensic Hashing     Article tests over 30 "forensic" hash programs.
   -    ZIP-IT for forensic retention    Article test a few zipping programs and
   -    ZIP_IT_TAKE2    More tests for your zipping capabilities.


Bottom line:
I wanted to see how many (what) programs could produce a quick, clean and legible listing of files within a tree. Kinda like this:

C:\TEMP                                  |    0 |....D..|03/01/2019|12:00:00:000c|03/01/2019|12:00:00:000w|01/26/2022|22:19:05:970a|EST
C:\TEMP\shutdown_help.txt                | 4125 |..H....|04/01/2019|12:00:00:000c|04/04/2019|12:00:00:000w|01/26/2022|22:19:05:970a|EST
C:\TEMP\junk\Signature1.TXT              |   35 |.......|04/24/2020|11:01:05:507c|04/24/2020|11:01:05:507w|01/26/2022|22:19:05:970a|EST
C:\TEMP\junk\Signature1.TXT:ads_hash.txt |  291 |.adata.|04/24/2020|11:01:05:507c|04/24/2020|11:01:05:507w|01/26/2022|22:19:05:970a|EST
C:\TEMP\junk2\more\Signature2.txt        |   45 |.......|06/05/2021|14:06:45:816c|06/05/2021|14:07:05:861w|01/26/2022|22:19:05:970a|EST


I soon found out, there were not a lot of options. Thats why this article.

The directory/tree of files I setup is small, about 500+- total. DEMO_FILES.exe   These sample files would be used to stress test the software which I was testing. Expand this exe to a 2Gig or larger NTFS thumb drive, or directory. Read the above listed articles which reference my test results for the various topics mentioned. I mean, who doesn't hash, copy, zip as part of your evidence proceedures.

When designing the tests and the software chosen to test, I considered a number of different but generic forensic processes and scenarios which an analyst/investigator would be dealing with. Each of these scenarios might cause analysts to choose a different software program to accomplish that task. Because of the various scenarios that might be encountered, I ONLY USED one scenario. The scenario I used was one that would probably come at the end of an investigation rather than at the beginning. The other scenarios mentioned below were not considered because of that fact. I only wanted to test software that would be practical in this one final scenario.

In a lot of forensic cases the analyst initially would rely on a suite (which does everything but cook dinner), or a program which works or examins a bit image of the suspect data. We all know that situations occur which require specific alteration of the process or making different selection of software based on that specific need or situation, but that is not what I set these tests to do.

So, as best I could, I designed and chose the software which was able to be used without "loading" the suspect files and creating a case. Also did not wish to use a piece of software that required an image to be loaded before it could process the data. After all, if you are working on your own forensic computer, and have already isolated the important files and reports, why do you need to create or load those files into a "case". The loading of files or creating a "case" merely to obtain a list of evidence is a bit overkill. And what about those persons who need a file listing, are not forensicators, and don't have suites to rely upon. What do they use to obtain a list of all the files within a directory?

These tests hopefully will test a single generic universal process that could be used most of the time in the situtation(s) that I will explain. So, bottom line: If the software is a suite, or works soley from/off/within a bit image it probably wasn't completely considered.


These scenarios for which an examiner might need or use a file listing/catalog program are: when the analysis or process is being done on only a SINGLE directory/tree. This tree being processed as evidence might fall into one of the following three directory tree scenarios: (Thus the name of this article: GO CLIMB A TREE)

   -    A: Could be a suspects directory tree located on a large corporate server, (no forensic software allowed to be loaded on the corporate computer), or
   -    B: The directory is on the analysts conputer holding forensically restored or necessary evidence files, or
   -    C: The directory is on the analysts computer, is one of many individual "case" folders and holds files and reports ready to deliver to the reviewing party.

Thus, the files being used in this software tests are considered to be part of one of the above situations. Living/residing in a single tree/directory. And the software being tested is expected to be able to handle that simple process of listing files within this single tree and not cook dinner.

   -    The first scenario is where an investigator goes to a suspect site and is only allowed to view or obtain evidence from the suspects directory on the much larger corporate server. This scenario of being able to only obtain/copy files from a single directory tree located on a much larger server keeps this and the other two scenarios consistant with the goal of finding a program which can catalog/list files within a single tree. See the suite stuff below.

   -    The second scenario is where the analyst has restored evidentiary directories on their analyst work drive/machine. Thus we have a seperate directory (hopefully forensically restored) containig evidence files belonging to this single case, which the analyst will perform additional analysis on. This situation again contains forensically restored files which must be treated as original evidence. The analyst would require a true and accurate list of all the files placed within this evidence tree. If for no other reason, than to verify the analyst has everything they should be working with.

   -    The third situation is similar to the second. The analyst has either created reports for delivery to a reviewing party, or has isolated evidentiary files which would ultimately be delivered with their forensic report to the reviewer. This third situation, is virtually identical to the second, in that any and all files within this third set should be treated as pristine and evidentiary in nature. And the analyst or reviewer would like to have a clean list of all the files within this tree before delivering it to the "customer".


SUITE STUFF

Since most suites can operate at one of the three levels:

   -   complete bit image of the entire drive,
   -   complete bit image of a selected tree,
   -   process a tree/directory at the file level.

I have chosen to use only one or two suites in the test of a programs capability to perform a good catalog/list of files within a dirctory. But, in order to keep the tests even, any suites (or images created/processed by the suite) that might be tested are tested only at the logical file level of the specific evidentiary directory.

Realistically, when you are at a large server, or you own forensic machine, and only need a single directory listing, the "logical" directory view is most likely the way you will go.
So: For my tests, bit images created by suites or dedicated bit imaging software which would then produce a list of files within the image were not usually allowed to be considered. The reason being, is that other software would only consider logical directory structures that could be processed.

Thus, the treating any suite processing of the bit image as ONLY a logical directory would keep the playing field even. Since most examiners aren't going to create a bit image of their work directory (item 2 and 3 above) merely to create a "case" and load your own work it into a suite to obtain a true and accurate list of the files which reside therein, which is where this test is ultimately going. Get a list of files within a directory.


Hopefully my three explanations above are a clear as mud. 😀

An aside
Another reason to have a good file cataloging software is: in the real world, (i hope you are there) in order to test any of the forensic software you are using, it seems a basic first step is to know the content of the file system, tree/directory you are feeding to your forensic software procedure. If you think about it, (I know, thinking is bad for the health), no matter which of the above three situations, wouldn't it be nice if you could sit down at a computer and initially, without much to do, have a piece of software that would be able to find and list ALL the files within your environment/directory. Whether it is the suspect directory on the server, or the directory on your own machine holding evidence and reports.

To restate the obvious: in an investigatory environment:
   -    Wouldn't it be nice to know how many files might reside within the suspect directory you are asked to analyze or image/copy files from.
   -    Or, after your investigation is finished; provide to the reviewer (attorney, manager, corporation) a full, understandable, list of all the files you are providing as evidence.
   -    Or, merely to have a reliable program to provide a true and accurate list of files within a tree/directory regardless of the situation.

After all, regardless of the process you are conducting (image, analyze, report) when you are using or testing the reliability of a piece of software, if you don't know what or how many files are contained within your test directory/tree how are you going to know if the software you are testing was able to find, identify and process all the files that live within the test environment (tree/directory).

As an example, if you are going to test your hashing software, would you assume (we all know what that means) that you should at a minimum know how many files you should be finding and hashing. And also, have a seperate independent way of determinig how many files are processed, and what their actual hash value (if you are hashing) is. Next is a funny (or not so funny) story about one of my hands on hash testing classes.

I was conducting a class/seminar on testing hashing software. There were about 10 persons in attendance of all knowledge levels. I handed out the test suite of files and told the attendees, I had a gift certificate to a well known breakfast restaurant to the first person who could tell me how many files were in the test data set. They were here to test hashing software, so I figured they could tell me how many files needed to be hashed. After some time, NOT a single person came up with the correct answer. Needless to say, I used the gift certificate myself.

Or if you are attemping to forensically copy files from a suspect location (sans suite) to retrieve evidentiary files, wouldn't it be nice to have a piece of software that could independently list/catalog all the files within the tree you are about to copy. Just to see if the copy was successful.

Or after you have zipped up a directory structure, and then unzipped it somewhere else, it might be nice to be able to confirm the number of files in the source, and destination. Make sure all the files legally migrated.

Bottom line: You should have at your fingertips a program capable of cataloging or listing the files within a directory. You should have a piece of software that will find, catalog and list all the files which you ASSUME are within the specified tree. Regardless if the tree is a suspect directory on a server, your evidence processing directory on your forensic computer, or the list of files provided to the reviewing party. After all, if you don't know how many files are within the tree structure for your forensic software to process, how do you know your forensic software found and processed all the files within the tree? Finding/knowing how many files are within your evidence tree should be a basic item to be able to accomplish. DAH!

It also helps, if you ever decide to test any forensic process/software yourself (what a novel idea) that when you create the test directory, to know how many, what, and where you placed the files. Simple.

In a forensic/evidentiary process, you might want to make sure you have a piece of cataloging software which can create a reasonably formatted and accurate output listing to provide to the reviewer, attorney, court, whoever. So this might be another requirement of a file cataloging software. Create a nice, clean and easily manipulate/printed list of all the software within your forensic analysis production. Thats what this test and article is hoping to accomplish. Find and test a number of cataloging programs which can stand up to technical, legal and peer review scrutiny. What an idea.

Before going any further, take this challenge  to see if your hash, copy, zip software passes the test. This page also has links to test files which I used to run my test. And which you can use as a basis to test your own cataloging software. It would be nice to see that your cataloging process finds everything is should. Wanna bet it doesn't.

Disclaimer: The mention of any program, website or algorithm in no way should be taken as an endorsement of same. And in some cases, I may even point out a flaw or limit to its actions.


BACKGROUND

The first thing I did was to set up a test suite of files which I knew from my prior hashing and copying tests would stress software and cause it to possibly not provide accurate or complete results. Actually it is the test suite used for the prior hash and copy tests.

The files were installed on an NTFS file system to make use of the NTFS capability of using long filenames and alternate data streams. After all, most of the forensic suites today can easily create long filenames in their exports. And who of you have not seen persons you deal with who create filenames as long as war and peace.

The test suite contained several directories. Some directories were made up to include long filenames greater than the 255 character usual WINDOWS default. (don't get into it about the WINDOWS 255 character restriction. WINDOWS 10 does not have that limit. enough said).

Also contained within the file structure of the test suite were some files which contained alternate data streams. It might be nice if when you create a catalog of files within your evidence list any alternate data streams would be listed. After all, alternate data streams can contain useful evidence such as the URL that the porn came from. Who wouldn't want to know that 😄.

Obviously the three file dates were necessary to record as part of the evidence listing. You need to know the three dates for your reporting so you can report when a specific action may have taken place.

There were also a few filenames with unicode character names. But that wasn't too much tested for.

The test requirements at a minimum are the following:
   -   ★ 1. Long Filename/path identification/process: Able to find, articulate and list all files found within any long filename paths.
   -   ★ 2. Alternate Data Streams: Able to find, articulate and list all Alternate data streams, whether in LFN's or normal file lengths.
   -   ★ 3. Report Generation: Able to provide output easily imported into a spreadsheet or data base for next step process.
   -   ★ 4. Time display: Able to find/display and include in report all three MAC times. GMT time listing might also be nice.
   -    An added plus might be to produce a log of the process for the final report. But not part of the testing.


TEST RESULTS

I sent word out to the lists for recommendations on disk cataloging software. Not surprising I received only about three or four recommendations for software to test. As it turns out, file cataloging software that does what I was testing is far and few between. So I only had a few programs to test.

The test results of those names provided to me, showed that most of the more popular directory cataloging software had flaws or retricitions which caused them to fail one or more of my requirements. I generally did not test the forensic suites, as they hopefully will provide a reasonable file listing of those files which are found within the suspect image. But it might be wise for you to run some tests on the suite you are using to see how it performs. However, I did test one imaging package to see if it could properly report the complete catalog once it "mounted" the suspect data.

Another reason I didn't concentrate on suites is because of possibly two things.
1. First, when you are called to a corporation for an investigations, you may not be able to either image or load your software onto the suspect server. This leaves you with the requirement that you have to use a piece of software that can be run without installation and doesn't create or need an image to work off of.
2. Next, is, if you have completed your investigation and analysis, you obviously have a boatload (thats a colloquial term) of files to provide with/in your report to the reviewer. At this point, it would be ridiculous to have your forensic suite create a case file merely to create a list of files within your evidentiary diretory. So, you need a program that can quickly, easily and in most cases not be required to be installed on a suspect computer. That is why I didn't concentrate on using suites to test/create file/directory listings.

One of the main drawbacks was that a lot of the software needs to be installed in order to work. That is fine on your own machine, but what about when you are at a suspect machine and are not allowed to install any software. This would require that you make sure you have a software package that can be run from a stand alone drive, or from the DOS command prompt similar to any of the powershell commands. However, even powershell directory listing command failed some of the requirements.

Other software associated with forensic suite processing can either treat the suspect tree as raw evidence or it can "mount" the tree as if it were a mounted drive. Different results and capabilities may be seen depending on how you run the software. In most of my tests, if this operation was available I only treated the suspect tree as if it was "mounted" as a drive letter, and not as raw data tree or work off an image file.

The problems or operational challenges as I call them with the GUI programs is that you really need to learn the various settings which you use to obtain optimal results. Then, if the program doesn't allow you save the settings, you need to redo the options every time you use the software. This may lead to differing output formats, and in the worst case, cause you to miss key items within the output list. So choose a program that can be set to use an ini type setup for "repeatability".

MD5 NOTE:

Even though the calculation of MD5 was not part of the test requirements, if the program had the option to calculate the MD5 of the file(s) I probably tested it. The reason being, is that if the program calculated the MD5 and allowed the last access date to be altered, it could possibly lead to evidence corruption. At the least, lead to some embarrassing defense questions why the evidence last access date was altered and/or not reported correctly. That is why in some instances below, you see an MD5 / last access date comment.


Individual Program Results

This is a list of the programs I tested so far. If you have one that is not on the list, let me know, and if I don't have to pay an exhorbitant suite fee, I'll add it on.. Some were suites, some required installation, and some only examined an "image". Since these three items caused the individual program to possibly pass or fail one of the four tests mentioned above, it was somewhat difficult to compare them on equal ground. So I urge you to conduct your own tests within the environment and for the requirement (A:, B: or C:) that you are operating within.

AUTOPSY
WINDOWS DIR CMD (using various options, obviously produced differing results)
DIRLISTER
DISKCAT
Forensic Explorer (FEX)
FILELIST_CREATOR
FILELIST_v2
FTK_IMAGER - ( both drive and folder mount, produced result differences)
KARENWARE
PARABEN_E3
POWERSHELL
SLEUTHKIT: FLS command (again, depending on the options, very different results)
TREESIZE_FREE
Again, because the playing field, based on the programs capabilities ability to fit into the A:, B:, or C: situations above, the test results may not be identically compatable or comparable. So you need to test in your own environment.


★ ★ ★ ★ The first program tested passed all the tests. (LFN's, ADS's, Report Generation, Time display) ★ ★ ★ ★

C:\TMP\FILETEST\FILE.C                57857 A...... 12/01/2021 11:16:47:075c 01/09/2022 14:02:22:298w 01/09/2022 14:02:22:298a EST
C:\TMP\FILETEST\FILE.C:ads_file.txt     317 .adata. 12/01/2021 11:16:47:075c 01/09/2022 14:02:22:298w 01/09/2022 14:02:22:298a EST
C:\TMP\FILETEST\FILE.H                 4005 A...... 12/01/2021 11:16:50:090c 12/01/2021 11:16:50:090w 01/08/2022 22:20:51:511a EST
C:\TMP\FILETEST\FILE_ONLY.C           24655 A...... 12/01/2021 11:16:47:075c 12/01/2021 11:16:47:075w 01/08/2022 22:20:51:527a EST
   -   ★ Created formatted tree structure easily imported to spreadsheet or database
   -   ★ Found long filenames/paths
   -   ★ Found alternate data streams
   -   ★ Was able to display GMT time as well as local timezones.
   -   ★ Command line program so it is batch file compatable.


This program, is a full GUI and requires installation.
It was run in two parts

FULL PHYSICAL DRIVE run
   -    ★ showed all files, LFN's, ADS's and output tree properly, BUT:
   -    ☹ output to the report contained much superfluous data which had to be removed before publishing
   -    ★ did output proper MAC dates. even after hash calculation did not alter original access date.

LOGICAL FOLDER/TREE run
   -    ☹ Not able to output a full tree list, only single directory at a time
   -    ☹ Was not able to output MAC times. Probably just not familiar enough with its output options

Both reports/case creation did require an extensive learning curve on which options and processes to use.


The WINDOWS DIR command with the correct options:

C:>dir \path\path\*     /S /R /C /4
C:>dir \path\path\* /B /S /R /C /4
   -    ★ proved to have some forensic failures.
   -    ☹ NO single group of options passed all the tests.

   -    ★ showed all files, including ADS's, BUT
   -    ☹ WHEN using the /B (bare) command had many shortcommings.
   -    ☹ failed to find/show any long filenamed files.
   -    ☹ failed to show dates of the ADS's, see below.
   -    ☹ only showed one date at a time. could not produce all 3 at once
   -    ☹ seperated each directory list, which was not spreadsheet compatable.

 
Directory of H:\ORIGINALS\CATALOGERS\SNAP2HTML
12/14/2021  09:25 AM     "DIR"         .
12/14/2021  09:25 AM     "DIR"        ..
12/14/2021  09:54 AM     "DIR"         Snap2HTML
12/14/2021  09:22 AM           204,650 Snap2HTML.zip
                                    80 Snap2HTML.zip:Zone.Identifier:$DATA
12/14/2021  09:25 AM           246,926 TEST_OF_F.html
               2 File(s)        451,576 bytes
The problem with the DIR command is twofold.
If you use the /B which gets a nice path/filename listing, see three lines below, BUT BUT BUT
H:\ORIGINALS\CATALOGERS\SNAP2HTML\TEST_OF_F.html
H:\ORIGINALS\CATALOGERS\SNAP2HTML\Snap2HTML\ReadMe.txt
H:\ORIGINALS\CATALOGERS\SNAP2HTML\Snap2HTML\Snap2HTML.exe
   -    ☹Loose the capability of including ADS (alternate data stream /R) and
   -    ☹Loose the capability of including any of the file dates /TW.
   -    ☹Still not able to show long filenames
   -    ☹ ☹Basically you loose all file information except the filename/path.


This directory listing program
   -    created text output which segregated each folder contents within a seperate section of the output. See below example.
   -    ☹ This made it difficult to combine all the items from all the directories into a single list that could be imported into the spreadsheet. See the sample below.
Two main failures appear to be:
   -   ☹ Failed to find and list any long filename files.
   -   ☹ Failed to find any alternate data stream names. A major failure for the tests.

Output sample #1: Notice the format makes it virtually impossible to import to spreadsheet or database. Also note that there is no last access date in the output. I couldn't find any option to include access date.

===========
F:\SOURCE2\
===========
ALTERNATE_STREAM_FILE.TXT :: 48B :: Created: 2019-01-01 07:34:56, modified: 2019-01-01 07:34:56
all other files within this directory shown here, and then:

=========================
F:\SOURCE2\CYRILLIC_COPY\
=========================
Cyrillic.7z :: 12.59KB :: Created: 2019-01-01 07:34:56, modified: 2019-01-01 07:34:56
etc. etc. etc. and so on for all the directories in the tree


The next program:
   -    ★ Mostly worked OK. and was one of the two that worked OK. with the following problem:
   -   ☹ It did not offer any choice/option to include in the output the three MAC times.
   -   ☹ It included all system type $I30 and other system $EXTENT type files which increased counts unnecessarily. Not of any use to attorneys.
Major Problem - inconsitant data
   -   ☹☹ It included within the output; a listing to a single file which was deleted. And did not indicate the deleted status.
   -    ☹ Within the GUI file list window it properly showed the deleted file, but when exporting the list, included it normally.
   -    ☹ This could prove to be very problematic if more deleted files were inadvertently included in the output.


The next program:
   -    ★ Found and listed all the Long Filename Parent files:
   -   ☹ It did not find or list any of the Alternate Data Stream files..
   -    It did offer many ways to display the output date format and output file format.

Major Problem
Even though these two items were not a test requirement, the actions mentioned here could severly hinder your evidentiary process and results.

   -   ☹ When doing MD5's, did not calculate LFN MD5's. and left that column empty. Poor display practice.
   -   ☹☹ It altered last access date of any of the files it calculated the MD5 of.


Next:
This is a full forensic suite, so I didn't think there would be any problems. However, I gave it a shot.
It worked fine except I could find no way of creating a complete tree structure output.
   -    ☹ All I could do was export a listing one directory at a time. This is not exactly what I was looking for.
   -   ★ It did find and list ALL the files. Including ADSs and Long filenames.
   -   ★ It provided all necessary MAC dates.
   -   ☹ For the ADSs however, it did not fill the date fields. That might be a slight drawback when needing file dates.
   -   ★ So except for the fact that I was unable to produce a complete one step full tree list. It provided adequate output and information.


The next program had output that looked like:

FOLDER	C:\TMP\TEST_USB\D1\	-------	2	15	772,744	772,744
FILE	---A---X	1/1/2019 07:34	1/1/2019 07:34	1/1/2019 07:34	54	_RESET_D1.BAT
FILE	---A----	1/1/2019 07:34	1/1/2019 07:34	1/1/2019 07:34	48	ALTERNATE_STREAM_FILE.TXT
FOLDER	C:\TMP\TEST_USB\D1\CYRILLIC_NAMES\	-------	0	5	226,341	226,341
FILE	--------	1/1/2019 07:34	1/1/2019 07:34	1/1/2019 07:34	12,889	Cyrillic.7z
FILE	--------	1/1/2019 07:34	1/1/2019 07:34	1/1/2019 07:34	93,971	Cyrillic_NAMES_W_ADS_PK.zip
   -   ★ It did find LFN's which I was very surprised at.
   -   ☹ But when asked to hash, had more problems with LFN's that had ADS's attached to the LFN files.
   -   ☹ Again, this output doesn't lend itself very easy to import to excell or data base.
   -   ★ It did however, capture the three file dates, but did not allow for output of time seconds, or GMT time.
   -   ☹ Missed 28 ADS's


POWERSHELL dir command

The powershell command structure is worse than an engineers nightmare. I had a hard time with the options, but managed some success.
I had significant problems finding options that fit all the requirements at once. Some options that fit one requirment, cancelled out another. So in order not to be slanted, I provided the name, so that you could test this yourself without downloading any other software. If you have a command line that fits all the requirements (1:, 2: 3: 4: ) at once, please let me know.

   -   ☹ FAILED to show/list alternate data streams
                 -    Total listed ==== 124, missed the ADS’s
   -   ☹ No apparent command to show ADS, or any date other than write.
   -    There may be scripts that can do this but i couldn't find them.
   -   ☹ Again: its output was porely designed for additional processing.

    Directory: F:\source2

Mode                 LastWriteTime         Length Name                                                                                                                                                 
----                 -------------         ------ ----                                                                                                                                                 
d-----        12/30/2019   6:07 PM                CYRILLIC_COPY                                                                                                                                        
d-----        12/30/2019   6:07 PM                CYRILLIC_NAMES                                                                                                                                       
d-----        12/30/2019   6:07 PM                top_of_lfn_folders                                                                                                                                   
------          1/1/2019   7:34 AM             48 ALTERNATE_STREAM_FILE.TXT
    
    Directory: F:\source2\CYRILLIC_COPY

Mode                 LastWriteTime         Length Name                                                                                                                                                 
----                 -------------         ------ ----                                                                                                                                                 
------          1/1/2019   7:34 AM          12889 Cyrillic.7z                                                                                                                                          
------          1/1/2019   7:34 AM          25894 CYRILLIC_NAMES_W_ADS.7z 

Next:
   -   ★ The next one seemed to work very well as far as providing a good format for additional processing.
   -   ★ It provided a lot of information on the screen.
   -   ☹ however: failed to see ADS's files, but did list their sizes, couldn't figure out that one. ????
   -   ☹ When asked to export list: failed to include ADS names.
   -    Could not find any options to show file times
   -   ☹ Showed a column for ADS's, but didn't show them. Different listing for normal filename and longfilenames: curious.


★ ★ The next program tested passed ALMOST all the tests. (LFN's, Report Generation, Time display) ★ ★ ★

   -   ★ Created formatted tree structure easily imported to spreadsheet or database
   -   ★ Found long filenames/paths
   -   ★ Command line program, so it is batch file compatable.
   -   ★ When it performed a SHA256 calculation, properly restored last access date.

   -   ☹ Its only failure was that it didn't find or process any of the Alternate Data Stream files.


The next program proved it could be used as a true file listing program. See two sample runs below.

However, the output had no column headings included in the output. However, research of the documentation did provide insight into columns being produced, which meant that you would have to manually add column headings to the output data.

When I asked it to output the file dates and sizes, it provided sufficient data, but again, without appropriate column headings in the output, it would be necessary to manually add that to the output file. An easy batch file could fix this. But it should not be necessary.

The time zone: EST as shown here shortened from EASTERN STANDARD TIME for display purposes, as was the field delimeter of a tab was replaced with the pipe (|) for display purposes. Again, a manually insert of column headings is necessary to determine which time is which.
   -   ★ With the correct gigantic command line it properly found and listed all files with paths
   -   ★ It found and displayed proper MAC file times, and was capable of showing GMT times also.
   -   ★ Its output was compatable with import to a spreadsheet, with little effort.
   -   ☹☹ A major drawback is that it seems to have to be run on an image. Its output was compatable with import to a spreadsheet, with little effort.
   -   ☹ But I could not find a way of having it list only a single tree information. It seems to be designed to do entire drive/image.
   -   ☹ To obtain MD5 values, had to forgo the MAC time for the decimal value of MAC time which was not usable.

Possibly a little more research could make this a viable "image" file cataloger. But when listing files within a full blown disk image, I really don't want to have to provide a list of a few million records.

Overall, this is a usable program, but again, it must be run on an image. Not useful for getting single folder data off a server.

r/r 41-128-3:	ARTICLES/ads.htm
r/r 41-128-5:	ARTICLES/ads.htm:TEST_REPORT_COPY_stripped.xlsx
r/r 42-128-3:	ARTICLES/copy_that.htm
r/r 43-128-3:	ARTICLES/get_testy.htm
r/r 44-128-3:	ARTICLES/Get_Testy.pptx
r/r 45-128-3:	ARTICLES/hash_faqs.htm
r/r 46-128-3:	ARTICLES/hash_it_out.htm
r/r 47-128-3:	ARTICLES/hash_it_out_article.htm
r/r 47-128-5:	ARTICLES/hash_it_out_article.htm:TEST_REPORT_HASHING_stripped.xlsx
r/r 48-128-3:	ARTICLES/testing_invite.htm
r/r 49-128-3:	ARTICLES/testing_invite1.htm
r/r 50-128-3:	ARTICLES/what_time_is_it.htm
r/r 51-128-3:	ARTICLES/ZIP_IT.htm
r/r 52-128-3:	ARTICLES/ZIP_IT_TAKE2.htm
Second alternate command line provides full time information. WITHOUT HEADERS.
r/r 41-128-3:|ARTICLES/ads.htm|2020-01-01 07:34:56 (EST)|2020-01-01 07:34:56 (EST)|2022-01-10 16:47:56 (EST)|2020-01-01 07:34:56 (EST)|19559|0|0

r/r 41-128-5:|ARTICLES/ads.htm:TEST_REPORT_COPY_stripped.xlsx|2020-01-01 07:34:56 (EST)|2020-01-01 07:34:56 (EST)|2022-01-10 16:47:56 (EST)|2020-01-01 07:34:56 (EST)|13633|0|0

r/r 42-128-3:|ARTICLES/copy_that.htm|2020-03-03 07:34:56 (EST)|2020-03-03 07:34:56 (EST)|2022-01-10 16:47:56 (EST)|2020-03-03 07:34:56 (EST)|18937|0|0


OTHER TESTS CODUCTED

Here are other studies I performed on "forensic" copy and "forensic" hashing programs and the results. Generally I only tested the free or eval versions, as I'm not about to pay for a piece of software just to add it to my tests. If anyone wants to provide me a paid version of the software, feel free, and I'll test it.
I attempted to be as accurate and consistant in my testing, but stuff happens. So if you test any of the mentioned software, and obtain a different result, please let me know so I can retest.

  DMARES.COM home page.
  A TRUE FORENSIC COPY PROGRAM
Associated articles and programs of interest:
  hash  program to calculate hash values.
  HASH_IT_OUT  an article discussing forensic hashing of evidence.
  ZIP_IT  an article regarding use of zipping software for forensics.
  ZIP_IT_TAKE2  an article explaining the testing of zipping software.


TOP  

A challenge   (6/2020) for you to test your forensic hash/copy/zip software for forensic and evidentiary reliability.


Before closing you might want to look at some of these articles:
Using File Hashes to Ensure good forensic processes (article).
ZIP-IT for forensic retention (article)  and   ZIP_IT_TAKE2  Test your zipping capabilities.
MATCH FILE HASHES  with various Maresware software.


I would appreciate any comment or input you have regarding this article. Thank you. dan at dmares dot com,