Coder Social home page Coder Social logo

orlikoski / cdqr Goto Github PK

View Code? Open in Web Editor NEW
330.0 330.0 51.0 72.36 MB

The Cold Disk Quick Response (CDQR) tool is a fast and easy to use forensic artifact parsing tool that works on disk images, mounted drives and extracted artifacts from Windows, Linux, MacOS, and Android devices

License: GNU General Public License v3.0

Python 86.47% Dockerfile 2.18% Shell 5.74% PowerShell 5.61%

cdqr's People

Contributors

dadokkio avatar lansatac avatar orlikoski avatar rough007 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cdqr's Issues

No Win EXE

There doesn't appear to be an executable (cdqr.exe) file anywhere in the master.

ZIP Not Found

ERROR: "Test/[NameOfTheZip].zip" cannot be found by the system. Please verify filename and path are correct.
.
I'm consistently getting this error despite the ZIP being available on that location. I can confirm that the hostname doesn't contain '-' as well.

root@skadi:/home/skadi# cdqr -z --max_cpu Test/[NameOfTheZip] --es_kb Issac1
Assigning CDQR to the host network
The Docker network can be changed by modifying the "DOCKER_NETWORK" environment variable
Example (default Skadi mode): export DOCKER_NETWORK=host
Example (use other Docker network): export DOCKER_NETWORK=skadi-backend
docker run --network host aorlikoski/cdqr:5.0.0 -y -z --max_cpu Test/[NameOfTheZip] --es_kb Issac1
CDQR Version: 5.0
Plaso Version: 20190331
Using parser: win
Number of cpu cores to use: 4
ERROR: "Test/[NameOfTheZip]" cannot be found by the system. Please verify filename and path are correct.
Exiting...

CDQR Parsing when Timesketch Elastic Search Not running

I've been running into problems where CDQR.py is throwing errors when timesketch is not running.

@chunderstruck will look into this issue and submit a pull request.

Removing uncompressed files in directory: Results/artifacts/

Process to export to ElasticSearch started
Exporting results in TimeSketch format to the ElasticSearch server
"psort.py" "-o" "timesketch" "--status_view" "linear" "--name" "bumblebee" "--index" "bumblebee" "Results/BUMBLEBEE.plaso"
ERROR: There was a problem. See details in log.

Process not completing

Having trouble getting CDQR (v3.1.3) to complete its processing on some images/mounted devices.
I was successful with a 16GB thumb drive (9 min) and with the 3GB nps-2008-jean.E01 test image (1h 27min).

I created a 3.5GB .E01 image using FTK Imager Lite v3.1.1.8 on a Windows 7, x86, VMware Virtual Machine. Image was located on an external HDD connected via USB 3.0. Forensic Machine was running an updated Kali Linux VM with 8GB RAM as used with the thumb drive and jean.E01 above.
./cdqr.py -p win --max_cpu /media/root/1TB_SSD/Evidence/FinalProject.E01 --export
I force stopped CDQR after 16 hours. The logfile created during the process was over 90 MB in size, but most of it looks like below:

�[1mIdentifier PID Status Sources Events File�[0m
Main 12733 aborted 77245 (0) 1901267 (0)
Worker_00 12741 running 40510 (0) 579900 (0) VSS2:TSK:/Windows/winsxs/Backup/x86_microsoft-windows-i..ltinstall.resources_31bf3856ad364e35_6.1.7600.16385_en-us_b0b31218d2ed84e5.manifest
Worker_01 12747 running 35189 (0) 37180 (0) VSS2:TSK:/Windows/winsxs/Backup/x86_microsoft-windows-f..truetype-angsananew_31bf3856ad364e35_6.1.7600.16385_none_63cb9dea654e41ff.manifest
Worker_02 12751 running 45059 (0) 71998 (0) VSS2:TSK:/Windows/winsxs/Backup/x86_microsoft-windows-feclient_31bf3856ad364e35_6.1.7600.16385_none_beb0674eb8e86a51.manifest
Worker_03 12755 running 18853 (0) 1272473 (0) VSS2:TSK:/Windows/winsxs/Backup/x86_microsoft-windows-filtermanager-core_31bf3856ad364e35_6.1.7600.16385_none_10dfc9158c1fa6f6.manifest

I subsequently moved to a Dell XPS13 running Windows 10, 8GB RAM (no VM) and started CDQR.
cdqr.py -p win --max_cpu G:\Evidence\FinalProject.E01
After 37 minutes it completed. But produced only two reports MFT and File System. Neither appeared to have collected the targeted information as compared to what was available in jean.E01.

I currently have the image mounted with CDQR pointed to F:.
cdqr.py -p win --max_cpu F:
It has been running for two hours, but has not yet started the reporting stage. Log file is empty and the tmp0g4cpp and tmpuldq_5 folders are also empty. This attempt may be a syntax error.

Error when Unknown parser or plugin names

Good Alan,

The first thanks for this great tool.

I am using CDQR version 5 with the following command:

"cdqr.exe -p win file.zip" (where "file.zip" is the result of CyLR)

It gives me an error and the error that the log picks up is the following:

"Unknown parser or plugin names in element (s):" windows_typed_urls, ccleaner "of parser filter expression: bencode, binary_cookies, ccleaner, chrome_cache, chrome_preferences, czip, esedb, esedb / msie_webcache, filestat, firefox_cache, java_id, action msiecf, olecf, opera_global, opera_typed_history, eg, plist / safari_history, prefetch, recycle_bin, recycle_bin_info2, sccm, sophos_av, sqlite, sqlite / chrome_27_history, sqlite / chrome_8_history, sqlite / chrome_iesofrometekrometekromete_temptole, chrome_ies_dromete_temperature_square sqlite / firefox_downloads, sqlite / firefox_history, symantec_scanlog, windows_typed_urls, winevt, winevtx, winfirewall, winjob, winreg "

What could it be?
Thank you

Add support to accept defaults

Really appreciate if you can add the ability to automatically accept all the default answers to the questions when running CDQR.

TypeError: 'encoding' is an invalid keyword argument for this function

# log2timeline.py -V
plaso - log2timeline version 1.5.1

# python cdqr.py -p win --nohash --max_cpu disk.E01 Testing/
CDQR` Version: 3.0
Plaso Version: 1.5
Using parser: win
Number of cpu cores to use: 8
Source data: disk.E01
Destination Folder: Testing
Database File: Testing/disk.E01.db
SuperTimeline CSV File: Testing/disk.E01.SuperTimeline.csv

Testing/disk.E01.log
Processing started at: 2017-01-25 18:16:14.127854
Parsing image
"log2timeline.py" "-p" "--partition" "all" "--vss_stores" "all" "--parsers" "appcompatcache,bagmru,binary_cookies,ccleaner,chrome_cache,chrome_cookies,chrome_extension_activity,chrome_history,chrome_preferences,explorer_mountpoints2,explorer_programscache,filestat,firefox_cache,firefox_cache2,firefox_cookies,firefox_downloads,firefox_history,google_drive,java_idx,mcafee_protection,mft,mrulist_shell_item_list,mrulist_string,mrulistex_shell_item_list,mrulistex_string,mrulistex_string_and_shell_item,mrulistex_string_and_shell_item_list,msie_zone,msiecf,mstsc_rdp,mstsc_rdp_mru,network_drives,opera_global,opera_typed_history,prefetch,recycle_bin,recycle_bin_info2,rplog,safari_history,symantec_scanlog,userassist,usnjrnl,windows_boot_execute,windows_boot_verify,windows_run,windows_sam_users,windows_services,windows_shutdown,windows_task_cache,windows_timezone,windows_typed_urls,windows_usb_devices,windows_usbstor_devices,windows_version,winevt,winevtx,winfirewall,winjob,winlogon,winrar_mru,winreg,winreg_default" "--hashers" "none" "--workers" "8" "Testing/disk.E01.db" "disk.E01"
Parsing ended at: 2017-01-25 18:44:51.130077
Parsing duration was: 0:28:37.002223

Creating the SuperTimeline CSV file
"psort.py" "-o" "l2tcsv" "Testing/disk.E01.db" "-w" "Testing/disk.E01.SuperTimeline.csv"
SuperTimeline CSV file is created

Creating the individual reports
Traceback (most recent call last):
File "cdqr.py", line 616, in
create_reports(dst_loc,csv_file)
File "cdqr.py", line 204, in create_reports
rpt_evt = open(rpt_evt_name,'a+', encoding='utf-8')
TypeError: 'encoding' is an invalid keyword argument for this function

Execution of cdqr.exe requires log2timeline.exe

Hello,

I was executing cdqr.exe ,however a message pops up please provide the path for log2timeline.exe,where can I get the executable for log2timeline.exe .I navigated to the Plaso path that has been mentioned but I didnt find the executable there currently?Do I require to modify the code so only python code is taken as input?

Thanks and Regards
Tej Gandhi

Database Filename Issues on Windows

If I run a command like

cdqr.exe g:\

CDQR attempts to create a database file named g:.db, however : is not a valid character for a file in Windows. This causes log2timeline to error out
IOError: [Errno 13] Permission denied: u'g:.db' Failed to execute script log2timeline

but it continues to run so CDQR just appears to run forever.

Can you either strip illegal characters on windows
\ / : * ? " < > |
or let the user set the name of the db file somehow?

My use case is examining a multipart DD image mounted by another program.

What's the right way to run dead box collection?

Hi there,

I'm attempting to run CDQR against an .E01 image on an external hard drive. However, I started an analysis job and it's been running for over 24 hours. Is there a guide somewhere to manually gathering just the necessary artifacts from a dead box image and then running CDQR against that?

Thanks,
Brian

log2timeline.py: error: unrecognized arguments: Results/artifacts/host1

Hi Alan,

CDQR Version: 20191226 errors out when used with Plaso Version: 20220428. Replicated on Ubuntu 20.04 and Kali 2022.2.

  • Error message: "log2timeline.py: error: unrecognized arguments: Results/artifacts/host1"
  • Troubleshooting suggests this has to do with the arguments of this version of log2timeline.py that requires '--storage_file' before the path to the Plaso DB is specified.

Full Error Output:

user@vm:~/CDQR/src/Results$ cat host1.log 
usage: log2timeline.py [-h] [--troubles] [-V] [--artifact_definitions PATH]
                       [--custom_artifact_definitions PATH] [--data PATH]
                       [--artifact_filters ARTIFACT_FILTERS]
                       [--artifact_filters_file PATH] [--preferred_year YEAR]
                       [--process_archives] [--skip_compressed_streams]
                       [-f FILE_FILTER] [--hasher_file_size_limit SIZE]
                       [--hashers HASHER_LIST]
                       [--parsers PARSER_FILTER_EXPRESSION]
                       [--yara_rules PATH] [--partitions PARTITIONS]
                       [--volumes VOLUMES] [--language LANGUAGE_TAG]
                       [--no_extract_winevt_resources] [-z TIME_ZONE]
                       [--no_vss] [--vss_only] [--vss_stores VSS_STORES]
                       [--credential TYPE:DATA] [-d] [-q] [-u] [--info]
                       [--use_markdown] [--no_dependencies_check]
                       [--logfile FILENAME] [--status_view TYPE] [-t TEXT]
                       [--buffer_size BUFFER_SIZE] [--queue_size QUEUE_SIZE]
                       [--single_process] [--process_memory_limit SIZE]
                       [--temporary_directory DIRECTORY] [--vfs_back_end TYPE]
                       [--worker_memory_limit SIZE] [--worker_timeout MINUTES]
                       [--workers WORKERS] [--sigsegv_handler]
                       [--profilers PROFILERS_LIST]
                       [--profiling_directory DIRECTORY]
                       [--profiling_sample_rate SAMPLE_RATE]
                       [--storage_file PATH] [--storage_format FORMAT]
                       [--task_storage_format FORMAT]
                       [SOURCE]
log2timeline.py: error: unrecognized arguments: Results/artifacts/host1
CDQR Version: 20191226
Plaso Version: 20220428
Using parser: win
Number of cpu cores to use: 4
Destination Folder: Results
Source data: Results/artifacts/host1
Log File: Results/host1.log
Database File: Results/host1.plaso
SuperTimeline CSV File: Results/host1.SuperTimeline.csv

Start time  was: 2022-06-25 18:20:55.086696
Processing started at: 2022-06-25 18:20:55.086861
Parsing image
"log2timeline.py" "--partition" "all" "--vss_stores" "all" "--status_view" "linear" "--parsers" "bash_history,bencode,czip,esedb,filestat,lnk,mcafee_protection,olecf,pe,prefetch,recycle_bin,recycle_bin_info2,sccm,sophos_av,sqlite,symantec_scanlog,winevt,winevtx,webhist,winfirewall,winjob,winreg,zsh_extended_history" "--hashers" "md5" "--workers" "4" "--logfile" "Results/host1_log2timeline.gz" "Results/host1.plaso" "Results/artifacts/host1" "--no_dependencies_check"
ERROR: There was a problem. See details in log.

Thank you for all the work you put into your tools to make forensics more accessible!

Hope this helps,
Alex

CDQR does not parse Windows Event correctly to Kibana

Hi, I tried to parse CyLR output using CDQR to Skadi. This was from a windows target machine. After parsing, it seems like windows event logs were not parsed completely.

The main column such as timestamp and event id were correct but the XML strings were left as it is in one giant column. Is this not supported yet in the current version?

Unknown parser or plugin names in element(s): "bash"

Hi,
I have log2timeline plaso - log2timeline version 20190916 and CDQR 5.1.0 in Ubuntu 18.04
I got this error:
2019-10-29 10:27:14,880 [ERROR] (MainProcess) PID:2845 Unknown parser or plugin names in element(s): "bash" of parser filter expression: bash,bencode,binary_cookies,chrome_cache,chrome_preferences,czip,esedb,esedb/msie_webcache,filestat,firefox_cache,java_idx,lnk,mcafee_protection,msiecf,olecf,opera_global,opera_typed_history,pe,plist/safari_history,prefetch,recycle_bin,recycle_bin_info2,sccm,sophos_av,sqlite,sqlite/chrome_27_history,sqlite/chrome_8_history,sqlite/chrome_autofill,sqlite/chrome_cookies,sqlite/chrome_extension_activity,sqlite/firefox_cookies,sqlite/firefox_downloads,sqlite/firefox_history,symantec_scanlog,winevt,winevtx,winfirewall,winjob,winreg,zsh_extended_history

when I launch:
./cdqr.py disk.dd

cdqr breaks on unicode characters

Running cdqr v.4.4.0 via skadi 2019.2 docker setup.

CDQR exits with unable to extract.

david@skadi:~$ cdqr in:CASE-DEVICE.zip out:CASE-DEVICE -p datt -z --max_cpu --es_kb CASE-DEVICE --es_ts CASE-DEVICE
docker run -v /etc/hosts:/etc/hosts:ro --network host -v /home/david/CASE-DEVICE.zip:home/david/CASE-DEVICE.zip -v /home/david/CASE-DEVICE:/home/david/CASE-DEVICE -v /etc/timesketch.conf:/etc/timesketch.conf aorlikoski/cdqr:4.4.0 -y /home/david/CASE-DEVICE.zip /home/david/CASE-DEVICE -p datt -z --max_cpu --es_kb CASE-DEVICE --es_ts CASE-DEVICE
CDQR Version: 4.4
Plaso Version: 20190131
WARNING!! Known compatible version of Plaso NOT detected. Attempting to use default parser list. Try using the --no_dependencies_check if Plaso dependancies are the issue.
Using parser: datt
Number of cpu cores to use: 4
Destination Folder: /home/david/CASE-DEVICE
Attempting to extract source file: /home/david/CASE-DEVICE.zip
Unable to extract file: /home/david/CASE-DEVICE.zip
'ascii' codec can't encode character '\u2310' in position 135: ordinal not in range(128)

Make MFT and USNJRNL Optional

Due to the quality of results and how long it takes to process MFT and USNJRNL entries with Plaso consider either making them opt in or opt out from default parser lists

Error when Results folder already exists

Recently built Ubuntu 16.04 Server LTS image, using the Skadi signed in installer [as at 21/03/2019].

When attempting to run cdqr.py when there is already a Results folder, the following error is displayed:

david@skadi:~$ cdqr.py -p datt --max_cpu --es_kb HCS -z CYLRTRIAGEIMAGE.zip CDQR Version: 4.3 Plaso Version: 20190131 WARNING!! Known compatible version of Plaso NOT detected. Attempting to use default parser list. Try using the --no_dependencies_check if Plaso dependancies are the issue. Using parser: datt Number of cpu cores to use: 2 Traceback (most recent call last): File "/usr/local/bin/cdqr.py", line 1771, in <module> main() File "/usr/local/bin/cdqr.py", line 1684, in main if not query_yes_no("\n"+dst_loc+" already exists. Would you like to use that directory anyway?","yes"): File "/usr/local/bin/cdqr.py", line 672, in query_yes_no if args.confirmAll: NameError: name 'args' is not defined

add skip compressed file parameter

I'm running cdqr since Sept 13th against 80GB disk.
In the log file I see log2timeline is processing java/zip files and it is taking so long.
Sometime it would be helpful to skip the analysis of compressed files.
What do you think?

Bypass pause at the end of CDQR processing

Alan, can you please add a switch to allow users to bypass the pause at the end of running CDQR. As you are aware I have included CDQR in my scripting processes. I have automated CaseFramework processing (with the exception of volatility setup prior to processing which requires minimal user interaction). By bypassing the pause it would allow me to continue additional processing without requiring the user to press any key to continue.

Manage Timeout

During Exporting results in Kibana format to the ElasticSearch server I got this erorr:

2018-09-24 21:42:32,266 [WARNING] (MainProcess) PID:8896 <base> POST http://12.3.4:9200/case_cdqr-mydisk1/plaso_event/_bulk [status:N/A request:300.019s] Traceback (most recent call last): File "site-packages\elasticsearch\connection\http_urllib3.py", line 114, in perform_request File "site-packages\urllib3\connectionpool.py", line 649, in urlopen File "site-packages\urllib3\util\retry.py", line 333, in increment File "site-packages\urllib3\connectionpool.py", line 600, in urlopen File "site-packages\urllib3\connectionpool.py", line 388, in _make_request File "site-packages\urllib3\connectionpool.py", line 308, in _raise_timeout ReadTimeoutError: HTTPConnectionPool(host=u'1.2.3.4', port=9200): Read timed out. (read timeout=300) Traceback (most recent call last): File "tools\psort.py", line 68, in <module> File "tools\psort.py", line 54, in Main File "plaso\cli\psort_tool.py", line 561, in ProcessStorage File "plaso\multi_processing\psort.py", line 1017, in ExportEvents File "plaso\multi_processing\psort.py", line 534, in _ExportEvents File "plaso\multi_processing\psort.py", line 431, in _ExportEvent File "plaso\multi_processing\psort.py", line 594, in _FlushExportBuffer File "plaso\output\interface.py", line 118, in WriteEventMACBGroup File "plaso\output\interface.py", line 73, in WriteEvent File "plaso\output\shared_elastic.py", line 292, in WriteEventBody File "plaso\output\shared_elastic.py", line 218, in _InsertEvent File "plaso\output\shared_elastic.py", line 112, in _FlushEvents File "site-packages\elasticsearch\client\utils.py", line 73, in _wrapped File "site-packages\elasticsearch\client\__init__.py", line 1174, in bulk File "site-packages\elasticsearch\transport.py", line 312, in perform_request File "site-packages\elasticsearch\connection\http_urllib3.py", line 122, in perform_request elasticsearch.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host=u'1.2.3.4', port=9200): Read timed out. (read timeout=300)) Failed to execute script psort

Probably due to high ES load. Now I have to delete the ES index (~37M docs and ~22GB) and retry again?
It should be useful to manage the timeout, like "retry if receive timeout for n° times" and maybe also manage the resume (this maybe is more complicated).

Can't parse zip if hostname contains '-'

First off - cool tool 👍

My hostname has two '-' in it and this causes the cdqr to fail at position 113.

skadi@skadi:~$ cdqr in:NOT-MY-HOSTNAME.zip out:Results -p win --max_cpu -z
Assigning CDQR to the host network
The Docker network can be changed by modifying the "DOCKER_NETWORK" environment variable
Example (default Skadi mode): export DOCKER_NETWORK=host
Example (use other Docker network): export DOCKER_NETWORK=skadi-backend
docker run  --network host  -v /home/skadi/NOT-MY-HOSTNAME.zip:/home/skadi/NOT-MY-HOSTNAME.zip -v /home/skadi/Results:/home/skadi/Results aorlikoski/cdqr:5.1.0 -y /home/skadi/NOT-MY-HOSTNAME.zip /home/skadi/Results -p win --max_cpu -z
CDQR Version: 5.0
Plaso Version: 20190331
Using parser: win
Number of cpu cores to use: 4
Destination Folder: /home/skadi/Results
Attempting to extract source file: /home/skadi/NOT-MY-HOSTNAME.zip
Unable to extract file: /home/skadi/NOT-MY-HOSTNAME.zip
'ascii' codec can't encode character '\u2013' in position 113: ordinal not in range(128)

The u2013 char is the - https://www.fileformat.info/info/unicode/char/2013/index.htm

Not able to generate individual CSV files help me

Plaso version:

For example 1.4.0 Release

Operating system Plaso is running on:

Windows 7

Installation method:

For example installed from [GiFT][https://launchpad.net/~gift], built using [l2devtools][https://github.com/log2timeline/l2tdevtools], or some other method.

Description of problem:

I got generated .db file but not able to generate individual CSV please help me. Please asap. Please

value out of range
[ERROR] Unable to copy 8378110232981209856 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378110232981209856 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378110232981209856 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378110232981209856 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378121015794462777 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378121015794530608 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378121015794661429 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378648688585023858 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378673109732614202 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8378677572206928246 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8379513036173553264 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8399962630399972454 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8411810075432968750 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8434944808433363510 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8439958577042025589 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8447053898000436037 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8450411854277059429 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8458219050600409096 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8458219050600409096 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8518571447666484596 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8583311014230429528 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8591754146788160101 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8594572151534069614 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8594572151534069614 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8595128521853384031 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8609712713317859328 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8609712713317859431 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8611808730372752904 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8663823465366630521 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8666627607517475399 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8731018281124574318 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8731018281124574318 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8735594369897606253 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8740091451076981553 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8808409356843886948 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8808409356843886948 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8808409356843886948 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8808409356843886948 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8808409356843886948 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8813848390023104008 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8813848390023104008 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8813848390023104008 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8813848390023104008 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8841300321372195078 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 8939877146065568892 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9154526533530086620 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9206932593046046216 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9240788109554383359 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9439857534320624385 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9450563123376473031 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9523187321566512648 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9590061817797748577 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9791040504015627805 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9797244691458117902 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9847734146204471227 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9853854227379254513 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9894897215608114824 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9913299544659576328 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 9939857530394120056 to a datetime object with error: date
value out of range
[ERROR] Unable to copy 10083578031867000541 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10108913072317066314 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10223587223576102408 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10240120813717497254 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10338638747030757372 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10393353228237767891 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10488740624457176785 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10646627337393804219 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10711569826296718099 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10770706921350676993 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 10941357210281297416 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11202710936523554971 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11309358565153004693 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11407764457095883251 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11441476671244977672 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11441476671244977672 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11659261648254887637 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11675845096727635881 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 11700491379558871167 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 12212834748250371440 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 12484946163585452748 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 12537763983824454776 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13026051081429436823 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13093039967066734416 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13344385581339849458 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13419619099931288297 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13459082866923749195 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13500498977837386347 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13967528769758019584 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 13977340349366593948 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14024985342528702832 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14117589017205261832 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14303972000890317429 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14670673538803115433 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14794764293904813278 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14893090895897309988 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14909593410259559116 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 14999051346928498082 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 15272740937592059539 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 15416721789605011684 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 15712939374547955685 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 15952641062218334931 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 15956788365783282840 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16044906716129835036 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16060983515880736264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16270856394576843160 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16281855883216718985 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16329604955889501659 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16388456692791771542 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16420201675374301806 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16433156886369758529 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16489706494824587995 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16548111423035854027 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16579708170927926317 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16599966975918121724 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16737837482992651562 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16777773824840613374 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 16847281535263874031 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17022428369045276168 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17090698145525515016 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17348130281417043577 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17391062608280086992 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17678013276618343432 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17678013276618343432 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17678013276618343432 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17678936420003825205 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17681889966300943275 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17854066264658587272 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 17931355049309095432 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18010297785160155912 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18071948832602479032 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18195557796009138387 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18244516511671060495 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18279817475589259264 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18279817475589259372 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18395699398406737657 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18431474510272775688 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18431474510272775688 to a datetime object with error: dat
e value out of range
[ERROR] Unable to copy 18435099600109551615 to a datetime object with error: dat
e value out of range
Processing completed.

*********************************** Counter ************************************

Stored Events : 16967253
Events Included : 16967253

Duplicate Removals : 14346020

C:\Windows\system32>

Source data:

Error in creating Individual CSV report.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.