log2timeline / dfdatetime Goto Github PK
View Code? Open in Web Editor NEWDigital Forensics date and time
License: Apache License 2.0
Digital Forensics date and time
License: Apache License 2.0
currently we have PosixTimeInMicroseconds, we should have an equivalent for milliseconds
Improve fraction support to handle arbitrary fractions. For example to support 7 digit fraction: 2012-03-05T20:40:00.0000000Z for log2timeline/plaso#1187
One option could be to preserve the information, e.g. in a time zone object. This would allow "local time zone" to be defined semantically. The question would be here how to compare different date time values using normalized timestamp.
Add set precision method
Remove deprecated CopyFromString somewhere mid 2018
GetPlasoTimestamp is needed for the migration of plaso to dfdatetime, deprecate it when no longer needed.
Golang serializes timestamps (time.Time structs) in the following binary form [0]
Epoch/zero value is January 1, year 1, 00:00:00.000000000 UTC [1]
Per conversation in https://codereview.appspot.com/335010043/
To distinguish between a "string representation of the object" versus a "date time string"
decide to rename CopyToString to CopyToDatetimeString
CopyFromString should match this naming convention for consistency
Add a CopyFromDatetime to TimeElements to deprecate plaso PythonDatetimeEvent. Also see: https://github.com/log2timeline/plaso/blob/master/plaso/containers/time_events.py#L64
Add CopyToDateTimeStringISO8601 method
Have DelphiDateTime check for maximum supported date time
Additional context log2timeline/plaso#3153
From: https://ci.appveyor.com/project/joachimmetz/dfdatetime/build/289/job/dpv7r38vvuc1bgs3
%PYTHON%\python.exe %PYTHON%\Scripts\pywin32_postinstall.py -install
Traceback (most recent call last):
File "C:\Python36\Scripts\pywin32_postinstall.py", line 594, in <module>
install()
File "C:\Python36\Scripts\pywin32_postinstall.py", line 328, in install
LoadSystemModule(lib_dir, "pywintypes")
File "C:\Python36\Scripts\pywin32_postinstall.py", line 166, in LoadSystemModule
mod = imp.load_dynamic(modname, filename)
File "C:\Python36\lib\imp.py", line 343, in load_dynamic
return _load(spec)
File "<frozen importlib._bootstrap>", line 684, in _load
File "<frozen importlib._bootstrap>", line 658, in _load_unlocked
File "<frozen importlib._bootstrap>", line 571, in module_from_spec
File "<frozen importlib._bootstrap_external>", line 922, in create_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
ImportError: DLL load failed: The specified module could not be found.
Online references indicate that DLL could require Microsoft Visual C++ 2010 Redistributable Package
set fat_date_time attribute when FatDateTime is set from string
https://tools.ietf.org/html/rfc2579
These timestamps are used in ipp cache files (https://tools.ietf.org/html/rfc2911).
DateAndTime ::= TEXTUAL-CONVENTION
DISPLAY-HINT "2d-1d-1d,1d:1d:1d.1d,1a1d:1d"
STATUS current
DESCRIPTION
"A date-time specification.
field octets contents range
----- ------ -------- -----
1 1-2 year* 0..65536
2 3 month 1..12
3 4 day 1..31
4 5 hour 0..23
5 6 minutes 0..59
6 7 seconds 0..60
(use 60 for leap-second)
7 8 deci-seconds 0..9
8 9 direction from UTC '+' / '-'
9 10 hours from UTC* 0..13
10 11 minutes from UTC 0..59
* Notes:
- the value of year is in network-byte order
- daylight saving time in New Zealand is +13
For example, Tuesday May 26, 1992 at 1:30:15 PM EDT would be
displayed as:
1992-5-26,13:30:15.0,-4:0
add semantic time convenience classes
_GetDateValues
Add support for OLE Automation date (aka Floatingtime or Application Time)
dfvfs tests are failing with latest dfdatetime
======================================================================
FAIL: testGetStat (vfs.cpio_file_entry.CPIOFileEntryTest)
Tests the GetStat function.
----------------------------------------------------------------------
Traceback (most recent call last):
File "dfvfs/tests/vfs/cpio_file_entry.py", line 75, in testGetStat
self.assertFalse(hasattr(stat_object, 'mtime_nano'))
AssertionError: True is not false
======================================================================
FAIL: testGetStat (vfs.gzip_file_entry.GZIPFileEntryTest)
Tests the GetStat function.
----------------------------------------------------------------------
Traceback (most recent call last):
File "dfvfs/tests/vfs/gzip_file_entry.py", line 70, in testGetStat
self.assertFalse(hasattr(stat_object, 'mtime_nano'))
AssertionError: True is not false
======================================================================
FAIL: testGetStat (vfs.tar_file_entry.TARFileEntryTest)
Tests the GetStat function.
----------------------------------------------------------------------
Traceback (most recent call last):
File "dfvfs/tests/vfs/tar_file_entry.py", line 75, in testGetStat
self.assertFalse(hasattr(stat_object, 'mtime_nano'))
AssertionError: True is not false
Per [0], timestamps can be formatted using nanosecond precision:
RFC3339Nano = "2006-01-02T15:04:05.999999999Z07:00"
Add public method to retrieve the date and time of day
When running plaso (latest from head at the time of the issue being sent out) against a set of files collected from a Mac OS X system an error occurs:
Traceback (most recent call last):
File "<my_pyenv_path>/plaso_env/bin/log2timeline.py", line 4, in <module>
__import__('pkg_resources').run_script('plaso==20220816', 'log2timeline.py')
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/pkg_resources/__init__.py", line 667, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/pkg_resources/__init__.py", line 1464, in run_script
exec(code, namespace, namespace)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/EGG-INFO/scripts/log2timeline.py", line 99, in <module>
if not Main():
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/EGG-INFO/scripts/log2timeline.py", line 73, in Main
tool.ExtractEventsFromSources()
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/cli/extraction_tool.py", line 730, in ExtractEventsFromSources
processing_status = self._ProcessSources(session, storage_writer)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/cli/extraction_tool.py", line 587, in _ProcessSources
storage_file_path=self._storage_file_path)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/extraction_engine.py", line 1017, in ProcessSources
source_configurations, storage_writer, session_identifier)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/extraction_engine.py", line 591, in _ProcessSources
self._ScheduleTasks(storage_writer, session_identifier)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/extraction_engine.py", line 705, in _ScheduleTasks
self._MergeTaskStorage(storage_writer, session_identifier)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/extraction_engine.py", line 476, in _MergeTaskStorage
maximum_number_of_containers=self._maximum_number_of_containers)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/extraction_engine.py", line 388, in _MergeAttributeContainers
container = merge_helper.GetAttributeContainer()
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/merge_helpers.py", line 64, in GetAttributeContainer
container = next(self._generator)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/multi_process/merge_helpers.py", line 47, in _GetAttributeContainers
container_type):
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/storage/sqlite/sqlite_file.py", line 401, in _GetAttributeContainersWithFilter
container_type, column_names, row, 1)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/storage/sqlite/sqlite_file.py", line 332, in _CreatetAttributeContainerFromRow
attribute_value = self._serializer.ReadSerialized(attribute_value)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/serializer/json_serializer.py", line 493, in ReadSerialized
return cls.ReadSerializedDict(json_dict)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/serializer/json_serializer.py", line 517, in ReadSerializedDict
json_object = cls._ConvertJSONToValue(json_dict)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/plaso-20220816-py3.7.egg/plaso/serializer/json_serializer.py", line 294, in _ConvertJSONToValue
return convert_function(json_dict)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/dfdatetime/serializer.py", line 34, in ConvertDictToDateTimeValues
return cls.ConvertJSONToDateTimeValues(json_dict)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/dfdatetime/serializer.py", line 178, in ConvertJSONToDateTimeValues
date_time = factory.Factory.NewDateTimeValues(class_name, **json_dict)
File "<my_pyenv_path>/plaso_env/lib/python3.7/site-packages/dfdatetime/factory.py", line 46, in NewDateTimeValues
return date_time_values_type(**kwargs)
TypeError: __init__() got an unexpected keyword argument 'time_zone_offset'
This happens in the factory code, for NewDateTimeValues
when the date class RFC2579DateTime
is being called, the actual kwargs that is being sent by plaso does not confirm to the struct that RFC2579DateTime
expects, which is
16 struct {
17 uin16_t year,
18 uint8_t month,
19 uint8_t day_of_month,
20 uint8_t hours,
21 uint8_t minutes,
22 uint8_t seconds,
23 uint8_t deciseconds,
24 char direction_from_utc,
25 uint8_t hours_from_utc,
26 uint8_t minutes_from_utc
27 }
But rather it has the form (the kwargs):
{'rfc2579_date_time_tuple': [2022, 1, 17, 12, 55, 51, 0], 'time_zone_offset': 0}
Running tests with Python3 produces a number of exceptions such as this one:
======================================================================
ERROR: testAtMaximumDepth (registry_searcher.FindSpecTest)
Tests the AtMaximumDepth function.
----------------------------------------------------------------------
Traceback (most recent call last):
File "/<<PKGBUILDDIR>>/tests/registry_searcher.py", line 103, in testAtMaximumDepth
key_path=u'HKEY_CURRENT_USER\\Software\\Microsoft')
File "./dfwinreg/registry_searcher.py", line 46, in __init__
if len(key_path_arguments) > 1:
TypeError: object of type 'filter' has no len()
... and this one:
======================================================================
FAIL: testSplitKeyPath (key_paths.KeyPathTest)
Tests the SplitKeyPath function.
----------------------------------------------------------------------
Traceback (most recent call last):
File "/<<PKGBUILDDIR>>/tests/key_paths.py", line 22, in testSplitKeyPath
self.assertEqual(path_segments, expected_path_segments)
AssertionError: <filter object at 0x7f0a21bb17b8> != ['HKEY_CURRENT_USER', 'Software', 'Microsoft']
Add copy from RFC 822 / 1123 / 2822 string functions
Also see:
Add information about RFC 822 / 1123 / 2822 to the documentation
Improve support of parsing ISO 8601 date and time strings.
APFS, as represent the number of nano seconds elapsed since January 1, 1970.
Ref: "Decoding the APFS file system"
http://cyberforensicator.com/wp-content/uploads/2017/11/DIIN_698_Revisedproof.1-min-ilovepdf-compressed.pdf
After #165 add a script to generate date and time values documentation from source
Context from log2timeline-dev@
I wrote a simple script that exploits regexes to convert from a standard asctime format to a string that can be accepted from dfdatetime CopyFromDateTimeString
https://gist.github.com/SamuelePilleri/3ffae24b3cdd6dd04d01790519c2846b
Consider adding functionality to set date and time elements from ctime string
This is would be useful for things that optionally have a timestamp, to be able to represent something like "this thing never expires".
I had some weird errors processing CSV output from psort
, and tracked them down to a couple of lines having date stamps of "01/00/1971" or "01/00/1974", which seems wrong. Outputting JSON from psort
instead, I got the timestamps in POSIX format, and tracked the error to dfdatetime
.
The original error occurred on an up-to-date Fedora 29, running plaso-20181219-3
out of LiFTeR on Python 2.7.15, but the error is still there when running the current master of dfdatetime
(0b23e03).
$ python2
>>> from dfdatetime import posix_time as dfdatetime_posix_time
>>> dfdatetime_posix_time.PosixTime(31511228).CopyToDateTimeString()
u'1971-01-00 17:07:08'
>>> dfdatetime_posix_time.PosixTime(126189827).CopyToDateTimeString()
u'1974-01-00 12:43:47'
The same bug is present in the PosixTimeInMilliseconds()
and PosixTimeInMicroseconds()
interfaces (which is what psort
actually uses).
date
says those timestamps were on the 31:st of December the year before, not the zeroth of January:
$ env LC_TIME=en_US TZ=UTC date --date='@31511228'
Thu Dec 31 17:07:08 UTC 1970
$ env LC_TIME=en_US TZ=UTC date --date='@126189827'
Mon Dec 31 12:43:47 UTC 1973
CopyToStatTimeTuple is a left over from early plaso days, deprecate it.
Depends on log2timeline/dfvfs#610
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.