usarmyresearchlab / dshell Goto Github PK
View Code? Open in Web Editor NEWDshell is a network forensic analysis framework.
License: Other
Dshell is a network forensic analysis framework.
License: Other
Greetings, I am using Arch Linux and I wanted to install your framework. Has anyone experienced the following error when running 'make'?
My Arch Version
My Python Version
IPy is installed
pcap is installed
Crypto is installed
dpkt is installed
python /usr/share/Dshell/bin/generate-dshellrc.py /usr/share/Dshell
File "/usr/share/Dshell/bin/generate-dshellrc.py", line 28
except Exception, e:
^
SyntaxError: invalid syntax
Makefile:8: recipe for target 'rc' failed
make: *** [rc] Error 1
Hey! I hope this message finds you well.
So, I'm trying to run some plugins to live capture on a well configured interface and I got no results.
The Dshell prompt is presented again after I execute the decode command.
Dshell> decode -i ens5f0 -d search --search_expression UPDATE
Dshell>
Any help?
Best regards.
Hi
I can't seem to get dshell working with pcaps saved to disk or traffic from an interface.
In wireshark the traffic has the pppoe layer above the ethernet layer, and another layer above the pppoe one, which wireshark calls 'Point-to-point Protocol' and is 2 bytes in length.
I've started dshell with --strip= all numbers from 1 to 6, with nothing happening, the pcap definitely has dns traffic in it. --strip requires an int, so what should I give it?
Many thanks
It appears that the blob_handler() function isn't getting called at certain critical junctures. My understanding was that the core code would call this function (if defined in a plugin) as it processed packets, every time the stream changed direction. It actually looks like the blob_handlers aren't called until the connection closes and the blobs are formed/iterated.
When using the rip-http
module with a pcap file I noticed that it put all the ripped files in the install directory. There only seems to be input controls for rip-http
and I'm not seeing an -o
or -tmpdir
option like in some of the other modules.
Is this something that exists already? If so what is the option to enable it and if not can it be added as an option of rip-http
.
Also I have been unable to find an option to enable resolving network addresses to hostnames. I have it set in tcpdump and Wireshark to resolve and find it helpful tracking down issues. In my digging around I have been unable to find the option to enable this. Does this option exist (or is set to exist in Dshell) and if it does how does one go about enabling it?
I would find a flag to enable network address resolution quite helpful. Loving the project so far.
It would be nice to have a overview page for the rip-http decoder. I'm thinking of a page much like the one for chaosreader http://chaosreader.sourceforge.net/Chaos01/index.html but with a more modern look with filtering and searching. I could be based on http://www.datatables.net/ for example. I'm unsure where in the architecture this would fit. Maybe it would be appropriate to create a post processing step that will kick in when all streams are processed. I think an overview would be useful for other decoders as well. One other way to solve this is that the decoder creates a file, with a summary of all streams and associated files, that could be processed by a stand alone tool to create an overview.
If you want to analysis huge pcap files, it would be better to do it with C/C++. Isn't it?
Hello,
I need your help on two points:
I try the following command:
decode -d writer -i INTERFACE -o pcap FILE NAME
He gives me the following error message:
WARNING:writer:rawHandler() got an unexpected keyword argument 'smac'
Do you know what it is?
GeoIP.dat, GeoIPv6.dat, GeoIPASNum.dat, GeoIPASNumv6.dat
The linked URL "https://dev.maxmind.com/geoip/geoip2/geolite2/" only has the GeoLite2 databases left:
GeoLite2 City, ... Country, ... ASN
These contain databases, but no longer the files mentioned above.
Does this still work?
I haven't do the network data capture and analysis for quite a while. mainly because being lazy.
is it possible to include those pcap files mentioned in the README file ?
I beleive the expected output should be a realtime view of the dns queries.
I tried decode -i eth0 -d dns > log && tail -f log
but same problem.
The only invocation of setfilter()
on the capture device (pcapy.Reader
class) is based on the initial_bpf
from the first plugin on the chain.
Because of this, any efforts to expand the filter are moot. Narrowing of the filter seems effective through manipulating compiled bpf filters on the plugin objects, but only the packets pulled from the wire or file (governed by the pcapy.Reader
filter) are ever passed to the feed_plugin_chain
function.
It seems we may need a mechanism to update the Reader filter when bpf filters are changed in the plugin chain. But this is not trivial, because recompiling bpf happens in the plugin object and the instantiated Reader appears only in decode.py
.
I initially noticed this because the automatic vlan wrapper wasn't working with any plugin on vlan tagged PCAP files, but it has potential effects also in chained plugins and plugins that dynamically alter their bpf filters.
can not download these from mac
would you mind adding support for mac
BTW, this looks brilliant.
https://github.com/kbandla/dpkt
location moved for wiki ,
https://github.com/necrose99/necromancy-overlay/blob/master/app-forensics/dshell/dshell-9999.ebuild
wish list add releases versions, so far 9999 = live from src. , having version tarballs just in case means
if update frys out the live build,
In the file Dshell/decode.py
, you have a function expandCompressedFile
with the parameters fname
(file name), verbose
(a seemingly boolean value indicating verbose output), and tmpdir
(a temporary directory the files output to). However, there's a comment at the beginning of the function saying Expand a file compressed with gzip, bzip2, or zip. Only handles zip files with one file. Need to add handling for zip files containing multiple files.
I haven't learned Python or what it can do, but it looks like, in this statement:
elif ext == '.zip': print "Enter password for .zip file [default:none]:", pswd=raw_input() z = zipfile.ZipFile(fname) f = z.open(z.namelist()[0],'r',pswd)
Where f
is the first file of fname
(meaning z
is the actual ZIP archive), a coder would need to implement some sort of for
loop that adds values for f
that is equal to the file index 0
of z
(the first file in) to the file index (and it may be coded something like this, again, I don't know Python well enough to say so, but I do know VB, and I'm using the equivalent of that) z.Items - 1
(the last file).
The function itself would have to be converted to use and output string arrays
rather than a simple string
. I don't know how to code it, but someone else might.
Hi there,
I'm a researcher studying software evolution. As part of my current research, I'm studying the implications of open-sourcing a proprietary software, for instance, if the project succeed in attracting newcomers. Dshell was in my list. However, I observed that the software history of when the software was developed as a proprietary software was not kept after the transition to Github.
Knowing that software history is indispensable for developers (e.g., developers need to refer to history several times a day), I would like to ask Dshell developers the following four brief questions:
Thanks in advance for your collaboration,
Gustavo Pinto, PhD
http://www.gustavopinto.org
When trying to extract some files from an HTTP stream, I get the following error:
WARNING:rip-http:local variable 'contenttype' referenced before assignment
Traceback (most recent call last):
File "~/Dshell/bin/decode", line 960, in <module>
main(*sys.argv[1:])
File "~/Dshell/bin/decode", line 916, in main
decoder.cleanConnectionStore()
File "~/Dshell/lib/dshell.py", line 337, in cleanConnectionStore
self.close(conn)
File "~/Dshell/lib/dshell.py", line 306, in close
self.blobHandler(conn, conn.blobs[-1])
File "~/Dshell/lib/httpdecoder.py", line 62, in blobHandler
1], response=None, requesttime=self.requests[conn][0], responsetime=blob.starttime)
File "~/Dshell/decoders/http/rip-http.py", line 82, in HTTPHandler
contenttype, filename, data = self.POSTHandler(request.body)
File "~/Dshell/decoders/http/rip-http.py", line 77, in POSTHandler
return contenttype, filename, l
UnboundLocalError: local variable 'contenttype' referenced before assignment
Upon some further digging, it seems that if it is thought the request has the content that would like to be extracted, then request.body
is passed to POSTHandler()
. The problem lies in POSTHandler()
trying to extract the content-type header from the body. Dpkt has already extracted it and placed content-type in request.headers
. From what I see, POSTHandler()
is no longer necessary and can share the same parsing code that is used for extracting the file from response
.
Hello everyone,
I'm try to compile the source code in MAC OSX,
Do you have any guide on this ?
I'm getting this error running Dshell on a new install on Fedora 21, kernel 3.19.17-200:
Traceback (most recent call last):
File "/root/Dshell/trunk/bin/decode", line 891, in main
decoder.capture = pcap.pcap(pcapfile)
AttributeError: 'module' object has no attribute 'pcap'
I have all the Fedora equivalents of the prereqs, as far as i know, and have this running on another Fedora 21 box without error. Anyone seen this?
Code is:
try:
if not pcap:
raise NotImplementedError(
"pcap support not implemented")
decoder.capture = pcap.pcap(pcapfile)
Running Python 2.7.8, have libpcap, libpcap-devel and pylibpcap installed.
The large-flows.py decoder appears to have a bug in the code in line 24. The decoder is designed for flows 1MB or larger. This would 1048576 bytes. Line 24 incorrectly omits the last digit:
self.min = 104857 * self.size
should be
self.min = 1048576 * self.size
I get the following error running decode -l:
pcap not available: decoders requiring pcap are not usable
Exception loading module 'decoders.nbns.nbns': 'bool' object has no attribute 'dObj'
Exception loading module 'decoders.dns.reservedips': 'bool' object has no attribute 'dObj'
What library am I missing? Thanks...
Is there a recommended install location? It looks like we're supposed to install under your user's home directory.
sorry for the wrong place to put
This project is really nice, and seems like a really powerful development base for network forensics.
Since it was uploaded to GitHub, it has gathered quite a lot of attention, and I think that many people would be eager to contribute. Because of this, I think that a proper test suite would be really helpful for Dshell.
I am willing to do this myself, and I'm adding this ticket to know if anyone else has started to do some work on this matter.
Does anyone have initial ideas on how to properly add a test suite to Dshell ?
Before I go digging through the code I thought I would ask this question. When I run the followstream decoder on a pcap I quickly capture on my computer it shows 6 connections (4 distinct IPs, 3 convos with one of the IPs). Yet when I pull that same pcap up in wireshark it shows 13 IP converasations and 23 tcp conversations. Am I not correctly understanding how the followstream decoder works? It seems to me it should have 23 different connections listed if I have 23 TCP connections in my wireshark.
Hello, I'm getting the following warning running decode -p dns...
/usr/local/lib/python3.8/site-packages/dshell/decode.py:490: DeprecationWarning: PY_SSIZE_T_CLEAN will be required for '#' formats
header, packet = capture.next()
Running Python 3.8.5 and just pulled down the latest release of Dshell today. From Goggling, it looks like the issue is with PyArg_Parse? Thanks.
README.md states:
Should that be /home/[username]/share/GeoIP
or
sudo mkdir /usr/local/share/GeoIP/ ?
It is a bit unclear in README.md.
DestDir isnt cleanly defined in makefile so thus its a royal pest to package DSHELL on many distros....
gentoo uses sandbox /var/portage/$Packagename/$package-version/build , D$ would fake the root based on DestDir's else i have to force {$ROOT} to force make a root-fs tree in sandbox (and it works about as well as a cluster-F####) {9999 is customary for Live-git version }
once built then it gets copied over to actual install tree and temp paths removed.
unfortunately .dshellrc dshell-decode dshell gets real paths and has to get fixed. basically it jumps out of security sandbox so it doesn't package well as is.
however python most of all DISTUTILS packages goes up without a hitch. some require nominal patching to behave...
A: distro or system Agnostic is the power of distutils https://docs.python.org/2/distutils/setupscript.html
ie python setup.py
/Dshell/share/GeoIP copy if exsits /usr/share/GeoIP file symlinks etc.
B: makes for Easier Agnostic Packaging by Linux distros , adds consistent directories for 3rd parity add on modules.
C: Can define USER or SYSTEM mode installs
default-sysntem .dshell > /etc/skel/ so thus it is copied over to many users on login (usr /home/$username/Dshell-path/.......
Symlink /opt/bin/Dshell or /usr/bin/Dshell/ dshell-decode dshell @/usr/bin
Dshell/docs to /usr/share/doc/Dshell as many of the docs are dynamically generated.
d: RPM/DEB /MSI ETC are a nice added bonus feature of python distutils , In theory Dshell could be just as easily patched for windows Boxes with a good and proper Python setup. and or even MACOS etc natives. as add-on modules that do added packaging, RPM or Deb , etc could be added on to the main modules , and then system packages generated for users conveniences latter.
E: find any missing docs or apply updates. new modules etc.
https://pythonhosted.org/setuptools/python3.html
https://github.com/pypa/sampleproject
https://docs.python.org/2.0/dist/creating-rpms.html
https://ghantoos.org/2008/10/19/creating-a-deb-package-from-a-python-setuppy/
http://cyrille.rossant.net/create-a-standalone-windows-installer-for-your-python-application/
[install]
prefix=/usr/bin/Dshell
install_lib=//usr/bin/Dshell/lib
install_scripts=/usr/bin/Dshell/bin etc.
[bdist_wininst]
prefix=/c:/Dshell
install_lib=/some/lib/path
install_scripts=/some/bin/path
There seems to be an issue with installing pcapy: helpsystems/pcapy#73 which is preventing installation of DShell.
Collecting pcapy
Downloading pcapy-0.11.4.tar.gz (37 kB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [6 lines of output]
Traceback (most recent call last):
File "<string>", line 2, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-nqz_zlei/pcapy_e1a8e2d5b3794862af12f48d4e4fdfdc/setup.py", line 49, in <module>
save_init_posix = sysconfig._init_posix
AttributeError: module 'distutils.sysconfig' has no attribute '_init_posix'
[end of output]
It appears unlikely that pcapy will release a fix for the issue since the latest commit was back in 2019. A possible work around is to use pcapy-ng (https://github.com/stamparm/pcapy-ng/) instead pcapy. Update the pcapy
to pcapy-ng
in setup.py
install_requires=[
"geoip2",
"pcapy-ng",
"pypacker",
],
** NOTE:to any fellow Gentoo USERS/DEVS **
https://github.com/necrose99/necromancy-overlay/blob/master/app-forensics/dshell/dshell-9999.ebuild
https://github.com/necrose99/Dshell/blob/master/install-gentoo.py Proof of concept
PROVED WORKING
(going to TEST tomorrow evening , anyone else cares to verify, the install-gentoo.py its opened pull. )
if package missing (mainly cloned existing debian/ubuntu script and replaced to call Gentoo Emerge)
as well you can install these dependencies.
DEPS = dev-python/pycrypto, dev-python/dpkt ,dev-python/ipy , dev-python/pypcap dev-python/pygeoip , & for documents dev-python/epydoc
GENTOO EBUILD = Sabayon Linux, FUNTOO, Calculate Linux, PENTOO , Google Chrome O/S,(IE Chromebooks), Zentoo,Core O/S, others.
emerge -av app-forensics/dshell .. not quite their yet? victims ; I mean testers welcome.
layman -a necromancy-overlay. (Many Alpha ebuilds hear just a warning some may not as of yet work , some do and were rescued from now gone overlays)
Any Gentoo USERS/DEVS (related variants/forks ) Pull Requests and fixes are welcomed.
EBUILD is Proof of concept/pre-alpha trial , going to test shortly.
currently testing them myself however any fixes , to the script and or for the EBUILD are quite Welcome.
A bug was found in decode.py that prevents the --parallel flag from functioning properly. Below is the exception seen:
Traceback (most recent call last):
File "/home/user/lib/python3.6/multiprocessing/process.py", line 249, in _bootstrap
self.run()
File "/home/user/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/home/user/Dshell/build/lib/dshell/decode.py", line 430, in process_files
input0 = inputs.pop(0)
AttributeError: 'str' object has no attribute 'pop'
To fix, add a comma after [i]
in line 386:
multiprocessing.Process(target=process_files, args=([i],), kwargs=kwargs)
We will permanently correct this bug after finalizing our review of pull request #120
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.