Coder Social home page Coder Social logo

Comments (23)

cybert79 avatar cybert79 commented on August 20, 2024

Here is some more info on an ip I can publish

[+] Execute Nmap against 81.82.222.25
[*] nmap -Pn -sT -A -r --initial-rtt-timeout 300ms --min-rtt-timeout 200ms --max-rtt-timeout 1000ms --max-scan-delay 200ms --max-retries 3 -oX nmap_result_81.82.222.25.xml 81.82.222.25

[] Start time: 2018/07/17 22:07:31
[
] Port scanning: 81.82.222.25 [Elapsed time: 0 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 5 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 10 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 15 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 20 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 25 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 30 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 35 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 40 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 45 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 50 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 55 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 60 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 65 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 70 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 75 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 80 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 85 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 90 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 95 s]
[
] Port scanning: 81.82.222.25 [Elapsed time: 100 s]
[] Port scanning: 81.82.222.25 [Elapsed time: 105 s]
[
] End time : 2018/07/17 22:09:23
[+] Get port list from nmap_result_81.82.222.25.xml.
[+] Get exploit list.
[] Loading exploit list from local file: /opt/machine_learning_security/DeepExploit/data/exploit_list.csv
[+] Get payload list.
[
] Loading payload list from local file: /opt/machine_learning_security/DeepExploit/data/payload_list.csv
[+] Get exploit tree.
[] Loading exploit tree from local file: /opt/machine_learning_security/DeepExploit/data/exploit_tree.json
[+] Get target info.
[+] Check web port.
[
] Target URL: http://81.82.222.25:81
[] Port "81" is web port. status=200
[
] Target URL: http://81.82.222.25:1723
[!] Port "1723" is not web port.
[] Target URL: https://81.82.222.25:1723
[!] Port "1723" is not web port.
2018-07-17 22:09:29 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: scrapybot)
2018-07-17 22:09:29 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.5.3 (default, Jan 19 2017, 14:11:04) - [GCC 6.3.0 20170118], pyOpenSSL 18.0.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptography 2.2.2, Platform Linux-4.9.0-3-amd64-x86_64-with-debian-9.5
2018-07-17 22:09:29 [scrapy.crawler] INFO: Overridden settings: {'SPIDER_LOADER_WARN_ONLY': True, 'FEED_URI': 'crawl_result/20180717220928_crawl_result.json', 'FEED_FORMAT': 'json'}
2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.memusage.MemoryUsage']
[
] Save log to /opt/machine_learning_security/DeepExploit/crawl_result/81.82.222.25_81.log
2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2018-07-17 22:09:29 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2018-07-17 22:09:29 [scrapy.core.engine] INFO: Spider opened
2018-07-17 22:09:29 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-07-17 22:09:29 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-07-17 22:09:29 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/>
2018-07-17 22:09:29 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None)
2018-07-17 22:09:29 [scrapy.core.engine] INFO: Closing spider (finished)
2018-07-17 22:09:29 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 490,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 4589,
'downloader/response_count': 2,
'downloader/response_status_count/200': 1,
'downloader/response_status_count/302': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2018, 7, 17, 20, 9, 29, 508799),
'log_count/DEBUG': 3,
'log_count/INFO': 7,
'memusage/max': 55042048,
'memusage/startup': 55042048,
'response_received_count': 1,
'scheduler/dequeued': 2,
'scheduler/dequeued/memory': 2,
'scheduler/enqueued': 2,
'scheduler/enqueued/memory': 2,
'start_time': datetime.datetime(2018, 7, 17, 20, 9, 29, 201870)}
2018-07-17 22:09:29 [scrapy.core.engine] INFO: Spider closed (finished)
Traceback (most recent call last):
File "DeepExploit.py", line 2064, in
target_tree = env.get_target_info(rhost, proto_list, info_list)
File "DeepExploit.py", line 525, in get_target_info
web_target_info = self.util.run_spider(rhost, web_port_list)
File "/opt/machine_learning_security/DeepExploit/util.py", line 159, in run_spider
dict_json = json.load(fin)
File "/usr/lib/python3.5/json/init.py", line 268, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File "/usr/lib/python3.5/json/init.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.5/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

i correspond the null character problem of loading file.
please, try again.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

ok, So I do a git update first?

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

Did you add a function? :) [!] 1638/1792 unix/webapp/squirrelmail_pgp_plugin module is danger (rank: manual). Can't load.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

It went a little bit better now, but still errors:

[] 1174/1174 exploit:linux/smtp/haraka, targets:2
[
] Saved exploit tree.
[+] Get target info.
[+] Check web port.
[] Target URL: http://81.82.222.25:81
[
] Port "81" is web port. status=200
[] Target URL: http://81.82.222.25:1723
[!] Port "1723" is not web port.
[
] Target URL: https://81.82.222.25:1723
[!] Port "1723" is not web port.
2018-07-18 09:36:14 [scrapy.utils.log] INFO: Scrapy 1.5.0 started (bot: scrapybot)
2018-07-18 09:36:14 [scrapy.utils.log] INFO: Versions: lxml 4.2.3.0, libxml2 2.9.8, cssselect 1.0.3, parsel 1.5.0, w3lib 1.19.0, Twisted 18.7.0, Python 3.5.3 (default, Jan 19 2017, 14:11:04) - [GCC 6.3.0 20170118], pyOpenSSL 18.0.0 (OpenSSL 1.1.0h 27 Mar 2018), cryptography 2.2.2, Platform Linux-4.9.0-3-amd64-x86_64-with-debian-9.5
2018-07-18 09:36:14 [scrapy.crawler] INFO: Overridden settings: {'SPIDER_LOADER_WARN_ONLY': True, 'FEED_URI': 'crawl_result/20180718093613_crawl_result.json', 'FEED_FORMAT': 'json'}
2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.logstats.LogStats',
'scrapy.extensions.memusage.MemoryUsage',
'scrapy.extensions.feedexport.FeedExporter',
'scrapy.extensions.telnet.TelnetConsole']
[*] Save log to /opt/machine_learning_security/DeepExploit/crawl_result/81.82.222.25_81.log
2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2018-07-18 09:36:14 [scrapy.middleware] INFO: Enabled item pipelines:
[]
2018-07-18 09:36:14 [scrapy.core.engine] INFO: Spider opened
2018-07-18 09:36:14 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-07-18 09:36:14 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-07-18 09:36:14 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/>
2018-07-18 09:36:14 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None)
2018-07-18 09:36:14 [scrapy.core.engine] INFO: Closing spider (finished)
2018-07-18 09:36:14 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 490,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 4589,
'downloader/response_count': 2,
'downloader/response_status_count/200': 1,
'downloader/response_status_count/302': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2018, 7, 18, 7, 36, 14, 923712),
'log_count/DEBUG': 3,
'log_count/INFO': 7,
'memusage/max': 54947840,
'memusage/startup': 54947840,
'response_received_count': 1,
'scheduler/dequeued': 2,
'scheduler/dequeued/memory': 2,
'scheduler/enqueued': 2,
'scheduler/enqueued/memory': 2,
'start_time': datetime.datetime(2018, 7, 18, 7, 36, 14, 607720)}
2018-07-18 09:36:14 [scrapy.core.engine] INFO: Spider closed (finished)
Traceback (most recent call last):
File "DeepExploit.py", line 2082, in
target_tree = env.get_target_info(rhost, proto_list, info_list)
File "DeepExploit.py", line 525, in get_target_info
web_target_info = self.util.run_spider(rhost, web_port_list)
File "/opt/machine_learning_security/DeepExploit/util.py", line 159, in run_spider
dict_json = json.loads(fin.read().replace('\0', ''))
File "/usr/lib/python3.5/json/init.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.5/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

Hi

Did you add a function? :) [!] 1638/1792 unix/webapp/squirrelmail_pgp_plugin module is danger (rank: manual). Can't load.

yes, i do.
this is no problem. deep exploit only use safety modules such as "rank=excellent, great, good".
in the future, i'll allow users to choose the rank.

It went a little bit better now, but still errors:

ugh..
this problem is solved in my environment.
i'm seeking cause of the problem, please just a moment.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

Well, I just installed it on a debian 9 with python 3
You need more info about my system?

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

now i don't need more info.
i released the updated module util.py. it can remove control character such as NUL, EOT, DEL.

if your environment cause error, please send me your yyyymmddhhmmss_crawl_result.jon.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

Mmmm, still the same issue, and the json file is empty,...

2018-07-19 10:33:09 [scrapy.core.engine] INFO: Spider opened
2018-07-19 10:33:09 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2018-07-19 10:33:09 [scrapy.extensions.telnet] DEBUG: Telnet console listening on 127.0.0.1:6023
2018-07-19 10:33:09 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/>
2018-07-19 10:33:09 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None)
2018-07-19 10:33:09 [scrapy.core.engine] INFO: Closing spider (finished)
2018-07-19 10:33:09 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 490,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 4589,
'downloader/response_count': 2,
'downloader/response_status_count/200': 1,
'downloader/response_status_count/302': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2018, 7, 19, 8, 33, 9, 576632),
'log_count/DEBUG': 3,
'log_count/INFO': 7,
'memusage/max': 55201792,
'memusage/startup': 55201792,
'response_received_count': 1,
'scheduler/dequeued': 2,
'scheduler/dequeued/memory': 2,
'scheduler/enqueued': 2,
'scheduler/enqueued/memory': 2,
'start_time': datetime.datetime(2018, 7, 19, 8, 33, 9, 248249)}
2018-07-19 10:33:09 [scrapy.core.engine] INFO: Spider closed (finished)
Traceback (most recent call last):
File "DeepExploit.py", line 2094, in
target_tree = env.get_target_info(rhost, proto_list, info_list)
File "DeepExploit.py", line 539, in get_target_info
web_target_info = self.util.run_spider(rhost, web_port_list)
File "/opt/machine_learning_security/DeepExploit/util.py", line 169, in run_spider
dict_json = json.loads(self.delete_ctrl_char(fin.read()))
File "/usr/lib/python3.5/json/init.py", line 319, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.5/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.5/json/decoder.py", line 357, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
root@debian:/opt/machine_learning_security/DeepExploit# ls
config.ini crawl_result CreateReport.py data DeepExploit.py deep_plugin img LICENSE pycache README.md report requirements.txt Spider.py trained_data util.py
root@debian:/opt/machine_learning_security/DeepExploit# cd data/
root@debian:/opt/machine_learning_security/DeepExploit/data# ls
exploit_list.csv exploit_tree.json payload_list.csv
root@debian:/opt/machine_learning_security/DeepExploit/data# cd ..
root@debian:/opt/machine_learning_security/DeepExploit# cd trained_data/
root@debian:/opt/machine_learning_security/DeepExploit/trained_data# ls
checkpoint DeepExploit.ckpt.data-00000-of-00001 DeepExploit.ckpt.index DeepExploit.ckpt.meta

root@debian:/opt/machine_learning_security/DeepExploit/trained_data# cd ..
root@debian:/opt/machine_learning_security/DeepExploit# cd crawl_result/
root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# ls
20180719103307_crawl_result.json 81.82.222.25_81.log
root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# cat 20180719103307_crawl_result.json
root@debian:/opt/machine_learning_security/DeepExploit/crawl_result#

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

here is the result of the other file in the crawl directory: the .log file

root@debian:/opt/machine_learning_security/DeepExploit/crawl_result# cat 81.82.222.25_81.log

<title>Blue Iris Login</title> <script type="text/javascript"> var login_version = "17"; var bi_version = "4.7.6.8"; var combined_version = login_version + "-" + bi_version; </script> <script type="text/javascript"> window.onerror = function (msg, url, line, charIdx) { try { var versionStr = "unknown"; if (typeof login_version != "undefined") versionStr = login_version; var biVersionStr = "unknown"; if (typeof bi_version != "undefined") biVersionStr = bi_version; url = url.replace(/\/\/.*?\//, '//censored_hostname/'); alert("An unexpected error has occurred in Blue Iris Login (v " + versionStr + " / " + biVersionStr + "). If you wish to report the error, please SCREENSHOT the browser now.\n\n" + msg + "\nat " + url + " [" + line + ":" + charIdx + "]\n" + navigator.userAgent); } catch (ex) { alert(ex); } }; </script> <style type="text/css"> body { font-family: sans-serif; background: #212325; }
            #loginLoading
            {
                    display: none;
                    text-shadow: 0 0 10px rgba(0,0,0,0.3);
                    position: absolute;
                    text-align: center;
                    top: 40%;
                    width: 100%;
                    color: #FFFFFF;
            }

                    #loginLoading h1
                    {
                            margin: 0 0 20px 0;
                            font-size: 32px;
                    }

                    #loginLoading div
                    {
                            font-size: 20px;
                    }

            #login
            {
                    display: none;
            }
    </style>

ITL-CAM-01

Loading login page...

ITL-CAM-01

Log in automatically:
<script type="text/javascript"> var loadingOpacity = 0; function IncreaseLoadingOpacity() { loadingOpacity += 0.05; if (loadingOpacity > 1) loadingOpacity = 1; var ele = document.getElementById('loginLoading'); ele.style.display = "block"; ele.style.opacity = loadingOpacity; if (loadingOpacity < 1) showLoadingMessageTimeout = setTimeout(IncreaseLoadingOpacity, 33); } var showLoadingMessageTimeout = setTimeout(IncreaseLoadingOpacity, 67);
            document.write('<link href="applet/loginStyles.css?v=' + combined_version + '" rel="stylesheet" />'
                    + '<script src="applet/loginScripts.js?v=' + combined_version + '"><\/script>');
    </script>
    <script type="text/javascript">
            var autologin_timeout_1 = null;
            var autologin_timeout_2 = null;
            var existingSession = "";
            var isStoredDataLoaded = false;
            var windowUnloading = false;
            var cookiesEnabled;
            var localStorageEnabled;
            $(function ()
            {
                    cookiesEnabled = areCookiesEnabled();
                    localStorageEnabled = isLocalStorageEnabled();
                    if (UrlParameters.Get("autologin") === "0")
                    {
                            SetPersistedValue("bi_override_disable_auto_login_once", "1");
                            location.href = location.href.replace(/autologin=0&?/gi, '');
                            return;
                    }
                    clearTimeout(showLoadingMessageTimeout);
                    $("#loginLoading").hide();
                    $("#login").show();
                    if (typeof window.JSON === 'undefined')
                    {
                            $("#login").html("<div>Your web browser is too old to use the Blue Iris web interface properly.<br><br>To proceed with this browser, disable the \"Secure only\" requirement within Blue Iris's web server settings.</div>");
                            $("#login").css("color", "#EEEEEE").css("margin", "8px");
                            return;
                    }
                    if (!cookiesEnabled && !localStorageEnabled)
                            $("#cbLoginAutomatically").parent().text("Note: Cookies and Local Storage are disabled in your browser.").css("color", "#EEEEEE");
                    SetupLoginContextMenu();
                    SetStatus();
                    $(window).resize(resized);
                    resized();
                    window.onbeforeunload = function ()
                    {
                            windowUnloading = true;
                            cbLoginAutomaticallyClicked();
                            return;
                    }
                    var skipAutoLogin = GetPersistedValue("bi_override_disable_auto_login_once") == "1";
                    if (skipAutoLogin)
                    {
                            SetPersistedValue("bi_override_disable_auto_login_once", "0");
                    }
                    // Handle automatic login
                    if (GetPersistedValue("bi_rememberMe") == "1")
                    {
                            $("#cbLoginAutomatically").attr('checked', 'checked');
                            $("#txtUn").val(Base64.decode(GetPersistedValue("bi_username")));
                            $("#txtPw").val(Base64.decode(GetPersistedValue("bi_password")));

                            if (!skipAutoLogin)
                            {
                                    if ($("#txtUn").val() != "" && $("#txtPw").val() != "")
                                    {
                                            if (GetAutoLoginInstantly())
                                            {
                                                    if ($("#cbLoginAutomatically").is(":checked"))
                                                            login();
                                            }
                                            else
                                            {
                                                    $("#btnLogin").val("Logging in, in 2 seconds");
                                                    autologin_timeout_1 = setTimeout(function () { $("#btnLogin").val("Logging in, in 1 second"); }, 1000);
                                                    autologin_timeout_2 = setTimeout(function ()
                                                    {
                                                            if ($("#cbLoginAutomatically").is(":checked"))
                                                                    login();
                                                    }, 2000);
                                            }
                                    }
                            }
                    }
                    else
                    {
                            $("#cbLoginAutomatically").removeAttr('checked');
                            SetPersistedValue("bi_username", "");
                            SetPersistedValue("bi_password", "");
                    }

                    // Check for existing session
                    ExecJSON({ cmd: "login", session: $.cookie("session") }, function (response)
                    {
                            if (response.result == "fail")
                            {
                                    // No existing session.
                                    var anonymousAvailable = response.data && response.data["auth-exempt"];
                                    if (anonymousAvailable)
                                    {
                                            // Attempt Anonymous login, to get the permission level.
                                            var myResponse = md5("Anonymous:" + response.session + ":");
                                            ExecJSON({ cmd: "login", session: response.session, response: myResponse }, function (response)
                                            {
                                                    if (response.result == "success")
                                                    {
                                                            existingSession = response.session;
                                                            SetStatus("An anonymous " + (response.data.admin ? "administrator" : "user") + ' session is available. <a href="javascript:LeaveLoginPage()">Click here to use it.</a>');
                                                    }
                                            },
                                                    function (jqXHR, textStatus, errorThrown)
                                                    {
                                                            HandleError("Unable to contact Blue Iris server");
                                                    });
                                    }
                            }
                            else if (response.result == "success")
                            {
                                    existingSession = response.session;
                                    SetStatus("An existing " + (response.data.admin ? "administrator" : "user") + ' session is available. <a href="javascript:LeaveLoginPage()">Click here to use it.</a>');
                            }
                    },
                            function (jqXHR, textStatus, errorThrown)
                            {
                                    HandleError("Unable to contact Blue Iris server");
                            });

                    // Set focus on first empty field
                    if (!$("#txtUn").val())
                            $("#txtUn").get(0).focus();
                    else if (!$("#txtPw").val())
                            $("#txtPw").get(0).focus();
                    else
                            $("#btnLogin").get(0).focus();

                    isStoredDataLoaded = true;
            });
            function login()
            {
                    cbLoginAutomaticallyClicked();
                    $("#btnLogin").val("Logging in ...");
                    SetStatus();
                    ExecJSON({ cmd: "login" }, function (response)
                    {
                            var myResponse = md5($("#txtUn").val() + ":" + response.session + ":" + $("#txtPw").val());
                            ExecJSON({ cmd: "login", session: response.session, response: myResponse }, function (response)
                            {
                                    if (response.result == "success")
                                    {
                                            $("#btnLogin").attr("disabled", "disabled").val("Redirecting...");
                                            existingSession = response.session;
                                            LeaveLoginPage();
                                    }
                                    else
                                    {
                                            $("#cbLoginAutomatically").removeAttr('checked');
                                            SetPersistedValue("bi_rememberMe", "0");
                                            SetPersistedValue("bi_username", "");
                                            SetPersistedValue("bi_password", "");
                                            $("#btnLogin").val("Log in");
                                            HandleError(response.data ? response.data.reason : "Login failed but Blue Iris did not provide a reason.");
                                    }
                            },
                                    function (jqXHR, textStatus, errorThrown)
                                    {
                                            HandleError("Unable to contact Blue Iris server");
                                            $("#btnLogin").val("Log in");
                                    });
                    },
                            function (jqXHR, textStatus, errorThrown)
                            {
                                    HandleError("Unable to contact Blue Iris server");
                                    $("#btnLogin").val("Log in");
                            });
            }
            function LeaveLoginPage()
            {
                    var page = UrlParameters.Get("page");
                    if (page == "")
                            page = "/";
                    if (cookiesEnabled)
                            $.cookie("session", existingSession, { path: "/" });
                    else
                            page += (page.indexOf("?") < 0 ? "?" : "&") + "session=" + existingSession;
                    location.href = page + location.hash;
            }
            function cancelAutoLogin()
            {
                    if (autologin_timeout_1 != null)
                    {
                            clearTimeout(autologin_timeout_1);
                            autologin_timeout_1 = null;
                    }
                    if (autologin_timeout_2 != null)
                    {
                            clearTimeout(autologin_timeout_2);
                            autologin_timeout_2 = null;
                    }
                    $("#btnLogin").val(windowUnloading ? "Redirecting..." : "Log in");
            }
            function cbLoginAutomaticallyClicked()
            {
                    cancelAutoLogin();
                    if (!isStoredDataLoaded)
                            return;
                    var isChecked = $("#cbLoginAutomatically").is(":checked");
                    SetPersistedValue("bi_rememberMe", isChecked ? "1" : "0");
                    SetPersistedValue("bi_username", isChecked ? Base64.encode($("#txtUn").val()) : "");
                    SetPersistedValue("bi_password", isChecked ? Base64.encode($("#txtPw").val()) : "");
            }
            function GetPersistedValue(key)
            {
                    var value;
                    if (localStorageEnabled)
                            value = localStorage.getItem(key);
                    else if (cookiesEnabled)
                            value = $.cookie(key);
                    if (!value)
                            value = "";
                    return value;
            }
            function SetPersistedValue(key, value)
            {
                    if (localStorageEnabled)
                            return localStorage.setItem(key, value);
                    else if (cookiesEnabled)
                            return $.cookie(key, value, { expires: 365 });
            }
            function pwKeypress(ele, e)
            {
                    var keycode;
                    if (window.event) keycode = window.event.keyCode;
                    else if (typeof e != "undefined" && e) keycode = e.which;
                    else return true;

                    if (keycode == 13)
                    {
                            login();
                            return false;
                    }
                    else
                            return true;
            }
            function resized()
            {
                    if ($("#status").is(":visible"))
                    {
                            $("#status").hide();
                            $("#status").css("max-width", $("#login").width() + "px");
                            $("#status").show();
                    }
                    $('#login').css({ position: 'absolute', left: ($(window).width() - $('#login').outerWidth()) / 2, top: ($(window).height() - $('#login').outerHeight()) / 2 });
                    $("#status").css("max-width", $("#login").width() + "px");
                    var heightTotal = 0;
                    $("#login").children().each(function (idx, ele)
                    {
                            heightTotal += $(ele).outerHeight(true);
                    });
                    if (heightTotal > $(window).height())
                    {
                            if ($("#status").parent().attr("id") != "status_wrapper_upper")
                                    $("#status_wrapper_upper").append($("#status"));
                    }
                    else
                    {
                            if ($("#status").parent().attr("id") != "status_wrapper_lower")
                                    $("#status_wrapper_lower").append($("#status"));
                    }
                    $("#lblLoginAutomatically").parent().css("padding-left", (($('#login').outerWidth() - $("#lblLoginAutomatically").outerWidth(true)) / 2) + "px");
            };
            function HandleError(error)
            {
                    SetStatus(error, "#FF6262");
            }
            function SetStatus(html, color)
            {
                    if (typeof html == "undefined" || html == null || html == "")
                    {
                            html = "";
                            $("#status").hide();
                    }
                    else
                            $("#status").show();
                    if (typeof color == "undefined" || color == null || color == "")
                            color = "#FFFFFF";
                    $("#status").html(html);
                    $("#status").css("color", color);
                    resized();
            }
            function GetAutoLoginInstantly()
            {
                    return GetPersistedValue("bi_autoLoginInstant") == "1";
            }
            function GetPreferredContextMenuTrigger()
            {
                    return GetPersistedValue("ui3_contextMenus_longPress") == "1" ? "longpress" : "right";
            }
            function SetupLoginContextMenu()
            {
                    var onTriggerContextMenu = function (e)
                    {
                            if (GetAutoLoginInstantly())
                                    $("#autoLoginNoDelay").text("Auto Login: Instant (click to change)");
                            else
                                    $("#autoLoginNoDelay").text("Auto Login: Delayed (click to change)");
                            return true;
                    }
                    var onContextMenuAction = function ()
                    {
                            if (this.data.alias == "autoLoginNoDelay")
                                    SetPersistedValue("bi_autoLoginInstant", GetAutoLoginInstantly() ? "0" : "1");
                    }
                    var menuOptions =
                            {
                                    alias: "cmroot_login", width: "300px", items:
                                    [
                                            { text: '<span id="autoLoginNoDelay"></span>', icon: "", alias: "autoLoginNoDelay", action: onContextMenuAction }
                                    ]
                                    , onContextMenu: onTriggerContextMenu
                                    , clickType: GetPreferredContextMenuTrigger()
                            };
                    $(".checkboxWrapper,#btnLogin").contextmenu(menuOptions);
            }
    </script>

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

Ok, I tried it on a kali machine with metasploit localy installed, same error.
On Kali machine with metasploit on Debian installed, same error
On Debian with metasploit same error.

(pfew its not my machine's error :) )

Am I maybe missing a module that is running on your setup and not on mine?

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

thanks for providing your info.
i understood cause of this problem.
in this case, there are two problems.

  1. why cause json exception?
    deepexploit didn't correspond to empty file.
    in this result, deepexploit caused exception of reading json.

  2. why json is empty?
    currently Scrapy (crawling module) doesn't correspond parsing javascript (only html).
    in this result, Scrapy couldn't extract links.

i fixed the problem 1).
please, update util.py.

however, i cannot quickly fix the problem 2).
because, i need to reconstruct design of Scrapy.
near the future, i'll correspond the parsing javascript, one moment please.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

mmmm, it seems to be in a loop now,...

[] Save learned data: local_thread1
[
] Save learned data: local_thread1
[] Save learned data: local_thread1
[
] Save learned data: local_thread1
[] Save learned data: local_thread1
[
] Save learned data: local_thread1
....

the crawl_result json file is still empty,...

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

Hi

if I look at the code, I cannot find the line where the program should write to craw_result json file.

crawl_result.json is created by Scrapy (external libraly).
currently Scrapy (crawling module) doesn't correspond parsing javascript (only html).
in this result, Scrapy couldn't extract links.

mmmm, it seems to be in a loop now,...

i'll check this problem.
please just a moment.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

'start_time': datetime.datetime(2018, 7, 22, 10, 21, 40, 603686)}
2018-07-22 12:21:40 [scrapy.core.engine] INFO: Spider closed (finished)
[!] [crawl_result/20180722122139_crawl_result.json] is empty.
[] Gather HTTP responses.
[
] 2018-07-22 12:21:40 http://81.82.222.25:81/
[] Analyzing port 21/tcp, unknown/0.0, Available exploit modules:0
[
] Analyzing port 22/tcp, unknown/0.0, Available exploit modules:0
[] Analyzing port 25/tcp, unknown/0.0, Available exploit modules:0
[
] Analyzing port 53/tcp, unknown/0.0, Available exploit modules:0
[] Analyzing port 80/tcp, unknown/0.0, Available exploit modules:0
[
] Analyzing port 81/tcp, unknown/1.1, Available exploit modules:0
[] Analyzing port 443/tcp, unknown/0.0, Available exploit modules:0
[
] Analyzing port 1723/tcp, unknown/0.0, Available exploit modules:0
[] Analyzing port 3000/tcp, unknown/0.0, Available exploit modules:0
[
] Analyzing port 10000/tcp, unknown/0.0, Available exploit modules:0
[] Analyzing port 55555/tcp, unknown/0.0, Available exploit modules:0
[
] Saved target tree.
[+] Executing start: local_thread1

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

I see that you did some updates, so, I updated and lets see what happens, I keep you noticed :)

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

When doing it on another machine to test:

{'downloader/request_bytes': 9908,
'downloader/request_count': 37,
'downloader/request_method_count/GET': 37,
'downloader/response_bytes': 2086690,
'downloader/response_count': 37,
'downloader/response_status_count/200': 36,
'downloader/response_status_count/404': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2018, 7, 23, 12, 19, 50, 783575),
'httperror/response_ignored_count': 1,
'httperror/response_ignored_status_count/404': 1,
'item_scraped_count': 34,
'log_count/DEBUG': 72,
'log_count/ERROR': 1,
'log_count/INFO': 11,
'memusage/max': 230842368,
'memusage/startup': 228425728,
'offsite/filtered': 16,
'request_depth_max': 1,
'response_received_count': 37,
'scheduler/dequeued': 37,
'scheduler/dequeued/memory': 37,
'scheduler/enqueued': 37,
'scheduler/enqueued/memory': 37,
'spider_exceptions/UnicodeDecodeError': 1,
'start_time': datetime.datetime(2018, 7, 23, 12, 17, 13, 295134)}
2018-07-23 14:19:50 [scrapy.core.engine] INFO: Spider closed (finished)
Traceback (most recent call last):
File "DeepExploit.py", line 2086, in
target_tree = env.get_target_info(rhost, proto_list, info_list)
File "DeepExploit.py", line 533, in get_target_info
web_target_info = self.util.run_spider(rhost, web_port_list, self.client)
File "/opt/machine_learning_security/DeepExploit/util.py", line 182, in run_spider
if target_ip == util.parse_url(item).host:
File "/usr/local/lib/python3.6/dist-packages/urllib3/util/url.py", line 199, in parse_url
raise LocationParseError(url)
urllib3.exceptions.LocationParseError: Failed to parse: javascript:;

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

I also noticed if I run deepexploit against a machine on my own netwerk (private ip 192.168.1.10) it reaction is complete different agianst a public ip. On a private network it seems to work fine.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

2018-07-25 20:23:52 [scrapy.downloadermiddlewares.redirect] DEBUG: Redirecting (302) to <GET http://81.82.222.25:81/login.htm?page=%2F> from <GET http://81.82.222.25:81/>
2018-07-25 20:23:52 [scrapy.core.engine] DEBUG: Crawled (200) <GET http://81.82.222.25:81/login.htm?page=%2F> (referer: None)
2018-07-25 20:23:52 [scrapy.core.engine] INFO: Closing spider (finished)
2018-07-25 20:23:52 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 490,
'downloader/request_count': 2,
'downloader/request_method_count/GET': 2,
'downloader/response_bytes': 4589,
'downloader/response_count': 2,
'downloader/response_status_count/200': 1,
'downloader/response_status_count/302': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2018, 7, 25, 18, 23, 52, 451097),
'log_count/DEBUG': 3,
'log_count/INFO': 7,
'memusage/max': 205688832,
'memusage/startup': 205688832,
'response_received_count': 1,
'scheduler/dequeued': 2,
'scheduler/dequeued/memory': 2,
'scheduler/enqueued': 2,
'scheduler/enqueued/memory': 2,
'start_time': datetime.datetime(2018, 7, 25, 18, 23, 51, 525123)}
2018-07-25 20:23:52 [scrapy.core.engine] INFO: Spider closed (finished)
[!] [crawl_result/20180725202348_crawl_result.json] is empty.
[] Gather HTTP responses.
[
] 2018-07-25 20:23:53 http://81.82.222.25:81/
[] Analyzing port 81/tcp, unknown/1.1, Available exploit modules:0
[
] Analyzing port 1723/tcp, unknown/0.0, Available exploit modules:0
[*] Saved target tree.
[+] Executing start: local_thread1
[+] Executing start: local_thread2
[+] Executing start: local_thread3
[+] Executing start: local_thread4
[+] Executing start: local_thread5
[+] Executing start: local_thread6
[+] Executing start: local_thread7
[+] Executing start: local_thread8
[+] Executing start: local_thread9
[+] Executing start: local_thread10

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

So it starts the thread, but then nothing happens anymore :(

from machine_learning_security.

13o-bbr-bbq avatar 13o-bbr-bbq commented on August 20, 2024

this problem doesn't cause in my environment (Kali Linux 2018.2 or 2017.3).
but, i have never try public ip address (only private ip).

please just a moment.
i'll examine the problem.

from machine_learning_security.

cybert79 avatar cybert79 commented on August 20, 2024

Well, it seems to do it on "most" public ip's. You can always try mine (81.82.222.25)

from machine_learning_security.

Trustgirl avatar Trustgirl commented on August 20, 2024

I confirm the problem. on the local metasploitable normally works, and on the white addresses of the woven:

...
[*] Analyzing port 8100/tcp, unknown/0.0, Available exploit modules:0
[*] Analyzing port 8084:0/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 80:1/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 443:2/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 8000:3/tcp, unknown/8.5, Available exploit modules:0
[*] Analyzing port 8000:4/tcp, unknown/8.5, Available exploit modules:0
[*] Analyzing port 8087:5/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 443:6/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 80:7/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 8084:8/tcp, unknown/7.5, Available exploit modules:0
[*] Analyzing port 8087:9/tcp, unknown/7.5, Available exploit modules:0
[*] Saved target tree.
[+] Executing start: local_thread1
[+] Executing start: local_thread2
[+] Executing start: local_thread3
[+] Executing start: local_thread4
[+] Executing start: local_thread6
[+] Executing start: local_thread7
[+] Executing start: local_thread5
[+] Executing start: local_thread9
[+] Executing start: local_thread8
[+] Executing start: local_thread11
[+] Executing start: local_thread12
[+] Executing start: local_thread14
[+] Executing start: local_thread16
[+] Executing start: local_thread17
[+] Executing start: local_thread15
[+] Executing start: local_thread18
[+] Executing start: local_thread10
[+] Executing start: local_thread13
[+] Executing start: local_thread19
[+] Executing start: local_thread20

And that's all. Nothing more.

from machine_learning_security.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.