Coder Social home page Coder Social logo

thehive-project / cortex Goto Github PK

View Code? Open in Web Editor NEW
1.3K 1.3K 214.0 4.69 MB

Cortex: a Powerful Observable Analysis and Active Response Engine

Home Page: https://thehive-project.org

License: GNU Affero General Public License v3.0

Scala 46.84% JavaScript 26.34% HTML 19.64% Shell 4.91% Python 0.49% Dockerfile 0.06% SCSS 1.46% EJS 0.26%
analysis analyzer api cortex cyber-threat-intelligence dfir digital-forensics engine free free-software incident-response iocs observable open-source python response rest scala security-incidents thehive

cortex's People

Contributors

8ear avatar adl1995 avatar cemasirt avatar garanews avatar jeromeleonard avatar mthlvt avatar nadouani avatar o101010 avatar saadkadhi avatar stephen-oleary avatar to-om avatar vdebergue avatar vulnbe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cortex's Issues

Cortex removes the input details from failure reports

Request Type

Bug

Work Environment

Question Answer
Cortex version 1.1.3

Problem Description

Using cortexutils library, analyzers include their input on the failure output report, to help users investigate the errors that could occur.

It seems that there is an regression in Cortex that removes that inputattributes from the failure reports, and just keep the successand errorMessageproperties.

As an example, the failure report should look like

{
    "input": {
        "tlp": 1,
        "dataType": "file",
        "filename": "output.swf",
        ...
        "config": {
            ...
        }
    },
    "errorMessage": "An error occurred during the file scan",
    "success": false
}

Error when parsing analyzer failure report

Request Type

Bug

Work Environment

Question Answer
Cortex version / git hash 1.1.2
Package Type Any

Problem Description

Cortex doesn't correctly handle the case where an analyzer fails and returns an JSON report including errorMessage field.

We end up with Cortex returning a incoherent error message

Persistence and Report Caching

Request Type

Feature Request

Work Environment

NA

Problem Description

The current version of Cortex (1.0.0) has no persistence. When it is restarted, all jobs are lost hence the associated results are no longer available.

Moreover, if an analyst runs an analyzer through the Web UI or the REST API, they can't retrieve the report from TheHive or any other 3rd party tool that leverages the API. They need to re-run the analysis. This is a serious problem:

  • if the report contains useful information that may be lost if the analysis is executed again (time-based results)
  • if the analyzer must not be run more than required (quota-based services)

Possible Solutions

  1. Implement persistence to allow analysts to retrieve past reports even after a reboot
  2. Implement report caching and specific API replies to indicate to TheHive or a 3rd party tool that reports are already available and let them retrieve those if interested instead of re-running the analysis.

After Upgrade from Cortex 1.0.2 to 1.1.1 system does not come up

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian 8
Cortex version / git hash 1.1.1
Package Type Debian Package

Problem Description

After Upgrade Cortex does not come up.
It Looks as anything was missing (maybe in the config?)

Error Messages:

2017-05-19 09:54:56,987 [INFO] from akka.event.slf4j.Slf4jLogger in application-akka.actor.default-dispatcher-2 - Slf4jLogger started
2017-05-19 09:54:58,701 [ERROR] from akka.actor.OneForOneStrategy in application-akka.actor.default-dispatcher-3 - Unable to provision, see the following errors:

1) Error injecting constructor, java.util.NoSuchElementException: None.get
  at services.MispSrv.<init>(MispSrv.scala:32)
  at services.MispSrv.class(MispSrv.scala:21)
  while locating services.MispSrv
    for parameter 1 at services.AnalyzerSrv.<init>(AnalyzerSrv.scala:12)
  at services.AnalyzerSrv.class(AnalyzerSrv.scala:12)
  while locating services.AnalyzerSrv
    for parameter 1 at services.JobActor.<init>(JobSrv.scala:109)
  while locating services.JobActor

1 error
akka.actor.ActorInitializationException: akka://application/user/JobActor: exception during creation
        at akka.actor.ActorInitializationException$.apply(Actor.scala:174)
        at akka.actor.ActorCell.create(ActorCell.scala:607)
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:461)
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:483)
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:282)
        at akka.dispatch.Mailbox.run(Mailbox.scala:223)
        at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:

1) Error injecting constructor, java.util.NoSuchElementException: None.get
  at services.MispSrv.<init>(MispSrv.scala:32)
  at services.MispSrv.class(MispSrv.scala:21)
  while locating services.MispSrv
    for parameter 1 at services.AnalyzerSrv.<init>(AnalyzerSrv.scala:12)
  at services.AnalyzerSrv.class(AnalyzerSrv.scala:12)
  while locating services.AnalyzerSrv
    for parameter 1 at services.JobActor.<init>(JobSrv.scala:109)
  while locating services.JobActor

1 error
        at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1025)
        at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1051)
        at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:405)
        at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:400)
        at play.api.libs.concurrent.ActorRefProvider$$anonfun$1.apply(Akka.scala:210)
        at play.api.libs.concurrent.ActorRefProvider$$anonfun$1.apply(Akka.scala:210)
        at akka.actor.TypedCreatorFunctionConsumer.produce(IndirectActorProducer.scala:87)
        at akka.actor.Props.newActor(Props.scala:213)
        at akka.actor.ActorCell.newActor(ActorCell.scala:562)
        at akka.actor.ActorCell.create(ActorCell.scala:588)
        ... 9 common frames omitted
Caused by: java.util.NoSuchElementException: None.get
        at scala.None$.get(Option.scala:347)
        at scala.None$.get(Option.scala:345)
        at services.MispSrv.<init>(MispSrv.scala:34)
        at services.MispSrv$$FastClassByGuice$$52b14c8e.newInstance(<generated>)
        at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)
        at com.google.inject.internal.DefaultConstructionProxyFactory$1.newInstance(DefaultConstructionProxyFactory.java:61)
        at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:105)
        at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
        at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:267)
        at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
        at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
        at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
        at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:145)
        at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
        at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38)
        at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62)
        at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:104)
        at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
        at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:267)
        at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
        at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1103)
        at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
        at com.google.inject.internal.SingletonScope$1.get(SingletonScope.java:145)
        at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
        at com.google.inject.internal.SingleParameterInjector.inject(SingleParameterInjector.java:38)
        at com.google.inject.internal.SingleParameterInjector.getAll(SingleParameterInjector.java:62)
        at com.google.inject.internal.ConstructorInjector.provision(ConstructorInjector.java:104)
        at com.google.inject.internal.ConstructorInjector.construct(ConstructorInjector.java:85)
        at com.google.inject.internal.ConstructorBindingImpl$Factory.get(ConstructorBindingImpl.java:267)
        at com.google.inject.internal.InjectorImpl$2$1.call(InjectorImpl.java:1016)
        at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1092)
        at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1012)
        ... 18 common frames omitted

Configuration:

# Secret key
# ~~~~~
# The secret key is used to secure cryptographics functions.
# If you deploy your application to several instances be sure to use the same key!
http.port=9001
play.crypto.secret="XXXXXXXXXXXXXXXXXXX"
analyzer {
  path = "/opt/Cortex-Analyzers/analyzers"
config {
    global {
      proxy {
        http="http://x:8080",
        https="http://y:8080"
      }
    }
    DNSDB {
      server="https://api.dnsdb.info"
      key="..."
    }
    DomainTools {
      username="..."
      key="..."
    }
[...]
    PassiveTotal {
      key="..."
      username=".."
    }
    MISP {
            url="https://misp.local.dom"
            key="..."
            certpath=["/etc/ssl/private/misp.local.crt", ""]
            name="instance-1"
}
  }
}

Cortex behind proxy

Feature Request

Work Environment

OS version (server) : Centos 7

Problem Description

My server is not public. It is interal network and using proxy server. How to config cortex job analyzer run behind proxy?

Exxample : Run Job OTXquery -> Connection syn_sent (because i'm ussing proxy). -> return error

Add proxy authentication

The HTTP client of Play can use proxy settings of the JVM but can't handle authentication. Please add support for proxy authentication.

Problem Start Cortex on Ubuntu 16.04

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian, Ubuntu
Cortex version / git hash 1.1.2
Package Type deb

Problem Description

After install Cortex package (deb) the service (Cortex) does not start and show a stack java error

Steps to Reproduce

  1. Install a new Ubuntu 16.04.02 Server and update to Ultimate kernel
  2. As a Guide install cortex by .deb
  3. Start Cortex (systemctl start cortex)
    ย 

Possible Solutions

After install cortex run this command:

sudo apt-get remove the openjdk-9-jre-headless
sudo apt-get install open-jk-8-jre-headless
sudo apt-get install cortex

Complementary information

The problem is that on Ubuntu 16.04 the cortex package installs openjdk-9 by default but cortex works with openjdk 8.

Thanks to Nabil Adouani for fix the problem :)

Error 500 in TheHive when a job is submited to Cortex

Request Type

Bug

Work Environment

Question Answer
Cortex version 1.1.1
Package Type any

Problem Description

When a job is submitted, Cortex returns the created job, serialized in JSON. The serializer uses wrong format (attributes used are not correct). This makes TheHive raise JsResultException.

Complementary information

The error is introduced by commit 9f8f1eb in file app/models/JsonFormat.scala.

Missing install repertory

Request Type

Bug

Work Environment

Binary

Problem Description

Hello, when trying to test cortex via binaries with thehive, I noticed that there was no install directory in the following packages
thehive-cortex-latest.zip and cortex-latest.zip

Thanks

Option to disable misp-modules

Request Type

Feature Request by @crackytsi

Work Environment

Question Answer
OS version (server) Debian
OS version (client) 8.9
TheHive version / git hash 2.13.1
Package Type DEB

Problem Description

Currently misp-modules can only completely or not disabled.
It would be helpfull to be able to disable selected modules.
Especially if some modules makes no sense or do not work as they should.

Error when clicking out of the "New Analysis" box

Error when clicking out of the "New Analysis" box

Request Type

Bug

Work Environment

Question Answer
OS version (server) CentOS
OS version (client) Win 7
Cortex version / git hash 1.1.4, hash of the commit
Package Type Binary
Browser type & version Chrome 61

Problem Description

When clicking out of the "New Analysis" box the following error appears in the bottom right corner
image

Steps to Reproduce

  1. Click "New Analysis" on header bar
  2. Once information box pops up click anywhere else on the browser window to make it go away
  3. Error pops up in corner

"TLP is higher than allowed" message with different analyzers

Request Type

Bug

Work Environment

Question Answer
OS version (server) Ubuntu 16.04
OS version (client) Windows 7
Cortex version / git hash 1.1.3-1
Package Type Binary (.deb)
Browser type & version Opera 46.0

Problem Description

I just finished to install TheHive-Cortex bundle on my server. I installed the analyzers too, regarding the official documentation.
For some analyzers, it works well ! But for a few one, i've got the message "TLP is higher than allowed" coming from analyzer python file. I don't know why.

For example, i've got this message when i use on a file (observable) the analyzer Vmray, or on a URL the analyzer WOT_Lookup.

I didn't find any information on this message, sorry if it's obvious...

Steps to Reproduce

  1. Install The Hive
  2. Install Cortex
  3. Install Analyzers
  4. Test analyzers

Complementary information

bugcortex1

Redirect to jobs list when a job is not found

Request Type

Bug

Work Environment

Any

Problem Description

If a user accesses to a job details page of a delated job, Cortex should redirect to jobs list instead of displaying en empty job details page

Local, LDAP, AD and API Key Authentication

Request Type

Feature Request

Work Environment

NA

Problem Description

As stated in #2, anyone can access Cortex with no authentication. Anonymous users/services can run analyzers and consume quotas/queries and that is not desirable.

Possible Solutions

  1. Implement local, LDAP and AD authentication on the Web UI
  2. Implement local, LDAP and AD authentication on the REST API
  3. Implement API key authentication on the REST API for TheHive and 3rd party services

Complementary information

It must be possible to change or lock down the API key if it is compromised/leaked.

Wrong MISP config in conf/application.sample

Change the sample configuration for MISP in application.sample.

New Configuration should look like this :

      #url=["https://mymispserver_1", "https://mymispserver_2"]
      #key=["mykey_1", "mykey_2" ]
      #certpath=["", ""]
      #name=["misp_server_name_1", "misp_server_name_2"]

Provide Secret Key auth to upstream service

Provide Secret Key auth to upstream service

Request Type

Feature Request

Work Environment

NA

Problem Description

Currently, the cortex doesn't require any sort of auth. It would be nice if authentication between an upstream service could be required. So that people can't just hit the cortex URL and anonymously run analyzers.

At a minimum, using a secret key in the hive config that is passed in all requests to the cortex would be a start, that would require some info on enabled https for the cortex (I was able to mimic the https setup for the hive, so that won't be an issue).

A more verbose way would be to provide the auth information from thehive, and pass it to the cortex which then accesses the elastic search backend (or other auth backend) to gather keys or other info, and record metrics etc.

Application.conf doesn't have Yeti config nor allows for API Auth

Request Type

Bug

Work Environment

Question Answer
OS version (server) Ubuntu
OS version (client) OS X
Cortex version / git hash latest
Package Type Binary (Debian Package)
Browser type & version N/A

Problem Description

The Yeti analyzer doesn't have the default config in the application.conf

The Yeti analyzer doesn't allow for configuration of an API key. By default, Yeti is configured with NO auth. By deleting the default Yeti user, access to the UI / API requires an API key. The analyzer should be configured to allow API level access via token auth.

Steps to Reproduce

  1. Install Yeti
  2. Create a new user in Yeti
  3. Generate a new API key for the new Yeti users
  4. Delete the default Yeti user.

Possible Solutions

Have a config sysntax in the /etc/cortex/application.conf that provides:

  1. URL to the Yeti Server
  2. API Key for the Yet API

Add page loader

Request Type

Enhancement

Problem Description

This issue is related to adding a page loader to Cortex UI, like what is made on TheHive.

Scala code cleanup

The aim of this task is to, optimize imports, follow code style guide, add return type for public methods, ...

Provide way to reload conf file for new API keys without shutdown.

Provide way to reload conf file for new API keys without shutdown.

Request Type

Feature Request

Problem Description

Currently, as far as I can tell, if we get a new configuration for analyzers, it requires us to stop and start cortex. This could cause failures on currently running analysis, that shouldn't have to happen. Instead a mechanism to either check conf every N seconds (configurable) for changes, and if change reload, or manually request reload of config would be helpful to ensure currently running analysis would not be interrupted .

File_Info issue

Request Type

Bug

Work Environment

Question Answer
OS version (server) RedHat 6.9
OS version (client) N/A
Cortex version / git hash 1.1.4
Package Type RPM
Browser type & version N/A

Problem Description

Analyzer File_Info erroring out processing a file

Steps to Reproduce

1.Opened up file_info analyzer on the cortex
2. Selected TLP:White Data Type:File then put a file with a sample trojan inside.
3.{
"errorMessage": "Unexpected Error: 'module' object has no attribute 'hash_file'",
"input": {
"tlp": 0,
"dataType": "file",
"content-type": "application/pdf",
"filename": "secured document.pdf",
"file": "/tmp/cortex-4878520860249705835-datafile",
"config": {
"max_tlp": 3,
"check_tlp": false,
"service": ""
}
},
"success": false
}

Missing logos and favicons

Request Type

Bug

Work Environment

Question Answer
Cortex version 1.1.0
Package Type Any

Problem Description

The 1.1.0 should have included the new Cortex logo which has not been packaged during the release process.

jobstatus from jobs within cortex are not updated when status changes

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian 8
Cortex version / git hash 1.1.1-2
Package Type Debian Package

Problem Description

If you manually submit a Job in Cortex via webfrontend, the Job Status is not refreshed. So the new Status only appears if you reclick on the joblist.

By the way: At the very bottom of the page appears: VERSION: Snapshot
I would expect Version 1.1.1...

MISP integration

Description

Add an API to communicate with MISP and run analyzers and mips-modules.

Display analyzers metadata

Request Type

Enhancement

Problem Description

The idea is to display in the list of analyzers, the author and license of each analyzer when available

Display analyzers only if necessary configuration values are set

Request Type

FR

Idea

If necessary data the analyzer needs such as username, password or an api key are defined through a config file (the json file or so...), the analyzer should only be displayed if all as required marked config values are set.

Maybe this is something to consider together with

From my point of view, this is cortex sided so I added the issue here.

API: Resource not found by Assets controller

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian stretch/sid
OS version (client) MacOS
Cortex version / git hash cortex 1.1.4-1
Package Type Binary, thehive 2.13.2-1

Problem Description

I installed the TheHive and can't get the Cortex API to give me anything other than

A client error occurred on GET /api/analyzer : Resource not found by Assets controller

Accessing Cortex and TheHive through the WUI works just fine but not through API calls.

Curl request:

$ curl -v http://cortex-1-1.company.com:9000/api/analyzer
*   Trying 10.4.24.12...
* TCP_NODELAY set
* Connected to cortex-1-1.company.com (10.4.24.12) port 9000 (#0)
> GET /api/analyzer HTTP/1.1
> Host: cortex-1-1.company.com:9000
> User-Agent: curl/7.54.0
> Accept: */*
>
< HTTP/1.1 404 Not Found
< Set-Cookie: XSRF-TOKEN=e72aad4a879067855b4debd3cd2d2b2ff4c2cfa3-1509762781220-fcb9c302f8723d6ff4ac6b00; Path=/
< Date: Sat, 04 Nov 2017 02:33:01 GMT
< Content-Type: text/plain; charset=UTF-8
< Content-Length: 86
<
* Connection #0 to host cortex-1-1.company.com left intact
A client error occurred on GET /api/analyzer : Resource not found by Assets controller

Library request:

>>> from cortex4py.api import CortexApi
>>> api = CortexApi('http://cortex-1-1.company.com:9000', cert=False)
>>> api.get_analyzers()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.6/site-packages/cortex4py/api.py", line 75, in get_analyzers
    self.__handle_error(e)
  File "/usr/local/lib/python3.6/site-packages/cortex4py/api.py", line 51, in __handle_error
    raise_from(CortexException("Unexpected exception"), exception)
  File "/usr/local/lib/python3.6/site-packages/future/utils/__init__.py", line 398, in raise_from
    exec(execstr, myglobals, mylocals)
  File "<string>", line 1, in <module>
cortex4py.api.CortexException: Unexpected exception
Unexpected exception

Steps to Reproduce

  1. Make any API call

Possible Solutions

I don't know. I can't find anything in the logs to help point to an issue.

Complementary information

I tailed these log files and then made the API requests (curl, cortex4py).

thehive@cortex-1-1:~$ tail -f /var/log/cortex/application.log /var/log/elasticsearch/hive*.log /var/log/thehive/application.log
==> /var/log/cortex/application.log <==
2017-11-01 00:10:02,290 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer DomainTools_WhoisLookup_IP 2.0 (DomainTools_WhoisLookup_IP_2_0)
2017-11-01 00:10:02,291 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer DomainTools_ReverseNameServer 2.0 (DomainTools_ReverseNameServer_2_0)
2017-11-01 00:10:02,296 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer DomainTools_ReverseIP 2.0 (DomainTools_ReverseIP_2_0)
2017-11-01 00:10:02,300 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer DomainTools_ReverseWhois 2.0 (DomainTools_ReverseWhois_2_0)
2017-11-01 00:10:02,303 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer MISP 2.0 (MISP_2_0)
2017-11-01 00:10:02,305 [INFO] from services.ExternalAnalyzerSrv in application-akka.actor.default-dispatcher-200 - Register analyzer CERTatPassiveDNS 2.0 (CERTatPassiveDNS_2_0)
2017-11-01 00:21:59,290 [INFO] from services.ExternalAnalyzerSrv in application-analyzer-216 - Execute sh -c "./WOT_lookup.py"  in WOT
2017-11-01 00:21:59,298 [INFO] from services.ExternalAnalyzerSrv in application-analyzer-215 - Execute sh -c "./passivetotal_analyzer.py"  in PassiveTotal
2017-11-01 00:21:59,302 [INFO] from services.ExternalAnalyzerSrv in application-analyzer-217 - Execute sh -c "./hippo.py"  in Hippocampe
2017-11-01 00:21:59,314 [INFO] from services.ExternalAnalyzerSrv in application-analyzer-218 - Execute sh -c "./safebrowsing_analyzer.py"  in GoogleSafebrowsing

==> /var/log/elasticsearch/hive-2017-09-27.log <==
[2017-09-27T06:06:21,193][INFO ][o.e.n.Node               ] [Mu2gqAX] starting ...
[2017-09-27T06:06:23,040][INFO ][o.e.t.TransportService   ] [Mu2gqAX] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2017-09-27T06:06:26,537][INFO ][o.e.c.s.ClusterService   ] [Mu2gqAX] new_master {Mu2gqAX}{Mu2gqAXqQz23sDcdhTBChw}{0AMSnT2YRIy82MvbxWVWiA}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-09-27T06:06:26,655][INFO ][o.e.h.n.Netty4HttpServerTransport] [Mu2gqAX] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2017-09-27T06:06:26,655][INFO ][o.e.n.Node               ] [Mu2gqAX] started
[2017-09-27T06:06:26,694][INFO ][o.e.g.GatewayService     ] [Mu2gqAX] recovered [0] indices into cluster_state
[2017-09-27T06:06:47,812][INFO ][o.e.n.Node               ] [Mu2gqAX] stopping ...
[2017-09-27T06:06:47,950][INFO ][o.e.n.Node               ] [Mu2gqAX] stopped
[2017-09-27T06:06:47,950][INFO ][o.e.n.Node               ] [Mu2gqAX] closing ...
[2017-09-27T06:06:48,000][INFO ][o.e.n.Node               ] [Mu2gqAX] closed

==> /var/log/elasticsearch/hive-2017-10-24.log <==
[2017-10-24T14:53:17,177][INFO ][o.e.n.Node               ] [Mu2gqAX] starting ...
[2017-10-24T14:53:17,403][INFO ][o.e.t.TransportService   ] [Mu2gqAX] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2017-10-24T14:53:20,494][INFO ][o.e.c.s.ClusterService   ] [Mu2gqAX] new_master {Mu2gqAX}{Mu2gqAXqQz23sDcdhTBChw}{OdYfKRLoTsubjaTXvWQUEw}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-10-24T14:53:20,516][INFO ][o.e.h.n.Netty4HttpServerTransport] [Mu2gqAX] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2017-10-24T14:53:20,516][INFO ][o.e.n.Node               ] [Mu2gqAX] started
[2017-10-24T14:53:20,527][INFO ][o.e.g.GatewayService     ] [Mu2gqAX] recovered [0] indices into cluster_state
[2017-10-24T14:54:28,661][INFO ][o.e.n.Node               ] [Mu2gqAX] stopping ...
[2017-10-24T14:54:28,734][INFO ][o.e.n.Node               ] [Mu2gqAX] stopped
[2017-10-24T14:54:28,734][INFO ][o.e.n.Node               ] [Mu2gqAX] closing ...
[2017-10-24T14:54:28,789][INFO ][o.e.n.Node               ] [Mu2gqAX] closed

==> /var/log/elasticsearch/hive-2017-10-26.log <==
[2017-10-26T15:19:23,592][INFO ][o.e.n.Node               ] [Mu2gqAX] starting ...
[2017-10-26T15:19:24,108][INFO ][o.e.t.TransportService   ] [Mu2gqAX] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2017-10-26T15:19:27,232][INFO ][o.e.c.s.ClusterService   ] [Mu2gqAX] new_master {Mu2gqAX}{Mu2gqAXqQz23sDcdhTBChw}{AmjP6OeeSnqyHpWGT7-emQ}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-10-26T15:19:27,315][INFO ][o.e.g.GatewayService     ] [Mu2gqAX] recovered [0] indices into cluster_state
[2017-10-26T15:19:27,329][INFO ][o.e.h.n.Netty4HttpServerTransport] [Mu2gqAX] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2017-10-26T15:19:27,329][INFO ][o.e.n.Node               ] [Mu2gqAX] started
[2017-10-26T15:21:13,944][INFO ][o.e.n.Node               ] [Mu2gqAX] stopping ...
[2017-10-26T15:21:14,063][INFO ][o.e.n.Node               ] [Mu2gqAX] stopped
[2017-10-26T15:21:14,063][INFO ][o.e.n.Node               ] [Mu2gqAX] closing ...
[2017-10-26T15:21:14,104][INFO ][o.e.n.Node               ] [Mu2gqAX] closed

==> /var/log/elasticsearch/hive_deprecation.log <==
[2017-10-26T15:16:00,891][WARN ][o.e.d.e.NodeEnvironment  ] ES has detected the [path.data] folder using the cluster name as a folder [/var/lib/elasticsearch], Elasticsearch 6.0 will not allow the cluster name as a folder within the data path
[2017-10-26T15:16:03,511][WARN ][o.e.d.c.s.Settings       ] [script.inline] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.
[2017-10-26T15:19:10,728][WARN ][o.e.d.e.NodeEnvironment  ] ES has detected the [path.data] folder using the cluster name as a folder [/var/lib/elasticsearch], Elasticsearch 6.0 will not allow the cluster name as a folder within the data path
[2017-10-26T15:19:16,387][WARN ][o.e.d.c.s.Settings       ] [script.inline] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.
[2017-10-30T21:51:07,057][WARN ][o.e.d.e.NodeEnvironment  ] ES has detected the [path.data] folder using the cluster name as a folder [/var/lib/elasticsearch], Elasticsearch 6.0 will not allow the cluster name as a folder within the data path
[2017-10-30T21:51:13,085][WARN ][o.e.d.c.s.Settings       ] [script.inline] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.
[2017-10-30T21:58:29,717][WARN ][o.e.d.e.NodeEnvironment  ] ES has detected the [path.data] folder using the cluster name as a folder [/var/lib/elasticsearch], Elasticsearch 6.0 will not allow the cluster name as a folder within the data path
[2017-10-30T21:58:38,293][WARN ][o.e.d.c.s.Settings       ] [script.inline] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.
[2017-10-30T22:27:24,122][WARN ][o.e.d.e.NodeEnvironment  ] ES has detected the [path.data] folder using the cluster name as a folder [/var/lib/elasticsearch], Elasticsearch 6.0 will not allow the cluster name as a folder within the data path
[2017-10-30T22:27:30,237][WARN ][o.e.d.c.s.Settings       ] [script.inline] setting was deprecated in Elasticsearch and will be removed in a future release! See the breaking changes documentation for the next major version.

==> /var/log/elasticsearch/hive_index_indexing_slowlog.log <==

==> /var/log/elasticsearch/hive_index_search_slowlog.log <==

==> /var/log/elasticsearch/hive.log <==
[2017-10-30T22:27:29,207][INFO ][o.e.p.PluginsService     ] [Mu2gqAX] no plugins loaded
[2017-10-30T22:27:37,468][INFO ][o.e.d.DiscoveryModule    ] [Mu2gqAX] using discovery type [zen]
[2017-10-30T22:27:39,227][INFO ][o.e.n.Node               ] initialized
[2017-10-30T22:27:39,228][INFO ][o.e.n.Node               ] [Mu2gqAX] starting ...
[2017-10-30T22:27:39,877][INFO ][o.e.t.TransportService   ] [Mu2gqAX] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
[2017-10-30T22:27:43,069][INFO ][o.e.c.s.ClusterService   ] [Mu2gqAX] new_master {Mu2gqAX}{Mu2gqAXqQz23sDcdhTBChw}{nEUlWNRUS1Suvo8R4uZ3Ig}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
[2017-10-30T22:27:43,153][INFO ][o.e.h.n.Netty4HttpServerTransport] [Mu2gqAX] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}
[2017-10-30T22:27:43,154][INFO ][o.e.n.Node               ] [Mu2gqAX] started
[2017-10-30T22:27:43,864][INFO ][o.e.g.GatewayService     ] [Mu2gqAX] recovered [1] indices into cluster_state
[2017-10-30T22:27:44,973][INFO ][o.e.c.r.a.AllocationService] [Mu2gqAX] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[the_hive_11][2]] ...]).

==> /var/log/thehive/application.log <==
2017-11-04 03:30:47,587 [INFO] from org.elasticsearch.plugins.PluginsService in main - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
2017-11-04 03:30:47,588 [INFO] from org.elasticsearch.plugins.PluginsService in main - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
2017-11-04 03:30:47,588 [INFO] from org.elasticsearch.plugins.PluginsService in main - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
2017-11-04 03:30:47,588 [INFO] from org.elasticsearch.plugins.PluginsService in main - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2017-11-04 03:30:49,813 [INFO] from io.netty.util.internal.PlatformDependent in main - Your platform does not provide complete low-level API for accessing direct buffers reliably. Unless explicitly requested, heap buffer will always be preferred to avoid potential system instability.
2017-11-04 03:30:51,561 [INFO] from connectors.cortex.services.CortexClient in main - new Cortex(LOCAL CORTEX, http://localhost:9999, ) Basic Auth enabled: false
2017-11-04 03:30:51,591 [INFO] from connectors.cortex.services.CortexSrv in main - Search for unfinished job ...
2017-11-04 03:30:51,970 [INFO] from connectors.cortex.services.CortexSrv in application-akka.actor.default-dispatcher-3 - 0 jobs found
2017-11-04 03:30:53,246 [INFO] from play.api.Play in main - Application started (Prod)
2017-11-04 03:30:53,934 [INFO] from play.core.server.AkkaHttpServer in main - Listening for HTTP on /0:0:0:0:0:0:0:0:9000

Limit Rates and Respect Quotas

Request Type

Feature Request

Work Environment

NA

Problem Description

The current version of Cortex (1.0.0) does not support rate-limiting nor quota alerts for analyzers that rely on quota-based services.

This is a problem as analysts may consume the number of queries per time period (day/month...) or inadvertently cause abuse on free services by sending too many queries.

Possible Solutions

  1. Implement configurable rate-limiting to limit the number of analyzer executions with respect to the available number of queries per time period (subscription level of the service) or the reasonable amount of queries for a free service.
  2. Produce configurable quota alerts to alert analysts on the number of remaining queries past a certain threshold.

Complementary Information

The quota alerts should be shown in the Web UI next to each analyzer description and provided as a response to an API query (so that it can be displayed in TheHive).

TheHive should also have a page where analysts can view the query consumption per analyzer and how many are left without having to connect to the Cortex Web UI.

errorMessage": "Error: Invalid output\nCapTipper v0.3 b13

Request Type

Bug

Work Environment

Question Answer
OS version (client) Ubuntu

Problem Description

I dump files from HTTP requests in a PCAP file using a python tool CapTipper. After dumping file in one of a folder I scan hash of each file using a python module virustotal. I get this error.

{
  "errorMessage": "Error: Invalid output\nCapTipper v0.3 b13 - Malicious HTTP traffic explorer tool\nCopyright 2015 Omri Herscovici <[email protected]>\n\n[A] Analyzing PCAP: /tmp/cortex-5967164689721189551-datafile\n\n[+] Traffic Activity Time:  Wed, 07/07/10 03:16:19\n[+] Conversations Found:\n\n0: \u001b[35m / \u001b[0;0m -> text/html \u001b[0;0m(0.html)\u001b[0;0m [41.4 KB]  (Magic: \u001b[1mGZ\u001b[22m)\n1: \u001b[35m /sd/idlecore-tidied.css?T_2_5_0_300 \u001b[0;0m -> text/css \u001b[0;0m(idlecore-tidied.css)\u001b[0;0m [0.0 B] \n2: \u001b[35m /sd/print.css?T_2_5_0_300 \u001b[0;0m -> text/css \u001b[0;0m(print.css)\u001b[0;0m [0.0 B] \n3: \u001b[35m /sd/twitter_icon.png \u001b[0;0m -> image/png \u001b[32m(twitter_icon.png)\u001b[0;0m [0.0 B] \n4: \u001b[35m /sd/facebook_icon.png \u001b[0;0m -> image/png \u001b[32m(facebook_icon.png)\u001b[0;0m [3.4 KB]  (Magic: \u001b[1mPNG\u001b[22m)\n5: \u001b[35m /sd/cs_sic_controls_new.png?T_2_5_0_299 \u001b[0;0m -> image/png \u001b[32m(cs_sic_controls_new.png)\u001b[0;0m [0.0 B] \n6: \u001b[35m /sd/cs_i2_gradients.png?T_2_5_0_299 \u001b[0;0m -> image/png \u001b[32m(cs_i2_gradients.png)\u001b[0;0m [0.0 B] \n7: \u001b[35m /sd/logo2.png \u001b[0;0m -> image/png \u001b[32m(logo2.png)\u001b[0;0m [0.0 B] \n\n GZIP Decompression of object 0 (0.html) successful!\n New object created: 5\n\n Object 0 written to /home/asfandyar/Documents/Cap_Tipper/0-0.html\n\u001b[31m\n[E] Object: 1 (idlecore-tidied.css) : Response body was empty\u001b[0;0m\n\n Object 1 written to /home/asfandyar/Documents/Cap_Tipper/1-idlecore-tidied.css\n\u001b[31m\n[E] Object: 2 (print.css) : Response body was empty\u001b[0;0m\n\n Object 2 written to /home/asfandyar/Documents/Cap_Tipper/2-print.css\n\u001b[31m\n[E] Object: 3 (twitter_icon.png) : Response body was empty\u001b[0;0m\n\n Object 3 written to /home/asfandyar/Documents/Cap_Tipper/3-twitter_icon.png\n Object 4 written to /home/asfandyar/Documents/Cap_Tipper/4-facebook_icon.png\n Object 5 written to /home/asfandyar/Documents/Cap_Tipper/5-ungzip-0.html\n{\"artifacts\": [{\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/c087d878b15538b92d20ea650317c36acc26cdaf5bb87408da144f856fd266be/analysis/1510913190/\"}, {\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\"}, {\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/054b71965ff26b3cfc5305a2e5d7fa29ebf123518f3992e2a7a29c251410aadc/analysis/1282392885/\"}, {\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\"}, {\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/b4a88d47cd933f7c0de51ac243ef8ff9b89554c2a88d2395e170f36ea3042e07/analysis/1510916881/\"}, {\"type\": \"url\", \"value\": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\"}], \"full\": {\"RESULT\": [{\"Antivirus' total : \": 60, \"Permalink : \": \"https://www.virustotal.com/file/c087d878b15538b92d20ea650317c36acc26cdaf5bb87408da144f856fd266be/analysis/1510913190/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/0-0.html\"}, {\"Antivirus' total : \": 61, \"Permalink : \": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/3-twitter_icon.png\"}, {\"Antivirus' total : \": 42, \"Permalink : \": \"https://www.virustotal.com/file/054b71965ff26b3cfc5305a2e5d7fa29ebf123518f3992e2a7a29c251410aadc/analysis/1282392885/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/4-facebook_icon.png\"}, {\"Antivirus' total : \": 61, \"Permalink : \": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/1-idlecore-tidied.css\"}, {\"Antivirus' total : \": 59, \"Permalink : \": \"https://www.virustotal.com/file/b4a88d47cd933f7c0de51ac243ef8ff9b89554c2a88d2395e170f36ea3042e07/analysis/1510916881/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/5-ungzip-0.html\"}, {\"Antivirus' total : \": 61, \"Permalink : \": \"https://www.virustotal.com/file/e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855/analysis/1511443342/\", \"Antivirus's positives : \": 0, \"Resource's status : \": \"Scan finished, information embedded\", \"FILE NAME : \": \"/home/asfandyar/Documents/Cap_Tipper/2-print.css\"}]}, \"success\": true, \"summary\": {}}\n",
  "success": false
}

Shodan Analyzer Fails - Module cortexutils Not Found

Request Type

Bug

Work Environment

Question Answer
OS version (server) Ubuntu 16.0.4 LTS
OS version (client) OS X
Cortex version / git hash Latest
Package Type Source
Browser type & version Chrome (N/A)

Problem Description

Enabling the Shodan analyzer fails with an error of:

Traceback (most recent call last):
  File "./shodan_analyzer.py", line 3, in <module>
    from cortexutils.analyzer import Analyzer
ImportError: No module named 'cortexutils'

To confirm it is installed:

root@ip-10-10-2-40:/opt/Cortex-Analyzers/analyzers/Shodan# pip search cortexutils
cortexutils (1.2.0)  - A Python library for including utility classes for Cortex analyzers
  INSTALLED: 1.2.0 (latest)

Steps to Reproduce

  1. Install Cortex
  2. pip install requirements.txt (under Shodan Analyzer folder)
  3. Execute the python script OR run the analyzer via WebUI

Initialize MISP modules at startup

MISP modules initialization could take times. To speed-up the startup of Cortex this initialization is lazy (done only when it is needed).
This make the first access to Cortex analyzer list very long.

Initialization should begin when Cortex starts but in a separate thread in order to not slow down Cortex startup.

Fix page scroll issues

Request Type

Bug

Problem Description

Navigating from a page another should scroll automatically to top of the new page.

Create an Analyzer Store

Request Type

Feature Request

Work Environment

N/A

Problem Description

Cortex as of 1.x doesn't have a (simple) process for:

  1. Getting notifications when a new version of an existing analyzer is available or a new analyzer has been published
  2. Updating existing public analyzers to the latest versions (or select a specific version and lock updating)
  3. Installing new public analyzers

Moreover, Cortex does not currently allow teams to:

  1. Add analyzers from private sources
  2. Package analyzers in a simple way and share them easily with public or private communities (without having to use pull requests for public analyzers)
  3. Specify the license of their contributed analyzers and monetize them if they want to

Possible Solutions

Create an analyzer store much like a mobile app store that allow teams to:

  1. Browse through the existing analyzers, see their license, author(s), report samples, required input...
  2. Subscribe to different sources (public, private communities) of analyzers to install them, get notifications about new versions, and lock their update process to stay at a given version if they so please
  3. Install analyzers and configure them on-the-fly without having to restart Cortex
  4. Package and push new analyzers choosing where to share them (public, single or multiple private communities)
  5. Buy analyzers
  6. Obtain support when applicable

Provide alternative paths for analyzers in addition to standard path.

Provide alternative paths for analyzers in addition to standard path.

Request Type

Feature Request

Problem Description

In the application config, there is a "path to analyzers" field. It would be nice to be able to provide "N" number of paths to analyzers so that we could continue to use the built in analyzers, but also provide other locations that are checked. Each path to analyzers would have it's own config associated (We would have to determine if Global was truly global or global PER path to analyzer).

This would allow folks to create alternate repos of analyzers that could be shared in private circles, whether trust groups, whether internally at an org etc, without having to go through a merging process every time new analyzers are added to the main repo. Basically, Analyzers would first be checked in order, and when the config is parsed, analyzers would have to be globally unique from a naming perspective (Failure causes exit).

It would make ease of upgrade, addition, and management of analyzers much easier at scale.

No module named cortexutils.analyzer in abusefinder.py

No module named cortexutils.analyzer in abusefinder.py

Request Type

Bug

Work Environment

Question Answer
OS version (server) CentOS 7 (Kernel 3.10.0-514)
Cortex version / git hash 1.1.1
Package Type Binary

Problem Description

Describe the problem/bug as clearly as possible.

Steps to Reproduce

  1. Open Cortex GUI
  2. launch Abuse_Finder_1_0
  3. Go to Job report

{
"errorMessage": "Error: Invalid output\nTraceback (most recent call last):\n File "./abusefinder.py", line 8, in \n from cortexutils.analyzer import Analyzer\nImportError: No module named cortexutils.analyzer\n",
"success": false
}

Possible Solutions

Change #!/usr/bin/env python to use the python version of the virtual environment we have to use with TheHive and Cortex

Complementary information

cortex_bug

Cortex and MISP unclear and error-loop

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian 8
Cortex version / git hash 1.1.1-2
Package Type Debian Package

Problem Description

Hello,
Thanks a lot for your really, really good work!!!
Sorry maybe its my fault, but I don't have any further idear, so I use this way to adress it:

  1. My Cortex Config loops persmantenly this messages:
May 19 21:02:46 debian-8-user cortex[19470]: import misp_modules
May 19 21:02:46 debian-8-user cortex[19470]: ImportError: No module named 'misp_modules'
May 19 21:02:46 debian-8-user cortex[19470]: [#033[37minfo#033[0m] application - GET /api/analyzer returned 500
May 19 21:02:46 debian-8-user cortex[19470]: java.lang.RuntimeException: Nonzero exit value: 1
May 19 21:02:46 debian-8-user cortex[19470]: at scala.sys.package$.error(package.scala:27)
May 19 21:02:46 debian-8-user cortex[19470]: at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.slurp(ProcessBuilderImpl.scala:132)
May 19 21:02:46 debian-8-user cortex[19470]: at scala.sys.process.ProcessBuilderImpl$AbstractBuilder.$bang$bang(ProcessBuilderImpl.scala:102)
May 19 21:02:46 debian-8-user cortex[19470]: at services.MispSrv.list$lzycompute(MispSrv.scala:46)
May 19 21:02:46 debian-8-user cortex[19470]: at services.MispSrv.list(MispSrv.scala:45)
May 19 21:02:46 debian-8-user cortex[19470]: at services.AnalyzerSrv.list(AnalyzerSrv.scala:18)
May 19 21:02:46 debian-8-user cortex[19470]: at controllers.AnalyzerCtrl$$anonfun$list$1.apply(AnalyzerCtrl.scala:19)
May 19 21:02:46 debian-8-user cortex[19470]: at controllers.AnalyzerCtrl$$anonfun$list$1.apply(AnalyzerCtrl.scala:18)
May 19 21:02:46 debian-8-user cortex[19470]: at play.api.mvc.ActionBuilder$$anonfun$apply$13.apply(Action.scala:371)
May 19 21:02:46 debian-8-user cortex[19470]: at play.api.mvc.ActionBuilder$$anonfun$apply$13.apply(Action.scala:370)
May 19 21:02:47 debian-8-user cortex[19470]: Traceback (most recent call last):
May 19 21:02:47 debian-8-user cortex[19470]: File "/opt/cortex/contrib/misp-modules-loader.py", line 10, in <module>

  1. Why are there now 2 different stancas? What do they affect?
analyzer {
  path = "/opt/Cortex-Analyzers/analyzers"
  config {
  ...
} 

AND

misp.modules {
  enabled = true

  config {
  ...
}
  1. Is it possible provide one "full" config file containing all possible parameters (as remark?)
    I'm not sure if the old configuration of MISP (as source) is still correct:
    MISP {
    url="https://server"
    key="mykey"
    certpath=["/etc/ssl/private/misp.local.crt", ""]
    name="instance-1"
    }

endless loop of cortex analyser call

Request Type

Bug

Work Environment

Question Answer
OS version (server) Debian 8
Cortex version / git hash 1.1.3-1
Package Type Binary

Problem Description

Cortex Analysis of geoip_country leads to an endless Loop.
Excerpt:

Jun 30 13:59:37 server cortex[29365]: at [Source: 2017-06-30 13:57:12,616 - geoip_country - DEBUG - 62.210.15.114
Jun 30 13:59:37 server cortex[29365]: {"error": "GeoIP resolving error"}
Jun 30 13:59:37 server cortex[29365]: ; line: 1, column: 6]
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1586)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:521)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:450)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportMissingRootWS(ParserMinimalBase.java:466)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._verifyRootSpace(ReaderBasedJsonParser.java:1598)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parsePosNumber(ReaderBasedJsonParser.java:1248)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:705)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3847)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3765)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2050)
Jun 30 13:59:37 server cortex[29365]: [#033[37minfo#033[0m] application - GET /api/job/aOXIJkMHdn51RHsP/waitreport?atMost=1%20minute returned 500
Jun 30 13:59:37 server cortex[29365]: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
Jun 30 13:59:37 server cortex[29365]: at [Source: 2017-06-30 13:57:12,616 - geoip_country - DEBUG - 62.210.15.114
Jun 30 13:59:37 server cortex[29365]: {"error": "GeoIP resolving error"}
Jun 30 13:59:37 server cortex[29365]: ; line: 1, column: 6]
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1586)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:521)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:450)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.base.ParserMinimalBase._reportMissingRootWS(ParserMinimalBase.java:466)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._verifyRootSpace(ReaderBasedJsonParser.java:1598)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parsePosNumber(ReaderBasedJsonParser.java:1248)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:705)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper._initForReading(ObjectMapper.java:3847)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3765)
Jun 30 13:59:37 server cortex[29365]: at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2050)
Jun 30 13:59:37 server cortex[29850]: [#033[37minfo#033[0m] a.e.s.Slf4jLogger - Slf4jLogger started
Jun 30 13:59:37 server cortex[29850]: [#033[37minfo#033[0m] s.MispSrv - MISP modules is enabled, loader is /opt/cortex/contrib/misp-modules-loader.py
Jun 30 13:59:37 server cortex[29850]: [#033[37minfo#033[0m] play.api.Play - Application started (Prod)
Jun 30 13:59:37 server cortex[29850]: [#033[37minfo#033[0m] p.c.s.NettyServer - Listening for HTTP on /0:0:0:0:0:0:0:0:9001
Jun 30 13:59:38 server cortex[29850]: [#033[37minfo#033[0m] application - GET /api/job/aOXIJkMHdn51RHsP/waitreport?atMost=1%20minute returned 404

Use new logo and favicon

Request Type

Enhancement

Objectives

This task aims to include the new logo on Cortex user interface

Global section in configuration file is ignored

Request Type

Bug

Work Environment

any

Problem Description

The section "analyzer.config.global" in configuration file is not used to run analyzers. This section contains common items for all analyzers, in particular proxy configuration.

Workaround

Duplicate the content of global section on all analyzer sections.

Disable analyzer in configuration file

Have a way to disable analyzers in application.conf instead of remove analyzers folders after update.

Possible Solutions

add the list in /etc/cortex/application.conf :

[..]
analyzers {
    config {
        [..]
    }
    disabled = [
        analyzer_name_1,
        analyzer_name_4,
        analyzer_name_12,
        analyzer_name_N
    ]
    
    }
[..]

Using python virtualenv for analyzers

Virtualenv support for analyzer: update documentation or add configuration option

Request Type

Feature request

Work Environment

Debian, Cortext with systemd

Problem Description

Cortex analyzer are run using python-in-path.
There is no configuration to specify the python interpreter or virtualenv to use.

Steps to Reproduce

  1. Install analyzers python requirements in a virtualenv /opt/cortex/virtual_env_path
  2. Run an analyzer with cortex
  3. errors, as cortexutils is not in system python pythonpath

Possible Solutions

  1. Add a hint in documentation to say:
    Change systemd cortex.service to include the following in the [Service] section:
  # use a cortext virtualenv for analyzer
  Environment=PATH=/opt/cortex/virtual_env_path/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
  1. add a configuration in application.conf to handle a virtualenv for analyzers

Cortex 1.1.0 doesnt work with theHive 2.11.0

Request Type

Bug Maybe

Work Environment

Question Answer
OS version (server) Ubuntu
OS version (client) Ubuntu 16.04
Cortex version / git hash 1.1.0
Package Type Binary & Deb
Browser type & version Firefox

Problem Description

Yesterday i'm upgrading TheHive 2.10 - 2.11 all goes well | I have using "binary" deployment and all was ok.

Cortex engine is 1.0.1 and work with TheHive 2.11, I have my own analyzers, and work ok. But I tried to upgrade Cortex 1.0.1 to 1.1.0 Binary first and later, from deb package.

  1. First| Problem, When I start Cortex doesnt show any analyzers , y check application.conf and all seems ok.

  2. From package deb, the same problema, when I start Cortex-1.1.0 analyzers doesnt show, any analyzers.

I back to cortex 1.0.1 and work well | Maybe some bugs? any new config files to be configured?

Thanks in advance, I want to "release" my analyzers soon with us support!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.