Coder Social home page Coder Social logo

thalesgroup-cert / watcher Goto Github PK

View Code? Open in Web Editor NEW
797.0 41.0 119.0 13.69 MB

Watcher - Open Source Cybersecurity Threat Hunting Platform. Developed with Django & React JS.

Home Page: https://thalesgroup-cert.github.io/Watcher

License: GNU Affero General Public License v3.0

Dockerfile 0.20% Python 42.86% JavaScript 38.39% CSS 17.75% HTML 0.31% Makefile 0.12% Batchfile 0.15% Shell 0.22%
cybersecurity threat-hunting django reactjs rss-bridge misp thehive security incident-response threat-detection

watcher's People

Contributors

dependabot[bot] avatar eciavatta avatar felix83000 avatar romainpisters avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

watcher's Issues

[Feature Request] integration with pystemon for pastebin alike sites scraping

It's exciting to see the capabilities of watcher.
I notice the implementation has a custom pastebin scraping tool.
It might be with to consider using the very modular pystemon as backend allowing much more paste-alike websites to be scraped.
To exchange data multiple methods are already possible (database, redis, mongo, ...) and more are easy to integrate if needed.

searx-checker: The image for the service you're trying to recreate has been removed.

Describe the bug
Pulling searx-checker (searx/searx-checker:)...
ERROR: The image for the service you're trying to recreate has been removed. If you continue, volume data could be lost. Consider backing up your data before continuing.
Continue with the new image? [yN]y
Pulling searx-checker (searx/searx-checker:)...
ERROR: pull access denied for searx/searx-checker, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

To Reproduce
Steps to reproduce the behavior:
docker-compose up -d

Authentication credentials were not provided

Describe the bug
Authentication popup's are displayed once logged in!?

To Reproduce
Steps to reproduce the behavior:

  1. Connect to Watcher
  2. Authenticate with a valid user
  3. Browse the web interface tabs

Screenshots
Screenshot 2020-10-22 at 17 27 29

Trendy words and "word cloud" not updating

Hi

First of all, thanks for this great tool!

I initially installed Watcher on 11 January, since then the "trendy words" and "word cloud" is not updating, no new words are showing up.

image

Any idea what is wrong, or how to trigger a manual update.

Cheers,
Dago

Unknown

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Mastodon support

Hi

RSS-Bridge should be capable for monitoring Mastodon. Could you please help with providing the correct format of the link to add.

Like for Twitter it is:

hxxp://xx.xx.xx.xx/?action=display&bridge=Twitter&context=By+username&u=xxxxxx&norep=on&nopic=on&noimg=on&noimgscaling=on&format=Mrss

Is this the correct format for Mastodon:

hxxp://xx.xx.xx.xx/?action=display&bridge=MastodonBridge&canusername=xxxxxx&norep=on&noboost=on&signaturetype=noquery&format=Mrss

Many thanks for your help and this great tool!

[Feature Request] Bulk registration of new RSS feeds (or any other data)

Is your feature request related to a problem? Please describe.
I have a bunch of RSS feeds to add to my current Watcher instance and adding them one by one is sloooooow.

Describe the solution you'd like
A way to upload a text file with RSS feeds? (with a validation process - correct XML returned etc)
Or fetch data from a URL

Describe alternatives you've considered
n/a

Additional context
n/a

Watcher frontend doesn't load

Describe the bug
I made a clean install, enabled the default RSS but the frontend doesn't load. In console I have:
SyntaxError: invalid for/in left-hand side
The administration interface works.
Am I doing something wrong?

To Reproduce
Fresh installation, just followed the steps.

Desktop (please complete the following information):

  • OS: CentOS 7
  • Browser Firefox

Alternative DNS servers?

It's not a bug report not a feature request but a question...
I'm using specific DNS resolvers with my containers. However, due to the huge amount of requests performed by Watcher, I'd like to use alternative DNS resolvers to not flood the current ones.
I presume that the container called 'watcher' is the one that generates most of the DNS requests right?
Happy Holidays!

MySQLdb._exceptions.OperationalError: Illegal mix of collations

Describe the bug
Hello, thank you for your fix for the 7/7 and 24/24 it is functional but since the update I have log returns.

Thank you very much for your help.

To Reproduce

docker-compose rm watcher
docker pull felix83000/watcher:latest
docker-compose run watcher bash
python manage.py migrate
exit
docker-compose up -d

Log
today at 15:25:39Job "main_watch (trigger: cron[day_of_week='mon-sun', minute='*/30'], next run at: 2022-05-14 15:30:00 CEST)" raised an exception
today at 15:25:39Traceback (most recent call last):
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 89, in _execute
today at 15:25:39 return self.cursor.execute(sql, params)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 75, in execute
today at 15:25:39 return self.cursor.execute(query, args)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
today at 15:25:39 res = self._query(query)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
today at 15:25:39 db.query(q)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/connections.py", line 254, in query
today at 15:25:39 _mysql.connection.query(self, query)
today at 15:25:39MySQLdb._exceptions.OperationalError: (1267, "Illegal mix of collations (utf8mb4_0900_ai_ci,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '='")
today at 15:25:39
today at 15:25:39The above exception was the direct cause of the following exception:
today at 15:25:39
today at 15:25:39Traceback (most recent call last):
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/apscheduler/executors/base.py", line 125, in run_job
today at 15:25:39 retval = job.func(*job.args, **job.kwargs)
today at 15:25:39 File "/app/Watcher/threats_watcher/core.py", line 76, in main_watch
today at 15:25:39 focus_on_top(settings.WORDS_OCCURRENCE)
today at 15:25:39 File "/app/Watcher/threats_watcher/core.py", line 216, in focus_on_top
today at 15:25:39 if TrendyWord.objects.filter(name=word):
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 324, in bool
today at 15:25:39 self._fetch_all()
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 1507, in _fetch_all
today at 15:25:39 self._result_cache = list(self._iterable_class(self))
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 57, in iter
today at 15:25:39 results = compiler.execute_sql(
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1361, in execute_sql
today at 15:25:39 cursor.execute(sql, params)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 67, in execute
today at 15:25:39 return self._execute_with_wrappers(
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 80, in _execute_with_wrappers
today at 15:25:39 return executor(sql, params, many, context)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 89, in _execute
today at 15:25:39 return self.cursor.execute(sql, params)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/utils.py", line 91, in exit
today at 15:25:39 raise dj_exc_value.with_traceback(traceback) from exc_value
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 89, in _execute
today at 15:25:39 return self.cursor.execute(sql, params)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 75, in execute
today at 15:25:39 return self.cursor.execute(query, args)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
today at 15:25:39 res = self._query(query)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
today at 15:25:39 db.query(q)
today at 15:25:39 File "/usr/local/lib/python3.8/site-packages/MySQLdb/connections.py", line 254, in query
today at 15:25:39 _mysql.connection.query(self, query)
today at 15:25:39django.db.utils.OperationalError: (1267, "Illegal mix of collations (utf8mb4_0900_ai_ci,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '='")

Cannot create super user

Describe the bug
After installation (on Ubuntu 20.04), when I try to define a super user using:

docker-compose down
docker-compose run watcher bash
python manage.py createsuperuser

I have following error message: django.db.utils.ProgrammingError: (1146, "Table 'db_watcher.auth_user' doesn't exist")

To Reproduce
Steps to reproduce the behavior:

  1. Type: docker-compose down
  2. Type: docker-compose run watcher bash

Expected behavior
I cannot define superuser (but can connect to web portal)

Screenshots

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "manage.py", line 22, in <module>
    execute_from_command_line(sys.argv)
  File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line
    utility.execute()
  File "/usr/local/lib/python3.8/site-packages/django/core/management/__init__.py", line 413, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 354, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/usr/local/lib/python3.8/site-packages/django/contrib/auth/management/commands/createsuperuser.py", line 79, in execute
    return super().execute(*args, **options)
  File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 398, in execute
    output = self.handle(*args, **options)
  File "/usr/local/lib/python3.8/site-packages/django/contrib/auth/management/commands/createsuperuser.py", line 100, in handle
    default_username = get_default_username(database=database)
  File "/usr/local/lib/python3.8/site-packages/django/contrib/auth/management/__init__.py", line 141, in get_default_username
    auth_app.User._default_manager.db_manager(database).get(
  File "/usr/local/lib/python3.8/site-packages/django/db/models/manager.py", line 85, in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 431, in get
    num = len(clone)
  File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 262, in __len__
    self._fetch_all()
  File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 1324, in _fetch_all
    self._result_cache = list(self._iterable_class(self))
  File "/usr/local/lib/python3.8/site-packages/django/db/models/query.py", line 51, in __iter__
    results = compiler.execute_sql(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size)
  File "/usr/local/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1169, in execute_sql
    cursor.execute(sql, params)
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 98, in execute
    return super().execute(sql, params)
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 66, in execute
    return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 75, in _execute_with_wrappers
    return executor(sql, params, many, context)
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
  File "/usr/local/lib/python3.8/site-packages/django/db/utils.py", line 90, in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
  File "/usr/local/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 73, in execute
    return self.cursor.execute(query, args)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
  File "/usr/local/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
    _mysql.connection.query(self, query)
django.db.utils.ProgrammingError: (1146, "Table 'db_watcher.auth_user' doesn't exist")

Desktop (please complete the following information):

  • OS: Ubuntu Ubuntu 20.04.2 LTS
  • Browser Firefox 78.11esr

400 Bad request

Hi everyone.

I'm still new to using Docker and it's required to install Watcher. I installed both docker and docker-compose following the guide provided here https://felix83000.github.io/Watcher/README.html.
Watcher is also installed following the same guide but once I reach this point "Try Access Watcher on http://0.0.0.0:9002 or http://yourserverip:9002." I get the "400 Bad Request" error.
I can see the error message on the shell of watcher VM also:
"GET / HTTP/1.1" 400 143
"GET /favicon.ico HTTP/1.1" 400 143

Am I missing something?

Thank in advance.

Question: Monitor for information leaks -> what sources?

Hi

Please allow me to ask the following questions:

  • What exact information sources are being checked for the Data Leak feature? It says "Monitor for information leaks, for example in Pastebin & other IT content exchange websites (stackoverflow, github, gitlab, bitbucket, apkmirror, npm...)." What is the exact list and where to find it?

  • And is it possible to add/remove sources? If yes, how?

Thank you very much for this tool and your response!

Dago

[Feature Request] Use Watcher to "watch" topics of interest

Watcher is really an excellent idea, covering good use cases for discover CyberSecurity Threats, thanks for your work. Taking the idea that Watcher find "trendy" words on configured RSS feeds automatically, is it possible maybe to add words of interest "by-hand"? Suppose you work with five technologies in your company, and you want to know as fast as you can if in any RSS feed that Watcher constantly review a vulnerability/exploit for these technologies is published, or new version of these technologies is available, or similar use cases. Do you think this feature maybe is interesting to add to Watcher?

Running behind a reverse proxy?

Not a feature request/bug but advice...
Anybody is already running Watcher behind a reverse-proxy (traefik in my case)?
In the official docker-compose file, IP addresses are assigned to the containers. My idea is to connect the 'watcher' container also to my traefik network to be accessed by the reverse proxy.

[Improvements] Suggesting some improvements

First of all thanks for creating and sharing such a great project. 💯

I have some suggestions regarding the code and the documentation.
For the documentation it would be useful if you can add docker and docker-compose version which are compatible and used by your project.

Regarding the code:
Django apps contains core.py maybe it's better to use a different filename as 'core' in Django also refers to builtin core modules of Django. I'd suggest services.py or utils.py it would make the code more clear and readable.

Regarding the threats_watcher app why are you saving only tokenized words in TrendyWord and not the complete post content? IMHO it's more useful to have the complete content of a post.

While I was checking your code I found that RSS-Bridge is used for Twitter. If the docker is used only for Twitter maybe you can use some tool like https://github.com/InQuest/ThreatIngestor which can extract and aggregate threat intelligence from RSS feeds, Twitter, Pastebin and other sources. This would smoother more your project as a docker image like RSS-Bridge is overkill if only used for twitter.

In the site_monitoring app's model rtir – Identification number of RTIR should be a non required field as some people might not have https://github.com/bestpractical/rt RTIR platform configured o in place. Also the Expiry Date field in some cases is not the best option as you might want to monitor a certain domain until it's takendown and you don't know the exact date when this will happen at first.

Screenshots:
In site_monitoring and in dns_finder it's an awesome feature if you can take screenshots of domain and also check for changes of the image. You could use https://hub.docker.com/r/scrapinghub/splash to take screenshots and https://pypi.org/project/ImageHash/ to check how the image was changed.

Why are you using MySQL as the DB? There are some other solutions like PostgreSQL which has builtin ArrayFields and JSONFields which could ease some model implementations.

Hope this helps.

Problem parsing some trendy words starting with '

A detected trendy word that starts sometimes with ' in the phrase in Twitter, is recognized as starting with '

For example:
Trendy Word: 'windows
Text in Twitter:
"A remote code execution vulnerability exists when Windows Network Address Translation (NAT) fails to properly handle UDP traffic, aka 'Windows NAT Remote Code Execution Vulnerability'." On NVD CVE, it's an RCE but on Microsoft website it's a DoS.

Snapshots:
image
image

[ Feature request ] Add THE_HIVE_VERIFY_SSL

Hello,

Is it possible to add the possibility to disable SSL verification for TheHive like for MISP?
Example:

Extract of docker-compose

      THE_HIVE_URL: https://${HOSTNAME}/thehive
      THE_HIVE_VERIFY_SSL: "False"
      THE_HIVE_KEY: ${THE_HIVE_KEY}
      THE_HIVE_CASE_ASSIGNEE: ${THE_HIVE_CASE_ASSIGNEE}
      THE_HIVE_TAGS: ${THE_HIVE_TAGS}
      MISP_URL: https://${HOSTNAME}/misp/
      MISP_VERIFY_SSL: "False"
      MISP_KEY: ${MISP_KEY}
      MISP_TICKETING_URL: ${MISP

I love your project :)

Monitor products or companies

Hello,

I would like to be able to monitor only specific companies or products.
By monitor, I mean that instead of getting random information in the Watcher dashboard, I would like to receive information about vulnerabilities or attacks that have been publicly disclosed for the companies I have defined.
I am not sure if this is currently possible, or at least I do not think so, but if it is, could you please explain how to do it?

Thank you very much for your help

Have a good day

Raumin

[Improvements] Data Leak Keywords - RegEX support

Is your feature request related to a problem? Please describe.
IT would be really helpful if we can have pattern matching also alongside exact match. Currently we can only search for exact keywords, however in real world searching for an exact match would not suffice, would be better if we can have a pattern matching support too.

Describe the solution you'd like
A Regex pattern can be entered by GUI which will be used to find the data which matches the pattern. Would be really great in searching for leaked credentials and other data.

Describe alternatives you've considered
Currently Watcher does not support pattern matching so no alternatives from Watcher end.

Word Cloud and Trendy Words are not populating

Describe the bug
Word Cloud and Trendy Words on Home Page is not populated ( it never did)

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'www.localhost.com:9002'
  2. Click on 'Watcher icon (Home Page)'

Expected behavior
Expectation to see Word Cloud and Trendy Words on home page

Desktop (please complete the following information):

  • OS: Centos7
  • Browser : Firefox
  • Version : fresh install visa docker compose

Additional context
Did a fresh installation via docker compose. Perform migration and populate DB steps and did docker-compose up. Still word cloud and trendy words were not populating (waited for hours). In Logs, the API of Trendy words is giving response code of 200.

Tweets not found

Describe the bug
The Twitter search engine request is in error. Unable to display results in Watcher.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '0.0.0.0:9002'
  2. Click on a 'Trendy Word'
  3. Scroll down to 'Tweet(s) related to ***'
  4. See the problem
  5. Check the problem when starting the service on the line "searx-checker | twitter....ERROR No result"

Screenshots
twitter_error

Searx error when executing 'docker-compose up'

When running docker-compose up I'm getting the following error in standard output.
Not sure if it's causing additional errors.

watcher          | [04/Nov/2020 22:04:19] "GET /api/dns_finder/alert/ HTTP/1.1" 200 2
watcher          | [04/Nov/2020 22:04:19] "GET /api/dns_finder/dns_monitored/ HTTP/1.1" 200 2
watcher          | [04/Nov/2020 22:04:25] "POST /api/dns_finder/dns_monitored/ HTTP/1.1" 201 85
searx-checker    | 2020/11/04 11:04:59 Check with the parameters : -o html/data/status.json http://10.10.10.3:8080
searx-checker    | Searx version : 0.17.0-224-1b42d426
searx-checker    | Testing 108 engines of http://10.10.10.3:8080
searx-checker    | This might take a while...
searx-checker    | 1337x.Traceback (most recent call last):
searx-checker    |   File "./checker/checker.py", line 228, in <module>
searx-checker    |     main(args.url, args.o)
searx-checker    |   File "./checker/checker.py", line 215, in main
searx-checker    |     state = check_engines_state(instance_url, engines)
searx-checker    |   File "./checker/checker.py", line 168, in check_engines_state
searx-checker    |     provides_results = _request_results(instance_url, engine)
searx-checker    |   File "./checker/checker.py", line 156, in _request_results
searx-checker    |     engine_result = _query_engine_result(instance_url, engine, query)
searx-checker    |   File "./checker/checker.py", line 146, in _query_engine_result
searx-checker    |     return _check_response(resp)
searx-checker    |   File "./checker/checker.py", line 124, in _check_response
searx-checker    |     resp_json = resp.json()
searx-checker    |   File "/usr/local/lib/python3.8/site-packages/requests/models.py", line 898, in json
searx-checker    |     return complexjson.loads(self.text, **kwargs)
searx-checker    |   File "/usr/local/lib/python3.8/json/__init__.py", line 357, in loads
searx-checker    |     return _default_decoder.decode(s)
searx-checker    |   File "/usr/local/lib/python3.8/json/decoder.py", line 337, in decode
searx-checker    |     obj, end = self.raw_decode(s, idx=_w(s, 0).end())
searx-checker    |   File "/usr/local/lib/python3.8/json/decoder.py", line 355, in raw_decode
searx-checker    |     raise JSONDecodeError("Expecting value", s, err.value) from None
searx-checker    | json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
searx-checker    | Sleep 86316 seconds
watcher          | 2020-11-04 22:05:00.001430 - CRON TASK : Fetch searx & pastebin
watcher          | 2020-11-04 22:05:00.004532 - Querying Searx for:  "data breach"

searx dontainer restart always

Describe the bug
I'm using latest Watcher version (retrieve yesterday from GitHub).

To Reproduce
Steps to reproduce the behavior:

  1. Launch docker-compose up using proxy defined in .env

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots

root@vm:/opt/Watcher# **docker ps**
CONTAINER ID   IMAGE                             COMMAND                  CREATED          STATUS                          PORTS                                                  NAMES
7e6c9c22e48f   felix83000/watcher:latest         "sh -c '/tmp/wait-fo…"   56 minutes ago   Up 2 minutes                    0.0.0.0:443->9002/tcp, :::443->9002/tcp                watcher
398a095b5489   searx/searx-checker               "/sbin/tini -- /usr/…"   16 hours ago     Up 2 minutes                                                                           searx-checker
07bb317190bf   searx/searx:0.18.0-341-ae0b621e   "/sbin/tini -- /usr/…"   16 hours ago     **Restarting (1) 28 seconds ago**                                                          searx
9cfdc6e5d94b   rssbridge/rss-bridge:latest       "docker-php-entrypoi…"   16 hours ago     Up 2 minutes                    80/tcp                                                 rss-bridge
44046aff6a26   mysql:latest                      "docker-entrypoint.s…"   16 hours ago     Up 2 minutes                    33060/tcp, 0.0.0.0:3307->3306/tcp, :::3307->3306/tcp   db_watcher

Desktop (please complete the following information):

  • OS:Ubuntu 20.04
  • Browser Firefox 78.22esr
  • Version [e.g. 22]

Additional context
From Watcher page we don't have tweet information
From Data Leak page, we don't have any information
watcher.txt

Question: Move to 7/7 days and 24/24h

Hello,

Thank you for your project, please I would like to have if possible how to change the main_watch from 7/7 days and 24h/24h for the update of words

main_watch (trigger: cron[day_of_week='mon-fri', hour='7-18', minute='*/30'], next run at: 2022-04-25 15:00:00 CEST)" raised an exception

Thank you very much for your work.

[Improvements] SMTP auth/TLS?

Hi! I love watcher and use it to keep track of a few sites. I was wondering if it was possible to have the Django email use SMTP auth and/or TLS? I want to point it at an external email server, but it needs authentication and TLS to work. I looked over the docs but didnt see anything.

Bad requset

Well, i am getting Bad request (400) even in case i have set: ALLOWED_HOST=['*']

Creating network "watcher_default" with the default driver
Creating rss-bridge    ... done
Creating searx         ... done
Creating searx-checker ... done
Creating db_watcher    ... done
Creating watcher       ... done
Attaching to searx, searx-checker, db_watcher, rss-bridge, watcher
db_watcher       | 2020-10-15 17:10:14+02:00 [Note] [Entrypoint]: Entrypoint scr                                                                             ipt for MySQL Server 8.0.21-1debian10 started.
db_watcher       | 2020-10-15 17:10:14+02:00 [Note] [Entrypoint]: Switching to d                                                                             edicated user 'mysql'
db_watcher       | 2020-10-15 17:10:14+02:00 [Note] [Entrypoint]: Entrypoint scr                                                                             ipt for MySQL Server 8.0.21-1debian10 started.
db_watcher       | 2020-10-15T15:10:14.957461Z 0 [System] [MY-010116] [Server] /                                                                             usr/sbin/mysqld (mysqld 8.0.21) starting as process 1
db_watcher       | 2020-10-15T15:10:14.968635Z 1 [System] [MY-013576] [InnoDB] I                                                                             nnoDB initialization has started.
db_watcher       | 2020-10-15T15:10:15.471750Z 1 [System] [MY-013577] [InnoDB] I                                                                             nnoDB initialization has ended.
db_watcher       | 2020-10-15T15:10:15.878610Z 0 [System] [MY-011323] [Server] X                                                                              Plugin ready for connections. Bind-address: '::' port: 33060, socket: /var/run/                                                                             mysqld/mysqlx.sock
db_watcher       | 2020-10-15T15:10:16.032563Z 0 [Warning] [MY-010068] [Server]                                                                              CA certificate ca.pem is self signed.
db_watcher       | 2020-10-15T15:10:16.032781Z 0 [System] [MY-013602] [Server] C                                                                             hannel mysql_main configured to support TLS. Encrypted connections are now suppo                                                                             rted for this channel.
db_watcher       | 2020-10-15T15:10:16.040243Z 0 [Warning] [MY-011810] [Server]                                                                              Insecure configuration for --pid-file: Location '/var/run/mysqld' in the path is                                                                              accessible to all OS users. Consider choosing a different directory.
db_watcher       | 2020-10-15T15:10:16.092680Z 0 [System] [MY-010931] [Server] /                                                                             usr/sbin/mysqld: ready for connections. Version: '8.0.21'  socket: '/var/run/mys                                                                             qld/mysqld.sock'  port: 3306  MySQL Community Server - GPL.
searx            | searx version 0.17.0-181-e78bfd4d
searx            |
searx            | Use existing /etc/searx/uwsgi.ini
searx            | Use existing /etc/searx/settings.yml
rss-bridge       | AH00558: apache2: Could not reliably determine the server's f                                                                             ully qualified domain name, using 10.10.10.7. Set the 'ServerName' directive glo                                                                             bally to suppress this message
rss-bridge       | AH00558: apache2: Could not reliably determine the server's f                                                                             ully qualified domain name, using 10.10.10.7. Set the 'ServerName' directive glo                                                                             bally to suppress this message
rss-bridge       | [Thu Oct 15 17:10:15.598192 2020] [mpm_prefork:notice] [pid 1                                                                             ] AH00163: Apache/2.4.38 (Debian) PHP/7.4.11 configured -- resuming normal opera                                                                             tions
rss-bridge       | [Thu Oct 15 17:10:15.598255 2020] [core:notice] [pid 1] AH000                                                                             94: Command line: 'apache2 -D FOREGROUND'
searx-checker    | Sleep 64 seconds
searx            | Listen on 10.10.10.3:8080
searx            | [uWSGI] getting INI configuration from /etc/searx/uwsgi.ini
watcher          | db_watcher is up, starting Watcher.
watcher          | Performing system checks...
watcher          |
watcher          | System check identified no issues (0 silenced).
watcher          | October 15, 2020 - 17:10:18
watcher          | Django version 3.1.1, using settings 'watcher.settings'
watcher          | Starting development server at http://0.0.0.0:9002/
watcher          | Quit the server with CONTROL-C.
watcher          | [15/Oct/2020 17:10:32] "GET / HTTP/1.1" 400 143
watcher          | [15/Oct/2020 17:10:32] "GET /favicon.ico HTTP/1.1" 400 143

Ubuntu 18.04

Watcher does not respond to the configuration change for the host address in the .env file.

Describe the bug
Hello, I don't know if it is a general bug, or a particular case; but when starting the Watcher containers, the application server is initialized with the address http://0.0.0.0:9002/ by default.
However, the installation guide refers that it can be modified, placing the new address in the "ALLOWED_HOST =" section of the .env file; then run the commands:
docker-compose down
docker-compose up
When performing these actions, the Watcher server continues to raise with the default address.
In the same way the changes are not applied when the mail server is configured.

Screenshots
Address after changing in config file .env:
image

image
Docker version on which it runs:
image

Desktop (please complete the following information):

  • OS: CentOS 8
  • Browser Firefox
  • Version CentOS Linux release 8.4.2105

Please I accept any recomendation to solve this problem.
Thanks

Scrape error Watcher

I am getting this output when watcher is scraping for dataleaks.

watcher | 2021-04-19 22:50:08.063406 - HTTPSConnectionPool(host='10.10.10.3', port=8080): Max retries exceeded with url: /search?q=%22xxxxx%22&engines=gitlab%2Cgithub%2Cbitbucket%2Capkmirror%2Cgentoo%2Cnpm%2Cstackoverflow%2Choogle&format=json (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1125)')))

[Feature Request] monitor domains and get email account data breaches (pwned) with h8mail or HIBP

Hello,

I was looking at the watcher functionalities and it really works very well, I had thought that perhaps it would be of great contribution to be able to monitor domains or email accounts that have been pwned or that are in a data breach list, for example being able to integrate it with h8mail or directly with HIPB, it would be quite interesting for those who wish to carry out this type of monitoring.

Regards,

Questions about watcher

I have some questions
On the homepage, how should I set it to be the same as on your picture
thanks

image

Twitter Feeds are no longer actualised using latest docker build

The Twitter Feeds are no longer refreshed using the latest docker build.

Error entry from the in the rss-bridge containers log:

[Mon Jan 10 16:31:07.478972 2022] [php7:notice] [pid 22] [client x.x.x.x:xxxxx] Exception: Could not parse guest token in /app/lib/error.php:24\nStack trace:\n#0 /app/lib/error.php(42): returnError()\n#1 /app/bridges/TwitterBridge.php(551): returnServerError()\n#2 /app/bridges/TwitterBridge.php(584): TwitterBridge->getApiKey()\n#3 /app/bridges/TwitterBridge.php(241): TwitterBridge->getApiContents()\n#4 /app/actions/DisplayAction.php(135): TwitterBridge->collectData()\n#5 /app/index.php(38): DisplayAction->execute()\n#6 {main},
X.X.X.X - - [10/Jan/2022:16:31:06 +0100] "GET /?action=display&bridge=Twitter&context=By+username&u=CSAsingapore&norep=on&nopic=on&noimg=on&noimgscaling=on&format=Mrss HTTP/1.1" 200 1487 "-" "python-requests/2.26.0"

IP are redacted

After testing the Twitter feed at https://wtf.roflcopter.fr/rss-bridge/#bridge-Twitter it appears to be an rss-bridge issue, but I'm not 100% sure...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.