debricked / dmarc-visualizer Goto Github PK
View Code? Open in Web Editor NEWAnalyse and visualize DMARC results using open-source tools
License: Apache License 2.0
Analyse and visualize DMARC results using open-source tools
License: Apache License 2.0
In past, everything worked fine. After redeploying on another server, we recognized that the IMAP email account connection doesn't work any more.
docker logs dmarc-visualizer_parsedmarc_1
lists lot of logs of following error
ModuleNotFoundError: No module named 'msgraph'
Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 5, in <module>
from parsedmarc.cli import _main
File "/usr/local/lib/python3.9/site-packages/parsedmarc/__init__.py", line 31, in <module>
from parsedmarc.mail import MailboxConnection
File "/usr/local/lib/python3.9/site-packages/parsedmarc/mail/__init__.py", line 2, in <module>
from parsedmarc.mail.graph import MSGraphConnection
File "/usr/local/lib/python3.9/site-packages/parsedmarc/mail/graph.py", line 10, in <module>
from msgraph.core import GraphClient
Can anyone share a working Gmail API example? Everything was working fine with local files, but I want to move to reading over the API instead.
I've currently:
command: parsedmarc -c /parsedmarc.ini /input/* --verbose --debug
COPY oauth.json /
to parsedmarc/Dockerfile[gmail_api]
credentials_file = /oauth.json
include_spam_trash = False
But when parsedmark boots it just hangs, apparently waiting. I presume it's waiting for a user driven authentication process, but I can't work out how to access that??? What am I missing! I also tried using a service account json file but that failed too.
Problems I see / guestimate:
If anyone has this working I'd appreciate a nudge in the right direction!
Maybe I am stupid, but every time I restart this, I get an empty database.
Where do I have to add a volume to not loose my data?
Hi, i just spins new vm (ubuntu 20.04) and installed docker 20.10.
Pulled this image and on first start i got in problems, grafana is up but without dashboard and data.
In log i see:
parsedmarc_1 | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f49ef6c3890>: Failed to establish a new connection: [Errno 111] Connection refused
parsedmarc_1 | elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7ffb51ceba90>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7ffb51ceba90>: Failed to establish a new connection: [Errno 111] Connection refused)
parsedmarc_1 | FileNotFoundError: [Errno 2] No such file or directory: '/output/aggregate.json'
and somewhere i saw error that he had permission denied for dashboard file..
where i am wrong? i do like is in tutorial
Looks like this repo is abandoned(at least for now), so here my fork with fixes provided as is in case somebody needs them: https://github.com/Keramblock/dmarc-visualizer feel free to take them if you want to.
For author: I've created PR(#38), if you want to implement this fixes. Also feel free to close this issue once this repo will be fixed, thank you for your work anyway!
I'm starting my journey into SPF/DKIM/DMARC and have installed this dmarc-visualizer - really cool project, by the way.
I'm seeing a discrepancy in this early stage and know it will most likely diverge even further as I collect more data. Looking at the screenshot below, why is it that the Total Message Count doesn't agree with the other highlighted areas?
Total Message Count is showing 40 but all others agree at 34 - what happened to the other 6?
So i have been using this for a while, through all of the fixes and updates.
My method of usage, is to spin up the docker stack, allow it to update from my DMARC email inbox, and then i can take a look to see if there are any issues, then i shut it all down.
It may stay down for a month or so, then i spin it back up to check on things.
Im realizing that when i spin up the docker stack, it only updates the NEW reports found in my email inbox.
The Aggragate.CSV file in the output folder is being updated, and contains all emails from back to 2020, but theyre not displayed in Grafana. Ive double checked and have grafana set to show data from last 5 years.
When parsedmarc grabs reports from the email inbox, it processes it, and i assume saves it in the aggregate CSV and JSON files.
Are those files read by elasticsearch or Grafana? Or will it need to read the reports from the email, and once its shut down, it loses it from current inventory?
Hello Monkeys.
Can you create any meaningful documentation?
If you want what you offer to be a product.
This is one big thing. What exactly only monkeys know.
Have a nice day.
The parsedmarc container is continuously throwing this exception and then restarting shortly afterwards.
parsedmarc_1 | elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7a72bff966a0>: Failed to establish a new connection: [Errno -3] Try again) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7a72bff966a0>: Failed to establish a new connection: [Errno -3] Try again)
Also, in the Grafana dashboard, the following error pop-up is being thrown in the top right corner:
Templating
Template variable service failed Elasticsearch error: Bad Gateway
Error happened at parsedmarc container when I run docker-compose up -d
.
Are some settings missing?
parsedmarc/parsedmarc.sample.ini
to parsedmarc/parsedmarc.ini
parsedmarc/Dockerfile
updated to add 'msgraph-core<1.0.0'
after parsedmarc
bacause ModuleNotFoundError: No module named 'msgraph' had happened.
RUN apk add --update --no-cache --virtual .build_deps build-base libffi-dev \
&& pip install parsedmarc \
RUN apk add --update --no-cache --virtual .build_deps build-base libffi-dev \
&& pip install parsedmarc 'msgraph-core<1.0.0' \
docker-compose up -d
2024-02-01 11:46:14 INFO:cli.py:802:Starting parsedmarc
2024-02-01 11:46:14 Traceback (most recent call last):
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connection.py", line 174, in _new_conn
2024-02-01 11:46:14 conn = connection.create_connection(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/util/connection.py", line 95, in create_connection
2024-02-01 11:46:14 raise err
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/util/connection.py", line 85, in create_connection
2024-02-01 11:46:14 sock.connect(sa)
2024-02-01 11:46:14 ConnectionRefusedError: [Errno 111] Connection refused
2024-02-01 11:46:14
2024-02-01 11:46:14 During handling of the above exception, another exception occurred:
2024-02-01 11:46:14
2024-02-01 11:46:14 Traceback (most recent call last):
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/connection/http_urllib3.py", line 251, in perform_request
2024-02-01 11:46:14 response = self.pool.urlopen(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 799, in urlopen
2024-02-01 11:46:14 retries = retries.increment(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 525, in increment
2024-02-01 11:46:14 raise six.reraise(type(error), error, _stacktrace)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 770, in reraise
2024-02-01 11:46:14 raise value
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 715, in urlopen
2024-02-01 11:46:14 httplib_response = self._make_request(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 416, in _make_request
2024-02-01 11:46:14 conn.request(method, url, **httplib_request_kw)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connection.py", line 244, in request
2024-02-01 11:46:14 super(HTTPConnection, self).request(method, url, body=body, headers=headers)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/http/client.py", line 1285, in request
2024-02-01 11:46:14 self._send_request(method, url, body, headers, encode_chunked)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/http/client.py", line 1331, in _send_request
2024-02-01 11:46:14 self.endheaders(body, encode_chunked=encode_chunked)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/http/client.py", line 1280, in endheaders
2024-02-01 11:46:14 self._send_output(message_body, encode_chunked=encode_chunked)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/http/client.py", line 1040, in _send_output
2024-02-01 11:46:14 self.send(msg)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/http/client.py", line 980, in send
2024-02-01 11:46:14 self.connect()
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connection.py", line 205, in connect
2024-02-01 11:46:14 conn = self._new_conn()
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/urllib3/connection.py", line 186, in _new_conn
2024-02-01 11:46:14 raise NewConnectionError(
2024-02-01 11:46:14 urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f1a311b4070>: Failed to establish a new connection: [Errno 111] Connection refused
2024-02-01 11:46:14
2024-02-01 11:46:14 During handling of the above exception, another exception occurred:
2024-02-01 11:46:14
2024-02-01 11:46:14 Traceback (most recent call last):
2024-02-01 11:46:14 File "/usr/local/bin/parsedmarc", line 8, in
2024-02-01 11:46:14 sys.exit(_main())
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/parsedmarc/cli.py", line 821, in _main
2024-02-01 11:46:14 elastic.migrate_indexes(aggregate_indexes=[es_aggregate_index],
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/parsedmarc/elastic.py", line 241, in migrate_indexes
2024-02-01 11:46:14 if not Index(aggregate_index_name).exists():
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch_dsl/index.py", line 414, in exists
2024-02-01 11:46:14 return self._get_connection(using).indices.exists(index=self._name, **kwargs)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/utils.py", line 168, in _wrapped
2024-02-01 11:46:14 return func(*args, params=params, headers=headers, **kwargs)
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/indices.py", line 332, in exists
2024-02-01 11:46:14 return self.transport.perform_request(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 413, in perform_request
2024-02-01 11:46:14 raise e
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 381, in perform_request
2024-02-01 11:46:14 status, headers_response, data = connection.perform_request(
2024-02-01 11:46:14 File "/usr/local/lib/python3.9/site-packages/elasticsearch/connection/http_urllib3.py", line 266, in perform_request
2024-02-01 11:46:14 raise ConnectionError("N/A", str(e), e)
2024-02-01 11:46:14 elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7f1a311b4070>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7f1a311b4070>: Failed to establish a new connection: [Errno 111] Connection refused)
I'm trying to spinup this docker compose file but I get this error:
# docker-compose up
Pulling elasticsearch (docker.elastic.co/elasticsearch/elasticsearch:7.9.1)...
7.9.1: Pulling from elasticsearch/elasticsearch
f1feca467797: Pull complete
dcfca94e7428: Pull complete
d2bf8b28bdf5: Pull complete
5efd10fdc328: Pull complete
71948c71bf56: Pull complete
3d79fd8021d0: Pull complete
3561742200e5: Pull complete
2811408f56d0: Pull complete
cb5a557b51ee: Pull complete
Digest: sha256:0a5308431aee029636858a6efe07e409fa699b02549a78d7904eb931b8c46920
Status: Downloaded newer image for docker.elastic.co/elasticsearch/elasticsearch:7.9.1
Building parsedmarc
ERROR: Cannot locate specified Dockerfile: Dockerfile
When attempting to compose the docker, I got this error. Is something perhaps out of date?
=> CANCELED [internal] load metadata for docker.io/grafana/grafana:8.5.4 0.1s
failed to solve: failed to compute cache key: failed to calculate checksum of ref moby::a42aczzth8djhp41cdvyx253y: "/parsedmarc.ini": not found
Is there a way to get an example of a docker-compose.yml file with persistent volumes? Every time I do a docker-compose down all data is gone.
Thanks
When install dmarc-visualizer, I get the following error.
Error: Service 'parsedmarc' failed to build: COPY failed: file not found in the build context or excluded by .dockerignore: stat parsedmarc.ini: file does not exist
Installed on vanilla ubuntu 20.04. Here is the video I followed: https://www.youtube.com/watch?v=a9CUTfzADLQ
Instructions:
Run each of these lines one at a time
########### Installing Docker on Ubuntu
sudo apt update
sudo apt install apt-transport-https ca-certificates curl software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo apt-key fingerprint 0EBFCD88
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
sudo apt update
apt-cache policy docker-ce
sudo apt install docker-ce
sudo systemctl status docker
############ Install Docker compose
sudo curl -L "https://github.com/docker/compose/releases/download/1.26.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker-compose --version
############ Running Hello World
sudo docker run hello-world
############ Running the YML file
sudo docker-compose up -d
sudo docker-compose up
Just re-built now, and in the browser UI it no longer renders stats. Instead there's a pop-up error saying:
⚠️ Templating [fromdomain]
Error updating options: Support for Elasticsearch versions after their end-of-life (currently versions < 7.10) was removed.
Aggregate report data being sent by zoho.com is failing to parse with error (filename shortened):
ERROR:cli.py:560:Failed to parse /input/21062413zoho.com.xml.gz - Not a valid aggregate or forensic report
Differences from a working file:
My question - what is the format standard required?
Hi,
As I attempt to configure the dmarc-visualizer on my server, parsedmarc container restarts because of
elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7c92fbbe6100>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7c92fbbe6100>: Failed to establish a new connection: [Errno 111] Connection refused)
INFO:cli.py:1031:Starting parsedmarc
/usr/local/lib/python3.9/site-packages/elasticsearch/connection/base.py:208: ElasticsearchWarning: Elasticsearch built-in security features are not enabled. Without authentication, your cluster could be accessible to anyone. See https://www.elastic.co/guide/en/elasticsearch/reference/7.17/security-minimal-setup.html to enable security.
`version: '3.5'
services:
parsedmarc:
build: ./parsedmarc/
volumes:
- ./files:/input:ro
- ./output_files:/output
command: parsedmarc -c /parsedmarc.ini /input/* --debug
depends_on:
- elasticsearch
restart: on-failure
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.5
container_name: elasticsearch
environment:
- discovery.type=single-node
ports:
- 9200:9200
volumes:
- ./elastic_data:/var/lib/elasticsearch/data
grafana:
build: ./grafana/
ports:
- 3000:3000
user: root
environment:
GF_INSTALL_PLUGINS: grafana-piechart-panel,grafana-worldmap-panel
GF_AUTH_ANONYMOUS_ENABLED: 'true'
`
Would you kindly assist me in fixing the problem?
Error
ModuleNotFoundError: No module named 'msgraph'
Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 5, in <module>
from parsedmarc.cli import _main
File "/usr/local/lib/python3.9/site-packages/parsedmarc/__init__.py", line 31, in <module>
from parsedmarc.mail import MailboxConnection
File "/usr/local/lib/python3.9/site-packages/parsedmarc/mail/__init__.py", line 2, in <module>
from parsedmarc.mail.graph import MSGraphConnection
File "/usr/local/lib/python3.9/site-packages/parsedmarc/mail/graph.py", line 10, in <module>
from msgraph.core import GraphClient
ModuleNotFoundError: No module named 'msgraph'``
got this error even changed the
RUN apk add --update --no-cache --virtual .build_deps build-base libffi-dev \
&& pip install parsedmarc 'msgraph-core<1.0.0' \
docker ps | grep dmarc
9f4c981247a7 d715d60fffa8 "parsedmarc -c /pars…" 58 minutes ago Restarting (1) 44 seconds ago dmarc_parsedmarc
ed59739f918d docker.elastic.co/elasticsearch/elasticsearch:7.17.5 "/bin/tini -- /usr/l…" 58 minutes ago Up 54 minutes 9200/tcp, 9300/tcp dmarc_elasticsearch
90bfc2b8a257 7ba0f5152237 "/run.sh" About an hour ago Up About an hour 0.0.0.0:3000->3000/tcp, :::3000->3000/tcp dmarc_grafana
docker-compose file
version: '3.5'
services:
parsedmarc:
container_name: dmarc_parsedmarc
build: ./parsedmarc/
volumes:
- ./files:/input:ro
- ./output_files:/output
command: parsedmarc -c /parsedmarc.ini /input/* --debug
depends_on:
- elasticsearch
restart: always
elasticsearch:
container_name: dmarc_elasticsearch
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.5
environment:
- discovery.type=single-node
volumes:
- ./elastic_data:/usr/share/elasticsearch/data
restart: always
grafana:
container_name: dmarc_grafana
build: ./grafana/
ports:
- 3000:3000
user: root
environment:
GF_INSTALL_PLUGINS: grafana-piechart-panel,grafana-worldmap-panel
GF_AUTH_ANONYMOUS_ENABLED: 'true'
restart: always
parsedmarc/Dockerfile
# Use the official Python 3.9 Alpine image
FROM python:3.9-alpine3.16
# Install system dependencies
RUN apk add --update --no-cache libxml2-dev libxslt-dev
# Install build dependencies
RUN apk add --update --no-cache --virtual .build_deps build-base libffi-dev \
&& pip install parsedmarc msgraph-core==0.2.2 \
&& apk del .build_deps
# Copy configuration file
COPY parsedmarc.ini /
# Command to run the parsedmarc tool
CMD ["parsedmarc"]
Hello,
I'm getting an error when i'm trying to parse my email.
I want to watch my dmarc report directly from my mailbox.
My email account is on outlook.office365. I cannot (watc), parse my email, when i'm using IMAP and microsoft graph.
It says that i have an login error with Python.
I don't know if i did something wrong.
Can someone explain how to set up this correctly please ?
I just wanted to point out a slight change in docker-compose.yml to have parsedmarc restart on failure. Since elasticsearch may take some while until it's up parsedmac can retry a few times.
I can't seem to get this running in my instance after uncommenting the appropriate line in parsedmarc/Dockerfile, downloading the GeoLite2-Country.mmdb file and placing it beside parsedmarc.ini (I also tried the top level).
ERROR:
WARNING:utils.py:296:GeoLite2-Country.mmdb is missing. Please follow the instructions at https://dev.maxmind.com/geoip/geoipupdate/ to get the latest version.
elasticsearch stop working after less tha 20 seconds, so grafana can not load data
that's my doker compose
version: '3.5'
services:
parsedmarc:
build: ./parsedmarc/
volumes:
- ./files:/input:ro
- ./output_files:/output
command: parsedmarc -c /parsedmarc.ini /input/* --debug
depends_on:
- elasticsearch
restart: on-failure
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.17.5
environment:
- discovery.type=single-node
volumes:
- ./elastic_data:/usr/share/elasticsearch/data
grafana:
build: ./grafana/
ports:
- 3000:3000
user: root
environment:
GF_INSTALL_PLUGINS: grafana-piechart-panel,grafana-worldmap-panel
GF_AUTH_ANONYMOUS_ENABLED: 'true'
and that's my parsedmarc.ini
[general]
save_aggregate = True
save_forensic = True
output = /output/
[elasticsearch]
hosts = elasticsearch:9200
ssl = False
So, I just found out after hours of troubleshooting, that you cannot write anything to the root folders in Big Sur. I even tried to turn off SIP and I still got "Read-only file system"
So, the line : "COPY GeoLite2-Country.mmdb /usr/share/GeoIP/GeoLite2-Country.mmdb" will not run.
I believe moving the GeoLite installation to the usr/local/share folder will correct the problem.
Hi
any quick fix for :
Failed to establish a new connection: [Errno 111] Connection refused)
what and where to change ?
docker-compose up
Creating network "dmarcvisualizer_default" with the default driver
Creating dmarcvisualizer_grafana_1 ...
Creating dmarcvisualizer_elasticsearch_1 ...
Creating dmarcvisualizer_elasticsearch_1
Creating dmarcvisualizer_grafana_1 ... done
Creating dmarcvisualizer_parsedmarc_1 ...
Creating dmarcvisualizer_parsedmarc_1 ... done
Attaching to dmarcvisualizer_elasticsearch_1, dmarcvisualizer_grafana_1, dmarcvisualizer_parsedmarc_1
grafana_1 | ✔ Downloaded grafana-piechart-panel v1.6.2 zip successfully
grafana_1 |
grafana_1 | Please restart Grafana after installing plugins. Refer to Grafana documentation for instructions if necessary.
grafana_1 |
parsedmarc_1 | Traceback (most recent call last):
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
parsedmarc_1 | conn = connection.create_connection(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 95, in create_connection
parsedmarc_1 | raise err
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
parsedmarc_1 | sock.connect(sa)
parsedmarc_1 | ConnectionRefusedError: [Errno 111] Connection refused
parsedmarc_1 |
parsedmarc_1 | During handling of the above exception, another exception occurred:
parsedmarc_1 |
parsedmarc_1 | Traceback (most recent call last):
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/connection/http_urllib3.py", line 255, in perform_request
parsedmarc_1 | response = self.pool.urlopen(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 785, in urlopen
parsedmarc_1 | retries = retries.increment(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/util/retry.py", line 525, in increment
parsedmarc_1 | raise six.reraise(type(error), error, _stacktrace)
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/packages/six.py", line 770, in reraise
parsedmarc_1 | raise value
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 703, in urlopen
parsedmarc_1 | httplib_response = self._make_request(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 398, in _make_request
parsedmarc_1 | conn.request(method, url, **httplib_request_kw)
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 239, in request
parsedmarc_1 | super(HTTPConnection, self).request(method, url, body=body, headers=headers)
parsedmarc_1 | File "/usr/local/lib/python3.10/http/client.py", line 1282, in request
parsedmarc_1 | self._send_request(method, url, body, headers, encode_chunked)
parsedmarc_1 | File "/usr/local/lib/python3.10/http/client.py", line 1328, in _send_request
parsedmarc_1 | self.endheaders(body, encode_chunked=encode_chunked)
parsedmarc_1 | File "/usr/local/lib/python3.10/http/client.py", line 1277, in endheaders
parsedmarc_1 | self._send_output(message_body, encode_chunked=encode_chunked)
parsedmarc_1 | File "/usr/local/lib/python3.10/http/client.py", line 1037, in _send_output
parsedmarc_1 | self.send(msg)
parsedmarc_1 | File "/usr/local/lib/python3.10/http/client.py", line 975, in send
parsedmarc_1 | self.connect()
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 205, in connect
parsedmarc_1 | conn = self._new_conn()
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 186, in _new_conn
parsedmarc_1 | raise NewConnectionError(
parsedmarc_1 | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f646bdfc8b0>: Failed to establish a new connection: [Errno 111] Connection refused
parsedmarc_1 |
parsedmarc_1 | During handling of the above exception, another exception occurred:
parsedmarc_1 |
parsedmarc_1 | Traceback (most recent call last):
parsedmarc_1 | File "/usr/local/bin/parsedmarc", line 8, in
parsedmarc_1 | sys.exit(_main())
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/parsedmarc/cli.py", line 684, in _main
parsedmarc_1 | elastic.migrate_indexes(aggregate_indexes=[es_aggregate_index],
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/parsedmarc/elastic.py", line 244, in migrate_indexes
parsedmarc_1 | if not Index(aggregate_index_name).exists():
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch_dsl/index.py", line 414, in exists
parsedmarc_1 | return self._get_connection(using).indices.exists(index=self._name, **kwargs)
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/client/utils.py", line 347, in _wrapped
parsedmarc_1 | return func(*args, params=params, headers=headers, **kwargs)
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/client/indices.py", line 371, in exists
parsedmarc_1 | return self.transport.perform_request(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/transport.py", line 417, in perform_request
parsedmarc_1 | self._do_verify_elasticsearch(headers=headers, timeout=timeout)
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/transport.py", line 606, in _do_verify_elasticsearch
parsedmarc_1 | raise error
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/transport.py", line 569, in _do_verify_elasticsearch
parsedmarc_1 | _, info_headers, info_response = conn.perform_request(
parsedmarc_1 | File "/usr/local/lib/python3.10/site-packages/elasticsearch/connection/http_urllib3.py", line 280, in perform_request
parsedmarc_1 | raise ConnectionError("N/A", str(e), e)
parsedmarc_1 | elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7f646bdfc8b0>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7f646bdfc8b0>: Failed to establish a new connection: [Errno 111] Connection refused)
It starts and stops with the following line in the log:
No log line matching the '' filter
What causes this and how to resolve it?
Hello, I started using this project recently and just noticed that the blog you link to for instructions is no longer available, 404 error.
parsedmarc/Dockerfile
FROM python:alpine
RUN apk add build-base libffi-dev libxml2-dev libxslt-dev \
&& pip install parsedmarc
COPY parsedmarc.ini /
#COPY GeoLite2-Country.mmdb /usr/share/GeoIP/GeoLite2-Country.mmdb
1st, thanks very much for this project.
The parsedmarc component fails to build on initial docker run with the following error:
docker-compose up -d
Building with native build. Learn about native build in Compose here: https://docs.docker.com/go/compose-native-build/
Building parsedmarc
Sending build context to Docker daemon 4.081MB
Step 1/4 : FROM python:alpine
---> 2d64a2341b7c
Step 2/4 : RUN apk add build-base libxml2-dev libxslt-dev && pip install parsedmarc
---> Running in cf9e4635c3b0
fetch https://dl-cdn.alpinelinux.org/alpine/v3.13/main/x86_64/APKINDEX.tar.gz
fetch https://dl-cdn.alpinelinux.org/alpine/v3.13/community/x86_64/APKINDEX.tar.gz
**ERROR: https://dl-cdn.alpinelinux.org/alpine/v3.13/main: temporary error (try again later)
WARNING: Ignoring https://dl-cdn.alpinelinux.org/alpine/v3.13/main: No such file or directory
ERROR: https://dl-cdn.alpinelinux.org/alpine/v3.13/community: temporary error (try again later)
WARNING: Ignoring https://dl-cdn.alpinelinux.org/alpine/v3.13/community: No such file or directory**
ERROR: unable to select packages:
build-base (no such package):
required by: world[build-base]
libxml2-dev (no such package):
required by: world[libxml2-dev]
libxslt-dev (no such package):
required by: world[libxslt-dev]
The command '/bin/sh -c apk add build-base libxml2-dev libxslt-dev && pip install parsedmarc' returned a non-zero code: 3
ERROR: Service 'parsedmarc' failed to build
I assume the reason for this is that the docker container is having some internet access issue. I've tried setting:
network_mode: host
for the parsedmarc service in the docker-compose file but it made no difference. Also, the machine docker is on can see the 2 alpinelinux.org locations without issue in a browser. I have a number of other docker containers running normally without issue.
Any help is appreciated.
Thanks, Robby
After the three containers are started I get the following message
parsedmarc_1 | ERROR:cli.py:696:IMAP Error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate (_ssl.c:1131) dmarc-visualizer_parsedmarc_1 exited with code 1
That is weird because my mailserver has no self-signed-certificate, but a genuine one which expires somewhere late october.
My parsedmarc.ini settings are:
[general]
save_aggregate = true
save_forensic = true
output = /output/
[imap]
host = mailserver.domain.tld
WRONG_VERSION_NUMBER]
port = 993
user = [email protected]
password = password
watch = true
reports_folder = INBOX
[elasticsearch]
hosts = elasticsearch:9200
ssl = false
Setting ssl = False
and skip_certificate_verification = True
makes no difference.
Any reason why parsedmarc does not start and gives this error?
Running this on CentOS 8, switched SE Linux to permissive just in case, running firewalld.
parsedmarc keeps dying. I see it in docker ps output, but it dies a few seconds after restaring. I get this in logs:
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/parsedmarc", line 8, in
sys.exit(_main())
File "/usr/local/lib/python3.9/site-packages/parsedmarc/cli.py", line 502, in _main
elastic.migrate_indexes(aggregate_indexes=[es_aggregate_index],
File "/usr/local/lib/python3.9/site-packages/parsedmarc/elastic.py", line 244, in migrate_indexes
if not Index(aggregate_index_name).exists():
File "/usr/local/lib/python3.9/site-packages/elasticsearch_dsl/index.py", line 414, in exists
return self._get_connection(using).indices.exists(index=self._name, **kwargs)
File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/utils.py", line 153, in _wrapped
return func(*args, params=params, headers=headers, **kwargs)
File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/indices.py", line 332, in exists
return self.transport.perform_request(
File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 413, in perform_request
raise e
File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 381, in perform_request
status, headers_response, data = connection.perform_request(
File "/usr/local/lib/python3.9/site-packages/elasticsearch/connection/http_urllib3.py", line 264, in perform_request
raise ConnectionError("N/A", str(e), e)
elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7fa2d6ca5610>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7fa2d6ca5610>: Failed to establish a new connection: [Errno 111] Connection refused)
Elastic looks OK, it's working.
Read the blog, tried moving to the older version of elastic... nothing. some advice would be appreciated.
If it matters, it's running in a VM, and the only other thing running is nginx as a proxy for http, with auth and https :).
Any advice would be appreciated.
I tried to use this, but I receive this error, and it never parses any data.
parsedmarc_1 | ERROR:cli.py:615:IMAP Error: open() takes 3 positional arguments but 4 were given
dmarc-visualizer_parsedmarc_1 exited with code 1
0it [00:00, ?it/s]
parsedmarc_1 | ERROR:cli.py:615:IMAP Error: open() takes 3 positional arguments but 4 were given
dmarc-visualizer_parsedmarc_1 exited with code 1
0it [00:00, ?it/s]
parsedmarc_1 | ERROR:cli.py:615:IMAP Error: open() takes 3 positional arguments but 4 were given
dmarc-visualizer_parsedmarc_1 exited with code 1
I can see the grafana interface in the browser. however, can't see the statistics. I suspect it is because my elasticsearch and parsedmarc are not talking to each other. It got this error:
elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7fac6e587dc0>: Failed to establish a new connection: [Errno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7fac6e587dc0>: Failed to establish a new connection: [Errno 111] Connection refused)
Any idea how to resolve it? Thanks
When I build the cluster ( docker compose up) it appears this error:
arsedmarc_1 |
parsedmarc_1 | During handling of the above exception, another exception occurred:
parsedmarc_1 |
parsedmarc_1 | Traceback (most recent call last):
parsedmarc_1 | File "/usr/local/bin/parsedmarc", line 8, in
parsedmarc_1 | sys.exit(_main())
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/parsedmarc/cli.py", line 821, in _main
parsedmarc_1 | elastic.migrate_indexes(aggregate_indexes=[es_aggregate_index],
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/parsedmarc/elastic.py", line 241, in migrate_indexes
parsedmarc_1 | if not Index(aggregate_index_name).exists():
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch_dsl/index.py", line 414, in exists
parsedmarc_1 | return self._get_connection(using).indices.exists(index=self._name, **kwargs)
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/utils.py", line 168, in _wrapped
parsedmarc_1 | return func(*args, params=params, headers=headers, **kwargs)
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch/client/indices.py", line 332, in exists
parsedmarc_1 | return self.transport.perform_request(
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 413, in perform_request
parsedmarc_1 | raise e
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch/transport.py", line 381, in perform_request
parsedmarc_1 | status, headers_response, data = connection.perform_request(
parsedmarc_1 | File "/usr/local/lib/python3.9/site-packages/elasticsearch/connection/http_urllib3.py", line 266, in perform_request
parsedmarc_1 | raise ConnectionError("N/A", str(e), e)
parsedmarc_1 | elasticsearch.exceptions.ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7fb07b9cb880>: Failed to establish a new connection: [Errno -2] Name does not resolve) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7fb07b9cb880>: Failed to establish a new connection: [Errno -2] Name does not resolve)
And in the dashboard appears "Elasticsearch error: Bad Gateway"
Hi there,
thank you so much for your blog post, it helped me a lot getting my own dmarc visualiser up and running. I've actually created a fork here https://github.com/dwt/dmarc-visualizer where I've simplified the docker file to build less local containers and use bind-mounts instead to get config files into the containers. I've also added a makefile to drive configuration and especially allow use with podman-compose to easily deploy it rootless.
What do you think? I'd love to get this into a pull-request or maybe get the guy from parsedmarc to add this to his documentation.
Best Regards,
Martin Häcker
Hello, I'm trying to use IMAP to read emails directly from inbox, I had change the parsedmarc.ini file:
[general]
save_aggregate = True
save_forensic = True
output = /output/
[imap]
host = (my value)
port = 993
user = (my value)
password = (my value)
watch = True
reports_folder = INBOX
[elasticsearch]
hosts = elasticsearch:9200
ssl = False
Do I need to change something more?, maybe something at docker-compose.yml ?
Thank you for your help!!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.