Filebeat config with some Elastalert rules configuration
- docker run --restart=always -e TZ=Asia/Ho_Chi_Minh -p 5601:5601 -p 9200:9200 -p 5044:5044 -d --name elk sebp/elk:771
- curl -L -O 30-output.confhttps://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.7.1-x86_64.rpm
- sudo rpm -vi filebeat-7.7.1-x86_64.rpm
- Edit file /etc/filebeat/filebeat.yml to point to logstash ( uploaded filebeat.yml in the git repo)
Step 3. Configure logstash on ELK container to accept request from filebeat and configure grok pattern for apache log file.
- Exec to docker ELK container
- Edit file /etc/logstash/conf.d/02-beats-input.conf for grok pattern to parse apache log file ( uploaded 02-beats-input.conf in the git repo)
- Edit file /etc/logstash/conf.d/30-output.conf for logstash to output to elasticsearch ( uploaded 30-output.yml in the git repo)
- Restart container to apply config
- Start filebeat from STEP 2.
- Install python3 and pip3
- pip3 install elastalert
- Create slack channel, create webhook and add to the channel.
- Edit config.yaml for elastalert to point to ES node. ( uploaded config.yaml in the git repo)
- Create rules file in rules_folder with slack webhook url ( uploaded rules_folder in the git repo)
- Change directory to elastalert ( with config file and rules_folder)
- Start elastalert "python3 -m elastalert.elastalert --verbose"
- wget https://devopsplayground.blob.core.windows.net/playground/sample-access-log-20200515.log
- split sample-access-log-20200515.log split.log to divide to smaller log file , default 1000 lines for each small log file. To split log data
- Put log file to filebeat gather folder, filebeat will send log file to logstash.
- Check slack for alerts
exported kibana dashboard file export.ndjson ( uploaded config.yaml in the git repo)