Coder Social home page Coder Social logo

tango's Introduction

About

Tango is a set of scripts and Splunk apps which help organizations and users quickly and easily deploy honeypots and then view the data and analysis of the attacker sessions. There are two scripts provided which facilitate the installation of the honeypots and/or Splunk Universal Forwarder. One of the scripts uf_only.sh will install the Splunk Universal Forwarder and install the necessary input and output configuration files. The other script sensor.sh will install the Splunk Universal Forwarder along with the Cowrie honeypot required for the Tango Honeypot Intelligence app to work.

###Version 2.0 Version 2.0 now supports the Cowrie honeypot as well as updates the Sensor forwarders to 6.3.0

Before You Begin

There are a few things that should be noted before you install:

  • When you deploy the input app on a sensor, the app will communicate with the website, ipv4.icanhazip.com to get the external IP address of the sensor. This is useful information for the sensor management portion of the app. Please feel free to remove if you'd rather not communicate with that site. Please note that if you do not use this, a lot of the "Sensor Management" fields will be blank.
  • The Tango Honeypot Intelligence Splunk App is built to use JSON formatted data from Cowrie by Michel Oosterhof, which can be found on his github.
  • You will need to add your own VirusTotal API key to the Splunk app, which can be configured at /opt/splunk/etc/apps/tango/bin/vt.py The API is free to obtain, you will just need to follow the procedures found on their website to receive one. Please note that you are limited to 4 requests per minute, so if you attempt to do more than that, you will not receive any information. This pertains to the File Analysis section of the Splunk Honeypot Intelligence app.

Installation

Sensor Installation (Cowrie and Splunk Universal Fowarder)

This script has been tested on a brand-new install of Ubuntu 14.04 and Cent OS 7 with no reported issues.

To get started, run the commands below and follow the prompts to enter the necessary input.

git clone https://github.com/aplura/Tango.git /tmp/tango; chmod +x /tmp/tango/sensor.sh
cd /tmp/tango/
./sensor.sh

There are some options you can change in /opt/cowrie/cowrie.cfg if you choose, however, some of these will break the forwarding of logs (such as changing the listening port set to 2222), however, there are some extra modules, such as mysql or xmpp logging you can enable if you choose, as well as changing the hostname of the honeypot.

cowrie is highly configurable, so if you wish to add extra commands or output to cowrie, there are tons of resources on github or google, which can help you do that if you choose.

The script will install the required packages based on the OS, then install cowrie, and lastly, install the Splunk Universal Forwarder.

Sensor Installation (Splunk UF Only)

If you already have cowrie honeypots deployed and wish to start analyzing their logs in the Tango Honeypot Intelligence Splunk App, you can run the uf_only.sh script, which will install the Splunk UF on your host, and configure the inputs and outputs necessary to start viewing your logs.

To get started, run the commands below and follow the prompts to enter the necessary input.

git clone https://github.com/aplura/Tango.git /tmp/tango; chmod +x /tmp/tango/uf_only.sh
cd /tmp/tango/
./uf_only.sh

Server Installation

In order to view the logs you are sending from cowrie, you will need to install Splunk Enterprise on a server, and install the Tango Honeypot Intelligence for Splunk App from this repo. There are plenty of guides on Splunk's website to get Splunk Enterprise running, however, the basic gist of setting up a server is this:

  • Download Splunk Enterprise from Splunk
  • Copy the Tango Honeypot Intelligence for Splunk App into $SPLUNK_HOME/etc/apps/
  • Create a Splunk listener on port 9997 (It's not required to be on 9997, however, the scripts are configured to use that port, so, if you change the port, change it everywhere)
  • Add your VirusTotal API key to /opt/splunk/etc/apps/tango/bin/vt.py
  • You'll need to add the requests source into the tango app's bin directory /opt/splunk/etc/apps/tango/bin/. Requests can be found here: Kenneth Reitz Github. This is needed for the VirusTotal lookup.
  • Restart Splunk
  • You'll need to allow users to search the 'honeypot' index by default. To do this, go into “Settings”, then “Access Controls”, then “Roles”, “Admin”, then scroll all the way down to “Indexes Searched by Default”, then add honeypot to the right-hand column.

Once in Splunk, you can start using the Tango app to analyze your Honeypot logs.

Tango Honeypot Intelligence for Splunk App

Now that you have your sensors and server running, you'll want to use the Tango Splunk App to analyze your logs and start identifying what the attackers are doing on your systems. Start by logging into Splunk and clicking on the "Tango Honeypot Intelligence App" on the left-hand side.

Once you enter the app, you'll be first taken to the "Attack Overview" portion of the app, which shows a broad overview of the attacks against your sensors. This includes Attempts vs. Successes, Latest Logins, Attackers logging into multiple locations, etc.

You'll notice at the top of the app, in the navigation pane, there are multiple categories of reports available to you, which include:

  • Attack Analysis
  • File Analysis
  • Network Analysis
  • Sensor Management
  • Threat Feed

Below we will go through each section and describe some of the data available in each section.

Attack Analysis

Attack Overview

This dashboard shows a broad overview of the attacks against your sensors. This includes Attempts vs. Successes, Latest Logins, Attackers logging into multiple locations, etc.

Session Playlog

This is one of the most beneficial dashboards available in the app, since it actually shows you what the attacker is doing on your honeypot. At the top of the dashboard, you can see the most recent sessions along with a filter to select a particular sensor. Clicking on a session will populate the panels below, which includes the passwords attempted/accepted, the commands entered, any files downloaded during the session and the raw logs for the session.

Attacker Profile

Using this dashboard, you can inquire about a certain IP and if seen in the app, you can get valuable information pertaining to that IP to include:

  • Geolocational data
  • Times seen
  • SSH Client versions
  • Sessions seen
  • Files Downloaded
Session Analysis

This series of dashboards contains some analytical information, to include the % of sessions with interaction, the various SSH versions seen, some environment details extracted by the session, and a Human vs. Bot Identification dashboard.

Location Overview

In this section, you are able to see various geographical data related to each session and attacker. There are currently three dashboards available:

  • Top countries from which attackers have logged in from
  • Top countries where attackers have scanned from
  • Top sensors that have been attacked

We also include a map which includes the location of attackers seen.

Username/Password Analysis

Currently, this dashboard contains the top usernames and passwords seen being attempted by the attackers, as well as the top username/password combinations.

Malware Analysis

File Analysis

Starting at the top of this page, you can see the latest files downloaded by attackers, which includes the following:

  • URL of file
  • SHA256 Hash of file
  • Sensor which the file was seen being download
  • The session identifier of the session, which the file was downloaded
  • The time that the file was downloaded

Below that is the latest "Attempted" file downloads. This contains URL's that were seen in a session that do not have a corresponding SHA256 hash (which indicates a successful download). This can be due to a server error on the hosting website, an incorrect spelling of the file, or if this URL was seen elsewhere in the command, perhaps as an argument or target site of the malware.

Lastly, is a panel which you are able to look up a particular SHA256 hash seen previously downloaded in VirusTotal to retrieve the following information:

  • Date Scanned
  • SHA256 Hash
  • How many AV vendors identified this file
  • The various signatures of the file

Please note that the VirusTotal API is limited to 4 requests per minute. With that being said, you can use this panel to quickly lookup the file hashes seen by in your sessions.

This "lookup" will produce a local "cache" to use in other dashboards, so it's useful to run lookups on any malware you see. This was created do to limitations in the Virustotal API, and will be used as a workaround for the time being.

Malware Analysis

This dashboard will show the Top 10 Malware Signatures we've seen over time, as well as the most recent legitimate malware. This dashboard is populated from the VirusTotal local "cache" found on the File Analysis page. This dashboard will also show you files that have been downloaded, but, produced no signatures in Virustotal.

Malware Campaigns

This set of reports give you information on possible campaigns associated with your sessions. Currently this includes:

  • Potential Malware Campaigns (By URL)
  • Potential Malware Campaigns (By Domain)
  • Potential Malware Campaigns (By Filename)
  • Potential Malware Campaigns (By SHA Hash)

This section will continue to be developed to include other possible campaign attribution by looking at other TTP's associated with each session. This could include commands entered during each session, terminal variables (size, language, SSH keys, etc.). For now, we can see the URL's, Domain's and Filenames that have been seen being used by multiple attackers.

Network Analysis

This dashboard currently includes reports on the following:

  • Top Domains Seen
  • Same URI on Multiple Domains
  • Latest IP Addresses Seen

Sensor Management

Sensor Status

This dashboard provides geographical information pertaining to each sensor currently deployed. You will find the following information available to you in this dashboard:

  • Sensor Name
  • Last Active
  • Sensor IP Address (External IP)
  • ASN
  • ASN Country
  • Network Name
  • Network Range

This dashboard also provides you with a map populated with the locations of all your sensors deployed.

Edit Sensor

In this dashboard, you are able to edit a few fields for your sensors, these fields are:

  • Owner
  • Owner Email
  • Comment

Threat Feed

Lastly, this dashboard contains feeds which you can download and integrate with other network monitoring solutions, which will hopefully be automated in the future.

The feeds currently available are:

  • IP Addresses
  • Potentially Malicious URLs
  • SHA File Hashes
  • Potentially Malicious Domains
  • File Names

Screenshots

Below are some screenshots which illustrate the features of Tango:

Attack Overview

Session Analysis

Malware Campaigns

Session Playlog

IOC Feed

Network Analysis

Malware Analysis

To-Do

  • Utilize Data Models to speed up searches
  • Auto-extract indicators inside of malware
  • TOR Exit Node Identifier

Credits

tango's People

Contributors

brianwarehime avatar covertserver avatar jonathanphillips avatar sqearl avatar voodoologic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tango's Issues

Tango App Not Processing Data

Sorry if this is more of a Splunk issue but wanted to start here. :)

I have my sensors setup and logging back to Tango within splunk. In Tango, I can see my sensors within Sensor Mgmt. If I go to Splunk Search and Reporting, I can see the kippojson logs.

However, when navigating through the dashboards, I get no results found.

homepg

When I see the logs, most of them are logon attempts but when I go to the username/password analysis page, I get no results there as well.

logons

I followed the server instructions on the main page; Tango is added to the default search index for my user.

Monitor and restart kippo

Has anyone found a good way to monitor the Cowrie process and restart it if it dies?

I know this is not Tango specific; but thought I'd start here. I looked at Upstart but not all of my machines are Ubuntu (though the majority are).

Tango App: No results found.

Maybe someone can tell me, what should I do to check if my Spunk and Tango App works properly?

My honeypot Cowrie seems works because in /log/lastlog.txt I can see accepted logon. Also I can see log in cowrie.log:

[cowrie.ssh.factory.CowrieSSHFactory] New connection: 195.62.52.62:37814 (94.177.248.74:2222) [HoneyPotSSHTransport,11,195.62.52.62] Remote SSH version: SSH-2.0-ssh2js0.3.6
[HoneyPotSSHTransport,11,195.62.52.62] kex alg, key alg: 'diffie-hellman-group14-sha1' 'ssh-rsa'
 [HoneyPotSSHTransport,11,195.62.52.62] outgoing: '3des-cbc' 'hmac-sha1' 'none'
 [HoneyPotSSHTransport,11,195.62.52.62] incoming: '3des-cbc' 'hmac-sha1' 'none'
[HoneyPotSSHTransport,11,195.62.52.62] NEW KEYS
[HoneyPotSSHTransport,11,195.62.52.62] starting service 'ssh-userauth'
 [SSHService 'ssh-userauth' on HoneyPotSSHTransport,11,195.62.52.62] 'root' trying auth 'password'
 [SSHService 'ssh-userauth' on HoneyPotSSHTransport,11,195.62.52.62] login attempt [root/111111] succeeded
 [SSHService 'ssh-userauth' on HoneyPotSSHTransport,11,195.62.52.62] 'root' authenticated with 'password'
[SSHService 'ssh-userauth' on HoneyPotSSHTransport,11,195.62.52.62] starting service 'ssh-connection'
[HoneyPotSSHTransport,11,195.62.52.62] avatar root logging out
 [HoneyPotSSHTransport,11,195.62.52.62] connection lost
[HoneyPotSSHTransport,11,195.62.52.62] Connection lost after 0 seconds

But my Splunk with Honeypot Tango App show still empty data (http://imgur.com/a/45Whs). I do not know what should I do next.

Event processing

Hello, are there any reported issues with the current versions of cowrie/splunk/uf? I seem to be having an issue processing events, when configured under "Forwarding and Receiving" no events are processed and when configured as a TCP listener under "Data Inputs" all I get is cooked data that isn't processed. This is on a fresh install of Splunk and Tango. Just wanted to make sure before I spent any more time on the issue.

Add in Tango Splunk App to server.sh

Create Tango Honeypot Intelligence Splunk App

Needs indexes.conf to set up honeypot index
Needs the dashboards, searches, menu navigation, etc.

Once complete, include in server.sh to fetch from repo and install

Keylogging (human vs. bot)

Try to integrate playlog.py and identify if commands are human entered or bot. A backspace would indicate a human entered command.

Universal Forwarder Configuration failed

When installing through the .sensor.sh or .uf_only.sh script I get the error: Universal Forwarder Configuration failed. Please check /var/log/tango_install.log for more details. I have attached the log file
UF Error Log.txt

This has happened when trying to use the .sensor.sh script on a brand new Ubuntu 14.04.4 LTS server as well.

Uploading Jason Files

Hey,

Can I just upload the completed Jason files? If yes, can you let me know how please?

Tango App empty-need assistance troubleshooting

I've got the Tango App installed on my Splunk server and 3 kippo honeypots.
2 were existing in which I just installed the UF only and 1 is brand new..new kippo and uf install per the default Tango installation.

I'm not seeing anything show up in the Tango app on Splunk. I don't have much experience with Splunk yet... What logs in Splunk can I look at for troubleshooting?

Cowrie - listen on 22

In order to get Cowrie to listen on 22, I had to update the last line of the start.sh file to include "authbind --deep" at the beginning.

Installation for Ubuntu < 18.04

Dependencies are not good for ubuntu < 18.04 installation
Here how to solve the problems in the uf_only.sh :

line 148 :
apt-get -y install python-dev python-openssl python-pyasn1 authbind git libgnutls28-dev libcurl4-gnutls-dev libssl-dev python-pip curl &>> $logfile

Line 25 (with the correct PATH) :
read -e -p "[?] Enter the full path to where your Cowrie logs are stored: (example:/home/cowrie/cowrie/var/log/cowrie/) " KIPPO_LOG_LOCATION

Consider adding additional third party intel

E.g. add a section with Robtex info on the attacker profile page.

Also on the main dashboard, consider an integration with other bad IP lists and RBLs. So the dashboard would list all of the IPs which hit your honeypots where the source IP is already present on another bad IP list or RBL.

Splunk not getting any logs

Hey,
Thank you for the tango add on!
I have having some issues with splunk receiving logs from the honeypot, can you please provide some help here. I have followed your readme and installed all of the required packages.
Been on this all day :(

Splunk home directory creation issue

When running initial setup from instructions on what would be the sensor, the directory /home/splunk doesn't get created and kept causing my install to fail. _edit_Manually creating /home/splunk [before running setup scripts], everything works fine.

Perhaps this was just something wrong with 14.04.2 LTS?

Kippo Version Issues

Hello, I'm having issues getting Tango to run (and unfortunately I lost my good install). You mention Michel Oosterhof's fork is necessary, however he has renamed his project "cowrie". After fixing the download link I can't get cowrie to run (even after replacing "cowrie" with "kippo", hoping that's all that changed). Can someone either hook me up with the last working version of Kippo or tell me how to fix the cowrie installations?

Additional dashboards

Thought of some additional ways to use the already existing data which would help with cyber intel shops.

  1. A panel which shows if there is a unique sample of malware which hit one honeypot but not the rest.
  2. A panel which shows a timeline vs geography of a particular IP.. maybe under Attacker Profile. So if it's a bot let's say, a panel which shows that it hits all of your honeypots from east to west.
  3. An additional panel that shows the above but not restricted to a single IP. Have it show the pattern for all source IPs. (if pattern exists but I would bet it does).

Get external IP of sensor

Use $curl icanhazip.com to get external IP address of sensor to add in geo data for sensor distribution.

Add scripted input into tango app that runs every x hours and gets IP.

Pivot on userdb

An idea for a new panel.

When a person logged into kippo creates a new account or changes the password, it adds the password to userdb.txt. So create a panel which shows all of the new passwords created in your sensors. Then you can click on a certain password and another panel shows you all of your sensors which have that password added and all of the IPs who have connected using that password.

doesnt appear to connect to VT

Hi, Great app.

Problem is I cant seem to get my VirusTotal connection working. The API is in the py file and the 'requests' py files are in the specified directory. However when in splunk, the VirusTotal SHA Lookup dropdown box is greyed out. I would expect from images seen, that a list of the SHA HASH of the files downloaded/uploaded would be in there.

Any thoughts ?

Tango install fails - "cp: cannot stat 'cowrie.cfg.dist': No such file or directory"

Hey, fresh install of Ubuntu 18.04 on Vultr. Apt-get update, followed by installing build-essential, then the commands for the full install from the readme - fails with the above.

Is this likely to work? Are the instructions still valid?

I've included the whole tango_install.log here, sorry if it's obnoxiously long, hoping that GitHub hides it :)

`[] Checking for root privs..
[
] We are root.
[] System is x86_64. Downloading: splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz to /opt..
[
] Splunk Forwarder Download successfully completed.
[] Installing required packages via apt-get..
[
] Apt Package Installation successfully completed.
[] Installed pip
[
] Installing required python packages via pip..
[] Python pip successfully completed.
[
] Checking for splunk user and group..
[] Creating splunk user and group..
[
] Splunk user and group creation successfully completed.
[] Checking for cowrie user and group..
[
] Creating cowrie user and group..
[] Cowrie user and group creation successfully completed.
[
] Installing Cowrie Honeypot..
[] Cloned Cowrie Repository from GitHub successfully completed.
[
] Configured Cowrie Honeypot
[] SSH Service Restarted successfully completed.
[
] Cowrie started successfully failed. Please check /var/log/tango_install.log for more details.
02 Moved Temporarily
Location: https://download.splunk.com/products/universalforwarder/releases/6.3.0/linux/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz [following]
--2018-09-18 20:57:19-- https://download.splunk.com/products/universalforwarder/releases/6.3.0/linux/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz
Resolving download.splunk.com (download.splunk.com)... 52.85.255.142, 52.85.255.186, 52.85.255.93, ...
Connecting to download.splunk.com (download.splunk.com)|52.85.255.142|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 15936760 (15M) [application/x-gzip]
Saving to: ‘/opt/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz’

 0K .......... .......... .......... .......... ..........  0% 47.4M 0s
50K .......... .......... .......... .......... ..........  0% 65.4M 0s

100K .......... .......... .......... .......... .......... 0% 62.1M 0s
150K .......... .......... .......... .......... .......... 1% 59.2M 0s
200K .......... .......... .......... .......... .......... 1% 53.1M 0s
250K .......... .......... .......... .......... .......... 1% 65.9M 0s
300K .......... .......... .......... .......... .......... 2% 53.1M 0s
350K .......... .......... .......... .......... .......... 2% 54.8M 0s
400K .......... .......... .......... .......... .......... 2% 45.3M 0s
450K .......... .......... .......... .......... .......... 3% 61.7M 0s
500K .......... .......... .......... .......... .......... 3% 58.2M 0s
550K .......... .......... .......... .......... .......... 3% 65.8M 0s
600K .......... .......... .......... .......... .......... 4% 88.6M 0s
650K .......... .......... .......... .......... .......... 4% 68.2M 0s
700K .......... .......... .......... .......... .......... 4% 49.6M 0s
750K .......... .......... .......... .......... .......... 5% 78.3M 0s
800K .......... .......... .......... .......... .......... 5% 76.9M 0s
850K .......... .......... .......... .......... .......... 5% 87.5M 0s
900K .......... .......... .......... .......... .......... 6% 71.0M 0s
950K .......... .......... .......... .......... .......... 6% 31.4M 0s
1000K .......... .......... .......... .......... .......... 6% 73.3M 0s
1050K .......... .......... .......... .......... .......... 7% 119M 0s
1100K .......... .......... .......... .......... .......... 7% 121M 0s
1150K .......... .......... .......... .......... .......... 7% 11.0M 0s
1200K .......... .......... .......... .......... .......... 8% 78.9M 0s
1250K .......... .......... .......... .......... .......... 8% 56.7M 0s
1300K .......... .......... .......... .......... .......... 8% 74.0M 0s
1350K .......... .......... .......... .......... .......... 8% 62.5M 0s
1400K .......... .......... .......... .......... .......... 9% 82.1M 0s
1450K .......... .......... .......... .......... .......... 9% 71.8M 0s
1500K .......... .......... .......... .......... .......... 9% 82.3M 0s
1550K .......... .......... .......... .......... .......... 10% 63.0M 0s
1600K .......... .......... .......... .......... .......... 10% 86.6M 0s
1650K .......... .......... .......... .......... .......... 10% 44.1M 0s
1700K .......... .......... .......... .......... .......... 11% 94.9M 0s
1750K .......... .......... .......... .......... .......... 11% 70.6M 0s
1800K .......... .......... .......... .......... .......... 11% 66.3M 0s
1850K .......... .......... .......... .......... .......... 12% 59.1M 0s
1900K .......... .......... .......... .......... .......... 12% 50.9M 0s
1950K .......... .......... .......... .......... .......... 12% 91.4M 0s
2000K .......... .......... .......... .......... .......... 13% 98.4M 0s
2050K .......... .......... .......... .......... .......... 13% 94.4M 0s
2100K .......... .......... .......... .......... .......... 13% 11.4M 0s
2150K .......... .......... .......... .......... .......... 14% 5.40M 0s
2200K .......... .......... .......... .......... .......... 14% 76.5M 0s
2250K .......... .......... .......... .......... .......... 14% 87.6M 0s
2300K .......... .......... .......... .......... .......... 15% 73.5M 0s
2350K .......... .......... .......... .......... .......... 15% 83.2M 0s
2400K .......... .......... .......... .......... .......... 15% 102M 0s
2450K .......... .......... .......... .......... .......... 16% 102M 0s
2500K .......... .......... .......... .......... .......... 16% 142M 0s
2550K .......... .......... .......... .......... .......... 16% 11.7M 0s
2600K .......... .......... .......... .......... .......... 17% 72.4M 0s
2650K .......... .......... .......... .......... .......... 17% 88.4M 0s
2700K .......... .......... .......... .......... .......... 17% 85.6M 0s
2750K .......... .......... .......... .......... .......... 17% 92.0M 0s
2800K .......... .......... .......... .......... .......... 18% 111M 0s
2850K .......... .......... .......... .......... .......... 18% 104M 0s
2900K .......... .......... .......... .......... .......... 18% 85.8M 0s
2950K .......... .......... .......... .......... .......... 19% 83.7M 0s
3000K .......... .......... .......... .......... .......... 19% 120M 0s
3050K .......... .......... .......... .......... .......... 19% 108M 0s
3100K .......... .......... .......... .......... .......... 20% 108M 0s
3150K .......... .......... .......... .......... .......... 20% 120M 0s
3200K .......... .......... .......... .......... .......... 20% 141M 0s
3250K .......... .......... .......... .......... .......... 21% 47.3M 0s
3300K .......... .......... .......... .......... .......... 21% 69.4M 0s
3350K .......... .......... .......... .......... .......... 21% 114M 0s
3400K .......... .......... .......... .......... .......... 22% 139M 0s
3450K .......... .......... .......... .......... .......... 22% 43.5M 0s
3500K .......... .......... .......... .......... .......... 22% 98.6M 0s
3550K .......... .......... .......... .......... .......... 23% 88.8M 0s
3600K .......... .......... .......... .......... .......... 23% 119M 0s
3650K .......... .......... .......... .......... .......... 23% 45.4M 0s
3700K .......... .......... .......... .......... .......... 24% 86.2M 0s
3750K .......... .......... .......... .......... .......... 24% 95.1M 0s
3800K .......... .......... .......... .......... .......... 24% 61.9M 0s
3850K .......... .......... .......... .......... .......... 25% 73.8M 0s
3900K .......... .......... .......... .......... .......... 25% 20.1M 0s
3950K .......... .......... .......... .......... .......... 25% 41.7M 0s
4000K .......... .......... .......... .......... .......... 26% 20.7M 0s
4050K .......... .......... .......... .......... .......... 26% 44.8M 0s
4100K .......... .......... .......... .......... .......... 26% 36.4M 0s
4150K .......... .......... .......... .......... .......... 26% 52.8M 0s
4200K .......... .......... .......... .......... .......... 27% 40.9M 0s
4250K .......... .......... .......... .......... .......... 27% 54.2M 0s
4300K .......... .......... .......... .......... .......... 27% 80.4M 0s
4350K .......... .......... .......... .......... .......... 28% 57.9M 0s
4400K .......... .......... .......... .......... .......... 28% 62.6M 0s
4450K .......... .......... .......... .......... .......... 28% 15.6M 0s
4500K .......... .......... .......... .......... .......... 29% 16.9M 0s
4550K .......... .......... .......... .......... .......... 29% 49.5M 0s
4600K .......... .......... .......... .......... .......... 29% 16.4M 0s
4650K .......... .......... .......... .......... .......... 30% 16.2M 0s
4700K .......... .......... .......... .......... .......... 30% 18.2M 0s
4750K .......... .......... .......... .......... .......... 30% 15.5M 0s
4800K .......... .......... .......... .......... .......... 31% 58.6M 0s
4850K .......... .......... .......... .......... .......... 31% 13.7M 0s
4900K .......... .......... .......... .......... .......... 31% 14.2M 0s
4950K .......... .......... .......... .......... .......... 32% 15.0M 0s
5000K .......... .......... .......... .......... .......... 32% 66.7M 0s
5050K .......... .......... .......... .......... .......... 32% 17.0M 0s
5100K .......... .......... .......... .......... .......... 33% 17.0M 0s
5150K .......... .......... .......... .......... .......... 33% 17.7M 0s
5200K .......... .......... .......... .......... .......... 33% 3.15M 0s
5250K .......... .......... .......... .......... .......... 34% 29.8M 0s
5300K .......... .......... .......... .......... .......... 34% 54.3M 0s
5350K .......... .......... .......... .......... .......... 34% 22.2M 0s
5400K .......... .......... .......... .......... .......... 35% 51.5M 0s
5450K .......... .......... .......... .......... .......... 35% 70.0M 0s
5500K .......... .......... .......... .......... .......... 35% 82.6M 0s
5550K .......... .......... .......... .......... .......... 35% 70.8M 0s
5600K .......... .......... .......... .......... .......... 36% 61.6M 0s
5650K .......... .......... .......... .......... .......... 36% 87.6M 0s
5700K .......... .......... .......... .......... .......... 36% 72.5M 0s
5750K .......... .......... .......... .......... .......... 37% 80.6M 0s
5800K .......... .......... .......... .......... .......... 37% 98.0M 0s
5850K .......... .......... .......... .......... .......... 37% 79.9M 0s
5900K .......... .......... .......... .......... .......... 38% 83.5M 0s
5950K .......... .......... .......... .......... .......... 38% 62.8M 0s
6000K .......... .......... .......... .......... .......... 38% 94.6M 0s
6050K .......... .......... .......... .......... .......... 39% 13.0M 0s
6100K .......... .......... .......... .......... .......... 39% 23.1M 0s
6150K .......... .......... .......... .......... .......... 39% 36.4M 0s
6200K .......... .......... .......... .......... .......... 40% 51.6M 0s
6250K .......... .......... .......... .......... .......... 40% 27.1M 0s
6300K .......... .......... .......... .......... .......... 40% 19.2M 0s
6350K .......... .......... .......... .......... .......... 41% 33.7M 0s
6400K .......... .......... .......... .......... .......... 41% 61.2M 0s
6450K .......... .......... .......... .......... .......... 41% 19.3M 0s
6500K .......... .......... .......... .......... .......... 42% 15.1M 0s
6550K .......... .......... .......... .......... .......... 42% 14.9M 0s
6600K .......... .......... .......... .......... .......... 42% 76.5M 0s
6650K .......... .......... .......... .......... .......... 43% 9.34M 0s
6700K .......... .......... .......... .......... .......... 43% 22.5M 0s
6750K .......... .......... .......... .......... .......... 43% 28.5M 0s
6800K .......... .......... .......... .......... .......... 44% 20.0M 0s
6850K .......... .......... .......... .......... .......... 44% 55.8M 0s
6900K .......... .......... .......... .......... .......... 44% 21.0M 0s
6950K .......... .......... .......... .......... .......... 44% 34.6M 0s
7000K .......... .......... .......... .......... .......... 45% 73.1M 0s
7050K .......... .......... .......... .......... .......... 45% 69.0M 0s
7100K .......... .......... .......... .......... .......... 45% 3.14M 0s
7150K .......... .......... .......... .......... .......... 46% 64.3M 0s
7200K .......... .......... .......... .......... .......... 46% 6.75M 0s
7250K .......... .......... .......... .......... .......... 46% 64.5M 0s
7300K .......... .......... .......... .......... .......... 47% 79.9M 0s
7350K .......... .......... .......... .......... .......... 47% 87.8M 0s
7400K .......... .......... .......... .......... .......... 47% 47.8M 0s
7450K .......... .......... .......... .......... .......... 48% 17.9M 0s
7500K .......... .......... .......... .......... .......... 48% 90.3M 0s
7550K .......... .......... .......... .......... .......... 48% 12.6M 0s
7600K .......... .......... .......... .......... .......... 49% 3.93M 0s
7650K .......... .......... .......... .......... .......... 49% 30.2M 0s
7700K .......... .......... .......... .......... .......... 49% 56.3M 0s
7750K .......... .......... .......... .......... .......... 50% 40.7M 0s
7800K .......... .......... .......... .......... .......... 50% 58.6M 0s
7850K .......... .......... .......... .......... .......... 50% 91.8M 0s
7900K .......... .......... .......... .......... .......... 51% 79.9M 0s
7950K .......... .......... .......... .......... .......... 51% 63.3M 0s
8000K .......... .......... .......... .......... .......... 51% 90.8M 0s
8050K .......... .......... .......... .......... .......... 52% 19.7M 0s
8100K .......... .......... .......... .......... .......... 52% 1.54M 0s
8150K .......... .......... .......... .......... .......... 52% 11.7M 0s
8200K .......... .......... .......... .......... .......... 53% 50.3M 0s
8250K .......... .......... .......... .......... .......... 53% 7.15M 0s
8300K .......... .......... .......... .......... .......... 53% 9.03M 0s
8350K .......... .......... .......... .......... .......... 53% 20.3M 0s
8400K .......... .......... .......... .......... .......... 54% 16.3M 0s
8450K .......... .......... .......... .......... .......... 54% 59.3M 0s
8500K .......... .......... .......... .......... .......... 54% 15.7M 0s
8550K .......... .......... .......... .......... .......... 55% 16.0M 0s
8600K .......... .......... .......... .......... .......... 55% 799K 0s
8650K .......... .......... .......... .......... .......... 55% 65.0M 0s
8700K .......... .......... .......... .......... .......... 56% 3.49M 0s
8750K .......... .......... .......... .......... .......... 56% 10.8M 0s
8800K .......... .......... .......... .......... .......... 56% 50.3M 0s
8850K .......... .......... .......... .......... .......... 57% 18.1M 0s
8900K .......... .......... .......... .......... .......... 57% 54.1M 0s
8950K .......... .......... .......... .......... .......... 57% 8.21M 0s
9000K .......... .......... .......... .......... .......... 58% 74.1M 0s
9050K .......... .......... .......... .......... .......... 58% 8.86M 0s
9100K .......... .......... .......... .......... .......... 58% 61.7M 0s
9150K .......... .......... .......... .......... .......... 59% 9.33M 0s
9200K .......... .......... .......... .......... .......... 59% 10.2M 0s
9250K .......... .......... .......... .......... .......... 59% 13.7M 0s
9300K .......... .......... .......... .......... .......... 60% 19.8M 0s
9350K .......... .......... .......... .......... .......... 60% 44.0M 0s
9400K .......... .......... .......... .......... .......... 60% 54.0M 0s
9450K .......... .......... .......... .......... .......... 61% 40.5M 0s
9500K .......... .......... .......... .......... .......... 61% 20.3M 0s
9550K .......... .......... .......... .......... .......... 61% 37.7M 0s
9600K .......... .......... .......... .......... .......... 62% 74.4M 0s
9650K .......... .......... .......... .......... .......... 62% 34.5M 0s
9700K .......... .......... .......... .......... .......... 62% 58.7M 0s
9750K .......... .......... .......... .......... .......... 62% 33.4M 0s
9800K .......... .......... .......... .......... .......... 63% 74.6M 0s
9850K .......... .......... .......... .......... .......... 63% 66.4M 0s
9900K .......... .......... .......... .......... .......... 63% 35.4M 0s
9950K .......... .......... .......... .......... .......... 64% 64.6M 0s
10000K .......... .......... .......... .......... .......... 64% 69.3M 0s
10050K .......... .......... .......... .......... .......... 64% 82.7M 0s
10100K .......... .......... .......... .......... .......... 65% 65.1M 0s
10150K .......... .......... .......... .......... .......... 65% 68.5M 0s
10200K .......... .......... .......... .......... .......... 65% 63.7M 0s
10250K .......... .......... .......... .......... .......... 66% 76.6M 0s
10300K .......... .......... .......... .......... .......... 66% 49.3M 0s
10350K .......... .......... .......... .......... .......... 66% 26.6M 0s
10400K .......... .......... .......... .......... .......... 67% 97.8M 0s
10450K .......... .......... .......... .......... .......... 67% 60.2M 0s
10500K .......... .......... .......... .......... .......... 67% 61.1M 0s
10550K .......... .......... .......... .......... .......... 68% 71.3M 0s
10600K .......... .......... .......... .......... .......... 68% 51.5M 0s
10650K .......... .......... .......... .......... .......... 68% 63.4M 0s
10700K .......... .......... .......... .......... .......... 69% 74.7M 0s
10750K .......... .......... .......... .......... .......... 69% 44.8M 0s
10800K .......... .......... .......... .......... .......... 69% 67.5M 0s
10850K .......... .......... .......... .......... .......... 70% 78.4M 0s
10900K .......... .......... .......... .......... .......... 70% 85.9M 0s
10950K .......... .......... .......... .......... .......... 70% 88.2M 0s
11000K .......... .......... .......... .......... .......... 71% 10.4M 0s
11050K .......... .......... .......... .......... .......... 71% 71.1M 0s
11100K .......... .......... .......... .......... .......... 71% 106M 0s
11150K .......... .......... .......... .......... .......... 71% 127M 0s
11200K .......... .......... .......... .......... .......... 72% 117M 0s
11250K .......... .......... .......... .......... .......... 72% 138M 0s
11300K .......... .......... .......... .......... .......... 72% 77.2M 0s
11350K .......... .......... .......... .......... .......... 73% 52.5M 0s
11400K .......... .......... .......... .......... .......... 73% 66.3M 0s
11450K .......... .......... .......... .......... .......... 73% 114M 0s
11500K .......... .......... .......... .......... .......... 74% 88.0M 0s
11550K .......... .......... .......... .......... .......... 74% 74.6M 0s
11600K .......... .......... .......... .......... .......... 74% 61.2M 0s
11650K .......... .......... .......... .......... .......... 75% 128M 0s
11700K .......... .......... .......... .......... .......... 75% 72.1M 0s
11750K .......... .......... .......... .......... .......... 75% 128M 0s
11800K .......... .......... .......... .......... .......... 76% 70.3M 0s
11850K .......... .......... .......... .......... .......... 76% 93.8M 0s
11900K .......... .......... .......... .......... .......... 76% 22.4M 0s
11950K .......... .......... .......... .......... .......... 77% 4.86M 0s
12000K .......... .......... .......... .......... .......... 77% 56.5M 0s
12050K .......... .......... .......... .......... .......... 77% 78.7M 0s
12100K .......... .......... .......... .......... .......... 78% 94.6M 0s
12150K .......... .......... .......... .......... .......... 78% 104M 0s
12200K .......... .......... .......... .......... .......... 78% 118M 0s
12250K .......... .......... .......... .......... .......... 79% 114M 0s
12300K .......... .......... .......... .......... .......... 79% 120M 0s
12350K .......... .......... .......... .......... .......... 79% 95.4M 0s
12400K .......... .......... .......... .......... .......... 79% 36.4M 0s
12450K .......... .......... .......... .......... .......... 80% 73.4M 0s
12500K .......... .......... .......... .......... .......... 80% 120M 0s
12550K .......... .......... .......... .......... .......... 80% 96.0M 0s
12600K .......... .......... .......... .......... .......... 81% 114M 0s
12650K .......... .......... .......... .......... .......... 81% 115M 0s
12700K .......... .......... .......... .......... .......... 81% 42.9M 0s
12750K .......... .......... .......... .......... .......... 82% 17.1M 0s
12800K .......... .......... .......... .......... .......... 82% 57.0M 0s
12850K .......... .......... .......... .......... .......... 82% 76.4M 0s
12900K .......... .......... .......... .......... .......... 83% 68.5M 0s
12950K .......... .......... .......... .......... .......... 83% 79.5M 0s
13000K .......... .......... .......... .......... .......... 83% 86.5M 0s
13050K .......... .......... .......... .......... .......... 84% 71.0M 0s
13100K .......... .......... .......... .......... .......... 84% 94.1M 0s
13150K .......... .......... .......... .......... .......... 84% 88.7M 0s
13200K .......... .......... .......... .......... .......... 85% 100M 0s
13250K .......... .......... .......... .......... .......... 85% 111M 0s
13300K .......... .......... .......... .......... .......... 85% 97.4M 0s
13350K .......... .......... .......... .......... .......... 86% 65.2M 0s
13400K .......... .......... .......... .......... .......... 86% 21.5M 0s
13450K .......... .......... .......... .......... .......... 86% 69.0M 0s
13500K .......... .......... .......... .......... .......... 87% 23.7M 0s
13550K .......... .......... .......... .......... .......... 87% 44.7M 0s
13600K .......... .......... .......... .......... .......... 87% 53.3M 0s
13650K .......... .......... .......... .......... .......... 88% 64.7M 0s
13700K .......... .......... .......... .......... .......... 88% 50.8M 0s
13750K .......... .......... .......... .......... .......... 88% 48.6M 0s
13800K .......... .......... .......... .......... .......... 88% 18.7M 0s
13850K .......... .......... .......... .......... .......... 89% 84.0M 0s
13900K .......... .......... .......... .......... .......... 89% 137M 0s
13950K .......... .......... .......... .......... .......... 89% 106M 0s
14000K .......... .......... .......... .......... .......... 90% 8.48M 0s
14050K .......... .......... .......... .......... .......... 90% 9.48M 0s
14100K .......... .......... .......... .......... .......... 90% 6.41M 0s
14150K .......... .......... .......... .......... .......... 91% 65.2M 0s
14200K .......... .......... .......... .......... .......... 91% 117M 0s
14250K .......... .......... .......... .......... .......... 91% 94.0M 0s
14300K .......... .......... .......... .......... .......... 92% 138M 0s
14350K .......... .......... .......... .......... .......... 92% 94.5M 0s
14400K .......... .......... .......... .......... .......... 92% 105M 0s
14450K .......... .......... .......... .......... .......... 93% 50.0M 0s
14500K .......... .......... .......... .......... .......... 93% 27.2M 0s
14550K .......... .......... .......... .......... .......... 93% 114M 0s
14600K .......... .......... .......... .......... .......... 94% 87.6M 0s
14650K .......... .......... .......... .......... .......... 94% 93.3M 0s
14700K .......... .......... .......... .......... .......... 94% 62.0M 0s
14750K .......... .......... .......... .......... .......... 95% 45.1M 0s
14800K .......... .......... .......... .......... .......... 95% 19.9M 0s
14850K .......... .......... .......... .......... .......... 95% 52.9M 0s
14900K .......... .......... .......... .......... .......... 96% 71.5M 0s
14950K .......... .......... .......... .......... .......... 96% 33.1M 0s
15000K .......... .......... .......... .......... .......... 96% 17.1M 0s
15050K .......... .......... .......... .......... .......... 97% 42.4M 0s
15100K .......... .......... .......... .......... .......... 97% 48.8M 0s
15150K .......... .......... .......... .......... .......... 97% 43.1M 0s
15200K .......... .......... .......... .......... .......... 97% 50.8M 0s
15250K .......... .......... .......... .......... .......... 98% 45.0M 0s
15300K .......... .......... .......... .......... .......... 98% 48.7M 0s
15350K .......... .......... .......... .......... .......... 98% 9.29M 0s
15400K .......... .......... .......... .......... .......... 99% 4.61M 0s
15450K .......... .......... .......... .......... .......... 99% 76.5M 0s
15500K .......... .......... .......... .......... .......... 99% 87.0M 0s
15550K .......... ... 100% 65.0M=0.6s

2018-09-18 20:57:20 (26.3 MB/s) - ‘/opt/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz’ saved [15936760/15936760]

Hit:1 http://archive.ubuntu.com/ubuntu bionic InRelease
Hit:2 http://archive.ubuntu.com/ubuntu bionic-updates InRelease
Hit:3 http://archive.ubuntu.com/ubuntu bionic-backports InRelease
Hit:4 http://security.ubuntu.com/ubuntu bionic-security InRelease
Reading package lists...
Reading package lists...
Building dependency tree...
Reading state information...
authbind is already the newest version (2.1.2).
libffi-dev is already the newest version (3.2.1-8).
openssh-server is already the newest version (1:7.6p1-4).
python-dev is already the newest version (2.7.15~rc1-1).
python-openssl is already the newest version (17.5.0-1ubuntu1).
python-pyasn1 is already the newest version (0.4.2-3).
git is already the newest version (1:2.17.1-1ubuntu0.1).
libcurl4-gnutls-dev is already the newest version (7.58.0-2ubuntu3.3).
libssl-dev is already the newest version (1.1.0g-2ubuntu4.1).
0 upgraded, 0 newly installed, 0 to remove and 65 not upgraded.
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1604k 100 1604k 0 0 15.8M 0 --:--:-- --:--:-- --:--:-- 15.8M
Collecting pip
Using cached https://files.pythonhosted.org/packages/5f/25/e52d3f31441505a5f3af41213346e5b6c221c9e086a166f3703d2ddaf940/pip-18.0-py2.py3-none-any.whl
Installing collected packages: pip
Found existing installation: pip 18.0
Uninstalling pip-18.0:
Successfully uninstalled pip-18.0
Successfully installed pip-18.0
Collecting pycrypto
Using cached https://files.pythonhosted.org/packages/60/db/645aa9af249f059cc3a368b118de33889219e0362141e75d4eaf6f80f163/pycrypto-2.6.1.tar.gz
Requirement already satisfied: cryptography in /usr/lib/python2.7/dist-packages (2.1.4)
Collecting service_identity
Using cached https://files.pythonhosted.org/packages/29/fa/995e364220979e577e7ca232440961db0bf996b6edaf586a7d1bd14d81f1/service_identity-17.0.0-py2.py3-none-any.whl
Collecting requests
Using cached https://files.pythonhosted.org/packages/65/47/7e02164a2a3db50ed6d8a6ab1d6d60b69c4c3fdf57a284257925dfc12bda/requests-2.19.1-py2.py3-none-any.whl
Collecting ipwhois
Using cached https://files.pythonhosted.org/packages/50/a4/cef165da087eae4d91f11f1f42ca356ce9410fee8145af76484a9589c447/ipwhois-1.0.0-py2.py3-none-any.whl
Collecting twisted
Using cached https://files.pythonhosted.org/packages/90/50/4c315ce5d119f67189d1819629cae7908ca0b0a6c572980df5cc6942bc22/Twisted-18.7.0.tar.bz2
Requirement already satisfied: pyopenssl>=0.12 in /usr/lib/python2.7/dist-packages (from service_identity) (17.5.0)
Requirement already satisfied: pyasn1 in /usr/lib/python2.7/dist-packages (from service_identity) (0.4.2)
Collecting attrs (from service_identity)
Using cached https://files.pythonhosted.org/packages/3a/e1/5f9023cc983f1a628a8c2fd051ad19e76ff7b142a0faf329336f9a62a514/attrs-18.2.0-py2.py3-none-any.whl
Collecting pyasn1-modules (from service_identity)
Using cached https://files.pythonhosted.org/packages/19/02/fa63f7ba30a0d7b925ca29d034510fc1ffde53264b71b4155022ddf3ab5d/pyasn1_modules-0.2.2-py2.py3-none-any.whl
Requirement already satisfied: idna<2.8,>=2.5 in /usr/lib/python2.7/dist-packages (from requests) (2.6)
Collecting certifi>=2017.4.17 (from requests)
Using cached https://files.pythonhosted.org/packages/df/f7/04fee6ac349e915b82171f8e23cee63644d83663b34c539f7a09aed18f9e/certifi-2018.8.24-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests)
Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3<1.24,>=1.21.1 (from requests)
Using cached https://files.pythonhosted.org/packages/bd/c9/6fdd990019071a4a32a5e7cb78a1d92c53851ef4f56f62a3486e6a7d8ffb/urllib3-1.23-py2.py3-none-any.whl
Collecting dnspython (from ipwhois)
Using cached https://files.pythonhosted.org/packages/a6/72/209e18bdfedfd78c6994e9ec96981624a5ad7738524dd474237268422cb8/dnspython-1.15.0-py2.py3-none-any.whl
Collecting ipaddr; python_version < "3.3" (from ipwhois)
Collecting zope.interface>=4.4.2 (from twisted)
Collecting constantly>=15.1 (from twisted)
Using cached https://files.pythonhosted.org/packages/b9/65/48c1909d0c0aeae6c10213340ce682db01b48ea900a7d9fce7a7910ff318/constantly-15.1.0-py2.py3-none-any.whl
Collecting incremental>=16.10.1 (from twisted)
Using cached https://files.pythonhosted.org/packages/f5/1d/c98a587dc06e107115cf4a58b49de20b19222c83d75335a192052af4c4b7/incremental-17.5.0-py2.py3-none-any.whl
Collecting Automat>=0.3.0 (from twisted)
Using cached https://files.pythonhosted.org/packages/a3/86/14c16bb98a5a3542ed8fed5d74fb064a902de3bdd98d6584b34553353c45/Automat-0.7.0-py2.py3-none-any.whl
Collecting hyperlink>=17.1.1 (from twisted)
Using cached https://files.pythonhosted.org/packages/a7/b6/84d0c863ff81e8e7de87cff3bd8fd8f1054c227ce09af1b679a8b17a9274/hyperlink-18.0.0-py2.py3-none-any.whl
Collecting PyHamcrest>=1.9.0 (from twisted)
Using cached https://files.pythonhosted.org/packages/9a/d5/d37fd731b7d0e91afcc84577edeccf4638b4f9b82f5ffe2f8b62e2ddc609/PyHamcrest-1.9.0-py2.py3-none-any.whl
Requirement already satisfied: setuptools in /usr/local/lib/python2.7/dist-packages (from zope.interface>=4.4.2->twisted) (40.4.1)
Requirement already satisfied: six in /usr/lib/python2.7/dist-packages (from Automat>=0.3.0->twisted) (1.11.0)
Building wheels for collected packages: pycrypto, twisted
Running setup.py bdist_wheel for pycrypto: started
Running setup.py bdist_wheel for pycrypto: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/27/02/5e/77a69d0c16bb63c6ed32f5386f33a2809c94bd5414a2f6c196
Running setup.py bdist_wheel for twisted: started
Running setup.py bdist_wheel for twisted: finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/a9/85/24/fc82998fb686cb31e65a26c027a20120fd1219c9f1e925913a
Successfully built pycrypto twisted
Installing collected packages: pycrypto, attrs, pyasn1-modules, service-identity, certifi, chardet, urllib3, requests, dnspython, ipaddr, ipwhois, zope.interface, constantly, incremental, Automat, hyperlink, PyHamcrest, twisted
Successfully installed Automat-0.7.0 PyHamcrest-1.9.0 attrs-18.2.0 certifi-2018.8.24 chardet-3.0.4 constantly-15.1.0 dnspython-1.15.0 hyperlink-18.0.0 incremental-17.5.0 ipaddr-2.2.0 ipwhois-1.0.0 pyasn1-modules-0.2.2 pycrypto-2.6.1 requests-2.19.1 service-identity-17.0.0 twisted-18.7.0 urllib3-1.23 zope.interface-4.5.0
Cloning into 'cowrie'...
cp: cannot stat 'cowrie.cfg.dist': No such file or directory
sudo: ./start.sh: command not found`

Ubuntu 18.04 Installer Fix

This resolves the existing issues (path locations, startup of cowrie, etc) with the sensor.sh installation script failing with errors on Ubuntu 18.04.

#!/bin/bash
#Tango Sensor Install
#Should be compatible with Ubuntu and Debian


#Disclaimer. Continues for yes, quits for no. 
while true; do
    read -p "[!] You are about to install Cowrie and the Splunk Universal Forwarder. By running this installer, you accept Splunk's EULA. Do you wish to proceed? (Yes/No)" yn
    case $yn in
        [Yy]* ) break;;
        [Nn]* ) exit;;
        * ) echo "Please answer Yes or No.";;
    esac
done

########################################
 
#User input variables
#Splunk Indexer hostname/IP address from user
read -e -p "[?] Enter the Splunk Indexer to forward logs to: (example: splunk.test.com:9997) " SPLUNK_INDEXER

#Sensor hostname from user
read -e -p "[?] Enter Sensor name. (example: hp-US-Las_Vegas-01) " HOST_NAME

#SSH Port number from user
read -e -p "[?] Enter new SSH port number, since Kippo will listen on default SSH port. (example: 1337) " SSH_PORT

########################################

# Logging setup. This is done to log all the output from commands executed in the script to a file. 
#This provides us troubleshooting data if the script fails.
logfile=/var/log/tango_install.log
mkfifo ${logfile}.pipe
tee < ${logfile}.pipe $logfile &
exec &> ${logfile}.pipe
rm ${logfile}.pipe

########################################

#metasploit-like print statements. Status messages, error messages, good status returns.
# I added in a notification print for areas users should definitely pay attention to.

function print_status ()
{
    echo -e "\x1B[01;34m[*]\x1B[0m $1"
}

function print_good ()
{
    echo -e "\x1B[01;32m[*]\x1B[0m $1"
}

function print_error ()
{
    echo -e "\x1B[01;31m[*]\x1B[0m $1"
}

function print_notification ()
{
    echo -e "\x1B[01;33m[*]\x1B[0m $1"
}
########################################

#Script does a lot of error checking. Decided to insert an error check function. 
# If a task performed returns a non zero status code, something very likely went wrong.

function error_check
{

if [ $? -eq 0 ]; then
    print_good "$1 successfully completed."
else
    print_error "$1 failed. Please check $logfile for more details."
exit 1
fi

}

########################################

#BEGIN MAIN#

########################################



# These Variables Need to be set! #

#SPLUNK_INDEXER: This is the box that is going to process your splunk logs. 
#Can be a hostname or an IP address. The default port is 9997/tcp. #
#SPLUNK_INDEXER="splunkserver.yourdomain.com:9997"

#HOST_NAME: This controls what name your kippo server will have when reviewing its 
# data in the Tango Splunk App. Use unique names. 
# Suggestion: "hp-{country code}-{city}-{number}" such as: hp-US-Las_Vegas-01 #
#HOST_NAME="hp-countrycode-city-01"


#SSH_PORT: This port will replace the default SSH port (22), so that Kippo may run on it, and you'll stil be able
# to access the host using SSH.
#SSH_PORT= "1337"


########################################

# Set the directory we are initially executing the script in.
execdir=`pwd`

########################################

#We need root privs to run most of this, this is a quick check to ensure that we are root. If not, bail.

print_status "Checking for root privs.."
if [ $(whoami) != "root" ]; then
    print_error "This script must be ran with sudo or root privileges."
    exit 1
else
    print_good "We are root."
fi
     
########################################    

#We check what architecture the system is and download the correct splunk Universal Forwarder for that CPU arch.

arch=`uname -m`

if [[ $arch == "x86_64" ]]; then
    INSTALL_FILE="splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz"
    print_notification "System is $arch. Downloading: $INSTALL_FILE to /opt.."
    wget -O /opt/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz 'http://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86_64&platform=linux&version=6.3.0&product=universalforwarder&filename=splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-x86_64.tgz&wget=true' &>> $logfile    
    error_check 'Splunk Forwarder Download'
elif [[ $arch == "i686" ]]; then
    INSTALL_FILE="splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-i686.tgz"
    print_notification "System is $arch. Downloading: $INSTALL_FILE to /opt.."
    wget -O /opt/splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-i686.tgz 'http://www.splunk.com/bin/splunk/DownloadActivityServlet?architecture=x86&platform=linux&version=6.3.0&product=universalforwarder&filename=splunkforwarder-6.3.0-aa7d4b1ccb80-Linux-i686.tgz&wget=true' &>> $logfile    
    error_check 'Splunk Forwarder Download'
else
    print_error "System arch is not x86_64 or i686. Tango Honeypot is not yet supported on other CPU architectures."
    exit 1
fi

########################################

# Based on the OS (Debian or Redhat based), use the OS package mangaer to download required packages

if [ -f /etc/debian_version ]; then
    apt-get -y update &>> $logfile
    print_notification "Installing required packages via apt-get.."
    apt-get -y install python-dev python-openssl python-pyasn1 authbind git libcurl4-gnutls-dev libssl-dev libffi-dev openssh-server&>> $logfile
    error_check 'Apt Package Installation'
    
    curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py" &>> $logfile
    python get-pip.py &>> $logfile
    print_notification "Installed pip"
    print_notification "Installing required python packages via pip.."
    pip install pycrypto cryptography service_identity requests ipwhois twisted &>> $logfile
    error_check 'Python pip'
    iptables -t nat -A PREROUTING -p tcp --dport 22 -j REDIRECT --to-port 2222
elif [ -f /etc/redhat-release ]; then
    yum -y update &>> $logfile
    print_notification "Installing required packages via yum.."
    yum -y install wget python-devel python-zope-interface unzip git gnutls-devel gcc gcc-c++ &>> $logfile
    error_check 'Yum Package Installation'
    
    print_notification "Installing required python packages via easy_install.."
    easy_install pycrypto pyasn1 twisted requests &>> $logfile
    error_check 'Python easy_install'
else
    print_error "Unable to determine correct package manager to use. This script currently supports apt-based Operating Systems (Debian, Ubuntu, Kali) and yum-based Operating Systems (Redhat, CentOS, etc.) and relies on either /etc/redhat-release or /etc/debian_version being present to determine the correct package manager to use."
    exit 1
fi


########################################

# Adding splunk user for service to run as. Shell is set to /bin/false.

print_status "Checking for splunk user and group.."

getent passwd splunk &>> $logfile
if [ $? -eq 0 ]; then
    print_status "splunk user exists. Verifying group exists.."
    id -g splunk &>> $logfile
    if [ $? -eq 0 ]; then
        print_notification "splunk group exists."
    else
        print_notification "splunk group does not exist. Creating.."
        groupadd splunk &>> $logfile
        usermod -G splunk splunk &>> $logfile
        error_check 'Creation of Splunk group and Addition of Splunk user to group'
    fi
else
    print_status "Creating splunk user and group.."
    groupadd splunk &>> $logfile
    useradd -g splunk splunk -d /home/splunk -s /bin/false &>> $logfile
    mkdir /home/splunk
    chown -R splunk:splunk /home/splunk
    error_check 'Splunk user and group creation'
    
fi

chown -R splunk:splunk /home/splunk &>> $logfile

########################################

# Adding splunk user for service to run as. Shell is set to /bin/false.

print_status "Checking for cowrie user and group.."

getent passwd cowrie &>> $logfile
if [ $? -eq 0 ]; then
    print_status "cowrie user exists. Verifying group exists.."
    id -g cowrie &>> $logfile
    if [ $? -eq 0 ]; then
        print_notification "cowrie group exists."
    else
        print_notification "cowrie group does not exist. Creating.."
        groupadd cowrie &>> $logfile
        usermod -G cowrie cowrie &>> $logfile
        error_check 'Creation of cowrie group and Addition of cowrie user to group'
    fi
else
    print_status "Creating cowrie user and group.."
    groupadd cowrie &>> $logfile
    useradd -g cowrie cowrie -d /home/splunk -s /bin/false &>> $logfile
    error_check 'Cowrie user and group creation'
    
fi

chown -R splunk:splunk /home/splunk &>> $logfile

########################################

# Installing Cowrie Honeypot

print_notification "Installing Cowrie Honeypot.."
cd /opt
git clone https://github.com/micheloosterhof/cowrie.git &>> $logfile
error_check "Cloned Cowrie Repository from GitHub"
cd cowrie
cd etc
cp cowrie.cfg.dist cowrie.cfg &>> $logfile
# Changing the Honeypot name as well as changing the port that Kippo listens on
#sed -i "s/#listen_port = 2222/listen_port = 22/" cowrie.cfg &>> $logfile
#sed -i "s/#\[database_jsonlog\]/\[database_jsonlog\]/" cowrie.cfg &>> $logfile
#sed -i "s/#logfile = log\/kippolog.json/logfile = log\/kippolog.json/" cowrie.cfg &>> $logfile
#sed -i "s/\[output_jsonlog\]/#\[output_jsonlog\]/" cowrie.cfg &>> $logfile
#sed -i "s/logfile = log\/kippo.json/#logfile = log\/kippo.json/" cowrie.cfg &>> $logfile
print_notification "Configured Cowrie Honeypot"

########################################

# Changing Default SSH Port

# Changing the port that SSH listens on to the variable set above
if [[ $arch == "x86_64" ]]; then
    cd /etc/ssh/
    sed -i "s/Port 22/Port $SSH_PORT/" sshd_config &>> $logfile
    service ssh restart &>> $logfile
    error_check 'SSH Service Restarted'
elif [[ $arch == "i686" ]]; then
    cd /etc/ssh/
    sed -i "s/#Port 22/Port $SSH_PORT/" sshd_config &>> $logfile
    service sshd restart &>> $logfile
    error_check "SSH Service Restarted"
    cd /tmp
    git clone https://github.com/tootedom/authbind-centos-rpm.git &>> $logfile
    error_check 'Cloned authbind repo from GitHub'
    cd authbind-centos-rpm/authbind/RPMS/x86_64/
    rpm -i authbind-2.1.1-0.x86_64.rpm &>> $logfile
    error_check 'Installed authbind'
else
    print_error "System arch is not x86_64 or i686. Tango Honeypot is not yet supported on other CPU architectures."
    exit 1
fi

########################################

# Setting up authbind to allow kippo user to bind to privileged port
#print_notification "Configuring Authbind"
#touch /etc/authbind/byport/22 &>> $logfile
#chown cowrie:cowrie /etc/authbind/byport/22 &>> $logfile
chown -R cowrie:cowrie /opt/cowrie &>> $logfile
cd /opt/cowrie/
#sed -i "s,twistd -y kippo.tac -l log/kippo.log --pidfile kippo.pid,authbind --deep twistd -y kippo.tac -l log/kippo.log --pidfile kippo.pid," start.sh &>> $logfile
sudo -u cowrie bin/cowrie start &>> $logfile
error_check "Cowrie started successfully"
#print_notification "Authbind Configured to use Port 22"

########################################

# Installing Splunk Universal Forwarder and setting it to persist on reboot

print_notification "Installing Splunk Universal Forwarder.."
cd /opt
tar -xzf $INSTALL_FILE &>> $logfile
chown -R splunk:splunk splunkforwarder &>> $logfile
sudo -u splunk /opt/splunkforwarder/bin/splunk start --accept-license --answer-yes --auto-ports --no-prompt &>> $logfile
error_check 'Universal Forwarder Configuration'
/opt/splunkforwarder/bin/splunk enable boot-start -user splunk &>> $logfile
error_check 'Universal Forwarder Install' 

########################################

#Check to see if the user tried to execute uf_only outside of the Tango directory. Yell at them if they did. 
# Grab tango_input from the Tango directory (if it's there), configure inputs.conf, start up the forwarder. We done here.

print_notification "Installing tango_input.."

if [ ! -d "$execdir/tango_input" ]; then
        print_error "Unable to find tango_input directory in $execdir. tango_input should be in the same directory as uf_only.sh. Please correct this and run the script again."
        exit 1
else
    cp -r "$execdir/tango_input" /opt/splunkforwarder/etc/apps &>> $logfile
fi

print_notification "Configuring /opt/splunkforwarder/etc/apps/tango_input/default/inputs.conf and outputs.conf.."

cd /opt/splunkforwarder/etc/apps/tango_input/default 
sed -i "s/test/$HOST_NAME/" inputs.conf &>> $logfile
sed -i "s/test/$SPLUNK_INDEXER/" outputs.conf &>> $logfile

chown -R splunk:splunk /opt/splunkforwarder &>> $logfile
/opt/splunkforwarder/bin/splunk restart &>> $logfile
error_check 'Tango_input installation'
sudo -u cowrie chmod 777 /opt/cowrie/var/log/cowrie.json

print_notification "If the location of your kippo log files changes or the hostname/ip of the indexer changes, you will need to modify /opt/splunkfowarder/etc/apps/tango_input/default/inputs.conf and outputs.conf respectively."

print_good "Install Completed. The splunk forwarder should be reporting and sending data to your indexer. Log file is located at /var/log/tango_install.log"

exit 0

Cowrie not producing logs

I have tried a few fresh installs of Tango but Cowrie is not producing logs. I see during install it fails to chmod the log file because it does not exist. Even after performing some test logins to the honeypot it still does not appear to log.

Anyone else have this issue?

Cowrie: ERROR: You must not run cowrie as root!

Hi guys,

I'm trying to run bash-file in path /cowrie "start.sh" but terminal response:

Starting cowrie with extra arguments [ ] ...
ERROR: You must not run cowrie as root!

Now, when I switched to new added user "test" without root permission I'm getting error:

`Starting cowrie with extra arguments [ ] ...
Unhandled Error
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 648, in run
    runApp(config)
  File "/usr/local/lib/python2.7/dist-packages/twisted/scripts/twistd.py", line 25, in runApp
    _SomeApplicationRunner(config).run()
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 379, in run
    self.application = self.createOrGetApplication()
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 439, in createOrGetApplication
    ser = plg.makeService(self.config.subOptions)
--- <exception caught here> ---
  File "/home/cowrie/twisted/plugins/cowrie_plugin.py", line 142, in makeService
    globals(), locals(), ['output']).Output(cfg)
  File "/home/cowrie/cowrie/output/jsonlog.py", line 50, in __init__
    self.outfile = twisted.python.logfile.DailyLogFile(base, dirs, defaultMode=0o664)
  File "/usr/local/lib/python2.7/dist-packages/twisted/python/logfile.py", line 42, in __init__
    self._openFile()
  File "/usr/local/lib/python2.7/dist-packages/twisted/python/logfile.py", line 252, in _openFile
    BaseLogFile._openFile(self)
  File "/usr/local/lib/python2.7/dist-packages/twisted/python/logfile.py", line 85, in _openFile
    self._file = open(self.path, "w+", 1)
exceptions.IOError: [Errno 13] Permission denied: u'log/cowrie.json'

Traceback (most recent call last):
  File "/usr/local/bin/twistd", line 11, in <module>
    sys.exit(run())
  File "/usr/local/lib/python2.7/dist-packages/twisted/scripts/twistd.py", line 29, in run
    app.run(runApp, ServerOptions)
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 648, in run
    runApp(config)
  File "/usr/local/lib/python2.7/dist-packages/twisted/scripts/twistd.py", line 25, in runApp
    _SomeApplicationRunner(config).run()
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 379, in run
    self.application = self.createOrGetApplication()
  File "/usr/local/lib/python2.7/dist-packages/twisted/application/app.py", line 439, in createOrGetApplication
    ser = plg.makeService(self.config.subOptions)
  File "/home/cowrie/twisted/plugins/cowrie_plugin.py", line 160, in makeService
    factory.portal = portal.Portal(core.realm.HoneyPotRealm(cfg))
  File "/home/cowrie/cowrie/core/realm.py", line 61, in __init__
    self.pckl = pickle.load(file(cfg.get('honeypot', 'filesystem_file'), 'rb'))
  File "/usr/lib/python2.7/pickle.py", line 1378, in load
    return Unpickler(file).load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 880, in load_eof
    raise EOFError
EOFError

What next should I do?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.