Coder Social home page Coder Social logo

tradytics / surpriver Goto Github PK

View Code? Open in Web Editor NEW
1.7K 87.0 322.0 2.02 MB

Find big moving stocks before they move using machine learning and anomaly detection

Home Page: https://www.tradytics.com/

License: GNU General Public License v3.0

Python 99.75% Batchfile 0.15% Shell 0.10%
machine-learning finance-application trading trading-algorithms algotrading anomaly-detection ai investment stock-analysis stock

surpriver's Issues

Volume*Price instead of just volume filtering

Hello, I really like what you have done here and I've been messing around with it a bit. However it's a bit strange how you can only filter for volume and not volume*price. For example 100k volume in a 2$ stock over a 5 day period amounts to 200.000$ traded, however 10 volume in Berkshire Hathaway A amounts to 3.000.000$ traded over a day period. However with the current filter Berkshire Hathaway A will be filtered out and the penny stock will be kept.

Integration with Ray?

Hi, I found this project from GitHub trending chart. Exciting work! ๐Ÿ˜€

I wonder if data volume and speed have become a consideration for the project? If so, maybe we can work together to integration with the Ray project (which I am part of). This would potentially allow the program run on multiple cores and multiple machines.

Process stopped with an error after fetching

Same error if I run from docker and from sources

100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 5693/5693 [10:14<00:00, 9.27it/s]
Traceback (most recent call last):
File "detection_engine.py", line 357, in
supriver.find_anomalies()
File "detection_engine.py", line 196, in find_anomalies
features, historical_price_info, future_prices, symbol_names = self.dataEngine.collect_data_for_all_tickers()
File "/usr/src/app/data_loader.py", line 211, in collect_data_for_all_tickers
features, historical_price_info, future_price_info, symbol_names = self.remove_bad_data(features, historical_price_info, future_price_info, symbol_names)
File "/usr/src/app/data_loader.py", line 249, in remove_bad_data
most_common_length = length_dictionary[0]
IndexError: list index out of range

Error while processing

Hello, I keep getting the following error while the script is running.

Exception __init__() got an unexpected keyword argument 'n'

Then, when finished, it gives this error:

Exception in thread Thread-2123: Traceback (most recent call last): File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/multitasking/__init__.py", line 102, in _run_via_pool return callee(*args, **kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/multi.py", line 167, in _download_one_threaded actions, period, interval, prepost, proxy, rounding) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/multi.py", line 182, in _download_one rounding=rounding, many=True) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/yfinance/base.py", line 156, in history data = data.json() File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/site-packages/requests/models.py", line 900, in json return complexjson.loads(self.text, **kwargs) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/__init__.py", line 348, in loads return _default_decoder.decode(s) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/home/sage/anaconda3/envs/Cerebus/lib/python3.7/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

I'm running it on Anaconda and all of the requirements are installed.
I've also tried this, in case it was a Python version issue.

python3 detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 1 --future_bars 25

Ialso found these in the detection_engine.py

Basic libraries

import os
import ta = Not working
import sys = Not working
import json
import math = Not working
import pickle = Not working
import random = Not working
import requests = Not working
import collections
import numpy as np
from os import walk = Not working, path
import pandas as pd = Not working
import yfinance as yf = Not working
import datetime as dt
from scipy.stats import linregress = Not working
from datetime import datetime = Not working, timedelta = Not working
import matplotlib.pyplot as plt
from sklearn.ensemble import IsolationForest
from data_loader import DataEngine
import warnings

exec: "/usr/src/app/entry_point.sh": permission denied

Hi,

I am trying to run surpriver using docker. I followed all the steps in README but got stuck on the last steps.
I am getting the following error when I run docker-compose up -d:

Starting surpriver ... error                                                                                                         
                                                                                                                                     
ERROR: for surpriver  Cannot start service surpriver: OCI runtime create failed: container_linux.go:367: starting container process caused: exec: "/usr/src/app/entry_point
.sh": permission denied: unknown                                                                                                     
                                                                                                                                     
ERROR: for surpriver  Cannot start service surpriver: OCI runtime create failed: container_linux.go:367: starting container process caused: exec: "/usr/src/app/entry_point
.sh": permission denied: unknown                                                                                                     
ERROR: Encountered errors while bringing up the project.

Did anyone face the same problems as above, and got away with it after all? Please share if so :)

Information:
Using the latest version (commit) as of today 5241766.
System:
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 20.10
Release: 20.10
Codename: groovy

Same dictionary different results

Is there some sort of time decay in the score factor? If I run it now vs an hour from now using the same dictionary I'll get different Symbols, the Symbols which are the same have different scores. What's going on?

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 1 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 0 --is_test 0 --future_bars 0 --output_format JSON

        [latest_date] => 2020-09-03 15:30:00-04:00
        [Symbol] => SHLO
        [Anomaly Score] => -0.090577053896931
        [Today Volume] => 191.8M
        [Average Volume 5d] => 4.4M
        [Average Volume 20d] => 1.6M
        [Volatility 5bars] => 0.039062185261644
        [Volatility 20bars] => 0.16062981405984

VS
[latest_date] => 2020-09-03 15:30:00-04:00
[Symbol] => SHLO
[Anomaly Score] => -0.089904823370707
[Today Volume] => 191.8M
[Average Volume 5d] => 4.4M
[Average Volume 20d] => 1.6M
[Volatility 5bars] => 0.039062185261644
[Volatility 20bars] => 0.16062981405984

This was number 4 the first time:
[latest_date] => 2020-09-03 15:30:00-04:00
[Symbol] => SBPH
[Anomaly Score] => -0.064138750572796
[Today Volume] => 178.03K
[Average Volume 5d] => 129.7K
[Average Volume 20d] => 102.2K
[Volatility 5bars] => 0.0067650775893322
[Volatility 20bars] => 0.090846754019149

7th the second time:
[latest_date] => 2020-09-03 15:30:00-04:00
[Symbol] => SBPH
[Anomaly Score] => -0.054286909962896
[Today Volume] => 178.03K
[Average Volume 5d] => 129.7K
[Average Volume 20d] => 102.2K
[Volatility 5bars] => 0.0067650775893322
[Volatility 20bars] => 0.090846754019149

No file or directory: dictionaries/data_dict.npy

Those who encounter this error, change the first command from this:
python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0

And change this to:
python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path dictionaries/data_dict.npy --is_save_dictionary 1 --is_test 0 --future_bars 0

How do I fix my JSON error when it doesn't show loading data for all stocks?

Hey everyone,

I've been using surpriver for a little while and this was the first time that this has happened. Thanks!

Surpriver has been initialized...
Data engine has been initialized...
Loading all stocks from file...
Total number of stocks: 5693
Technical Indicator Engine has been initialized
Loading data for all stocks...
  0%|                                                  | 0/5693 [00:00<?, ?it/s]Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.8/dist-packages/multitasking/__init__.py", line 102, in _run_via_pool
    return callee(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/yfinance/multi.py", line 166, in _download_one_threaded
    data = _download_one(ticker, start, end, auto_adjust, back_adjust,
  File "/usr/local/lib/python3.8/dist-packages/yfinance/multi.py", line 178, in _download_one
    return Ticker(ticker).history(period=period, interval=interval,
  File "/usr/local/lib/python3.8/dist-packages/yfinance/base.py", line 155, in history
    data = data.json()
  File "/usr/local/lib/python3.8/dist-packages/requests/models.py", line 898, in json
    return complexjson.loads(self.text, **kwargs)
  File "/usr/lib/python3/dist-packages/simplejson/__init__.py", line 518, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 370, in decode
    obj, end = self.raw_decode(s)
  File "/usr/lib/python3/dist-packages/simplejson/decoder.py", line 400, in raw_decode
    return self.scan_once(s, idx=_w(s, idx).end())
simplejson.errors.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

an update of docker

Hi I have forked your repo and add feature to be build with docker great.
If you are interrest, I can Pull?

Dependencies

Looks like I ran into a bunch of issues due to older versions of requirements. Tried to resolve with no luck. Is this system offered in your new platform?

requests.exceptions.ConnectionError

Hi,

I am going through the readme section to set up and run the package with docker.
I am getting the following error when I run docker-compose up -d:

Traceback (most recent call last):
  File "urllib3/connectionpool.py", line 677, in urlopen
  File "urllib3/connectionpool.py", line 392, in _make_request
  File "http/client.py", line 1277, in request
  File "http/client.py", line 1323, in _send_request
  File "http/client.py", line 1272, in endheaders
  File "http/client.py", line 1032, in _send_output
  File "http/client.py", line 972, in send
  File "docker/transport/unixconn.py", line 43, in connect
PermissionError: [Errno 13] Permission denied

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "requests/adapters.py", line 449, in send
  File "urllib3/connectionpool.py", line 727, in urlopen
  File "urllib3/util/retry.py", line 410, in increment
  File "urllib3/packages/six.py", line 734, in reraise
  File "urllib3/connectionpool.py", line 677, in urlopen
  File "urllib3/connectionpool.py", line 392, in _make_request
  File "http/client.py", line 1277, in request
  File "http/client.py", line 1323, in _send_request
  File "http/client.py", line 1272, in endheaders
  File "http/client.py", line 1032, in _send_output
  File "http/client.py", line 972, in send
  File "docker/transport/unixconn.py", line 43, in connect
urllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "docker/api/client.py", line 214, in _retrieve_server_version
  File "docker/api/daemon.py", line 181, in version
  File "docker/utils/decorators.py", line 46, in inner
  File "docker/api/client.py", line 237, in _get
  File "requests/sessions.py", line 543, in get
  File "requests/sessions.py", line 530, in request
  File "requests/sessions.py", line 643, in send
  File "requests/adapters.py", line 498, in send
requests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "docker-compose", line 3, in <module>
  File "compose/cli/main.py", line 80, in main
  File "compose/cli/main.py", line 189, in perform_command
  File "compose/cli/command.py", line 70, in project_from_options
  File "compose/cli/command.py", line 153, in get_project
  File "compose/cli/docker_client.py", line 43, in get_client
  File "compose/cli/docker_client.py", line 170, in docker_client
  File "docker/api/client.py", line 197, in __init__
  File "docker/api/client.py", line 222, in _retrieve_server_version
docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', PermissionError(13, 'Permission denied'))
[194412] Failed to execute script docker-compose

My system is Ubuntu Ubuntu 20.04.2 LTS. Any recommendation/advices on how to make it run? Thank you.

new

My name is Luis, I'm a big-data machine-learning developer, I'm a fan of your work, and I usually check your updates.

I was afraid that my savings would be eaten by inflation. I have created a powerful tool that based on past technical patterns (volatility, moving averages, statistics, trends, candlesticks, support and resistance, stock index indicators).
All the ones you know (RSI, MACD, STOCH, Bolinger Bands, SMA, DEMARK, Japanese candlesticks, ichimoku, fibonacci, williansR, balance of power, murrey math, etc) and more than 200 others.

The tool creates prediction models of correct trading points (buy signal and sell signal, every stock is good traded in time and direction).
For this I have used big data tools like pandas python, stock market libraries like: tablib, TAcharts ,pandas_ta... For data collection and calculation.
And powerful machine-learning libraries such as: Sklearn.RandomForest , Sklearn.GradientBoosting, XGBoost, Google TensorFlow and Google TensorFlow LSTM.

With the models trained with the selection of the best technical indicators, the tool is able to predict trading points (where to buy, where to sell) and send real-time alerts to Telegram or Mail. The points are calculated based on the learning of the correct trading points of the last 2 years (including the change to bear market after the rate hike).

I think it could be useful to you, to improve, I would like to share it with you, and if you are interested in improving and collaborating I am also willing, and if not file it in the box.

If tou want, Please read the readme , and in case of any problem you can contact me ,
If you are convinced try to install it with the documentation.
https://github.com/Leci37/LecTrade/tree/develop I appreciate the feedback

list index out of range

After running the below:

python detection_engine.py --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0

Here is the error message:

Traceback (most recent call last):
File "detection_engine.py", line 357, in
supriver.find_anomalies()
File "detection_engine.py", line 196, in find_anomalies
features, historical_price_info, future_prices, symbol_names = self.dataEngine.collect_data_for_all_tickers()
File "/Users/Home/Documents/Coding/surpriver-master/data_loader.py", line 211, in collect_data_for_all_tickers
features, historical_price_info, future_price_info, symbol_names = self.remove_bad_data(features, historical_price_info, future_price_info, symbol_names)
File "/Users/Home/Documents/Coding/surpriver-master/data_loader.py", line 249, in remove_bad_data
most_common_length = length_dictionary[0]
IndexError: list index out of range

ImportError: No module named ta

:~/Downloads/surpriver-master$ sudo python detection_engine.py -v --top_n 25 --min_volume 5000 --data_granularity_minutes 60 --history_to_use 14 --is_load_from_dictionary 0 --data_dictionary_path 'dictionaries/data_dict.npy' --is_save_dictionary 1 --is_test 0 --future_bars 0
Traceback (most recent call last):
File "detection_engine.py", line 3, in
import ta
ImportError: No module named ta

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.