Coder Social home page Coder Social logo

datasig-ac-uk / esig Goto Github PK

View Code? Open in Web Editor NEW
40.0 12.0 3.0 26.61 MB

esig python package

Home Page: https://esig.readthedocs.io/en/latest/

License: GNU General Public License v3.0

Shell 1.64% Dockerfile 0.26% Python 49.36% C 4.62% C++ 40.52% CMake 3.61%
hut23 hut23-171

esig's Introduction

esig

The Python package esig provides a toolset (previously called sigtools) for transforming vector time series in stream space to signatures in effect space. It is based on the libalgebra C++ library.

build

Installation

esig can be installed from a wheel using pip in most cases. The wheels contain all of the dependencies and thus make it easy to use the package. For example, on Python 3.8, you can install esig using the following console command:

python3.8 -m pip install esig

(You may need to tweak this command based on your platform, Python version, and preferences.)

esig can be compiled from source, but this is not advised. More information can be found in the documentation.

Basic usage

esig provides a collection of basic functions for computing the signature of a data stream in the form of a Numpy array. The stream2sig function computes the signature of a data stream up to a specific depth. For example, we can create a very simple data stream and compute its signature as follows.

import numpy as np
import esig

stream = np.array([
    [1.0, 1.0],
    [3.0, 4.0],
    [5.0, 2.0],
    [8.0, 6.0]
])
depth = 2

sig = esig.stream2sig(stream, depth) # compute the signature
print(sig) # prints "[1.0, 7.0, 5.0, 24.5, 19.0, 16.0, 12.5]"

The signature is returned as a flat Numpy array that contains the terms of the signature - which is fundamentally a higher dimensional tensor - in degree order. This first element is always 1.0, which corresponds to the empty tensor key. In this case the dimension is 2 (specified by the number of columns in the stream array), and so the next two elements are the signature elements corresponding to the words (1) and (2). These are the depth 1 words. The final 4 elements are the depth 2 words (1,1), (1,2), (2,1), and (2,2). esig provides the sigkeys function to generate these labels for you based on the parameters of the data.

width = 2
sig_keys = esig.sigkeys(width, depth)
print(sig_keys) # prints " () (1) (2) (1,1) (1,2) (2,1) (2,2)"

To compute the log signature of a data stream you use the stream2logsig function. This works in a similar manner to the stream2sig function in that it takes a Numpy array (the data) and a depth and returns a flat Numpy array containing the elements of the log signature in degree order.

log_sig = esig.stream2logsig(stream, depth)
print(log_sig) # prints "[7.  5.  1.5]"

Here the first two elements are the depth 1 Lie elements (corresponding to the letters 1 and 2) and the third element is the coefficient of the Hall basis element [1,2]. Again, esig provides a utility function logsigkeys for getting the keys that correspond to the coefficients in order for the log signature.

log_sig_keys = esig.logsigkeys(width, depth)
print(log_sig_keys) # prints " 1 2 [1,2]"

There are two additional utility functions for computing the size of a signature or logsignature with a specified dimension and depth: sigdim and logsigdim. These functions return an integer that is the dimension of the Numpy array returned from the stream2sig or stream2logsig functions, respectively.

esig also provides another function recombine, which performs a reduction of a measure defined on a large ensemble in a way so that the resulting measure has the same total mass, but is supported on a (relatively) small subset of the original ensemble. In particuar, the expected value over the ensemble with respect to the new measure agrees with that of the original measure.

Using alternative computation backends

esig uses libalgebra as a backend for computing signatures and log signatures by default. However, the computation backend can be changed to instead use an alternative library for computing signatures and log signatures. This is achieved by using the set_backend function in esig and providing the name of the backed that you wish to use. For example, we can switch to using the iisignature package as a backend by first installing the iisignature package and then using the command

import esig
esig.set_backend("iisignature")

To make it easier to install and use iisignature as a backend, it is offered as an optional extra when installing esig:

python3.8 -m pip install esig[iisignature]

You can also define your own backend for performing calculations by creating a class derived from esig.backends.BackendBase, implementing the methods describe_path (log_signature) and signature and related methods.

esig's People

Contributors

dependabot[bot] avatar inakleinbottle avatar irustandi avatar nbarlowati avatar pafoster avatar rolyp avatar terrylyons avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

esig's Issues

Parallelise remaining Mac builds

Mac builds for Python 3.5-3.8 run as a single job, but without much work could be split into separate jobs, which would then run in parallel in GitHub. 2.7 and 3.4 already work like this.

(Re-)Dockerise Windows build

We dropped Docker for simplicity when moving the Windows build from Azure to GitHub Actions. It probably makes sense to containerise again because the Windows build is particularly slow. See Original Dockerfile.

To do:

  • get working inside container
    • docker run –rm -it -v ${PWD}:C:\data esig_builder_windows_64 and cd \data\build\Windows
    • .\windows-wheel-builder.ps1 14.1 64 C:\Users\runneradmin\AppData\Local\Programs\Python\Python35 3.5.4/python-3.5.4-amd64.exe
      • Python path (install target in windows-2019 GitHub VM) doesn’t exist in Docker image
        • windows-wheel-builder.ps1 to fail if $py_install_dir doesn’t exist
        • use C:\Users\ContainerAdministrator\AppData\Local\Programs\Python\Python35 instead
    • C:\data\build\recombine` doesn't exist
      • add $ErrorActionPreference = “Stop” (fails fast because pushd is shorthand for a cmdlet)?
      • manually run git-preamble.sh
    • MSBuild.exe fails building recombine
      • download/install from built asset instead (see
      • BuildTools\MSBuild\Microsoft\VC\v160\Microsoft.Cpp.Default.props not found
      • error MSB8036: The Windows SDK version 8.1 was not found.
      • switch to Visual Studio 2017 (15.0) and install Windows 8.1 SDK component – recombine uses 2019

Done/dropped:

  • create Docker for Windows container using mcr.microsoft.com/windows/servercore:ltsc2019 base image
  • attempt to run rest of build-wheel.ps1 monolithically to start with – can I see trace output?
  • Invoke-WebRequest : Cannot process command because of one or more missing mandatory parameters: Uri. in doall-windows.ps1 – add .exe to curl
  • cmake not found
    • try .NET image ❌
    • install chocolatey and use to install cmake
  • The CMAKE_C_COMPILER: cl is not a full path and was not found in the PATH.
    • run C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat ❌ can’t find path
    • install VSBuildTools
      • iex command to download choco installer no longer works – ah, previous script resets escape to `
    • useENTRYPOINT with VsDevCmd.bat
  • launch wheel build inside container
    • extract windows-wheel-builder.sh to do the bit that runs inside the container
    • correct volume specification
  • automation step to build Docker image inside Azure VM

Error installing esig 0.7.2 with pip

(Looks like @inakleinbottle fixed this in setup.py.)

  • treat recombine DLL as a data file that gets installed (into site_packages) along with the wheel
  • install esig into fresh Windows VM (make runtime dependencies explicit rather than inheriting from dev env)? – not high priority, might come back to this

Original bug report:

Hi,

I am having trouble installing esig 0.7.2 with the python package manager.

A note on my configuration first:
Python: Python 3.7.4
System: Windows Server 2016 64-Bit
Boost: I have downloaded and installed the precompiled binaries for the latest release of Boost (1.73.0) with compiler msvc 14.2

Once I installed esig with pip, I checked whether the library is loaded by running the command

>> import esig
>> esig.is_library_loaded()

And I get the following error message

False

I have also tried running the tests with the following command

>> from esig import tests

which results in

Traceback (most recent call last):
  File<stdin>“, line 1, in <module>
  FileC:\Users\jdupin\Documents\Python\lib\site-packages\esig\tests\__init__.py”, line 3, in <module>
    from . import test_package
  FileC:\Users\jdupin\Documents\Python\lib\site-packages\esig\tests\test_package.py”, line 3, in <module>
    from esig import tosig as ts
ImportError: DLL load failed: The specified module could not be found.

Thanks.

recombine integration (32-bit)

Have commented 32-bit Linux/Windows builds for now. Revisit when we have terrylyons/recombine#3.

  • Windows:
    • Python 3.5: test_recombine non-zero exit code – moved to #89
    • Python 3.6, 3.7
    • Python 3.8, 3.9: test_recombine non-zero exit code – moved to #89
      • setup.py shouldn’t add DLLs to eager_resources – apparently needed, harmless anyway
      • run Windows build in Azure VM for faster feedback ❌ conda install fails in Azure VM
    • Add Python 3.9 build (failing)
  • Linux: add 32-bit Python 3.9 build
  • push another PyPI release (0.9.0)
    • bump version number
    • update package documentation – stored in another repo
    • fix build badge (pointing to wrong branch?)

Done/dropped:

  • (16 Feb) verify 64-bit builds work with latest recombine
  • (17 Feb) 64-bit Windows builds have stopped working
  • Windows
    • work out what tests are actually being run (looks like unittest.TestLoader().discover being used to find tests)
    • report on what tests are being run
    • is run_tests in __init.py__ running? no, python -m unittest discover -s /data/esig/tests is run directly
  • Linux
    • No package boost148-devel available in yum install – doesn’t seem to be a problem now
      • [guesswork] delete “linux32” entrypoint
    • virtual memory exhausted: Operation not permitted
      • bump Docker memory to 12GB ❌ doesn’t help
      • bump Docker memory to 16GB and swap to 4GB ❌ doesn’t help, fails at exactly the same point
      • reduce number of entries in _default_spec in switch_generator.py
        • drop just to first entry in list ✅ fixes memory problem
        • just comment one more entry (the penultimate (17,40) one) ✅
    • ValueError: Cannot repair wheel, because required library “libiomp5.so” could not be located
      • list contents of (value of) OMP_LIB_DIRECTORY just before auditwheel step ✅ libiomp5.so exists
      • explicitly prepend to LD_LIBRARY_PATH ❌ doesn’t help
      • fix use of symlinks in OMP_LIB_DIRECTORY in recombine manylinux2010 builds ❌ didn’t help
      • inspect contents of readelf -d ${HOME}/lyonstech/lib64/librecombine.so (see below)
      • eliminate .. from librecombine.so‘s rpath

Result of readelf:

Dynamic section at offset 0xf1a360 contains 33 entries:
  Tag        Type                         Name/Value
 0x0000000000000001 (NEEDED)             Shared library: [libiomp5.so]
 0x0000000000000001 (NEEDED)             Shared library: [libpthread.so.0]
 0x0000000000000001 (NEEDED)             Shared library: [libdl.so.2]
 0x0000000000000001 (NEEDED)             Shared library: [libgomp.so.1]
 0x0000000000000001 (NEEDED)             Shared library: [libstdc++.so.6]
 0x0000000000000001 (NEEDED)             Shared library: [libm.so.6]
 0x0000000000000001 (NEEDED)             Shared library: [libgcc_s.so.1]
 0x0000000000000001 (NEEDED)             Shared library: [libc.so.6]
 0x000000000000000e (SONAME)             Library soname: [librecombine.so]
 0x000000000000000f (RPATH)              Library rpath: [/opt/intel/mkl/../compiler/lib/intel64_lin]

pip install fails on Ubuntu 20.04

Submitted by Paul Moore ([email protected]). Could thinking about testing the pip install in an Ubuntu 20.04 VM (see #74).

Hi Sam,

I just got the error copied below when installing esig. It could be a
problem with my environment, but could you let me know if there is any
issue with pip install esig at the moment. It’s on Ubuntu 20.04.

All the best,

Paul.

paul@carbon:~$ workon dale
(dale) paul@carbon:~$ python -m pip install esig
Traceback (most recent call last):
   File “/usr/lib/python3.6/runpy.py”, line 193, in _run_module_as_main
     “__main__”, mod_spec)
   File “/usr/lib/python3.6/runpy.py”, line 85, in _run_code
     exec(code, run_globals)
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/site-packages/pip/__main__.py”,
line 16, in <module>
     from pip._internal.main import main as _main  # isort:skip # noqa
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/site-packages/pip/_internal/main.py”,
line 13, in <module>
     from pip._internal.cli.autocompletion import autocomplete
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/site-packages/pip/_internal/cli/autocompletion.py”,
line 11, in <module>
     from pip._internal.cli.main_parser import create_main_parser
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/site-packages/pip/_internal/cli/main_parser.py”,
line 7, in <module>
     from pip._internal.cli import cmdoptions
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/site-packages/pip/_internal/cli/cmdoptions.py”,
line 19, in <module>
     from distutils.util import strtobool
   File
“/home/paul/.virtualenvs/dale/lib/python3.6/distutils/__init__.py”, line
44, in <module>
     from distutils import dist, sysconfig  # isort:skip
ImportError: cannot import name ‘dist’

Simplify install_helpers.py

Strip all the extraneous cruft out of install_helpers.py. Started at 936 LOC; currently 227.

  • purge (obsolete) __author__, __date__ and __version__ metadata
  • remove silly self argument documentation
  • delete __run_checks()
    • remove (redundant) Python version-checking
    • remove (unused) sys.argv[1] == ‘install’ behaviour
  • package_abs_root shenigans
    • setter not required – remove use in setup.py and initialise inside InstallationConfiguration.__init__
    • getter not required (unused)
  • get_platform shenanigans
    • inline get_platform
    • delete OTHER platform
    • purge Enum nonsense
  • delete message_printer debacle – only used in ComponentChecker
    • make MESSAGE_PREFIX local
  • delete (unused) ComponentChecker shenanigans
    • delete (unused) check_libraries insanity
  • include_dirs shenanigans
    • don’t even need InstallExtensionCommand
    • delete CPATH treatment
  • library_dirs shenanigans
    • delete library_dirs setter
    • delete (unused) __library_dirs field

Replace libalgebra files by submodule

  • identify baseline version of libalgebra inlined into esig
  • import baseline version of libalgebra as a submodule instead of directly including files
  • delete libalgebra files from src and add libalgebra to include dirs in install_helpers.py
  • get .tar.gz using setup.py sdist rather than by manual downloading step
  • pip wheel from subdirectory rather than .tar.gz? - see below
  • delete .egg stuff?
  • bump VERSION to 0.7.0
  • verify 32-bit Linux build
  • verify OSX build
  • extract “get esig sources” to shared script

Considered whether we can/should build the wheels directly from sources, rather than going via an intermediate source distribution. Concluded the latter is preferable as we are then building from the sources as shipped via PyPI.

CI for Windows build experiment

Experiment with GitHub Actions:

  • windows-2016 (but look here for some imminent lifecycle issues)
    • curl to download Visual Studio 2015
    • switch windows_wheel_maker from .bat to .ps1
    • check installs to C:\Program Files (x86)\Microsoft Visual Studio 14.0
      • how to make dir emit to console from batch script? – not needed in PowerShell
    • migrate steps from Dockerfile to windows_wheel_maker.ps1:
      • 64-bit and 32-bit boost libraries for VS 14.0
        • migrate wget commands to curl
        • add -L to curl commands to follow redirections
        • install
        • Move-Item to boost directory
      • Python 3.5.4 installation
      • set path correctly to pick up 3.5.4
        • find out where 3.5.4 installed
      • pip install [numpy, wheel, delocate, setuptools]
      • can’t find libalgebra/libalgebra.h
        • git submodule update fails on Windows – actually fine; just reinstate
      • boost headers not found
        • use BOOST_ROOT variable internally
        • set $ENV:BOOST_ROOT
        • add trailing slash
        • use absolute path as per Nick’s original script
        • oops, also need to download and unzip boost sources
          • 403 Forbidden trying to download sources from boostorg
          • use SourceForge page instead
      • link error with libboost_thread-vc141-mt-x64-1_68.lib – switch to VS 14.1 boost libs
      • problem with numpy/cythonize.py
      • install virtualenv
      • test
  • for now enable Windows CI only for windows-build branch
  • remove del *.whl (fails when there are no wheels, and not needed anyway)
  • SSH login to Windows VM on GitHub Actions – not possible

Experiment with Travis

  • enable GitHub integration
  • build won’t rerun on commit – doesn’t tell you if .travis.yml doesn’t parse

Experiment with CircleCI

  • enable GitHub integration - request access from alan-turing-institute owners
  • initial Windows Server 2019 image building on windows-build only
  • extract git submodule preamble to top-level build script
  • fix ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate error (need to install certificates?)

Windows build working on Azure esig_builder

Get build running on our Azure VM.

  • rename data folder to esig and mount from root of esig package
  • fix pip wheel command to build from root of esig package
  • build wheel:
    • No module named pyparsing building wheel
    • can’t find Visual Studio C++ 14.0
      • where is the wheel-building step looking for Visual Studio & what is the binary?
      • looks like vs_buildtools.exe is failing silently (see here and here)
      • investigate problem with vs_buildtools.exe and microsoft/windowsservercore:10.0.14393.1593 - poss. unrelated (Visual Studio Build Tools 2017 not 2019)
      • check VS version numbers – looks like I’m downloading “latest” vs_buildtools (Visual Studio 2019), whereas Visual Studio 2015 is required for 14.0
  • install Visual Studio 2015
    • ADD to Docker container
    • manually install (command-line options) and check in C:\Program Files (x86)\Microsoft Visual Studio 14.0 – seems to silently fail, as with vs_buildtools.exe
    • investigate this problem with Visual Studio 2017 Build Tools
  • get logging working for vs_buildtools.exe
    • locate manual URL for Collect.exe
    • download into Docker container
    • run Collect.exe – fails with Program ‘Collect.exe’ failed to run: The specified executable is not a valid application for this OS platform.
      • run as Administrator:
        • hostname to get computer name (a9b2247ed255)
        • net user to find out name of administrator user (Administrator)
        • runas /user:”a9b2247ed255\Administrator” .\Collect.exe – don’t know Administrator password for Docker container (and should be running with admin rights anyway?)
  • remove use of sdist in Linux build
  • reinstate 2.7 and 3.4 support in Docker file

Done/dropped:

  • try building with Nick’s prebuilt images
    • fix/diagnose filesystem layer verification failed for digest error
  • subscription expired for Azure VM
    • request Azure subscription for esig
    • cost estimate ($80/mo)
    • migrate existing VM
      • raise Azure subscription request for REG to reenable (otherwise move won’t work)
      • move VM and all associated resources to build resource group of new subscription
      • start VM – failed (see error in comment below)
        • check with Oscar – mentioned possible availability issue with Azure
        • run diagnostics (see PDF below) – seems to support Oscar’s conjecture
        • try again later (conjecture verified)
      • check I can connect ok
    • create VM with same spec as before (Windows Server 2019 Datacenter with Containers)
  • Drop Python 2.7 and 3.4 support from Docker file:
    • installation of Python 2.7 (64- and 32-bit)
    • installation of VS C++ compiler for python 2.7
    • anything to do with MSVC 9.0

Problem building esig on manylinux 64x2010 build

Hi,

I am unable to build esig on a manylinux64x2010 system. It says there is a problem importing the _recombine module.

I have pasted the error below.

Is there an issue here that can be fixed?

src/C_tosig.c:16:10: fatal error: _recombine.h: No such file or directory
   #include "_recombine.h"
            ^~~~~~~~~~~~~~
  compilation terminated.
  error: command 'gcc' failed with exit status 1
  ----------------------------------------
  ERROR: Failed building wheel for esig
  Running setup.py clean for esig
  Building wheel for pyyaml (setup.py): started
  Building wheel for pyyaml (setup.py): finished with status 'done'
  Created wheel for pyyaml: filename=PyYAML-5.3.1-cp36-cp36m-linux_x86_64.whl size=44619 sha256=f40b91b3c43e73883aee35350154ac0b14002e89c2ca55ad735a05cc694285d0
  Stored in directory: /root/.cache/pip/wheels/e5/9d/ad/2ee53cf262cba1ffd8afe1487eef788ea3f260b7e6232a80fc
  Building wheel for contextvars (setup.py): started
  Building wheel for contextvars (setup.py): finished with status 'done'
  Created wheel for contextvars: filename=contextvars-2.4-py3-none-any.whl size=7664 sha256=5fa0ad18194c801762fe732f1ba7e9955e7b6a981bbc930125bd6b65b854cd3b
  Stored in directory: /root/.cache/pip/wheels/41/11/53/911724983aa48deb94792432e14e518447212dd6c5477d49d3
  Building wheel for psutil (setup.py): started
  Building wheel for psutil (setup.py): finished with status 'done'
  Created wheel for psutil: filename=psutil-5.7.3-cp36-cp36m-linux_x86_64.whl size=285183 sha256=042b20793fd4d24e15cbedb2a48c8a56efe7ffd37aec6e1d76d3c3c7e3b65289
  Stored in directory: /root/.cache/pip/wheels/fa/ad/67/90bbaacdcfe970960dd5158397f23a6579b51d853720d7856d
  Building wheel for locket (setup.py): started
  Building wheel for locket (setup.py): finished with status 'done'
  Created wheel for locket: filename=locket-0.2.0-py3-none-any.whl size=4039 sha256=5edb32449913743f9201834b71b6ff7c1a3b7dc5eb7251f8f00307a5185ec88f
  Stored in directory: /root/.cache/pip/wheels/21/ca/95/1e41f9a9a7a06ba06874ad2bf0eac88e7ba02fe9a6ff26c77d
Successfully built catch22 pyyaml contextvars psutil locket
Failed to build esig
Installing collected packages: catch22, cython, numpy, esig, llvmlite, numba, six, python-dateutil, pytz, pandas, scipy, urllib3, patsy, statsmodels, threadpoolctl, joblib, scikit-learn, pmdarima, pyparsing, packaging, py, zipp, importlib-metadata, toml, pluggy, attrs, iniconfig, pytest, coverage, pytest-cov, cycler, kiwisolver, pillow, matplotlib, seaborn, scikit-posthocs, pyyaml, toolz, locket, partd, fsspec, dask, tqdm, chardet, certifi, idna, requests, immutables, contextvars, msgpack, sortedcontainers, heapdict, zict, click, tornado, cloudpickle, psutil, tblib, distributed, tsfresh
    Running setup.py install for esig: started
    Running setup.py install for esig: finished with status 'error'
    ERROR: Command errored out with exit status 1:
     command: /opt/_internal/cpython-3.6.12/bin/python3.6 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-ele2643x/esig/setup.py'"'"'; __file__='"'"'/tmp/pip-install-ele2643x/esig/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-4rzp71_e/install-record.txt --single-version-externally-managed --compile --install-headers /opt/_internal/cpython-3.6.12/include/python3.6m/esig
         cwd: /tmp/pip-install-ele2643x/esig/
    Complete output (18 lines):
    esig_install> Starting esig installer...
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-ele2643x/esig/setup.py", line 2, in <module>
        from tools import install_helpers as helpers
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 935, in <module>
        CONFIGURATION = InstallationConfiguration()
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 569, in __init__
        self.__run_checks()
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 601, in __run_checks
        ComponentChecker.check_libraries(self.library_dirs)
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 287, in check_libraries
        message_printer(missing_libraries_message, is_failure=True, terminate=True)
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 115, in message_printer
        error_message=get_error_message())
      File "/tmp/pip-install-ele2643x/esig/tools/install_helpers.py", line 59, in get_error_message
        f = open(filename, 'r')
    FileNotFoundError: [Errno 2] No such file or directory: '/tmp/pip-install-ele2643x/esig/tools/ERROR_MESSAGE'
    ----------------------------------------

Python 3.4 and 3.7 broken in Linux build

Python 3.4 no longer seems to be included out-of-the-box with our manylinux1 base images. Removed from python_versions.txt for now.

The build also now fails with Python 3.7:
ERROR: auditwheel 3.1.0 has requirement wheel==0.34.1, but you’ll have wheel 0.31.1 which is incompatible.

  • try rolling back to tag 2020-01-31-d8fa357
  • reinstate 3.4 in python_versions.txt

Error building esig 0.7.2 from source

Hi,

I am having trouble building esig 0.7.2 from source

A note on my configuration first:
Python: Python 3.7.4
System: Windows Server 2016 64-Bit
Boost: I have downloaded and installed the precompiled binaries for the latest release of Boost (1.73.0) with compiler msvc 14.2

Once I downloaded and extracted the source from pypi.org, I ran the following command into my shell within the extracted esig directory

python setup.py install

And got the following error trace

c:\Users\jdupin\Documents\Python\lib\site-packages\setuptools\distutils_patch.py:26: UserWarning: Distutils was imported before Setuptools. This usage is discouraged and may exhibit undesirable behaviors or errors. Please use Setuptools' objects directly or at least import Setuptools first.
  "Distutils was imported before Setuptools. This usage is discouraged "
Ignoring attempt to add_dll_directory.
esig_install> Starting esig installer...
Traceback (most recent call last):
  File "setup.py", line 2, in <module>
    from esig import install_helpers as helpers
  File "n:\esig-0.7.2\esig\install_helpers.py", line 932, in <module>
    CONFIGURATION = InstallationConfiguration()
  File "n:\esig-0.7.2\esig\install_helpers.py", line 576, in __init__
    self.__run_checks()
  File "n:\esig-0.7.2\esig\install_helpers.py", line 608, in __run_checks
    ComponentChecker.check_libraries(self.library_dirs)
  File "n:\esig-0.7.2\esig\install_helpers.py", line 774, in library_dirs
    raise "MKLROOT not defined."
TypeError: exceptions must derive from BaseException

I wasn't sure why the MKLROOT environment variable is required (do I need to install the Intel Math Kernel library? this is not specified anywhere in the online documentation) so decided to comment the following part of the code (including the expanduser part since this seems to have hardcoded values).

            if not('MKLROOT' in os.environ):
                raise "MKLROOT not defined."
            # not sure why this is only needed on Windows
            return_list.append(os.path.join(os.environ['MKLROOT'], "lib", "intel64"))
            # todo: lose hardcoded knowledge of recombine installation dir
            from os.path import expanduser
            recombine_lib_dir = os.path.join(expanduser("~"), "lyonstech", "lib")
            os.listdir(recombine_lib_dir)
            return_list.append(recombine_lib_dir)

Running the setup.py script again now results in the following

Ignoring attempt to add_dll_directory.
esig_install> Starting esig installer...
esig_install> Required libraries found! Hurrah!
esig_install> Include files successfully found.
running install
running bdist_egg
running egg_info
writing esig.egg-info\PKG-INFO
writing dependency_links to esig.egg-info\dependency_links.txt
writing requirements to esig.egg-info\requires.txt
writing top-level names to esig.egg-info\top_level.txt
reading manifest file 'esig.egg-info\SOURCES.txt'
reading manifest template 'MANIFEST.in'
c:\Users\jdupin\Documents\Python\lib\site-packages\setuptools\distutils_patch.py:26: UserWarning: Distutils was imported before Setuptools. This usage is discouraged and may exhibit undesirable behaviors or errors. Please use Setuptools' objects directly or at least import Setuptools first.
  "Distutils was imported before Setuptools. This usage is discouraged "
warning: no previously-included files found matching '.DS_Store'
writing manifest file 'esig.egg-info\SOURCES.txt'
installing library code to build\bdist.win-amd64\egg
running install_lib
running build_py
copying esig\install_helpers.py -> build\lib.win-amd64-3.7\esig
running build_ext
esig_install> Running extra esig pre-build commands...
building 'esig.tosig' extension
C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\VC\Tools\MSVC\14.26.28801\bin\HostX86\x64\cl.exe /c /nologo
 /Ox /W3 /GL /DNDEBUG /MD -Ic:\Users\jdupin\Documents\Python\include -I.\src\ -I.\libalgebra\ -I.\recombine\ -
I.\build\recombine\recombine\ -IC:\local\boost_1_73_0_14_0 -Ic:\Users\jdupin\Documents\Python\include -
Ic:\Users\jdupin\Documents\Python\include -Ic:\Users\jdupin\Documents\Python\lib\site-packages\numpy\core\include "-
IC:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\VC\Tools\MSVC\14.26.28801\include" "-IC:\Program Files 
(x86)\Windows Kits\NETFXSDK\4.8\include\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\ucrt" "-
IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\shared" "-IC:\Program Files (x86)\Windows 
Kits\10\include\10.0.18362.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.18362.0\winrt" "-IC:\Program Files 
(x86)\Windows Kits\10\include\10.0.18362.0\cppwinrt" /Tcsrc/C_tosig.c /Fobuild\temp.win-amd64-3.7\Release\src/C_tosig.obj 
/EHsc /DWINVER=0x0601 /D_WIN32_WINNT=0x0601 /D_SCL_SECURE_NO_WARNINGS /bigobj
C_tosig.c
src/C_tosig.c(16): fatal error C1083: Cannot open include file: '_recombine.h': No such file or directory
error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\BuildTools\\VC\\Tools\\MSVC\\14.26.28801\\bin\\HostX86\\x64\\cl.exe' failed with exit status 2

I cannot find a '_recombine.h' file in the recombine folder.
Note that I have also tried using directly the most recent commit from the esig github repo which does include a _recombine.h file. The install fails again with this time the script looking also for a /recombine/recombine.h file which doesn't seem to be present.

Thanks

Validity of pulse reflections: Implement additive signal model

The model assumes that a pulse is either reflected off the drone's body, or the drone's propeller and that is is possible to align these reflections (separately) with the incident signal. A more realistic approach should combine the body and propeller reflections, instead of considering them separately.

Remaining Windows builds as GitHub Actions

After successful #21 with GitHub Actions, and failure at #12, have decided to migrate the rest of the build to GitHub Actions (perhaps migrating to Travis or CircleCI at some point if they prove better for debugging).

Misc:

  • reuse git-preamble.sh
  • pull .\git-preamble.sh into PowerShell script
  • cd build\Windows in as well?
  • parameterise 3.5 install to allow for 32-bit
    • basic parameterisation
    • still trying to link against libboost_thread-vc141-mt-x64-1_68.lib
      • 32-bit Python not installing properly
        • mention Python executable directly instead of setting PATH
          • use Invoke-Expression to run command via variable
        • enforce expected Python version – not needed if we invoke explicitly
        • folder C:\Users\runneradmin\AppData\Local\Programs\Python\Python35_32 missing
          • check parent Python folder – should be 35-32 not 35_32
  • windows-2019 Windows Server VM
    • curl to download Visual Studio 2015 fails – check again
  • windows-2016 (but look here for some imminent lifecycle issues)
    • try omitting VS 2015 installation
    • omit Python 3.5.4 setup from build-Windows.yml
  • enable on master

Versions still to do (but note 3.4 not implemented by Nick):

  • 2.7, 32-bit
  • 2.7, 64-bit
    • basic configuration
    • install Visual Studio 9.0
  • 3.5, 32-bit
  • 3.6, 32-bit
  • 3.6, 64–bit
  • 3.7, 32-bit
  • 3.7, 64-bit
  • 3.8, 32-bit
  • 3.8, 64-bit

OSX not building on Python 3.4

Extract new task to track 3.4 build challenges.

  • install Python 3.4:
    • via homebrew-python ❌ - 3.4 no longer exists
    • via Anaconda ❌ (see #23 for notes, not much investigation so far)
    • in GitHub Mac VM via setup-python@v1 ❌ – 3.4 doesn’t exist
    • via pyenv install ❌ – won’t compile from sources (see #23 for notes, much investigation)
    • via MacPorts
  • manually verify variant of build-27-wheels.sh with 3.4 (MacPorts):
    • sudo port install py34-pip
    • pip install steps fail with macports Could not install packages due to an EnvironmentError: [Errno 13] Permission denied – need to run as sudo
      • experiment with manually running those steps as sudo
      • pip wheel fails because boost headers not found
        • sudo port install boost (brings in Python 3.7, so probably wrong version..)
  • generalise build-27-wheels.sh so also works with 3.4 (MacPorts):
    • download MacPorts into macos-10.15 vm
    • install MacPorts for 10.14 from command line
      • conditionally install either 10.14 or 10.15, depending on OS version
    • install Python 3.4 via MacPorts
    • verify GitHub runner doesn’t not require sudo password
    • parameterise pip install steps on sudo
    • sudo port install boost, py34-pip
    • extract mac_wheel_builder-27-34.sh
    • sudo port contents python34 on GitHub build to see where MacPorts installs 3.4
    • grep location from output of sudo port contents python34 – not needed, seems to be universally installed to /opt/local/bin/python3.4
    • factor out commonality between 2.7 and 3.4 – not worth the effort
    • enforce Python versions are as expected (2.7 and 3.4) before building wheel
    • only check major/minor versions when enforcing Python versions – fix later if needed
  • further consolidation of new 2.7 build script:
    • rename to remove 2.7 from name
    • consolidate new job into build-OSX.yml so all Mac builds must succeed together
      • keep as separate jobs, add dummy final job with other jobs as dependents
      • turn into steps of a single job – less parallel
      • split 2.7/3.4 into separate scripts; rename build-wheels.sh to build-wheels-rest.sh
      • enable on master
      • actions/setup-python@v1 steps still needed?
      • shell: bash needed?
    • consoliidate new build script to remove redundancy with old scripts – lost will to live
  • add xcode-select –install to build steps/documentation? (avoid fatal error: ‘X11/Xlib.h’ file not found)

installing visual studio C++ build tools via command line

For VS2017, can download vs_buildtools.exe installer. Not clear how to run it to actually get build tools in desired place:
C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC
have tried
.\vs_buildtools.exe --add Microsoft.VisualStudio.Workload.VCTools;includeOptional
and also the same thing with powershell "..." but it's not clear what works - running a docker container and executing these commands by hand usually works eventually.

recombine integration (Mac)

Imported from terrylyons/recombine#12. First rough pass over recombine integration. Monstrous enough to require one issue per platform.

Basic structure seems to be as follows:

  • C_toSig.h and C_toSig.c specify a new API, pyrecombine

  • pyrecombine wraps a lot of scary stuff around two calls to _recombineC:

    • one with dummy arguments to obtain required buffer size
    • another with real arguments
    • not sure how _recombineC is resolved, though
  • _recombineC:

    • is defined in _recombine.h and _recombine.cpp
    • has a sister function _recombine, but can’t see that being called anywhere
    • uses some stuff from the TestVec folder (RdToPowers.h, EvaluateAllMonomials.h)
  • put Terry’s code into an experimental recombine branch of esig

  • compile and link esig

    • error: use of undeclared identifier 'ptrdiff_t'include <stddef.h>
    • mismatched signatures of pyrecombine – add PyObject* keywds to header
    • [Mac/Linux] ImportError: dlopen(…/tosig.so, 2): Symbol not found: __recombineC
      • add recombine/_recombine.cpp to Extension.sources in setup.py
      • fatal error: recombine/recombine.h: No such file or directory
      • ImportError: dlopen(/…/tosig.cpython-[…].so, 2): Symbol not found: _Recombine
        • nm -gU to verify that _Recombine is exported by /lyonstech/recombine.framework/Versions/C/recombine

Install recombine:

  • new install-recombine.sh script, call from build-wheel-other.sh
  • enable set -u on all scripts
    • set various virtualenv variables to ‘’ to prevent failure – disable -u in places
  • clone repo – could not read Username for ‘https://github.com’: Device not configured
    • switch to HTTPS (don’t know how to get SSH keys onto virtual machine)
    • specify user name, password as part of HTTPS URL (password as environment variable)
    • try a Personal Access Token instead of my password
    • store Personal Access Token as secret in esig repo
    • use GitHub checkout action in workflow and regular clone locally?
  • build recombine in build/recombine subfolder of esig
  • add build/recombine/recombine to esig include directories
  • ImportError: dlopen(../tosig.so, 2): Symbol not found: _RdToPowers – add recombine/TestVec/RdToPowers2.cpp to Extension.sources
  • ‘TestVec/OstreamContainerOverloads.h’ file not found
  • #include “_recombine.h” into C_toSig.cpp
  • repeat for build-wheel-27.sh and build-wheel-34.sh
  • -Wno-unused-but-set-variable option not recognised – who cares

Delocate step to pick up recombine:

  • delocate-listdeps to verify dependency on recombine ❌ not listed
  • Library not loaded: @rpath/recombine.framework/Versions/C/recombine – image not found
    • delocate warnings: Couldn’t find @rpath/recombine.framework/Versions/C/recombine on paths and Not processing required path @rpath/recombine.framework/Versions/C/recombine because it begins with @
      • add -Wl,-rpath,/Users/rperera/lyonstech/ to LDFLAGS in setup.py
    • temporarily put recombine.framework into Library/Frameworks – library found ok

recombine integration (64-bit Linux)

Remaining linux-pyXX-64bit tasks

  • 27: /opt/python/cp27-cp27m/bin/python: No such file or directory
  • 27u: /opt/python/cp27-cp27mu/bin/python: No such file or directory
  • 34: /opt/python/cp34-cp34m/bin/python: No such file or directory
  • 37: ModuleNotFoundError: No module named wheel.wheelfile – needs wheel==0.34.2
  • integrate Terry’s PyTestRecombine.py into esig/tests as recombinetests.py
    • switch formatting to format strings rather than Python 3.6-style string interpolation
    • break into 3 separate tests – not easy (shared prepared data), but emit test number for clarity
    • check tests run on:
      • mac-py27ImportError: cannot import name tests (importing esig tests into recombine)
      • mac-py34ImportError: cannot import name tests (importing esig tests into recombine)
      • mac-py35
      • mac-py36 ❌ test2 fails with no of points left = 2401 – determinise by fixing random seed
      • mac-py37
      • mac-py38
      • linux-py3[5--8]-64bit ❌ process killed
        • run OMP configuration script ❌ no change
        • verify OMP environment variables still set in virtualenv for tests
        • bump Docker daemon memory from 8Gb to 16Gb – ✅ yes, fixes local problem
        • still failing in GitHub Runner
          • set --memory=16g on Docker command-line ❌ doesn’t help
          • as per comment, set dimension = 240 for a “small” test

Occasionally, installation of sudo fails with [Errno -1] repomd.xml does not match metalink for epel, but this seems to be a sporadic networking failure.


Done/dropped:

Compilation:

  • error: ‘for’ loop initial declarations are only allowed in C99 mode – add -std=c99 to extra_compile_args in install_helpers.py (assumes gcc)
  • error: ’nullptr’ was not declared in this scope
    • replace nullptr by NULL for now
  • error: This file requires compiler and library support for the ISO C++ 2011 standard.
    • replace c99 by c++11 in above
    • fix c99 error by refactoring for loops
  • ‘ptrdiff_t’ does not name a type in lapack_fortran_definitions.h – include stddef.h
  • call doall-linux.sh build script for recombine from esig build-wheel.sh
    auditwheel step to pick up recombine:
  • auditwheel show to check dependency on recombine not yet present
  • add recombine to used_libraries in install_helpers.py
  • ld: cannot find -lrecombine (as expected)
    • add $HOME/lyonstech/lib to library_dirs in install_helpers.py ❌ not visible inside Docker container
    • move doall-linux.sh step to Dockerfile – sounds like more opacity I don’t need
    • run doall-linux.sh inside Docker container
      • wget fails with Unable to establish SSL connection – probably problem with manylinux
      • switch Linux build of recombine so it only uses curl, not wget
      • apt-key fails (manylinux uses yum, not apt) – migrated recombine to manylinux2014
    • run doall-linux.sh outside Docker container
      • copy .so files to location visible to container
        • cp fails with exit code 1 without -r if source contains a directory
  • examine output of auditwheel show (see below)
  • skipping incompatible /data/build/lib//librecombine.so when searching for -lrecombine – 32-bit only
  • Cannot repair wheel, because required library “librecombine.so.1” could not be located
    • set LD_LIBRARY_PATH=/data/build/lib:$LD_LIBRARY_PATH in setup.py ❌ wrong environment?
    • set in linux_wheel_builder.sh instead
    • no longer need explicit inclusion of /data/build/lib into library_dirs?
  • check audited wheel is consistent with platform tag manylinux1_x86_64 (see below)
  • error: cannot repair “tmp/esig-0.7.1-cp34-cp34m-linux_x86_64.whl” to “manylinux1_x86_64” ABI because of the presence of too-recent versioned symbols
    • get recombine building on manylinux14 (terrylyons/recombine#19)
    • esig to build recombine using doall-manylinux.sh (now uses Docker container to build)
    • cannot access ‘/home/runner/lyonstech/lib64’: No such file or directory
      • recombine builder is installing to a home directory inside the recombine Docker container
        • pull make install step out of doall.sh and specify install dir explicitly? ❌ make install fails with /usr/lib64/python2.7/site-packages/cmake/data/bin/cmake: No such file or directory (installer depends on something inside Docker container?)
        • first migrate esig to manylinux2014
        • build recombine local image and have esig Docker file extend that

+ auditwheel show tmp/esig-0.7.1-cp37-cp37m-linux_x86_64.whl

esig-0.7.1-cp37-cp37m-linux_x86_64.whl is consistent with the
following platform tag: "linux_x86_64".

The wheel references external versioned symbols in these system-
provided shared libraries: libgcc_s.so.1 with versions {'GCC_3.0'},
libc.so.6 with versions {'GLIBC_2.4', 'GLIBC_2.2.5', 'GLIBC_2.14',
'GLIBC_2.16'}, libm.so.6 with versions {'GLIBC_2.2.5'}, libstdc++.so.6
with versions {'GLIBCXX_3.4', 'CXXABI_1.3'}, libpthread.so.0 with
versions {'GLIBC_2.3.2', 'GLIBC_2.2.5'}, libdl.so.2 with versions
{'GLIBC_2.2.5'}, libiomp5.so with versions {'VERSION'}

This constrains the platform tag to "manylinux2014_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image.

Automate deployment to PyPI

Set up GitHub Actions to build and deploy Python packages.

  • new release branch as default; delete master
  • integrate all jobs into a single build.yml with top-level job
  • single badge for new unified workflow
  • resolve sporadic build failures (#45)
  • store built artifacts
  • reinstate Python 3.4 build for Linux
  • new top-level build script for Linux
  • add source distribution
  • build detritus like numpy should not be in output folder
  • retrieve built artifacts
  • add PyPI and TestPyPI passwords to repo secrets
  • fix Mac artifact output path
  • upload to TestPyPI
  • try out with 0.7.1 release
  • switch from TestPyPI to PyPI
  • only publish when building on release
  • delete esig_maintainer and ptype_maintainer accounts from [Test]PyPI

OSX build working for Python 3.5-3.8

  • pip wheel –no-binary missing argument
  • X11 headers not found so openssl can’t be built – installed XCode
  • top-level build-wheels.sh
  • pull Python version loop out of mac_wheel_builder.sh
  • 2.7.10 and 3.4.8 fail: can’t find file string in #include <string> - red herring; compiling on wrong Python environment because pyenv install fails?
  • build wheels for v3.5-3.7
  • 3.8.1

Sync Docker build to Feb 12 Windows Server Update

Re-running Docker build on 2 March showed Python installations failing which were previously working. While investigating, ran docker build --no-cache and revealed that step 2 even fails (Chocolately installation). Further investigation reveals that a Feb 12 update to the Windows Server base image needed to be pulled. This in turn caused the Docker build to run out of disk space.

  • ouch, Dockerfile no longer builds
  • Docker build fails with out of memory/file not found:
    • system prune (reclaimed 31Gb)
    • check available space using wmic logicaldisk get size,freespace,caption (~31Gb+8Gb)
    • retry build - strangely prune seems to have deleted cache as restarted from step 2 - still runs out of memory
    • look at docker system df to see reclaimable space - 75Gb, 9 unused images
    • prune leaves unused images untouched - use image prune -a
    • try again with docker build with no images, containers or cache

delocate-wheel failing on MacOS Python 3.4

This was silently failing, but now I’m running all Bash scripts with -e, it now breaks the build. The delocation installation issues this warning:

Installing collected packages: delocate
  WARNING: The scripts delocate-addplat, delocate-fuse, delocate-listdeps, delocate-patch, delocate-path and delocate-wheel are installed in '/opt/local/Library/Frameworks/Python.framework/Versions/3.4/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed delocate-0.8.0
  • fix script so it can find delocate-wheel
  • reinstate mac-py34 build in build.yml

Read the Docs integration

Read the Docs is probably currently configured with webhooks to publish from here (or perhaps here). We should make the present repo the authoritative version and set up Read the Docs to publish from here.

Python 3.8 support (esig 0.7.0)

Python 2.7 is deprecated as of 1 January 2020. Python 3.4 also deprecated (March 2019, according to warning message).

  • kill support for 2.7 and 3.4 from install_helpers.py (e.g. msvc-9.0 and msvc-10.0
  • Linux x86_64
  • Linux i686
  • OSX:add 3.8.1 to python_versions.txt and build wheel
  • Windows 64-bit
  • Windows 32-bit
  • bump esig revision number

Python 3.4 Windows build

Nick’s original build scripts didn’t support Python 3.4 on Windows. I also haven’t yet been able to build it. Since we didn’t support this platform previously, I’ve split it out from #30.

  • install Visual Studio 10.0 – looks difficult (need MSDN or Visual Studio Dev Essentials)
  • 3.4, 32-bit
  • 3.4, 64-bit
    • support .msi installer
    • verify install location
    • install numpy at 1.16 (1.17 onwards requires Python 3.5+) ❌ fails probably because wrong Visual Studio version

Windows development environment

I don’t have a Windows machine, which takes developing Docker for Windows scripts from painful to painful squared. Debugging by trial-and-error through the GitHub VM means a 5-45min feedback loop for something like a missed quote.

  • reactivate Azure subscription (total wait: 8 days)
    • email Tomas
    • raise ticket on Turing Complete
    • re-request subscription (credits used up)
  • check Docker ok in Azure VM
    • not enough disk space after 1h15m – delete local copy of boost
    • various “re-exec” errors – purge Docker caches (maybe some stale images from old versions) ✅
      • hcsshim::ImportLayer - failed failed in Win – 32The system cannot find the path specified
      • the command ‘cmd /S /C C:\TEMP\vs_buildtools.exe …‘ returned a non-zero code: 1
      • hcsshim::PrepareLayer - failed failed in Win32: The parameter is incorrect. (0x57)
      • hcsshim::PrepareLayer - failed failed in Win32: network resource or device no longer available. (0x37)
  • document usability issues of running in Azure:
    • screen doesn’t always update automatically – need to “poll” by pressing return
    • scrolling with mouse wheel sometimes scrolls wrong window (“inner” window rather than outer)
    • copy and paste shortcuts don’t work reliably
      • upgrade to new Remote Desktop for MacOS – apparently improves this
      • don’t seem to be able to paste into Docker terminal
    • Windows seems to run in all three of my monitors, but inside one virtual window
    • need to stop VM when not being used; destroys task context which is difficult to construct
  • temporary Windows laptop (total wait: 15 days and counting) ❌ abort
    • get approval from Martin
    • consider James’ suggestion of running local Windows VM ❌ will grind my machine to a halt
    • initial request ❌ accidentially used Bristol email addess, can’t access issue on Turing Complete
    • raise new issue, cross-reference old one
    • neither issue exists in Turing Complete any more – chase again
    • IT Services looking into it; wait to hear back, then decide whether to proceed ❌ no response

Does msiexec do anything?

e.g. the line RUN powershell "msiexec /i /quiet VCForPython27.msi ALLUSERS=1"
does it actually install anything?
Running the command manually in the docker container gives no output, even with /Lv option.

Cache dependencies in GitHub Actions

Building Docker images in GitHub Actions is somewhat self-defeating because each Docker build runs in its own VM so there is no caching. We could publish images to Dockerhub but the downloading overhead could still be significant.

See https://github.com/actions/cache. However, of the many examples, none mention Docker. On the other hand much of the GitHub Actions support for Docker does mention caching, e.g. here. Looks incredibly tedious.

Done/dropped:

Sporadic build failures on GitHub Actions

The build now runs ~20 jobs for various configurations, but one or two jobs often stall and never resume. Sometimes this may be down to bandwidth or download issues, but often it looks like the job has just stalled in the GitHub Actions runner. The workaround is to cancel the build when this happens and run again until all jobs complete.

Basic problem. In recent days, it seems that, with Python 3.6+ on Linux, test_recombine can no longer use 240 dimensions – the process hangs (as opposed to just taking a long time). 240 used to be a small enough number; now 90 reliably fails. 60 works, so that is the current setting. Further investigation is required to see if there is an underlying problem, or whether this is related to (changes in) the resources available to GitHub Actions.

Automate OSX and Linux builds through GitHub actions

  • Linux:
    • basic workflow based on ubuntu-18.04 virtual environment
    • experiment with building i686 Docker image
    • run build
    • build to fail if wheel not built
  • MacOS:
    • basic workflow based on macos-10.15 virtual environment
    • fix build
      • use virtualenv to run sdist in Python 3.5.5 (macos-10.15 has 2.7, we need >=3.5)
      • install Cython
      • get libalgebra submodule
    • build to fail if wheel not built

Switch to pre-built recombine (Mac)

Use the pre-built recombine binary (terrylyons/recombine#25). Mac only for now as pre-built binaries for Linux and Windows are WIP. May significantly reduce the case for #73 and #75.

  • retrieve installer via artifacts API – looks like a huge pain in the butt
  • retrieve installer via dawidd6/action-download-artifact
    • repo argument parses weirdly (see here – looks like terrylyons/recombine is correct syntax)
    • whatever string I pass to repo, Action fails with Error: Not Found (=404?); possible reasons:
      • secrets.GITHUB_TOKEN not found – no, that reports Input required and not supplied: github_token
      • release branch not found – no, that reports RunID: and then Not Found
      • artifact called build not found ❌ omitting name argument will download all artifacts
    • authenticate: secrets.RECOMBINE_LOGIN works (isn’t that a password, not an authentication token?)
  • retrieve installer (specific version for now) from the command line using curl
  • remove git-preamble.sh step
  • run cmake -- install step
    • INSTALL cannot find /Users/runner/work/recombine/recombine/build/recombine/recombine.framework
    • otool -l to inspect binaries for value(s?) of LC_RPATH in recombine binary:
      • recombine build, before test install: /opt/intel/compilers_and_libraries/mac/mkl/../compiler/lib
      • recombine build, after test install: /Users/runner/lyonstech/recombine.framework/Versions/C
      • esig pre-built version, before install: /opt/intel/compilers_and_libraries/mac/mkl/../compiler/lib
    • otool -D to look at “install name”: recombine/recombine.framework/Versions/C/recombine
    • decompose: path ofrecombine build (/Users/runner/work/recombine/recombine/build) + install name
  • add test_recombine step

pip –b deprecation

The pip wheel step complains with:

DEPRECATION: The -b/--build/--build-dir/--build-directory option is deprecated. pip 20.3 will remove support for this functionality. A possible replacement is use the TMPDIR/TEMP/TMP environment variable, possibly combined with –no-clean. You can find discussion regarding this at pypa/pip#8333.

  • Mac build
  • Linux build
  • Windows build

RuntimeError: Python version >= 3.6 required running sdist

The source distribution script sdist no longer seems to work. Unfortunately, Googling for the error message produces zero hits.

  • name specific Cython version used in last successful build ❌ didn’t help
  • bump Python version to 3.6 in GitHub runner

Migrate Windows build to mcr.microsoft.com/dotnet/framework/sdk:4.8

The base image microsoft/dotnet-framework:4.7.1 no longer exists, so need to switch to a new Docker for Windows base image. The new image uses PowerShell as the shell language rather than CMD. I didn’t know about the SHELL Docker command to switch shells, which might have allowed us to continue with most of the old script; instead I ended up porting the old Dockerfile to PowerShell, which required several changes.

Done/dropped:

  • build Docker image:
    • remove python27_64 and python27_32 from python_versions.txt
    • “The container operating system does not match the host operating system” running Dockerfile - switch Dockerhub repo from microsoft/dotnet-framework:4.7.1 to mcr.microsoft.com/dotnet/framework/sdk:4.8
    • copy commands fail (.\local\boost_1_68_0\lib64-msvc-14.0\*.lib doesn’t exist) - use Start-Process and wait to run installers synchronously
    • (Possibly harmless) The term ‘Stop’ is not recognized as the name of a cmdlet error - don’t use nested powershell
    • The string is missing the terminator: “. when copying C:\Program Files (x86)\Windows Kits\8.1\bin\x86\rc.exe (Windows Kit 8.0 not installed/available?)
    • Cannot find path ‘C:\Python27’ because it does not exist at step 80
      • run .msi files with msiexec
      • use Start-Process -Wait to run `msiexec’
    • update pathenv_ files to powershell:

OSX not building on Python 2.7

Can’t build for Python 2.7.10 or 3.4.8 on my Mac or on the GitHub macos-10.15 virtual machine. Installation fails with compilation error, apparently because of wrong version of SSL library.

  • remove sdist step and use of pyenv
    • verify unnecessary
    • [Errno 63] File name too long (circular file path?)
      • try pip wheel from esig and esig/build
    • new build script for 2.7 only initially
      • use virtualenv directly without pyenv
      • remove virtual environment after use
    • new GitHub job without 3.5 installation step in build-OSX-python-2.7.yml
    • extract $p from python –version so will work with 3.4 too
    • consolidate temporary folder structure
    • new GitHub job to run 2.7 build script in a Python 3.4 environment ❌ (3.4 not supported)
  • experiment with dropping pyenv:
    • with Python 2.7, run python -m pip install -U pip virtualenv
      • fails with ERROR:root:code for hash md5 was not found., probably to do with my openssl shenanigans:
  • Anaconda experiment to have multiple Python versions without requiring pyenv, at least for 2.7 and 3.4:
    • install Anaconda for Python 3.7
    • install Python 2.7 virtual environment
    • install Python 3.4 virtual environment
    • run wheel build in Python 3.4 virtual environment ❌
      • RuntimeError: Python 3.5 or later is required
    • run wheel build in Python 3.5 virtual environment ❌
      • fatal error: ‘string’ file not found: #include <string>
        • set CPATH=/Users/rperera/anaconda3/envs/py35/include/c++/v1/ before wheel build
      • ld: warning: directory not found for option ‘-L/opt/local/lib/’; ld: library not found for -lstdc++
      • give up for now and uninstall (seemed to be breaking pyenv install)
  • discuss PyPI download statistics with Terry
  • ask Peter to try building 2.7.10 and 3.4.8
  • fail build if any Python installation fails
  • temporarily enable OXS build on mac-build branch only
  • enable SSH access to GitHub Actions build – haven’t needed it yet
  • compilation error _ssl.c:684:35: error: incomplete definition of type ‘struct X509_name_entry_st’ (same error with Python 2.7 and 3.4) ❌
    • experiment to force Python 2.7 build to use [libressl-2.2.7] ❌(https://www.bountysource.com/issues/47297948-error-the-python-ssl-extension-was-not-compiled-missing-the-openssl-lib)
      • put libressl-2.2.7 somewhere local using curl -O https://ftp.openbsd.org/pub/OpenBSD/LibreSSL/libressl-2.2.7.tar.gz then tar xvf libressl-2.2.7.tar.gz
      • try setting CFLAGS include dir to libressl-2.2.7 - failed, argument not passed to clang, this code which ignores CFLAGS is probably relevant, also homebrew bumped openssl from 1.0 to 1.1 on 27 Nov 2019
      • try ln -s ~/Repo/esig/build/OSX/libressl-2.2.7 /usr/local/opt/[email protected] - seems to unstick compiler but fails with linker error ld: library not found for -lssl
      • copy /usr/local/opt/[email protected]/lib/* to libressl-2.2.7/lib - seems to unstick that link step, but fails with another linker error ld: framework not found QuickTime (and also highly dubious!)
      • improve slightly by instead trying ln -s ~/Repo/esig/build/OSX/libressl-2.2.7/include /usr/local/opt/[email protected]/include (linking include dir only) - should reproduce QT error
      • apparently --no-opencv will drop the QT requirement but not sure what this needs to be passed to ❌ – no need to continue this inquiry as no longer trying to build 2.7 from sources
      • install QuickTime and symlink into XCode installation ❌ – still apparently can’t find Quicktime.framework
    • experiment with force-removing openssl - later problem with ImportError: No module named zipp
    • experiment with PYTHON_CONFIGURE_OPTS=”--with-openssl-dir=./libressl-2.2.7”
    • set brew installations to specific versions - probably possible by pointing to specific .rb files but prob unnecessary
    • upgrade pyenv-virtualenv to 1.1.5 fails because openssl not symlinked??
    • remove anaconda installation so openssl isn’t picked up from there
      • removing anaconda means Python falls back to default 2.7 installation that comes with MacOS, and pip fails with ERROR:root:code for hash md5 was not found
      • reinstall system-level Python 3:
        • requires xcode-select –install (thought I already did that?)
        • brew install python3
  • try with Python 2.7.1 and 2.7.17 - still fails

Test published artifact

Extend the build process to install the built binary into a fresh VM and do a deployment-environment test. See #60.

recombine integration (64-bit Windows)

  • C_tosig.obj : error LNK2001: unresolved external symbol _recombineC
  • Cannot open include file: ‘recombine/recombine.h’
    • git-preamble.sh needs to be supplied with recombine password
  • run doall-windows.ps1 to build recombine locally
    • Cannot find path ‘D:\a\esig\esig\build\recombine’ because it does not exist.
      • inline git-preamble.sh (adjusting to Powershell as necessary) to check recombine cloned properly
      • run git-preamble.sh using bash explicitly?
  • reweight.h(93): error C3001: ‘simd’: expected an OpenMP directive name
    • how is recombine compiled on Windows in standalone build? ➡️ Visual Studio 16 2019
    • what compiler is being used to build recombine as part of esig? ➡️ Visual Studio 15 2017
    • bump 64-bit esig build for Windows from windows-2016 to windows-2019
  • fatal error LNK1104: cannot open file ‘mkl_intel_ilp64.lib’
    • locate mkl_intel_ilp64.lib’ ➡️ $env:MKLROOT/lib/intel64`
    • install_helpers.py to add this to library_dirs
  • (linking) error LNK2001: unresolved external symbol __imp_Recombine (apparently __imp_ means DLL)
    • add recombine as a library in Windows case of used_libraries in install_helpers.py
    • fatal error LNK1181: cannot open input file ‘recombine.lib’
    • add ~\lyonstech\lib to library_dirs in install_helpers.py
  • (testing wheel)ImportError: DLL load failed: The specified module could not be found.
    • scan MKLROOT for any potentially DLLs -- don't seem to be any, although the CMake uses C:/Program Files (x86)/IntelSWTools/compilers_and_libraries/windows/redist/intel64/compiler/libiomp5md.dll
    • set PATH to pick up recombine.dll from ~\lyonstech\bin (analogue of setting LD_LIBRARY_PATH)
    • still fails with Python 3.8 -- use add_dll_directory (breaking change to DLL resolution in Python 3.8)
    • add_dll_directory not available before Python 3.7 -- conditionalise using sys.version_info
    • also not defined on non-Windows platforms -- drop conditionalisation, handle AttributeError instead

iisignature import error when trying to use esig==0.9.4

Hi! I'm receiving the following error when I try to import esig using the 0.94 install.

My version of numpy is 1.1.9, the traceback is below.

Traceback:
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/esig/backends.py:18: in <module>
    import iisignature
E   ModuleNotFoundError: No module named 'iisignature'

During handling of the above exception, another exception occurred:
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/transformations/tests/test_all_transformers.py:24: in <module>
    ALL_TRANSFORMERS = all_estimators(estimator_types="transformer", return_names=False)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/utils/__init__.py:98: in all_estimators
    module = import_module(module_name)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/classification/signature_based/__init__.py:4: in <module>
    from sktime.classification.signature_based._signature_classifier import (
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/classification/signature_based/_signature_classifier.py:17: in <module>
    from sktime.transformations.panel.signature_based._checks import (
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/transformations/panel/signature_based/__init__.py:6: in <module>
    from sktime.transformations.panel.signature_based._signature_method import (
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/transformations/panel/signature_based/_signature_method.py:4: in <module>
    from sktime.transformations.panel.signature_based._compute import (
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/transformations/panel/signature_based/_compute.py:10: in <module>
    from sktime.transformations.panel.signature_based._rescaling import (
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/transformations/panel/signature_based/_rescaling.py:14: in <module>
    _check_soft_dependencies("esig")
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/sktime/utils/validation/_dependencies.py:23: in _check_soft_dependencies
    import_module(package)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/esig/__init__.py:28: in <module>
    from esig.backends import get_backend, set_backend, list_backends
../../../../hostedtoolcache/Python/3.7.10/x64/lib/python3.7/site-packages/esig/backends.py:21: in <module>
    raise ImportError("No available backend for signature calculations, please reinstall esig.")

Parallelise Linux builds, build from sources

To improve the feedback cycle, we should parallelise the Linux builds a bit. This isn’t particularly easy to do because of the use of Docker (since building a Docker image is slow unless it can rely on a cache). However, because there is a separate Docker build for each architecture (i686 vs x86_64) this offers an obvious parallelisation opportunity.

  • delete setup-python step (no longer required?)
  • use git-preamble.sh
  • switch to source builds
    • mount Docker container at root of package and cd into build/Linux
    • pip wheel to build from root of package, not archive
    • drop dependency of auditwheel step on ver extracted from name of archive
    • drop sdist
  • run all Bash scripts with -e, remove redundant defensive coding
  • parameterise build-wheels.sh on i686 vs x86_64

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.