Coder Social home page Coder Social logo

sxs-collaboration / gwsurrogate Goto Github PK

View Code? Open in Web Editor NEW
23.0 12.0 20.0 119.87 MB

An easy to use interface to gravitational wave surrogate models

License: Other

Jupyter Notebook 86.53% Python 7.63% Shell 0.01% C 0.29% MATLAB 0.23% HTML 5.31%
surrogate-models python gravitational-waveforms

gwsurrogate's Introduction

Welcome to GWSurrogate!

GWSurrogate is an easy to use interface to gravitational wave surrogate models.

Surrogates provide a fast and accurate evaluation mechanism for gravitational waveforms which would otherwise be found through solving differential equations. These equations must be solved in the ``building" phase, which was performed using other codes. For details see

[1] Scott Field, Chad Galley, Jan Hesthaven, Jason Kaye, and Manuel Tiglio. `"Fast prediction and evaluation of gravitational waveforms using surrogate models". Phys. Rev. X 4, 031006 (2014). arXiv: gr-qc:1308.3565

If you find this package useful in your work, please cite reference [1] and, if available, the relevant paper describing the specific surrogate used.

All available models can be found in gwsurrogate.catalog.list()

gwsurrogate is available at https://pypi.python.org

Installation

Dependency

gwsurrogate requires:

  1. gwtools. If you are installing gwsurrogate with pip you will automatically get gwtools. If you are installing gwsurrogate from source, please see https://bitbucket.org/chadgalley/gwtools/

  2. gsl. For speed, the long (hybrid) surrogates use gsl's spline function. To build gwsurrogate you must have gsl installed. Fortunately, this is a common library and can be easily installed with a package manager.

Note that at runtime (ie when you do import gwsurrogate) you may need to let gsl know where your BLAS library is installed. This can be done by setting your LD_PRELOAD or LD_LIBRARY_PATH environment variables. A relevant example:

>>> export LD_PRELOAD=~/anaconda3/envs/python27/lib/libgslcblas.so

From pip

The python package pip supports installing from PyPI (the Python Package Index). gwsurrogate can be installed to the standard location (e.g. /usr/local/lib/pythonX.X/dist-packages) with

>>> pip install gwsurrogate

If there is no binary/wheel package already available for your operating system, the installer will try to build the package from the sources. For that, you would need to have gsl installed already. The installer will look for GSL inside /opt/local/. You may provide additional paths with the CPPFLAGS and LDFLAGS environment variables.

In the case of an homebrew installation, you may install the package like this:

>>> export HOMEBREW_HOME=`brew --prefix`
>>> 
>>> export CPPFLAGS="-I$HOMEBREW_HOME/include/"
>>> export LDFLAGS="-L$HOMEBREW_HOME/lib/"
>>> pip install gwsurrogate

From conda

gwsurrogate is on conda-forge, and can be installed with

>>> conda install -c conda-forge gwsurrogate

Note: As of Feb 9th 2024, installation with Python 3.12 with conda doesn't work. Please use either use Python <= 3.11 or pip instead.

From source (pip)

First, please ensure you have the necessary dependencies installed (see above). Next, git clone this project, to any folder of your choosing. Then run

git submodule init
git submodule update

For a "proper" installation, run the following commands from the top-level gwsurrogate folder containing setup.py

>>> python -m pip install .            # option 1
>>> python -m pip install --editable . # option 2

where the "--editable" installs an editable (development) project with pip. This allows your local code edits to be automatically seen by the system-wide installation.

From source (tar.gz)

Please note this is not the recommended installation strategy, and certain functionality may not work.

You can download and unpack gwsurrogate-X.X.tar.gz to any folder gws_folder of your choosing. The gwsurrogate module can be used by adding

import sys
sys.path.append('absolute_path_to_gws_folder')

at the beginning of any script/notebook which uses gwsurrogate.

Alternatively, if you are a bash or sh user, edit your .profile (or .bash_profile) file and add the line

export PYTHONPATH=~absolute_path_to_gws_folder:$PYTHONPATH

Usage

Available models

To get a list of all available surrogate models, do:

>>> import gwsurrogate
>>> gwsurrogate.catalog.list()
>>> gwsurrogate.catalog.list(verbose=True)      # Use this for more details

Current NR models

The most up-to-date models trained on numerical relativity data are listed below, along with links to example notebooks.

Current point-particle blackhole perturbation theory models

The most up-to-date models trained on point-particle blackhole perturbation data and calibrated to numerical relativity (NR) in the comparable mass regime.

Download surrogate data and load it

Pick a model, let's say NRSur7dq4 and download the data. Note this only needs to be done once.

gwsurrogate.catalog.pull('NRSur7dq4')       # This can take a few minutes

Load the surrogate, this only needs to be done once at the start of a script

sur = gwsurrogate.LoadSurrogate('NRSur7dq4')

Evaluate the surrogate

q = 4           # mass ratio, mA/mB >= 1.
chiA = [-0.2, 0.4, 0.1]         # Dimensionless spin of heavier BH
chiB = [-0.5, 0.2, -0.4]        # Dimensionless of lighter BH
dt = 0.1                        # timestep size, Units of total mass M
f_low = 0                # initial frequency, f_low=0 returns the full surrogate

# h is dictionary of spin-weighted spherical harmonic modes
# t is the corresponding time array in units of M
# dyn stands for dynamics, do dyn.keys() to see contents
t, h, dyn = sur(q, chiA, chiB, dt=dt, f_low=f_low)

There are many more options, such as using MKS units, returning the polarizations instead of the modes, etc. Read the documentation for more details.

help(sur)

Jupyter notebooks located in tutorial/website give a more comprehensive overview of individual models.

Tests

If you have downloaded the entire project as a tar.gz file, its a good idea to run some regression tests.

>>> cd test                              # move into the folder test
>>> python download_regression_models.py # download all surrogate models to test
>>> python test_model_regression.py      # (optional - if developing a new test) generate regression data locally on your machine
>>> cd ..                                # move back to the top-level folder
>>> pytest                               # run all tests
>>> pytest -v -s                         # run all tests with high verbosity

NSF Support

This package is based upon work supported by the National Science Foundation under PHY-1316424, PHY-1208861, and PHY-1806665.

Any opinions, findings, and conclusions or recommendations expressed in gwsurrogate are those of the authors and do not necessarily reflect the views of the National Science Foundation.

gwsurrogate's People

Contributors

ahnitz avatar duetosymmetry avatar duncanmmacleod avatar jblackma avatar jyoo1042 avatar kbarkett avatar profplum avatar raffienficiaud avatar sfield17 avatar tousifislam avatar vijayvarma392 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gwsurrogate's Issues

Using hdf5 training info consistently within gwsurrogate (from Bitbucket)

Each surrogate hdf5 file has a param_space group which contains information about the training region -- param names, dimensionality, and min/max value of each parameter. This information is used in different ways, and the purpose has been to allow the hdf5 file to summarize the surrogate's training region. Within the code this info is used to instantiate a param_space object. Since the tidal model's region of validity is different from the aligned spin model, the hdf5 file's values are not correct. Because the limits are explicitly set, and the model's docstring is clear, I don't think this is a problem. But I wonder if there should be a cleaner solution. Personally, I like the idea of the hdf5 file containing some model metadata.. but not sure how this would be set without creating an entirely new hdf5 file that repeats all of the info besides the model intervals. Since the tidal model is somewhat special in this respect. On option that doesn't require modifying the hdf5 file would be to have the SurrogateEvaluator base class check if the hdf5 file's limit is equal to the soft_param_lims, but allow derive model's to override the method for special cases. Any thoughts on this or other solutions?

flow + dt or times array, but not both

When passing an array of times, models require the user to supply something for f_low. Probably the behavior should be one of

  1. times (as an np array)
  2. dt and f_low

but input like times and dt, or times and f_low, should raise an error. if times and f_low are allowed, f_low should only be checked (starting time is a frequency lower than f_low).

'wget' dependency not mentioned in the documentation

Documentation should mention dependency on 'wget'. Alternatively, pul() should fail with an appropriate error message if 'wget' cannot be found/executed. At the moment it fails silently.

Suggestion:
In line 252 in catalog.py check the return of os.systeM('wget....') through os.WEXITSTATUS() and fail if it is not zero, i.e.

if os.WEXITSTATUS( os.system('wget -q --directory-prefix='+sdir+' '+surr_url)) >0:
raise ValueError("wget not installed")

modes with old interface (from Bitbucket)

This is for the old API, supported for older models

When mode_sum true: t, hp, hc = ...

false modes, t, hp, hc = ...

Better to consistently make t, h... and have h be a dictionary of modes

ERROR using SimpleH5Object._read_subordinates

I have been using NRHybSur3dq8 with gwsurrogate version 0.9.9 for several months on a Python 3.7 conda environment, and that environment is unable to install pymultinest because of conflicts with lots of parallelization libraries (mpi, mpich2, openmp, and many more), so I'm trying to make a new environment that can use both gwsurrogate and pymultinest.

In trying to use gwsurrogate.LoadSurrogate('NRHybSur3dq8') with a new environment,
I'm getting an error from the file "gwsurrogate/new/saveH5Object.py"
attempting to use SimpleH5Object._read_h5
ERROR MESSAGE:
/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py:106: H5pyDeprecationWarning: dataset.value has been deprecated. Use dataset[()] instead.
v = item.value
Traceback (most recent call last):
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 205, in _read_subordinates
getattr(self, k)._read_h5(f[k])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 192, in _read_h5
self.h5_prepare_subs()
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/nodeFunction.py", line 94, in h5_prepare_subs
self.fitFunc = evaluate_fit.getFitEvaluator(self.fit_data)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/eval_pysur/evaluate_fit.py", line 252, in getFitEvaluator
gpr_predictObject = GPRPredictor(res)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/eval_pysur/evaluate_fit.py", line 39, in init
if not StrictVersion(sklearn.version) >= StrictVersion('0.19.1'):
File "/home/seth/anaconda3/envs/gw/lib/python3.8/distutils/version.py", line 40, in init
self.parse(vstring)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/distutils/version.py", line 137, in parse
raise ValueError("invalid version number '%s'" % vstring)
ValueError: invalid version number '0.22.2.post1'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 205, in _read_subordinates
getattr(self, k)._read_h5(f[k])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 233, in _read_h5
self.object_list[i]._read_h5(g)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 193, in _read_h5
self._read_subordinates(f)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 207, in _read_subordinates
raise Exception("%s could not be read: %s"%(k, e))
Exception: node_function could not be read: invalid version number '0.22.2.post1'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 205, in _read_subordinates
getattr(self, k)._read_h5(f[k])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 277, in _read_h5
v._read_h5(f[_list_item_string(idx)])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 193, in _read_h5
self._read_subordinates(f)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 207, in _read_subordinates
raise Exception("%s could not be read: %s"%(k, e))
Exception: node_functions could not be read: node_function could not be read: invalid version number '0.22.2.post1'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 205, in _read_subordinates
getattr(self, k)._read_h5(f[k])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 277, in _read_h5
v._read_h5(f[_list_item_string(idx)])
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 193, in _read_h5
self._read_subordinates(f)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 207, in _read_subordinates
raise Exception("%s could not be read: %s"%(k, e))
Exception: func_subs could not be read: node_functions could not be read: node_function could not be read: invalid version number '0.22.2.post1'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "", line 1, in
File "/home/seth/research/gw_detection_home/multimode_waveform_generator.py", line 39, in
SUR_HYB = gwsurrogate.LoadSurrogate('NRHybSur3dq8') # non-precessing hybrid surrogate model
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/surrogate.py", line 2235, in new
return SURROGATE_CLASSESsurrogate_name
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/surrogate.py", line 1963, in init
super(NRHybSur3dq8, self).init(self.class.name,
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/surrogate.py", line 1376, in init
self._sur_dimless = self._load_dimless_surrogate()
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/surrogate.py", line 1979, in _load_dimless_surrogate
sur.load(self.h5filename)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 148, in load
self._read_h5(f)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 193, in _read_h5
self._read_subordinates(f)
File "/home/seth/anaconda3/envs/gw/lib/python3.8/site-packages/gwsurrogate/new/saveH5Object.py", line 207, in _read_subordinates
raise Exception("%s could not be read: %s"%(k, e))
Exception: sur_subs could not be read: func_subs could not be read: node_functions could not be read: node_function could not be read: invalid version number '0.22.2.post1'
END ERROR MESSAGE

I have tried installing the dependencies in many different orders in new conda environments, but this same error keeps occurring. I have also tried updating conda and all packages. I have also tried initializing the environment with all the packages at once, but it gives me the following conflict error message:

conda_newenv_output.txt

Capturing logs using logging module

Currently, the logs printed by gwsurrogate package are simple print statements and do not offer control on turning them off. Use of logging or a similar module can be considered for better management of the logs.

[ENH] Support versioning of surrogate models

In vacuum call today, @keefemitman and I were discussing that it would be nice to be able to publish a surrogate model and later add improvements to it. In a sense this can already be supported by just appending version strings into the name of the model, e.g. "model_v1", "model_v2". But, for the end user it would probably also be nice to just be able to use whatever is the latest version of some model, so simply "model" for whatever version is highest.

Clean up from Pull request #2 (from Bitbucket code)

Pull request #2 (bitbucket) was a bit of a rush job to get the code working. Three key pieces of cleanup work remain:

  1. param nudging might be best suited for TensorSplineGrid class which is what actually needs nudging. It will make independent calls to this class easier too

  2. consistent coding style.

  3. remove memoize decorator (see Jonathan's email "fast spline" for how to do this correctly)

pull() fails for directories with characters "in-need-of-quotes"

For reasons beyond the scope of this comment, my home directory has a space (' ') in it. This causes the wget command in the pull() function to fail, as the argument to --directory-prefix is not put in quotes.

Suggestion:

In line 252 in catalog.py replace

os.system('wget -q --directory-prefix='+sdir+' '+surr_url)

by

os.system('wget -q --directory-prefix="'+sdir+'" '+surr_url)

(Note the additional quotes now surrounding the 'sdir'!)

handle missing modes more gently (from Bitbucket)

This is from Richard:

  • Right now, EvaluateSurrogate accepts 'ell_m', a list of modes to load
  • Until I look at the surrogate file itself, I don't know what modes are available (e.g., single mode vs multi mode)
  • If I pass an 'ell_m' list that includes modes that aren't available in the surrogate (e.g., all modes below some lmax; modes with m<0, for a reflection-symmetric model), the code fails to find them in the .h5 file, and fails.

Would it be possible to modify 'CreateManyEvaluateSingleModeSurrogate' (old interface) to either handle requested but missing modes more gently, by taking the intersection of the modes that are requested and the modes that are available? In about line 780 in surrogate.py?

Specifically, the following

   ### compile list of available modes ###
  modes_available =[]
  for kk in fp.keys():
      splitkk = kk.split('_')
      if splitkk[0][0] == 'l' and splitkk[1][0] == 'm':
        ell = int(splitkk[0][1])
        emm = int(splitkk[1][1:])
        if not (ell, emm) in exc_modes:
          modes_available.append((ell,emm))
  if ell_m is None:
    mode_keys = modes_available
  else:
    mode_keys = []
    for i, mode in enumerate(ell_m):
      if mode in exc_modes:
        print "WARNING: Mode (%d,%d) is both included and excluded! Excluding it."%mode 
      else:
        if mode in modes_available:
          mode_keys.append(mode)

Problems downloading surrogates with fresh install

I installed from conda-forge and tried to run a few lines from @vijayvarma392, but got a couple errors when it tried to download the surrogate data. The first error was because the output directory didn't exist. So I suggest following this line

sdir = os.path.abspath(sdir)

with

  if not os.path.isdir(sdir):
    os.makedirs(sdir)

The second error was because, although I have wget installed on my machine, the conda environment the code was running in doesn't. But because wget was called with os.system, when the shell errored out, nothing happened, and gwsurrogates continued as if it had succeeded — which made this bug pretty hard to track down. Python now suggests using something from the subprocess module instead of os.system, so I suggest replacing this line

os.system('wget -q --directory-prefix='+sdir+' '+surr_url)

with

    print(subprocess.check_output('wget -q --directory-prefix='+sdir+' '+surr_url, shell=True, stderr=subprocess.STDOUT))

(along with import subprocess somewhere up above). This should error out if wget fails, and it will show what happened.

interoperability with PyCBC

I'm interested in adding support so that gwsurrogate has better interoperability with the PyCBC gw data analysis package. The primary case here would be that gwsurrogate is able to act as a source of waveform models for our generic python GW waveform interface, and by doing so, be available for use as templates or injections for both parameter estimation and searches (or just make use in notebooks easier with a common interface).

A while back we added a plugin interface which should allow gwsurrogate to advertise what waveforms it can provide and harmonize the interface. For an example of what interface code is needed, see the reference example plugin.
https://github.com/gwastro/example-waveform-plugin

To be clear, this requires no explicit dependency on PyCBC, but would mean that if you have both packages installed ala

pip install gwsurrogate pycbc

that PyCBC programs and functions would immediately be aware of the waveforms from gwsurrogate and be able to use them.

I am willing to put together a PR. The changes would be limited a new module that contains the interface functions and small additions to setup.py so that gwsurrogate advertises globally its capabilities.

What do gwsurrogate developers think?

BUG: NRSur7dq4 replacing f_low != 0 with f_ref

SETUP:
sur = gwsurrogate.LoadSurrogate('NRSur7dq4')
t, wfdic, _ = sur(m1/m2, chi1, chi2, M=m1+m2,
dt=my_dt, f_low=my_f_low, f_ref=my_f_ref, units='mks',
inclination=None, phi_ref=None)
true_f_low = instantaneous_frequency(wfdic[(2, 2)], time_index=0)

t_0, wfdic_0, _ = sur(m1/m2, chi1, chi2, M=m1+m2,
dt=my_dt, f_low=0, f_ref=None, units='mks',
inclination=None, phi_ref=None)
min_f_low = instantaneous_frequency(wfdic_0[(2, 2)], time_index=0)
min_f_ref = min_f_low

WHAT WORKS:
--> for my_f_ref = None & my_f_low = 0, f_ref & f_low inherit the value of min_f_low
--> for my_f_ref = None & my_f_low > 0, f_ref inherits the value of my_f_low
--> for any my_f_ref != None, f_ref keeps the value of my_f_ref
--> for any my_f_ref > min_f_ref, my_f_low = 0 ==> t = t_0
(i.e., setting a value for f_ref does not affect the length of the returned waveform when f_low=0)
--> for any my_f_ref < min_f_ref, sur() throws Exception: got omega_ref = ... too small!
(regardless of my_f_low, as intended)

WHAT IS BROKEN:
--> for fixed my_f_ref > min_f_ref,
any my_f_low > 0 results in the same t & true_f_low
--> for different my_f_ref > min_f_ref,
fixed my_f_low > 0 results in different t & true_f_low
(for any fixed my_f_low -- can be > or < my_f_ref and/or min_f_low)

this indicates that for any my_f_ref != None and my_f_low > 0,
the internal f_low is set to my_f_ref rather than my_f_low

that is, my_f_low=0 indeed sets the internal f_low to min_f_low and the internal f_ref
is set to my_f_ref != None (or it inherits the internal f_low when my_f_ref=None), BUT
whenever my_f_low != 0, the internal f_low looks first to see if you have passed
my_f_ref != None and inherits that value instead of my_f_low

I think this was probably implemented purposely as a guard against
throwing exceptions for 0 < my_f_low < min_f_low if the user has also passed
my_f_ref > min_f_low, because it makes sense for the internal f_low to inherit my_f_ref
in that case

however, this can result in a nasty analysis bug in the following example:

you want to set f_ref = 50 Hz to make the reference frames line up with results from a different
waveform model; and, in order to get consistent results over some region of physical parameter
space, you want to make sure that you only count SNR starting at the frequency of the
maximal min_f_low that is forced by the surrogate's length constraints over that region of
(q, chi1, chi2) -- maybe there are plenty of parameter combinations for which the surrogate
has min_f_low = 20 Hz, but there are also some parameters for which the best the model
can do is min_f_low = 30 Hz; so you set my_f_low = 30 and my_f_ref = 50, and then
you unknowingly lose all the SNR from 30 Hz to 50 Hz

I know that the documentation recommends using f_low=0 for NRSur7dq4,
and i'm not saying that the model needs to accommodate this scenario of
setting my_f_ref != None and my_f_low != 0, i'm just saying that if my_f_ref != None
is going to behave this way then the documentation needs to be changed

also, if the behavior is actually unintended then it could be resulting in other silent bugs

pip install error on v1.0.7

When trying to install gwsurrogate via pip in a Google Colab env, I get the following error:

! pip install gwsurrogate
Collecting gwsurrogate
  Downloading https://files.pythonhosted.org/packages/46/fa/c693450133333cc219d54d16776e8933cc50a71d7e1f1e57e90b16309afb/gwsurrogate-1.0.7.tar.gz (5.2MB)
     |████████████████████████████████| 5.2MB 8.5MB/s 
Collecting gwtools
  Downloading https://files.pythonhosted.org/packages/f5/22/237ff44970e0e0bf790b9fe88399e7fea905253431db3fbfb88e710884cb/gwtools-1.0.7.tar.gz
Building wheels for collected packages: gwsurrogate, gwtools
  Building wheel for gwsurrogate (setup.py) ... error
  ERROR: Failed building wheel for gwsurrogate
  Running setup.py clean for gwsurrogate
  Building wheel for gwtools (setup.py) ... done
  Created wheel for gwtools: filename=gwtools-1.0.7-cp36-none-any.whl size=33918 sha256=75f1bc9ec95868d26538968d7584964dc4f9f7a727919bc6333fadf75e51e357
  Stored in directory: /root/.cache/pip/wheels/0d/89/0a/f1af3daa156476d50813859496ad72cf87da64195a4cf1a511
Successfully built gwtools
Failed to build gwsurrogate
Installing collected packages: gwtools, gwsurrogate
    Running setup.py install for gwsurrogate ... error
ERROR: Command errored out with exit status 1: /usr/bin/python3 -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-n4kkdbwv/gwsurrogate/setup.py'"'"'; __file__='"'"'/tmp/pip-install-n4kkdbwv/gwsurrogate/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-6_ei33as/install-record.txt --single-version-externally-managed --compile Check the logs for full command output.

NRSur7dq4 BUG in time array (and maybe elsewhere)

I first noticed that when I called the precessing surrogate with dt=0.001 (units='mks'), only one value was being returned in the waveform array. After thoroughly checking that I was giving the intended values for all the other parameters, I still got this result, so I tried calling the model without specifying dt, and I noticed that the total waveform length was less than 0.001 seconds. Then I started calling it with the exact same parameters several times in succession, and I noticed that every new time I call it with the same parameters, the total waveform duration T = t[-1] - t[0] goes down by orders of magnitude. Here is an example:

from gwsurrogate import LoadSurrogate
SUR_PRE = LoadSurrogate('NRSur7dq4')

t1, wf1, dyn1 = SUR_PRE(1, [0.05, 0.0, 0.3], [0, 0.01, -0.1], M=150, f_ref=0, dt=None,
inclination=None, phi_ref=None, ellMax=None, f_low=0, units='mks', dist_mpc=1,
skip_param_checks=True, df=None, freqs=None, times=None,
mode_list=None, precessing_opts=None, tidal_opts=None,
par_dict=None, taper_end_duration=None)
print(t1[-1] - t1[0])

If I call this second block of code 5 times in succession, here are the outputs:
3.250824076870587
0.0024017857224370586
1.774496103171364e-06
1.3110396946548902e-09
9.686271375231078e-13
7.156446409452046e-16

This will keep going indefinitely, each time dividing by exactly
0.0007388236538315364.
Further, it doesn't matter what parameters I use or what I call the variables. The overall scale of the time array continues to decrease by exactly this factor even if I call it 5 different times with 5 different sets of parameters and assign the output to 5 different sets of variable names. It also doesn't matter whether or not I specify ellMax and/or the orientation angles.

When I restart the kernel, the cycle resets. For the first call, the waveform looks reasonable, then for the second call, it looks like it has been zoomed in, and for the third call, it zooms in again. From the fourth call onward, the plot stays the same as after the third call except that the time axis scale is multiplied by a factor of 0.0007388236538315364 with each successive call to SUR_PRE.

It seems that there is somewhere in the generator where the previous calls by the current python kernel are saving information and doing something like
total_time *= 0.0007388236538315364

Installation problem with setup.py

From @oshaughn.

Question re gwsurrogate maintenance: I'm hitting an error like this one in this autobuild where pip install of the release gwsurrogate fails due to

#8 2.508 long\_description = pypandoc.convert('README.md', 'rst')
#8 2.508 AttributeError: module 'pypandoc' has no attribute 'convert'

which seems to rely on deprecated use of a convert routine in the setup file. Is this something that could be fixed upstream?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.