Coder Social home page Coder Social logo

robelgium / msnoise Goto Github PK

View Code? Open in Web Editor NEW
163.0 20.0 80.0 42.28 MB

A Python Package for Monitoring Seismic Velocity Changes using Ambient Seismic Noise | http://www.msnoise.org

License: European Union Public License 1.1

Shell 0.18% Python 92.39% CSS 0.27% HTML 6.49% PowerShell 0.35% Batchfile 0.32%
python seismic noise research passive signal-processing data-mining volcanology seismology

msnoise's People

Contributors

asyates avatar gitter-badger avatar pgervais avatar rantanplan77 avatar thomaslecocq avatar xavierbeguin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

msnoise's Issues

hardcoded MSEED

Hi Esteban,

Yes, this line is the culprit:
https://github.com/ROBelgium/MSNoise/blob/master/msnoise/s05compute_mwcs.py#L109

Should be replaced with a dynamic extension coming from the database. 1.3.2 it will be

Thomas

Le 7/05/2015 04:14, Esteban Chaves a écrit :

Hi All;

Just wondering if we came up with a solution for this:

IOError: [Errno 2] No such file or directory: 'STACKS/01/REF/ZZ/YZ_HRIU_YZ_JUDI.MSEED'

  1. Running: compute_mwcs
  2. If one has SAC data and the output format is SAC not miniseed.
    Thomas, I did modified this part in the previous version, but for this one
    it seems like isn’t going through the if loop...? Any thoughts?

Cheers;

-Esteban

traitsui dependency should be versioned

Database setup utility (and possibly other gui apps) fails with a
segmentation fault when running under Ubuntu 12.04. Tracking this down
showed that it stems from exceptions occurring internally within
traitsui. I.e. this is not a msnoise issue per se.

Replacing the system traisui (4.0) with upstream (4.5) resolves this
issue.

setup.py (line 17) should be updated so that the traitsui dependency is
versioned. I cannot determine if this behaviour is present with versions
4.1 to 4.4. So:

    traitsui>=4.5   will definitely work
    traitsui>4.0    might work

Add interstation distance to SAC export

This issue presents a tiny change in the add_corr methods, which pass the stations in order to later be able to add the distance in the SAC headers:

 # Add stations s1, s2 for keep_all :
add_corr(s1, s2, db, station1.replace('.', '_'), station2.replace('.', '_'), 
               filterid, thisdate, thistime, min30 / fe, components, corr, fe)

 # Add stations s1, s2 for keep_days :
add_corr(s1, s2,  db, station1.replace('.', '_'), station2.replace('.', '_'), filterid,
               thisdate, thistime, min30 / fe, components, corr, fe, day=True, ncorr=ncorr)

in database tools, edit:

def add_corr(s1, s2, db, station1, station2, filterid, date, time, duration, components, CF, sampling_rate, day=False, ncorr=0):
    ...
     if sac:
          export_sac(s1, s2, db, path, pair, components, filterid, CF/ncorr, ncorr)

the export_sac too:

def export_sac(s1, s2, db, filename, pair, components, filterid, corr, ncorr=0, sac_format=None, maxlag=None, cc_sampling_rate=None):
    ....
    dist, azim, bazim = gps2DistAzimuth(s1.Y, s1.X, s2.Y, s2.X)
    ....
    tr.SetHvalue('DIST', dist)
    tr.SetHvalue('AZ', azim)
    tr.SetHvalue('BAZ', bazim)

scan_archive, mtime, and data overlap

scan_archive changes files in data_availability table to 'M' even if they haven't changed (if they fall within the time range identified by 'mtime'.) This is apparently related to the date precision when Python reads the metadata and compares to the starttime and endtime columns in MySQL.

Move (pre-)processing methods to obspy

The title says it all. Preprocessing and processing functions need to be generalized and moved to obspy. This shouldn't be too hard, but requires testing!

DOC !

write documentation in rst and link it to the current development. Once done, either build the statics pages for WP on msnoise.org, or host the doc on github pages.

min/max time lags definition

Because of #21, the min&max time lags can't be defined inline anymore, so...

Step 1:
implement the "relative time lag window" vs interstation distance (cf OVPF install)

Step 2:
Modifiy the configuration/Configurator to enable/disable this feature.

examples:

if enabled:
config:velocity: 1 km/s
window:30s
sides:both

else:
minlag: 10 s
maxlag: 40s
sides: both

remaining parameters should be shared:
minCoh = 0.5
maxErr = 0.1
maxDt = 0.5

compute_cc problem in merging traces

this should never happen:

2015-04-02 15:22:48 [INFO] *** Starting: Compute CC ***
2015-04-02 15:22:48 [INFO] Will compute ZZ
2015-04-02 15:22:48 [INFO] New CC Job: 2006-01-25 (1 pairs with 2 stations)
2015-04-02 15:22:55 [INFO] Pre-Whitening Traces
2015-04-02 15:22:56 [INFO] Processing CC
2015-04-02 15:22:56 [INFO] Job Finished. It took 7.45 seconds
2015-04-02 15:22:56 [INFO] New CC Job: 2006-01-26 (1 pairs with 2 stations)
2015-04-02 15:23:03 [INFO] Pre-Whitening Traces
2015-04-02 15:23:03 [INFO] Processing CC
2015-04-02 15:23:03 [INFO] Job Finished. It took 7.37 seconds
2015-04-02 15:23:03 [INFO] New CC Job: 2006-01-27 (1 pairs with 2 stations)
2015-04-02 15:23:10 [INFO] Pre-Whitening Traces
2015-04-02 15:23:11 [INFO] Processing CC
2015-04-02 15:23:11 [INFO] Job Finished. It took 7.30 seconds
2015-04-02 15:23:11 [INFO] New CC Job: 2006-01-28 (1 pairs with 2 stations)
Traceback (most recent call last):
  File "/home/silvio/anaconda/bin/msnoise", line 9, in <module>
    load_entry_point('msnoise==1.3.1', 'console_scripts', 'msnoise')()
  File "/home/silvio/anaconda/lib/python2.7/site-packages/msnoise/scripts/msnoise.py", line 393, in run
    cli(obj={})
  File "/home/silvio/anaconda/lib/python2.7/site-packages/click-4.0-py2.7.egg/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/home/silvio/anaconda/lib/python2.7/site-packages/click-4.0-py2.7.egg/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/home/silvio/anaconda/lib/python2.7/site-packages/click-4.0-py2.7.egg/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/silvio/anaconda/lib/python2.7/site-packages/click-4.0-py2.7.egg/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/silvio/anaconda/lib/python2.7/site-packages/click-4.0-py2.7.egg/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/home/silvio/anaconda/lib/python2.7/site-packages/msnoise/scripts/msnoise.py", line 174, in compute_cc
    main()
  File "/home/silvio/anaconda/lib/python2.7/site-packages/msnoise/s03compute_cc.py", line 271, in main
    basetime, tramef_Z = preprocess(db, stations, comps, goal_day, params, tramef_Z)
  File "/home/silvio/anaconda/lib/python2.7/site-packages/msnoise/s03compute_cc.py", line 130, in preprocess
    stream[gap[0]] = stream[gap[0]].__add__(stream[gap[1]], method=0, fill_value="interpolate")
  File "/home/silvio/anaconda/lib/python2.7/site-packages/obspy-0.10.1-py2.7-linux-x86_64.egg/obspy/core/trace.py", line 681, in __add__
    raise TypeError("Sampling rate differs")
TypeError: Sampling rate differs

Documentation

still some empty parts in the documentation, that should be fixed for next release !

Add argparse to all scripts

Replace the current way of providing options to scripts by argparse. Example for stack.py:
python 04.stack.py --interval=5
will search for CC jobs being done in the last 5 days, instead of the last day (default).

Avoid exporting mov_stacks?

Would it make sense NOT to export the stacks... but only use the 1day stack and MWCS step would "dynamically" compute the moving averages...

Currently MWCS step opens N stack files per day, (N mov_stack values). With the new approach, it would read N times a different number of files (1, 2, 30 ?, the mov_stack value)...

  • the first time it would make sense to load the whole "matrix" and process from there
  • the next times, just load the necessary date span

Or always read the whole matrix and compute the sliding window on the fly ?

Add --outfile to the 7 default plots

OK, one last before MSNoise 1.3 goes out, add the support

--show : bool (default True)
--outfile: str (default None): if "?.png" is provided, ? is replaced by: "pair-components-filterid-mov_stack" automatically.

Problem with stack

[41] ∰ msnoise stack -r -m -i 100
Lets STACK !
/Users/ech/github/obspy/obspy/init.py:150: ObsPyDeprecationWarning: Module 'obspy.sac' is deprecated and will stop working with the next ObsPy version. Please import module 'obspy.io.sac' instead.
ObsPyDeprecationWarning)
/Users/ech/github/obspy/obspy/core/util/deprecation_helpers.py:55: ObsPyDeprecationWarning: Function 'obspy.core.util.gps2DistAzimuth' is deprecated and will stop working with the next ObsPy version. Please use 'obspy.geodetics.gps2dist_azimuth' instead.
ObsPyDeprecationWarning)
2015-10-14 12:28:11 [DEBUG] Starting the ref stack
2015-10-14 12:28:11 [DEBUG] Processing YZ_INDI:YZ_LAFE-ZZ-1
2015-10-14 12:28:11 [DEBUG] Found 057 updated days
2015-10-14 12:28:11 [DEBUG] New Data for YZ_INDI:YZ_LAFE-ZZ-1
2015-10-14 12:28:11 [DEBUG] Processing YZ_INDI:YZ_SAJU-ZZ-1
2015-10-14 12:28:11 [DEBUG] Found 057 updated days
2015-10-14 12:28:11 [DEBUG] New Data for YZ_INDI:YZ_SAJU-ZZ-1
2015-10-14 12:28:11 [DEBUG] Processing YZ_LAFE:YZ_SAJU-ZZ-1
2015-10-14 12:28:11 [DEBUG] Found 057 updated days
2015-10-14 12:28:11 [DEBUG] New Data for YZ_LAFE:YZ_SAJU-ZZ-1
2015-10-14 12:28:12 [DEBUG] Starting the step stack
Traceback (most recent call last):
File "/Users/ech/anaconda/bin/msnoise", line 9, in
load_entry_point('msnoise==1.3', 'console_scripts', 'msnoise')()
File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/scripts/msnoise.py", line 392, in run
cli(obj={})
File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 610, in call
return self.main(_args, *_kwargs)
File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 590, in main
rv = self.invoke(ctx)
File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 936, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 782, in invoke
return ctx.invoke(self.callback, *_ctx.params)
File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 416, in invoke
return callback(_args, **kwargs)
File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/scripts/msnoise.py", line 191, in stack
main('step', interval)
File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/s04stack.py", line 117, in main
mov_stacks = [int(mi) for mi in mov_stack.split(',')]
ValueError: invalid literal for int() with base 10: ''

MWCS jobs could start n times

Case:

  1. One process starts the MWCS (DTT) jobs, gets STA1-STA2, marks the corresponding jobs as "In Progress" in the DB
  2. Another process starts before all jobs are marked "In Progress", and gets the same STA1-STA2 job list.

Result:
MWCS is calculated two times (or more, if N process are used)...

msnoise scan_archive fail

First time trying to update with new data (had already done an initialization) w/ sqlite database,
msnoise scan_archive

scan_archive
2016-10-11 13:48:26 [INFO] *** Starting: Scan Archive ***
2016-10-11 13:48:26 [INFO] Will work on 1 threads

% ADDED THIS PRINT STATEMENT

b'data/NC/MCB/MCB.NC..HHZ.2016.281\ndata/NC/MCB/MCB.NC..HHZ.2016.282\ndata/NC/MCB/MCB.NC..HHZ.2016.283\ndata/NC/MCB/MCB.NC..HHZ.2016.284\n'

%

Traceback (most recent call last):
File "/usr/local/bin/msnoise", line 9, in
load_entry_point('msnoise==1.4.1', 'console_scripts', 'msnoise')()
File "/usr/local/lib/python3.5/dist-packages/msnoise-1.4.1-py3.5.egg/msnoise/scripts/msnoise.py", line 681, in run
cli(obj={})
File "/home/ashton/.local/lib/python3.5/site-packages/click/core.py", line 716, in call
return self.main(_args, *_kwargs)
File "/home/ashton/.local/lib/python3.5/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/home/ashton/.local/lib/python3.5/site-packages/click/core.py", line 1060, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/ashton/.local/lib/python3.5/site-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, *_ctx.params)
File "/home/ashton/.local/lib/python3.5/site-packages/click/core.py", line 534, in invoke
return callback(_args, *_kwargs)
File "/home/ashton/.local/lib/python3.5/site-packages/click/decorators.py", line 17, in new_func
return f(get_current_context(), *args, *_kwargs)
File "/usr/local/lib/python3.5/dist-packages/msnoise-1.4.1-py3.5.egg/msnoise/scripts/msnoise.py", line 325, in scan_archive
main(init, threads=ctx.obj['MSNOISE_threads'])
File "/usr/local/lib/python3.5/dist-packages/msnoise-1.4.1-py3.5.egg/msnoise/s01scan_archive.py", line 182, in main
files = sorted(stdout.replace('\r', '').split('\n'))
TypeError: a bytes-like object is required, not 'str'

Looks like the file list is returned as bytes type object and needs to be recast as a string before being converted into a list. From what I remember this was a change in subprocess (?).

So in, s01scan_archive.py
Ln 181, change;

                if len(stdout) != 0:
                    files = sorted(stdout.replace('\r', '').split('\n'))

to;

                if len(stdout) != 0:
                    if type(stdout) is not str:
                        stdout = stdout.decode("utf-8")
                        files=[]
                        [files.append(file.split('/')[-1]) for file in stdout.split('\n')]
                        files=sorted(files) 
                    else:
                        files = sorted(stdout.replace('\r', '').split('\n'))

This was using the BUD directory structure, and since the find mtime returned the relative path into the BUD data directory (e.g. 'data/NETWORK/STA/file) it was necessary to parse just the file name out so that we didnt end up with a data/NETWORK/STA/data/NETWORK/STA/file, when the folder was later prepended.

New Jobs inserts (doens't update)

The fact that the new job procedure inserts and doens't update existing fields yields an error for the following procedures, expecting unique day-pair-jobtype keys...

Adding station manually in admin

seems to fail

Hi

I'm now trying to create new station in the configuration page in web browser, but there's a 'builtins.TypeError' bug.

Bug says:
"TypeError: init() missing 8 required positional arguments: 'net', 'sta', 'X', 'Y', 'altitude', 'coordinates', 'instrument', and 'used' "

Is this about instantiating the class?
How can I fix the problem?

I'm a beginner in both python and msnoise, and this bug really worry me.
I'm useing Centos6.5 and python3.5(Anaconda)

Shuye

Make MSNoise a real Python Package

MSNoise should be importable like any other python package : installable using package managers, pip, easy_install, eggs or whatever.

This requires:

  • Moving all codes in main() functions, calling main() from if name == 'main'.
  • Propagating the db.ini path location to all codes so they find the right location
  • really using the output_directory (currently only used for CROSS_CORRELATIONS when keep_all is True
  • Adding information about that in the documentation
  • Provide examples

This will allow MSNoise to be installed by sysadmins, and used by users, with no output placed in the folder of the package.

Py3.5 Werkzeug 0.11.1

werkzeug should be 0.10.4 in order to avoid

an operation was attempted on something that is not a socket

when running msnoise admin

check dependencies

not sure all dependencies are properly identified (certainly with msnoise-admin) !

finalize logging

logging needs:

  • logging level defined by verbosity level of top level msnoise command (-v -vv -vvv -vvvv)
  • logging to console only, user can redirect to file if needed
  • logging multiprocess (is that doable)

PEP8

Review the code to respect PEP8

MSNoise Wish List

  • Admin interface via web interface (e.g. #25)
  • Instrument response removal (e.g. #16)
  • Support for Inventory/StationXML/DatalessSeed
  • Support for Plugins
  • Preprocessing in ObsPy (e.g. #29)
  • Threading Stacks (e.g. #12)
  • Data Availabiltiy Path as relative to the archive configuration value

Switch from explicit MySQL to "any" using SQLAlchemy

In order to simplify the installation process, AND because the database usage is somehow limited, it doesn't really make sense to require a full MySQL server to run MSNoise. Switching to SQLAlchemy will allow to use sqlite, e.g.

make stack faster

[ ] - only load data for the needed dates +- max(mov_stacks) should be enough, instead of loading the whole dataset in a huge matrix, and only using a few lines from it.

Database_tools_distance_computation

############ CORRELATIONS ############

###################################################################################

                                # Add info about s1 and s2 like show in s03compute_CC:

def add_corr(s1, s2, db, station1, station2, filterid, date, time, duration, components, CF, sampling_rate, day=False, ncorr=0):

###################################################################################



###################################################################################

                                        # add info about s1 and s2

        if sac:   
            export_sac(s1, s2, db, path, pair, components, filterid, CF/ncorr, ncorr)

###################################################################################


###################################################################################

                                               # add info about s1 and s2

def export_sac(s1, s2, db, filename, pair, components, filterid, corr, ncorr=0, sac_format=None, maxlag=None, cc_sampling_rate=None):

###################################################################################


###################################################################################

                       # Compute dist,azim,bazim

    dist, azim, bazim = gps2DistAzimuth(s1.Y, s1.X, s2.Y, s2.X)

################################################################################### 



###################################################################################

                                    # Add info in the headers
    tr.SetHvalue('DIST', dist)
        tr.SetHvalue('AZ', azim)
        tr.SetHvalue('BAZ', bazim)

###################################################################################

Stack_distance_computation

  # add station1,station2

                                            if sac:
                                                export_sac(
                                                    station1, station2, db, filename, pair, components, filterid, corr, maxlag=maxlag, cc_sampling_rate=cc_sampling_rate) 


 #add station1,station2

                            if sac:
                                export_sac(
                                    station1, station2, db, filename, pair, components, filterid, stack_total) 

Custom data_structure

Because of #21, it's no longer possible to edit the data_structure.py. I need to find a way to let users define their own structure AND parser for s002 and s01 !

Problem reading Sac files

There's a problem when exporting/creating sac files during Cross-correlation:

msnoise compute_cc
/Users/ech/github/obspy/obspy/init.py:150: ObsPyDeprecationWarning: Module 'obspy.sac' is deprecated and will stop working with the next ObsPy version. Please import module 'obspy.io.sac' instead.
? ObsPyDeprecationWarning)
/Users/ech/github/obspy/obspy/core/util/deprecation_helpers.py:55: ObsPyDeprecationWarning: Function 'obspy.core.util.gps2DistAzimuth' is deprecated and will stop working with the next ObsPy version. Please use 'obspy.geodetics.gps2dist_azimuth' instead.
? ObsPyDeprecationWarning)
2015-10-13 19:48:12 [INFO] *** Starting: Compute CC ***
2015-10-13 19:48:12 [INFO] Will compute ZZ
2015-10-13 19:48:12 [INFO] New CC Job: 2012-01-01 (1 pairs with 2 stations)
2015-10-13 19:48:16 [INFO] Pre-Whitening Traces
2015-10-13 19:48:16 [INFO] Processing CC
Traceback (most recent call last):
? File "/Users/ech/anaconda/bin/msnoise", line 9, in
? ? load_entry_point('msnoise==1.3', 'console_scripts', 'msnoise')()
? File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/scripts/msnoise.py", line 392, in run
? ? cli(obj={})
? File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 610, in call
? ? return self.main(_args, *_kwargs)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 590, in main
? ? rv = self.invoke(ctx)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 936, in invoke
? ? return _process_result(sub_ctx.command.invoke(sub_ctx))
? File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 782, in invoke
? ? return ctx.invoke(self.callback, *_ctx.params)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 416, in invoke
? ? return callback(_args, **kwargs)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/scripts/msnoise.py", line 173, in compute_cc
? ? main()
? File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/s03compute_cc.py", line 363, in main
? ? thisdate, thistime, params.min30 / params.goal_sampling_rate, 'ZZ', daycorr, params.goal_sampling_rate, day=True, ncorr=ndaycorr)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/api.py", line 851, in add_corr
? ? ncorr)
? File "/Users/ech/anaconda/lib/python2.7/site-packages/msnoise-1.3-py2.7.egg/msnoise/api.py", line 891, in export_sac
? ? tr = SacIO(filename)
? File "/Users/ech/github/obspy/obspy/io/sac/sacio.py", line 298, in init
? ? self.read_sac_file(filen)
? File "/Users/ech/github/obspy/obspy/io/sac/sacio.py", line 608, in read_sac_file
? ? self.hf = from_buffer(fh.read(4 * 70), dtype=native_str('<f4'))
AttributeError: 'str' object has no attribute 'read'

Update documentation

  • Remove intructions for mysql-python, no longer needed (thanks to pymysql)
  • Correct typos identified at NTU workshop
  • Add doc for web admin
  • Add doc for plugins
  • Complete the "TODO" parts in the doc (moslty in how-to's)
  • Comments from Z. Spica (phpmyadmin install) - might be not needed with webmin, check
  • Update readme !
  • Document Phase Weighted Stack (configuration and effect)
  • Add example of the dt/t plot
  • Instrument Correction

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.