Coder Social home page Coder Social logo

olgaliak / seismic-deeplearning Goto Github PK

View Code? Open in Web Editor NEW

This project forked from microsoft/seismic-deeplearning

3.0 1.0 0.0 2.48 MB

Deep Learning for Seismic Imaging and Interpretation

License: MIT License

Python 56.13% Makefile 0.38% Shell 3.58% Jupyter Notebook 39.91%

seismic-deeplearning's People

Contributors

dciborow avatar dcstwh avatar georgeaccnt-gh avatar maxkazmsft avatar microsoftopensource avatar msalvaris avatar msftgits avatar sharatsc avatar vapaunic avatar yalaudah avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar

seismic-deeplearning's Issues

TestLoader's support for custom paths

Currently the code assumes hardcoded "test" folders and splits.
Support for custom paths will make it possible to easily test on separate volumes of seismic data.

select contiguous data splits for test and train

Currently the prepare_dutchf3.py selects the edge of the cube as test and the core as train.

It is important to train with all parts of the cube so when we run the tests/validation we have more diverse data

Updates to prepare dutchf3 script

  1. In the staging branch patch is not used to create the crosslines or inlines but it is in the master branch. When using this script the index generated in the txt files seem to be past our data dimension boundaries. We want to use patch to create the crosslines and inlines

image

  1. The first patch highlighted is being used, but the second is not actually being used in the script as seen above. We should rename the second patch to patch_size to avoid confusion.

image

  1. I created splits for two different volumes and compared them. If you have dimensions for example (400, 500, 350) with a stride of 50 and patch of 100. I would expect the master branch for crosslines to go from horz_locations = range(0, 400-100, 50) so it would go from 0 to 300 at 50 stride intervals so we would see a print out up to 250 there but I would like to see it go out to 300 since we have data until 400. Although if we add 300+1 this would fix this. So I am recommending to consider including a plus one at that point incase the dimensions are exactly on a mark like this.

image

Add example on how to resume model training

Modify existing code samples and cofigs to include the option of resuming training.
For example: given a checkpoint from epoch 100 run the training for 50 more epochs.

Restart where training left off.
Update: Dutch_F3 notebook
Re-use min_epoch param in the config
param for weighs-path to resume -- double check what param we reuse

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.