Coder Social home page Coder Social logo

Comments (6)

maawoo avatar maawoo commented on July 28, 2024

Hi Eoghan!

Maybe this discussion is helpful for you. And the processing of Sentinel-1 data was also mentioned in the release of v. 3.0 here (I can't link to the bullet point, so you have to scroll down a bit).

Cheers, Marco

from force.

davidfrantz avatar davidfrantz commented on July 28, 2024

Hi Eoghan,

force-level2 can only deal with Landsat and Sentinel-2. The key to making Sentinel-1 data available to the higher-level routines is to

  • preprocess them in some way (you have done this)
  • to format them such that FORCE will digest the data (thanks Marco for stepping in, these links are spot-on)

the last part includes a proper cubing. You can use force-cube for this.

Cheers,
David

from force.

Ekeany avatar Ekeany commented on July 28, 2024

Hi Guys,

Thanks so much for the materials, that is great news that you can use the higher level FORCE features with Sentinel-1 data.

I apologize but I am completely new to the remote sensing scene and after reading through the methodology, I still have a few doubts about how to prepare the Data correctly.

  1. Currently at the end of my preprocessing pipeline in snappy, that takes the images from LEVEL1 to LEVEL2 I am converting the backscatter intensities into DB units. Instead should I leave this final step and just convert the processed backscatter intensity values to int16 by multiplying them by a factor of 10,000 and then replace the no data values with -9,999?

  2. When you mention in the documentation that the data needs to have two bands: VV and VH. Should I rename the bands or are the original names "Intensity_VH" , "Intensity_VV" okay ?

  3. With force-cube the datacube-definition.prj needs to exist in the datacube-dir if I use an existing datacube-definition.prj that I used to cube my Sentinel-2 data will the both data cubes line up ?. Even if the Sentinel-1 data is does not cover as much area as the Sentinel-2 data ?

Thanks a million,
Eoghan.

from force.

Ekeany avatar Ekeany commented on July 28, 2024

Hi,

Sorry to bother you guys again but I am completely stuck at the moment.
I have followed Pekka's workflow from the discussion linked above for preparing Sentinel-1 higher level products, so far I have.

  1. Downloaded all Sentinel-1A + 1B IW Level-1 GRD products from Copernicus Open Access Hub, from my study area in both ascending and descending orbits, for the time period of one year.

  2. I ran my preprocessing script in python snappy to the L1 data, to create the level2 data. I used a scaling factor of 1,000 and -9999 as nodata value to convert the output data to the required format of Int16. The output files were also named according to the naming convention you pointed out, e.g. 20170522_LEVEL2_S1BID_SIG.tif, 20170523_LEVEL2_S1AID_SIG.tif

  3. I then ran force-cube to dice and resample the L2 data into my grid (the same I used for Sentinel-2 data). I used nearest neighbor in resampling to TM-65/Irish Grid .

However, when I run the TSA.prm file to produce the higher level features nothing happens only the citeme.txt and data-cube-projection files are created in the output directory. As seen below:

tsa

I am using the same parameters for the sentinel-2 tsa.prm except for the following parameters:

SENSORS = S1AID S1BID
SCREEN_QAI = NODATA
INDEX = VV VH

This is the output from gdalinfo on one of my data cubes I have a feeling maybe something is wrong here ?:

issue_gdal_info

issue_gdal_info_2

I am not sure what I am doing wrong any help would be very much appreciated.
Eoghan.

from force.

ernstste avatar ernstste commented on July 28, 2024

Hi Eoghan,

the most likely reason for force-higher-level to run without creating output would be that there are no Level 2 products to process. I'd start by checking whether the paths are correct and the processing criteria match the files on disk:

  • does your tile range in the parameter file include the your tiles? Is there a tile whitelist?
  • does the date range / DOY range include your products? Note only DOYs within the date range are considered.

If you don't find the error there you may want to post the full parameter file.

from force.

Ekeany avatar Ekeany commented on July 28, 2024

Hi Stefan,

Thanks for the quick response, after pulling my hair out over the weekend I finally figured out the problem.

The problem was that I opened the tsi.prm file in Visual Studio Code on my windows machine before uploading it to the Linux server.

Once I recreated the parameter file on the server it ran smoothly and the output looks great.

from force.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.