Coder Social home page Coder Social logo

pytroll / satpy Goto Github PK

View Code? Open in Web Editor NEW
1.0K 35.0 287.0 22.03 MB

Python package for earth-observing satellite data processing

Home Page: http://satpy.readthedocs.org/en/latest/

License: GNU General Public License v3.0

Python 99.80% Gherkin 0.20%
python satellite weather hacktoberfest dask xarray closember

satpy's Introduction

Satpy

https://github.com/pytroll/satpy/workflows/CI/badge.svg?branch=main https://coveralls.io/repos/github/pytroll/satpy/badge.svg?branch=main CodeScene Code Health

The Satpy package is a python library for reading and manipulating meteorological remote sensing data and writing it to various image and data file formats. Satpy comes with the ability to make various RGB composites directly from satellite instrument channel data or higher level processing output. The pyresample package is used to resample data to different uniform areas or grids.

The documentation is available at http://satpy.readthedocs.org/.

Installation

Satpy can be installed from PyPI with pip:

pip install satpy

It is also available from conda-forge for conda installations:

conda install -c conda-forge satpy

Code of Conduct

Satpy follows the same code of conduct as the PyTroll project. For reference it is copied to this repository in CODE_OF_CONDUCT.md.

As stated in the PyTroll home page, this code of conduct applies to the project space (GitHub) as well as the public space online and offline when an individual is representing the project or the community. Online examples of this include the PyTroll Slack team, mailing list, and the PyTroll twitter account. This code of conduct also applies to in-person situations like PyTroll Contributor Weeks (PCW), conference meet-ups, or any other time when the project is being represented.

Any violations of this code of conduct will be handled by the core maintainers of the project including David Hoese, Martin Raspaud, and Adam Dybbroe. If you wish to report one of the maintainers for a violation and are not comfortable with them seeing it, please contact one or more of the other maintainers to report the violation. Responses to violations will be determined by the maintainers and may include one or more of the following:

  • Verbal warning
  • Ask for public apology
  • Temporary or permanent ban from in-person events
  • Temporary or permanent ban from online communication (Slack, mailing list, etc)

For details see the official CODE_OF_CONDUCT.md.

satpy's People

Contributors

adybbroe avatar ameraner avatar bengtrydberg avatar benr0 avatar djhoese avatar gerritholl avatar ghiggi avatar isotr0py avatar jhbravo avatar joleenf avatar lahtinep avatar loerum avatar mherbertson avatar mraspaud avatar pdebuyl avatar pnuu avatar pre-commit-ci[bot] avatar rdaruwala avatar samain-eum avatar seenno avatar sfinkens avatar simonrp84 avatar sjoro avatar strandgren avatar tommyjasmin avatar wroberts4 avatar youvaeumex avatar yufeizhu600 avatar yukaribbba avatar zxdawn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

satpy's Issues

Load metadata without loading data

It would be beneficial to users of satpy and for code clarity if satpy reader's could load the metadata for a dataset without having to load the data.

  1. If a user loads just the metadata first, how do they then ask for the data?
  2. How does satpy use previously loaded metadata when data is loaded? Does it waste the time to recompute it? Cache? Where is it cached? Caching in the main level reader probably makes the most sense since it would have the fully combined/merged metadata and could then take the concatenated data arrays and create a new Dataset object.
  3. What happens to areas? Are they loaded with the rest of the metadata? Is there another keyword to load only metadata but not with areas? What's the default?

Any other issues with trying to make this work?

High resolution 'true_color' composite issue for Himawari-8 AHI

Code Sample, a minimal, complete, and verifiable piece of code

from satpy.scene import Scene
from datetime import datetime, timedelta


ppp_config_directory = "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/etc/"
path = "/Users/andyprata/Documents/python/himawari-8/hsd"
path_output = "/Users/andyprata/Documents/python/himawari-8/png/"

time_slot = datetime(2017, 11, 26, 23)
time_slot_str = time_slot.strftime("%Y%m%d%H%M")
fn_out = 'global_hima_'+time_slot_str
print "Processing: "+time_slot_str
global_scene = Scene(platform_name="Himawari-8", sensor="ahi", reader="ahi_hsd",
                     start_time=time_slot,
                     end_time=time_slot + timedelta(minutes=10),
                     base_dir=path,
                     ppp_config_dir=ppp_config_directory)

rgb_composite = 'true_color'
print "Loading composite: "+rgb_composite
global_scene.load([rgb_composite])
print "Saving file..."
global_scene.save_dataset(rgb_composite, path_output+fn_out+'_'+rgb_composite+'.png')
print "Done. File saved here: " + path_output+fn_out+'_'+rgb_composite+'.png'

Problem description

The issue has been described here: https://groups.google.com/forum/#!topic/pytroll/pN9JLV2j-88
The above code results in the following error:
KeyError: "No dataset matching 'true_color' found"
The following workaround can be used to avoid the above error, but the output is not the expected result:

local_scene = global_scene.resample(global_scene['B03'].info['area'])
local_scene.save_dataset(rgb_composite, path_output+fn_out+'_'+rgb_composite+'.png')

Expected Output

image
Note: this image was generated using the 'true_color_reducedsize' composite.

Actual Result, Traceback if applicable

image

Versions of Python, package at hand and relevant dependencies

Python version: 2.7.14
NumPy version: 1.13.1

Proposal: Give warning when sensor is not implemented

When constructing an empty Scene for unknown sensor, there is no warning that sensor reader could not be found in the ppp_config_dir.

Considering adding a warning like:
scn = Scene(sensor='foo')
produces a debug message "No reader config found called foo.yaml"

This would be very similar to the composites, where exactly such logic already exists.

if i call sensor foo then there is a debug message No composite config found called foo.yaml

3.7 reflectance RGB recipes are wrong or missing, in particular for SEVIRI

Code Sample, a minimal, complete, and verifiable piece of code

from satpy.scene import Scene
from satpy.utils import debug_on
debug_on()
from glob import glob
FILES = glob("/home/a000680/data/hrit/20150420/*201504201000*__")
glbscene = Scene(filenames=FILES, reader='hrit_msg')
glbscene.load(['day_microphysics'])
glbscene.show('day_microphysics')

Problem description

Loading the day_microphysics RGB for SEVIRI actually wrongly use the HRVIS channel, which then result in a KeyError running the above code. The reason is that the recipe is described only in the visir.yaml file.

Further there is no wintertime recipe of the above RGB, and there is no snow RGB either. Also the shortwave (VIS/NIR) channels are not sun-zenith corrected, which they should!

Expected Output

Actual Result, Traceback if applicable

The above example result in a KeyError:

/home/a000680/usr/src/satpy/satpy/scene.pyc in show(self, dataset_id, overlay)
    593 
    594         from satpy.writers import get_enhanced_image
--> 595         get_enhanced_image(self[dataset_id], overlay=overlay).show()
    596 
    597     def images(self):

/home/a000680/usr/src/satpy/satpy/scene.pyc in __getitem__(self, key)
    305     def __getitem__(self, key):
    306         """Get a dataset."""
--> 307         return self.datasets[key]
    308 
    309     def __setitem__(self, key, value):

/home/a000680/usr/src/satpy/satpy/readers/__init__.pyc in __getitem__(self, item)
    181         key = self.get_key(item)
    182         if key is None:
--> 183             raise KeyError("No dataset matching '{}' found".format(str(item)))
    184         return super(DatasetDict, self).__getitem__(key)
    185 

KeyError: "No dataset matching 'day_microphysics' found"

Versions of Python, package at hand and relevant dependencies

SatPy version 0.7.7
Python 2.7.5

Merge area definition files

Newer versions of pyresample should be able to load multiple area definition files together. Right now satpy only loads the package provided one. It should merge the package provided and any user provided ones.

Update Documentation

Problem description

The project documentation at satpy.readthedocs.io is lacking a lot of information. I'm currently rewriting some of it, but wanted to keep a running public TODO list of what needs to be added or updated. Everyone, feel free to comment ideas and I'll add them to the list. If anyone wants to start one of these tasks, please let me know first.

  • Mention optional dependencies in installation instructions
  • In index mention that although the interfaces aren't stable, satpy is used operationally
  • Add overview page providing a quick summary of each of the main components of satpy: reading, compositing, resampling, enhancing, writing, datasets (DataArrays), and configuration files.
  • Add reader creation tutorial
  • Add writer creation tutorial
  • Add composite creation tutorial
  • Add custom enhancement tutorial
  • Add writer creation tutorial
  • Add various examples of how the Scene can be created
  • Add page for file and reader finding utility function (if not a section to the above)
  • Add tutorial with overlays or other writer customizations
  • Add contribution guide (have in root of project and linked in documentation?)

What tutorials go in the docs directly versus go in a jupyter notebook in pytroll/pytroll-examples?

Resample adds junk to image

I'm using the satpy to resample to some different areas. Sometimes the image get green junk in it.
Input is a MSG3 nat file.

I get the feeling that it is related to 'overview' and night time images.

untitled 9
Broken one. 00:30Z

untitled 6
Other broken one. 03:30Z

untitled 7
This is basically what I want. 04:30Z

Do you have any ideas how I can mitigate this?

Separate file discovery from Scene init

There are currently two methods for initializing a Scene:

  1. Provide the filenames for the files to be read as input. Optionally provide the name (or instance) of the reader to be used to open the provided files.
  2. Provide information for sorting through available on-disk files in a base_dir and discover which reader should be used and which files match the parameters.

It has been determined from off-github conversation that the 2nd method should be separated out in to its own function. The keywords required to perform both methods being in the same __init__ method has caused confusion among users, especially when compared to the legacy mpop interface.

Questions to answers:

  1. Should the discovery function be available as a separate utility function or Scene classmethod? If utility function, where should it be located and should it be added to satpy/__init__.py?

  2. What keywords related to file and reader loading are presented in Scene.__init__?

    reader
    filenames
    reader_kwargs
    others?

  3. Is reader still optional in the Scene init? Making it optional would allow the user to provide multiple formats of files and have multiple readers created to read those files. A feature that isn't heavily tested, but is intended to work.

  4. Does the discovery function return a list of files or a dictionary of reader_name: [filenames]?

  5. Do we allow for filtering based on metadata in the Scene.__init__? Does it have to be metadata only available inside the file or can it be metadata available from the filename? I believe filtering on filename metadata is currently available via the discovery functionality.

  6. Does separating these interfaces stop us or make it more difficult to add certain features in the future?

  7. Do we also change **metadata to just metadata or even remove it all together? Its function is to add metadata to the Scene.info dictionary.

`resolution` keyword to Scene.load not having desired effect

When loading a composite with the resolution keyword set, the resolution chosen is not always respected. This behaviour was shown by running on modis data in both python 2.7 and 3.6. In 2.7, the data loaded was 1000m in 3.6 it was the maximum resolution available.

Here is the test script:

    scn = Scene(platform_name="Aqua", sensor="modis",
                base_dir="/home/a001673/data/satellite",
                reader='hdfeos_l1b',
                start_time=datetime(2016, 8, 5, 0, 0),
                end_time=datetime(2016, 8, 15, 0, 0))

    scn.load(['true_color'], resolution=1000)

Adding coastlines does not preserve transparency

When adding coast lines to an image the image is no more transparent. Perhaps this is inevitable, as I guess the coastlines/overlay is stored to the alpha channel?

See e.g. the example at
pytroll-examples/satpy/viirs_smoke.ipynb

Creating composites post-load

I'd like to create composites after resampling, not at load time. This would save both memory and (CPU) time, especially in the case of large native resolution data resampled to a much smaller domain. There's always the option to use mpop.image.GeoImage or trollimage.Image, but I think it'd be cleaner to use the already existing built-in mechanics.

Issue loading and saving 'true_color' with MODIS data

Code Sample, a minimal, complete, and verifiable piece of code

from satpy.utils import debug_on; debug_on()
from glob import glob
from satpy.scene import Scene

ppp_config_directory = "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/etc/"
fnames = glob("/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/MOD*hdf")
path_output = '/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/'
scn = Scene(platform_name="EOS-Terra",
            sensor="modis",
            filenames=fnames,
            reader='hdfeos_l1b',
            ppp_config_dir=ppp_config_directory)
rgb_composite = 'true_color'
scn.load([rgb_composite])
scn.save_dataset(rgb_composite, path_output+rgb_composite+'.png')

Problem description

Based on the debug output I think the issue is something to do with resampling the RGB bands.

Expected Output

Rayleigh-corrected true color image.

Actual Result, Traceback if applicable

[DEBUG: 2017-12-19 15:21:55 : satpy.readers] Reading ['/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/etc/readers/hdfeos_l1b.yaml', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/etc/readers/hdfeos_l1b.yaml', '/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/etc/readers/hdfeos_l1b.yaml']
[DEBUG: 2017-12-19 15:21:56 : satpy.readers.yaml_reader] Assigning to hdfeos_l1b: ['/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/MOD03.A2017343.0145.061.2017343070358.hdf', '/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/MOD021KM.A2017343.0145.061.2017343133022.hdf', '/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/MOD02QKM.A2017343.0145.061.2017343133022.hdf', '/Users/andyprata/Documents/python/MODIS/2017_nt_fires/T20173430145/MOD02HKM.A2017343.0145.061.2017343133022.hdf']
[DEBUG: 2017-12-19 15:21:57 : satpy.composites] Looking for composites config file modis.yaml
[DEBUG: 2017-12-19 15:21:57 : satpy.composites] Looking for composites config file visir.yaml
[DEBUG: 2017-12-19 15:21:57 : satpy.readers.yaml_reader] No coordinates found for DatasetID(name='longitude', wavelength=None, resolution=1000, polarization=None, calibration=None, modifiers=())
[DEBUG: 2017-12-19 15:21:57 : satpy.readers.yaml_reader] No coordinates found for DatasetID(name='latitude', wavelength=None, resolution=1000, polarization=None, calibration=None, modifiers=())
[DEBUG: 2017-12-19 15:21:57 : satpy.readers.yaml_reader] No coordinates found for DatasetID(name='longitude', wavelength=None, resolution=500, polarization=None, calibration=None, modifiers=())
[DEBUG: 2017-12-19 15:21:57 : satpy.readers.yaml_reader] No coordinates found for DatasetID(name='latitude', wavelength=None, resolution=500, polarization=None, calibration=None, modifiers=())
[DEBUG: 2017-12-19 15:21:59 : satpy.composites] Applying sun zen correction
[DEBUG: 2017-12-19 15:21:59 : satpy.composites] Interpolating coszen calculations for higher resolution band
[DEBUG: 2017-12-19 15:21:59 : satpy.composites] Apply the standard sun-zenith correction [1/cos(sunz)]
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Sun-zenith correction applied. Computation time:   1.3 (sec)
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Applying sun zen correction
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Apply the standard sun-zenith correction [1/cos(sunz)]
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Sun-zenith correction applied. Computation time:   0.2 (sec)
[WARNING: 2017-12-19 15:22:00 : satpy.scene] Delaying generation of DatasetID(name='1', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=('sunz_corrected', 'rayleigh_corrected')) because of incompatible areas
[WARNING: 2017-12-19 15:22:00 : satpy.scene] Delaying generation of DatasetID(name='true_color', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=None) because of dependency's delayed generation: DatasetID(name='1', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=('sunz_corrected', 'rayleigh_corrected'))
[WARNING: 2017-12-19 15:22:00 : satpy.scene] Missing prerequisite for 'DatasetID(name='true_color', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=None)': 'DatasetID(name='1', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=('sunz_corrected', 'rayleigh_corrected'))'
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Applying sun zen correction
[DEBUG: 2017-12-19 15:22:00 : satpy.composites] Apply the standard sun-zenith correction [1/cos(sunz)]
[DEBUG: 2017-12-19 15:22:01 : satpy.composites] Sun-zenith correction applied. Computation time:   0.3 (sec)
[INFO: 2017-12-19 15:22:01 : satpy.composites] Removing Rayleigh scattering and aerosol absorption
[INFO: 2017-12-19 15:22:01 : pyspectral.rayleigh] Atmosphere chosen: midlatitude summer
[DEBUG: 2017-12-19 15:22:01 : pyspectral.rayleigh] LUT filename: /Users/andyprata/.local/share/pyspectral/rayleigh_only/rayleigh_lut_midlatitude_summer.h5
[DEBUG: 2017-12-19 15:22:01 : pyspectral.rsr_reader] Filename: /Users/andyprata/.local/share/pyspectral/rsr_modis_EOS-Terra.h5
[DEBUG: 2017-12-19 15:22:01 : pyspectral.rsr_reader] Filename: /Users/andyprata/.local/share/pyspectral/rsr_modis_EOS-Terra.h5
[DEBUG: 2017-12-19 15:22:02 : pyspectral.rayleigh] Time - Interpolation: 0.130395
[DEBUG: 2017-12-19 15:22:02 : satpy.composites] Applying sun zen correction
[DEBUG: 2017-12-19 15:22:02 : satpy.composites] Interpolating coszen calculations for higher resolution band
[DEBUG: 2017-12-19 15:22:02 : satpy.composites] Apply the standard sun-zenith correction [1/cos(sunz)]
[DEBUG: 2017-12-19 15:22:03 : satpy.composites] Sun-zenith correction applied. Computation time:   1.2 (sec)
[WARNING: 2017-12-19 15:22:03 : satpy.scene] Delaying generation of DatasetID(name='1', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=('sunz_corrected', 'rayleigh_corrected')) because of incompatible areas
[WARNING: 2017-12-19 15:22:03 : satpy.scene] Delaying generation of DatasetID(name='4', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=('sunz_corrected', 'rayleigh_corrected')) because of incompatible areas
[WARNING: 2017-12-19 15:22:03 : satpy.scene] The following datasets were not created: DatasetID(name='true_color', wavelength=None, resolution=None, polarization=None, calibration=None, modifiers=None)
Traceback (most recent call last):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython/core/interactiveshell.py", line 2481, in safe_execfile
    self.compile if kw['shell_futures'] else None)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython/utils/py3compat.py", line 289, in execfile
    builtin_mod.execfile(filename, *where)
  File "/Users/andyprata/PycharmProjects/himawari/src/plot_modis.py", line 39, in <module>
    scn.save_dataset(rgb_composite, path_output+rgb_composite+'.png')
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/scene.py", line 622, in save_dataset
    writer.save_dataset(self[dataset_id],
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/scene.py", line 303, in __getitem__
    return self.datasets[key]
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/satpy-0.7.5-py2.7.egg/satpy/readers/__init__.py", line 182, in __getitem__
    raise KeyError("No dataset matching '{}' found".format(str(item)))
KeyError: "No dataset matching 'true_color' found"

Versions of Python, package at hand and relevant dependencies

Python version: 2.7.14
NumPy version: 1.13.1

Proposal: Add "filters"

Currently SatPy has the idea of readers, writers, compositors, and modifiers. I would like to add a "filter" object type. @mraspaud and I have been talking about this on and off for a while so I thought I'd make the discussion a little more official.

A Filter is an optional object or function that when enabled checks the currently loaded datasets against some condition. If the condition is satisfied then the Dataset does not continue on in processing. Otherwise the Dataset continues on as normal. A Filter should not modify the values inside a Dataset (only modifiers should do this). However, a Filter may filter out individual pixels if it makes sense.

During our last discussion is was decided that Filters are executed before composites are generated and after composites are generated. I'm not sure whether or not the "before" execution is done inside the readers.

Use cases (please add more as they come up):

  1. Day or night filter: useful for reflectances that shouldn't be generated if a scene is more than X% night time. Or 'fog' like composites that are only useful if it is X% night time. How does this differ from per-pixel filtering?
  2. Time of year filter: Some products or composites are most useful at a certain time of year. I've seen one case like this with Polar2Grid where there is a certain stretching/enhancement applied to a product that makes it "summer land surface temperature", but another stretching is just "land surface temperature". Not sure if there are others.
  3. Quality filtering: Some data arrive degraded after subnominal reception, The filter would check some parameters (high frequencies in the data for example) and prevent bad quality products from being generated.
  4. Pressure level filtering: Some retrieval products have a 3rd vertical dimension. This could arguably need it's own more complex feature set added to the satpy readers for selecting a dataset based on pressure.

Loading SEVIRI channel data which are not available in (native format) file

The MSG SEVIRI native format comes in one single file and may not include all channels.
Suppose the user try loading a channel which is not present in the file: At the moment the get_dataset method in the native_msg reader raises a KeyError. So, the issues is taken care of in principle, as it is possible to load other channels available in the file with the same scene instance. But, the KeyError handling is not propagated properly upwards in the code, resulting in a bit messy exception printouts, which is then repeated everytime a new channel is loaded.

As the HRIT data comes in segments and one file per channel this is not an issue there, but could potentially be an issue with other data with multiple channels in the same file.

We see two possible options:

  1. Properly handle the KeyError upstream
  2. When creating the scene object from the native format the header is read, and at that point missing and available datasets/channels are known. This information could be used before going into the channel loading. In that case the get_dataset would never need to return None or raise a KeyError.

Sauli Joro & Adam Dybbroe

OMPS NPP NMSO2

I'm a newer user to satpy and am having trouble loading some OMPS data downloaded from the NASA EarthData website. The files names are as follows:

OMPS-NPP_NMSO2-L2_2017m0707t235247_o29505_2017m0708t125922.h5

I thought that I would need to modify the omps_edr.yaml file to address this but I continue to get an error when trying to load it like so

gs = Scene(platform_name='NPP',sensor='omps',filenames=files)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-7-2d834f8862df> in <module>()
----> 1 gs = Scene(platform_name='NPP',sensor='OMPS',filenames=files)

/data/aqf2/barryb/anaconda2/lib/python2.7/site-packages/satpy/scene.py in __init__(self, filenames, ppp_config_dir, reader, base_dir, sensor, start_time, end_time, area, reader_kwargs, **metadata)
     99                                                     base_dir=base_dir,
    100                                                     reader=reader,
--> 101                                                     reader_kwargs=reader_kwargs)
    102         self.info.update(self._compute_metadata_from_readers())
    103         self.datasets = DatasetDict()

/data/aqf2/barryb/anaconda2/lib/python2.7/site-packages/satpy/scene.py in create_reader_instances(self, filenames, base_dir, reader, reader_kwargs)
    145                       sensor=self.info.get("sensor"),
    146                       filenames=filenames,
--> 147                       reader_kwargs=reader_kwargs)
    148 
    149     @property

/data/aqf2/barryb/anaconda2/lib/python2.7/site-packages/satpy/readers/__init__.pyc in __call__(self, filenames, sensor, reader, reader_kwargs)
    350                     remaining_filenames)))
    351         if not reader_instances:
--> 352             raise ValueError("No supported files found")
    353         return reader_instances
    354 

ValueError: No supported files found

Calculate solar angles if requested on load()

Some of the readers can already return Solar zenith/azimuth (and satellite zenith) angles if requested in the load() call. In these cases the data are readily available in the files, but it would be awesome to have them available for all the readers. At least the Solar zenith angles can be calculated using pyorbital, and are already used in many composites for adjusting illumination. The illumination correction is done using modifiers for each channel, and I can't see if that mechanism can be used to blend two channels together.

An alternate solution I see for the DayNightCompositor is to make the solar_zenith_angle an optional prerequisite and calculate the angles if they are not available. The upside off this method is that the memory footprint is somewhat lower, as the angles are present only when the composite is generated, but then again the angles would need to be cached for other composites using DayNightCompositor.

In any case, the load()-able procedural/virtual/implicit/indirect/deducible/inferred// datasets would be quite a useful feature.

Any thoughts?

Fix coordinates array order [MODIS L1B LDAAC data].

SatPy failed to hash coordinates due to the array ordering, log below. Working with MODIS L1B LDAAC data and hdfeos_l1b reader, Python 3.6.3. Same results for develop and master branch. It is possible that the issue should be attributed to the hdfeos_l1b reader, not satpy.resample. Data processed comes from NASA LADSWEB

Traceback (most recent call last):
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 108, in hash_area
    area_hash = "".join((hashlib.sha1(json.dumps(area.proj_dict,
AttributeError: 'SwathDefinition' object has no attribute 'proj_dict'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/snap/pycharm-community/39/helpers/pydev/pydevd.py", line 1683, in <module>
    main()
  File "/snap/pycharm-community/39/helpers/pydev/pydevd.py", line 1677, in main
    globals = debugger.run(setup['file'], None, None, is_module)
  File "/snap/pycharm-community/39/helpers/pydev/pydevd.py", line 1087, in run
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "/snap/pycharm-community/39/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "/home/mikhail/Projects/PyTrollDemo/test_satpy.py", line 40, in <module>
    rescn = scn.resample(area_def)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/scene.py", line 558, in resample
    **resample_kwargs)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/dataset.py", line 572, in resample
    new_data = resample(source_area, data, destination_area, **kwargs)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 524, in resample
    return resampler.resample(data, **kwargs)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 176, in resample
    cache_id = self.precompute(cache_dir=cache_dir, **kwargs)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 251, in precompute
    epsilon=epsilon)
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 141, in get_hash
    the_hash = "".join((self.hash_area(source_geo_def),
  File "/home/mikhail/.virtualenvs/def36/lib/python3.6/site-packages/satpy/resample.py", line 128, in hash_area
    hashlib.sha1(lons).hexdigest(),
ValueError: ndarray is not C-contiguous

Allow wavelength ranges as composite prerequisites

Right now, when a composite depends on given wavelength, if there is no channel covering this wavelength, the composite building will fail.
By allowing wavelength ranges as composite prerequisites, this could be solved by taking a nearby acceptable channel.

SEVIRI HRIT reading: More userfriendly warning when no EPI/PRO files are present

Code Sample, a minimal, complete, and verifiable piece of code

from satpy.scene import Scene
from satpy.utils import debug_on
debug_on()
seviri = Scene(filenames=..., reader='hrit_msg')

Problem description

It often happens that a user forgets to copy the EPI and PRO files to the data directory. When instantiating the Scene object you will get a RuntimeError exception as depicted below.
Running with debug info on you will also get some warnings like:

[WARNING: 2018-01-05 10:50:09 : satpy.readers.yaml_reader] Missing requirements for /data/lang/satellit/geo/hrit/2015/201511/16/12/H-000-MSG1__-MSG1________-IR_120___-000005___-201511161200-__

But it is not obvious it comes from the missing prologue and epilogue files.

Expected Output

A warning message saying that this might most probably be due to the missing header/trailer files:

WARNING: No EPI/PRO files found!

Actual Result, Traceback if applicable

    102                                                     reader_kwargs=reader_kwargs,
    103                                                     metadata=metadata)
--> 104         self.info.update(self._compute_metadata_from_readers())
    105         self.datasets = DatasetDict()
    106         self.cpl = CompositorLoader(self.ppp_config_dir)

/home/a000680/usr/src/satpy/satpy/scene.pyc in _compute_metadata_from_readers(self)
    119         if self.readers:
    120             mda['start_time'] = min(x.start_time
--> 121                                     for x in self.readers.values())
    122             mda['end_time'] = max(x.end_time
    123                                   for x in self.readers.values())

/home/a000680/usr/src/satpy/satpy/scene.pyc in <genexpr>((x,))
    119         if self.readers:
    120             mda['start_time'] = min(x.start_time
--> 121                                     for x in self.readers.values())
    122             mda['end_time'] = max(x.end_time
    123                                   for x in self.readers.values())

/home/a000680/usr/src/satpy/satpy/readers/yaml_reader.pyc in start_time(self)
    410     def start_time(self):
    411         if not self.file_handlers:
--> 412             raise RuntimeError("Start time unknown until files are selected")
    413         return min(x[0].start_time for x in self.file_handlers.values())
    414 

RuntimeError: Start time unknown until files are selected

Versions of Python, package at hand and relevant dependencies

Python 2.7.5
Satpy 0.7.7

Thank you for reporting an issue !

load composite causes strange scene object behaviour

Case: First loading a composite, then loading a channel. Calling load on the scene object with the same channel again, satpy reads the channel from the file even though it was already read once. Also, if you have compute=True (default for load method) when loading the channel, also the composite is recomputed!?

Also, the unload didn't seem to have any effect.

Example work flow:

scn = Scene(...)
scn.load(composite_name)
scn.load([0.6])
scn.load([0.6])
scn.unload()
print(scn.datasets.keys())
scn.show(composite_name)

Fix meteosat 10 area

As discussed on the mailing list and slack the Meteosat 10 native satellite area produced by the hrit_msg reader is incorrect compared to the mpop reader. @adybbroe was able to come up with the following area from mpop:

In [6]: glbd['VIS006'].area
Out[6]: 
Area ID: Meteosat-10seviri[-5567248.07417344 -5570248.47733926  5570248.47733926  5567248.07417344](3712, 3712)
Description: On-the-fly area
Projection ID: geos
Projection: {'a': '6378169.00', 'b': '6356583.80', 'h': '35785831.00', 'lat_0': '0.00', 'lon_0': '0.00', 'proj': 'geos'}
Number of columns: 3712
Number of rows: 3712
Area extent: (-5567248.074173444, -5570248.4773392612, 5570248.4773392612, 5567248.074173444)

And from satpy:

In [7]: glbd['overview'].info['area']
Out[7]: 
Area ID: some_area_name
Description: On-the-fly area
Projection ID: geosmsg
Projection: {'a': '6378169.0', 'b': '6356583.8', 'h': '35785831.0', 'lon_0': '0.0', 'proj': 'geos', 'units': 'm'}
Number of columns: 3712
Number of rows: 3712
Area extent: (5568748.28340708, 5568748.28340708, -5568748.6866856618, -5568748.6866856618)

So even when taking in to consideration that one of the readers keeps the area "upside down" and the other doesn't the extents are still different. The satpy area produces a clipped image on the edges when resampled and shown on a PNG image. The mpop result produces the expected full disk image.

Not sure who is best to assign this to ( @mraspaud, @adybbroe, @pnuu, or maybe me).

Identify which file it is that the viirs_sdr reader fails at!

Code Sample, a minimal, complete, and verifiable piece of code

# Your code here
from satpy.scene import Scene
from satpy.utils import debug_on
debug_on()
from glob import glob
FILES2 = glob(BASEDIR + "npp_20170527_1010_28915/*h5")
scn = Scene(filenames=files, reader='viirs_sdr')
scn.load('true_color_lowres')

Problem description

I just ran into a small issue: We downloaded VIIRS SDR data from class. I have a small download script that can be interrupted and started again. But it was not smart enough to detect that the last file downloaded when I killed the job, wasn't completely written to disk when I stopped the script. I restarted the script immediately after (with nohup - and went home). The result was that one h5 file was corrupt. Then when running satpy I got an error in h5py. I immediately guessed what had happened, but I needed to go through all files with:

for f in FILES2:
try:
   with h5py.File(f, 'r') as h5f:
      pass
   except IOError:
      print f

in order to identify the file.

Expected Output

I would have been helped by seeing which file it was that h5py had problems reading.

Actual Result, Traceback if applicable

/home/a000680/usr/src/satpy/satpy/readers/hdf5_utils.pyc in __init__(self, filename, filename_info, filetype_info)
     43         super(HDF5FileHandler, self).__init__(filename, filename_info, filetype_info)
     44         self.file_content = {}
---> 45         file_handle = h5py.File(self.filename, 'r')
     46         file_handle.visititems(self.collect_metadata)
     47         self._collect_attrs('', file_handle.attrs)

/home/a000680/.local/lib/python2.7/site-packages/h5py/_hl/files.pyc in __init__(self, name, mode, driver, libver, userblock_size, swmr, **kwds)
    270 
    271                 fapl = make_fapl(driver, libver, **kwds)
--> 272                 fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
    273 
    274                 if swmr_support:

/home/a000680/.local/lib/python2.7/site-packages/h5py/_hl/files.pyc in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
     90         if swmr and swmr_support:
     91             flags |= h5f.ACC_SWMR_READ
---> 92         fid = h5f.open(name, flags, fapl=fapl)
     93     elif mode == 'r+':
     94         fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2684)()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper (/tmp/pip-4rPeHA-build/h5py/_objects.c:2642)()

h5py/h5f.pyx in h5py.h5f.open (/tmp/pip-4rPeHA-build/h5py/h5f.c:1930)()

IOError: Unable to open file (Truncated file: eof = 21665792, sblock->base_addr = 2048, stored_eoa = 25384409)

Versions of Python, package at hand and relevant dependencies

Python 2.7.5

Remove sunz corrected modifier for IR channels for VIIRS SDR reader

@pnuu pointed out that the VIIRS SDR reader incorrectly states that the IR channels are sunz corrected. This is not true for IR channels. Likely a copy and paste error by me or me not knowing that this was a thing. Anyway, removing the "modifiers" lines from the viirs sdr reader's yaml file for each IR dataset should do the trick.

Slowness of Scene() creation

When creating the Scene object with the base_dir given, the creation is very, very slow, if there are plenty of files in the directory. For example, I have now 6337 files in my data directory, the scene creation using command
glbl = Scene(platform_name="Meteosat-10", sensor="seviri", reader="hrit_msg", start_time=tslot, base_dir="/path/to/data")
takes 42.7 seconds. The given parameters should already contain all the needed information to get the correct set of files without opening all the existing files. In comparison
glbl = Scene(platform_name="Meteosat-10", sensor="seviri", reader="hrit_msg", start_time=tslot, filenames=glob("/path/to/data/*201409051145*"))
takes only 0.8 seconds.

I haven't yet looked under the hood, but maybe it would be possible to add a shortcut for the cases where the files can be selected based on the given data. Or if it's a smaller set, handle the data which need to be opened to get the exact time (etc) as a special case.

Re-use of calculated resampling parameters

At the moment satpy doesn't re-use the calculated resampling parameters for the following channels with identical fingerprints if the cache_dir kwarg is not given. With cache_dir set, the resampling parameters are written to disk the first time, and then reloaded for each of the matching datasets.

My suggestion is to find a way to resample the datasets in such order that all matching datasets are processed consecutively and the resampling parameters could be kept in-memory for that duration. In this way the parameters would not need to be reloaded (cache_dir set) or recalculated (cache_dir not set) for each dataset. The extra time used is even more pronounced when using bilinear sampling and/or large areas.

For testing:

Set debugging messages on
from satpy.utils import debug_on
debug_on()
Create a scene from some files
glbl = Scene(filenames=fnames)
Load all available datasets
glbl.load(glbl.available_dataset_names())
Resample without cache. Note that the resampling is recalculated for each dataset
lcl = glbl.resample('euron1')
Resample with cache. Note that the resampling parameters are reloaded, if already calculate and saved, for each dataset
lcl = glbl.resample('euron1', cache_dir='/tmp')

NetCDF writer doesn't work

Code Sample, a minimal, complete, and verifiable piece of code

from satpy import Scene
from satpy.utils import debug_on
import glob

debug_on()
fnames = glob.glob("/home/lahtinep/data/satellite/viirs/*b30342*")
glbl = Scene(reader="viirs_sdr", filenames=fnames)
glbl.load(['I05'])
lcl = glbl.resample('euro4')

# This works
lcl.save_datasets(writer='geotiff', filename='/tmp/foo.tif')

# This doesn't
lcl.save_datasets(writer='cf', filename='/tmp/foo.nc')

Problem description

As seen from the trace below, the NetCDF writer uses cf.Domain which seems to be removed from the cf-python package, and therefore fails to save the data.

Actual Result, Traceback if applicable

[INFO: 2017-12-12 08:14:04 : satpy.writers.cf_writer] Saving datasets to NetCDF4/CF.
[INFO: 2017-12-12 08:14:04 : satpy.writers.cf_writer] No longitude and latitude data to save.
[INFO: 2017-12-12 08:14:04 : satpy.writers.cf_writer] No grid mapping to save.
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-24-58ece08c15f3> in <module>()
----> 1 lcl.save_datasets(writer='cf', filename='/tmp/foo.nc')

/home/lahtinep/Software/pytroll/packages/satpy/satpy/scene.pyc in save_datasets(self, writer, datasets, **kwargs)
    632             datasets = self.datasets.values()
    633         writer = self.get_writer(writer, **kwargs)
--> 634         writer.save_datasets(datasets, **kwargs)
    635 
    636     def get_writer(self, writer="geotiff", **kwargs):

/home/lahtinep/Software/pytroll/packages/satpy/satpy/writers/cf_writer.pyc in save_datasets(self, datasets, filename, **kwargs)
    176                     coords = [line_coord, pixel_coord]
    177 
--> 178                 domain = cf.Domain(dim=coords,
    179                                    aux=aux,
    180                                    ref=grid_mapping)

AttributeError: 'module' object has no attribute 'Domain'

Versions of Python, package at hand and relevant dependencies

SatPy 0.7.5, cf-python 2.1.2, Python 2.7.12

Python 2 Cruft

This issue describes the various code pieces that should be removed or modified when Python 2 support is eventually removed from SatPy in order to keep the code clean. It will likely be difficult to keep this issue up to date, but I figure it would be good to make an attempt. SatPy maintainers should update the bulleted list below. SatPy contributors (no issue editing permissions) should comment with code lines/blocks that need to modified and how.

These changes don't all have to occur in one PR. They can't be merged until SatPy's Python 2.7 support is dropped which is will likely be some time shortly after numpy's support of it is dropped.

Add satellite zenith angle filter for geostationary imagery

Certain users (US National Weather Service Forecasters) would like geostationary full disk or mesoscale images (GOES-16+ and H8+) to be masked where satellite zenith angle is greater than a certain amount (maybe >83). Points beyond this threshold should be masked and be treated the same as space pixels would in the final output.

This is a desired feature because on certain visualization tools (AWIPS, etc) the image may be viewed in multiple projections and a pixel near the edge of the earth in the geostationary projection gets very distorted in other projections ("spiky"). The easiest example is a full disk GOES-16 image in a polar-stereographic projection near Alaska. The forecasters would rather have no data near the outer edge of the disk than have the huge distorted spikes they are seeing.

The question is how does this get applied in SatPy? Right now we don't have a defined interface for this type of functionality except for maybe a modifier. I think in the past @mraspaud and I had talked about a filter versus a modifier, where a filter is just a modifier that modifies the data in place. Maybe inplace writes aren't something we want. This feature will also be desired for GOES and Himawari satellite imagery so it doesn't make a ton of sense to add it to a specific reader. The only issue I have making it a modifier is that it will make it difficult for the user to request it; that is unless scn.load([...], modifiers=('satellite_zenith_filter',)) is possible.

Print a list of available sensors/readers

Is there currently a way to print a list of supported sensors/readers? That would make it easier to get started working with the data (e.g. when a user downloads new supported data, it would be more convenient to get the list of readers directly in shell/IDE instead of looking into the ets/readers directory).

Something like

    from satpy import readers
    readers.list_readers()

    >>> ['sar-c', 'olci', ... ]

Installation fails

Hi,
when I try to install satpy using "pip install satpy" I get the error
"Collecting satpy
Could not find a version that satisfies the requirement satpy (from versions: )
No matching distribution found for satpy"

It seems that https://pypi.python.org/pypi/satpy/2.0.0a1 contains only meta data, not the actual package...

When I download the package directly from github and do a "python setup.py install" I get
"[...]
Processing dependencies for satpy==2.0.0a1
Searching for pykdtree
Reading https://pypi.python.org/simple/pykdtree/
error: [Errno 2] No such file or directory"

It seems that handling the package dependencies does not work correctly.
So I have installed the following packages with pip: pykdtree trollimage trollsift pyproj

Now I can run "python setup.py install" and successfully "import satpy", but when I try to access SEVIRI data using
global_scene = Scene(platform_name="Meteosat-10", sensor="seviri", start_time=..., base_dir=...)
I get the following error:
ImportError: Could not import reader class 'satpy.readers.mipp_xrit.XritReader' for reader 'mipp_xrit': No module named satin.helper_functions

How can this be fixed?

P.S.: I used the Anaconda Python distribution under SuSE Linux x86_64.

Add default null log handler

Code Sample, a minimal, complete, and verifiable piece of code

>>> from satpy import Scene
>>> scn = Scene(...)
No handlers could be found for logger "satpy.readers.yaml_reader"

Problem description

Many users get confused by the above warning message. To be clear for any users who see this thinking it is an error, this is just a warning. Code should still work perfectly fine assuming you have the input files you think you have and they are supported by satpy.

Expected Output

No warning message about loggers so users don't confuse it with an error message.

Possible Solution

If I recall correctly we can add a NullHandler to the satpy logger: https://docs.python.org/2/howto/logging.html#configuring-logging-for-a-library

include some custom composite examples in satpy

It would be great to implement some custom composite examples as known from mpop documentation.
Nice would be the custom .yaml file(s) with the changes in generic.cfg for this composites:
natural [1.6], [0.8], [0.6]
overview [0.6], [0.8], [10.8] and a
severe storms ( R : VIS0.8 Range : 0-100 [%]-Gamma : 1.0, G : VIS0.8 Range : 0-100 [%]-Gamma : 1.0, B : IR10.8-IR3.9 Range : -60 - -40 [K]Gamma : 2.0), and a
clouds convection (R : VIS0.8 Range : 0-100 [%]-Gamma : 1.0, G : VIS0.8 Range : 0-100 [%]-Gamma : 1.0, B : IR10.8 Range : 323-203 [K]-Gamma : 1.0

and the same with HRV channel (luminance):
HRV-overview [HRV] ... ,HRV natural, HRV severe storms, HRV clouds convection.
A simple code example to render with euro1 area all custom composites should be provided.

Maybe the number of examples could be reduced and some of this composites added as defaults?
Some HRV composites could be available as fixed composites in satpy.
BTW. the snow reflecatnce composite seems not implemented any more (was present in mpop)?

Regards,

Christian

Make a land-sea compositor

In the same fashion as we have a day-night compositor which merges a composite for daytime and another for nighttime, we would like to have the possibility to merge a composite over land with another composite over sea.

Maybe this can even be made to a MaskCompositor which takes any mask and takes the masked pixel from another composite.

In order to compute the land-sea mask, the idea is to use pycoast.

msg_native example

Can someone write a small example on how to use the msg_native driver?

I tried:
global_scene = Scene(filenames="MSG1-SEVI-MSG15-0201-NA-20080131234240.880000000Z-20110506121050-1224354-2.nat", platform_name="Meteosat-10", sensor="seviri", reader="native_msg",base_dir="/mnt/LC3950L4/")

But I guess it is related to the
file_patterns: ['{satid:4s}-{instr:4s}-MSG{product_level:2d}-0100-NA-{processing_time1:%Y%m%d%H%M%S.%f}000Z-{processing_time2:%Y%m%d%H%M%S}-{order_id:d}.nat']

Can I override this id somehow and just send in the file?

Definition and accuracy of reflectance

While trying to get reflectances from MSG SEVIRI solar channel HRIT files using satpy I encountered several problems.

First of all, for calibration=='reflectance' you do not actually get reflectance according to the usual definition, but reflectance * cos(SZA) * d^-2, where SZA is the solar zenith angle and d is the sun-earth distance in astronomical units. Nevertheless the standard_name is set to "toa_bidirectional_reflectance" -- this is confusing. Moreover, reflectance * cos(SZA) * d^-2 is not really a useful quantity...

It would be great if satpy could calculate real reflectances and it would be even better if the accuracy of these reflectances was sufficiently high for applications like data assimilation.

There is some code to account for cos(SZA) in the composite module. However, this code computes SZA for a fixed time, the "start_time". Ignoring the variation of the solar zenith angle during image acquisition can cause large errors when the sun is low and causes errors of a few percent in the northern hemisphere even during the day (SEVIRI scans from south to north, so start_time is wrong by about 11 minutes in Europe). For this reason I am computing cos(SZA) like this:

mu0 = np.ma.zeros((ny,nx))
Tacq = gs.datasets[chan].info["end_time"] - gs.datasets[chan].info["start_time"]
for j in range(ny) :
    tacq = gs.datasets[chan].info["start_time"] + timedelta( seconds=(j/float(ny))*Tacq.seconds )
    mu0[j,:] = np.ma.masked_outside( cos_zen( taqu, lon2d[j,:], lat2d[j,:]),0.035, 1, copy=False)

Ignoring variations in the sun-earth distance can cause reflectance errors of up to +-3%.
To account for the variation I am now using
reflectance *= float(sunpy.sun.sunearth_distance( t=time_slot ) / astropy.units.AU)**2
Maybe sunearth_distance could be integrated in pyorbital?

Only with these two corrections I get reflectances that are compatible (error<1%) with what I got using other satellite software packages.

Slicing failed for 'Europe Canary' area

Providing the area argument for a scene object (tested for the MTG-LI reader) results in a wrong result array slice when using the area Europe Canary. In contrast the eurol area works correctly. It seems, that the slice is cutting at the middle of the image and return only the lower part (Look at the attached image). The sliced part is in white.
satpy_slicing_europecanary_mtgli

AHI True color imagary - poor stretch

For a code sample see the Jupyter notbook here

Problem description

Observation by Andy Prata:

Thanks for the response. I have installed the new version of satpy (0.7.5) and re-ran my script using 'true_color_reducedsize' for the rgb_composite string. Note: I also needed to comment out Dave's workaround, which is this line: local_scene = global_scene.resample(global_scene['B03'].info['area']) for the code to run without errors.

I've attached the output. It looks basically the same as before and quite different compared to the output of your jupyter noteboook example. I've also attached the new debug output.

Expected Output

Image shown in the above mentioned notebook example.

Actual Result, Traceback if applicable

Log indicates that only a standard stretch and no CIRA stretch is performed.

Versions of Python, package at hand and relevant dependencies

Verified using Python 2.7.5, NumPy 1.13.1

Thank you for reporting an issue !

Allow default resampling options based on dataset metadata

Similar to how enhancements work (or should work) the users should be able to do scn.resample('area') without any resampling parameters and have SatPy load a configuration file to determine what resampling would be best. In some cases it may be possible to have the resampling be smart about choosing the parameters dynamically. For example, choose nearest neighbor search radius based on nadir or limb resolutions of a swath. Or even more obvious, if defaulting to non-nearest resampling then make sure to not interpolate category data (flag_meanings in metadata).

As mentioned this should be similar to enhancements where there can be a default resampling section, otherwise the best options could be selected by metadata like standard_name or name or units.

NOAA-20 platform name wrong (=J01)

Code Sample, a minimal, complete, and verifiable piece of code

from glob import glob
from satpy.scene import Scene
from satpy.utils import debug_on
debug_on()

FILES = glob("/home/a000680/data/JPSS-1/*h5")

scn = Scene(filenames=FILES, reader='viirs_sdr')
composite = 'true_color_lowres'
scn.load([composite])

Problem description

Running the above code generates a nice looking True Color image. However, in the log I see the below output! So, Pyspectral cannot find the correct RSR file, since the platform-name is wrong ('J01').

Expected Output

Actual Result, Traceback if applicable

[INFO: 2017-12-22 09:57:57 : satpy.composites] Removing Rayleigh scattering and aerosol absorption
[INFO: 2017-12-22 09:57:58 : pyspectral.rayleigh] Atmosphere chosen: us-standard
[DEBUG: 2017-12-22 09:57:58 : pyspectral.rayleigh] LUT filename: /home/a000680/data/pyspectral/marine_clean_aerosol/rayleigh_lut_us-standard.h5
[DEBUG: 2017-12-22 09:57:58 : pyspectral.rsr_reader] Filename: /home/a000680/data/pyspectral/rsr_viirs_J01.h5
[WARNING: 2017-12-22 09:57:58 : pyspectral.rsr_reader] No rsr file /home/a000680/data/pyspectral/rsr_viirs_J01.h5 on disk
[ERROR: 2017-12-22 09:57:58 : pyspectral.rayleigh] No spectral responses for this platform and sensor: J01 viirs
Traceback (most recent call last):
  File "/home/a000680/usr/src/pyspectral/pyspectral/rayleigh.py", line 134, in get_effective_wavelength
    rsr = RelativeSpectralResponse(self.platform_name, self.sensor)
  File "/home/a000680/usr/src/pyspectral/pyspectral/rsr_reader.py", line 91, in __init__
    raise IOError(errmsg)
IOError: pyspectral RSR file does not exist! Filename = /home/a000680/data/pyspectral/rsr_viirs_J01.h5
Files matching instrument and satellite platform: []

Versions of Python, package at hand and relevant dependencies

Python 2.7.5, Pyspectral v0.5.1

Update VIIRS L1B reader to support new standard file naming scheme

Problem description

The NASA official VIIRS L1B filenaming scheme has now been finalized and has been changed from what SatPy previously supported. See https://ladsweb.modaps.eosdis.nasa.gov/api/v1/productPage/product=VNP02MOD

A coworker provided me with the following mapping:

VNP02MOD -> L1B moderate resolution SNPP
VNP03MOD -> GEO moderate resolution SNPP
VNP02IMG -> L1B imager resolution SNPP
VNP03IMG -> GEO imager resolution SNPP 
VNP02DNB -> L1B day night band SNPP
VNP03DNB -> GEO day night band SNPP

Note that the ESDTs for JPSS-1 will be very similar except the NP gets switched to J1 so:

VJ102MOD
VJ103MOD
VJ102IMG
VJ103IMG
VJ102DNB
VJ103DNB

I think this should just require updating the file pattern in viirs_l1b.yaml.

Warning without problem for NWC SAF products

The product ct and ct_pal are read corectly and the logging ends with the warning "The following datasets were not created:" but no product is listed.

[DEBUG: 2017-03-29 08:08:16 : satpy.readers.nc_nwcsaf_msg] Reading ct.
[DEBUG: 2017-03-29 08:08:16 : satpy.readers.yaml_reader] No coordinates found for DatasetID(name='ct_pal', wavelength=None, resolution=3000, polarization=None, calibration=None, modifiers=())
[DEBUG: 2017-03-29 08:08:16 : satpy.readers.nc_nwcsaf_msg] Reading ct_pal.
[WARNING: 2017-03-29 08:08:16 : satpy.scene] The following datasets were not created:

Add background color option to simple image writer

As mentioned in this thread: https://groups.google.com/forum/#!topic/pytroll/5UuL4jjpm3U

When saving a dataset to a JPEG image (which doesn't support transparency) the default output is a white background. We should provide a keyword argument to the save_dataset of the simple image (PILWriter) writer to do any necessary logic to achieve this. Possible solutions would be to overwrite masked values with the necessary color values. This may have to be added to trollimage.

Warnings

Some warnings that I am not sure how to solve:

  1. global_scene = Scene
    No handlers could be found for logger "satpy.readers.yaml_reader"

  2. global_scene.load([band_name]) - for overview of HRV meteosat
    /packages/pytroll/local/lib/python2.7/site-packages/satpy/readers/hrit_msg.py:1000: RuntimeWarning: divide by zero encountered in reciprocal
    data.data[:] **= -1

  3. global_scene.save_dataset(band_name, 'my_nice_overview.png')
    /packages/pytroll/local/lib/python2.7/site-packages/trollimage/image.py:847: RuntimeWarning: invalid value encountered in power
    (1.0 / gamma),

How to save the raw data to disk?

This project is really helpful to read Himawari 8 data in HSD format, Thanks a lot!!!
I used "global_scene.save_dataset" to save the data, but I found that the data was converted to 8 bit after it is saved to disk. I want to know how to save the raw data (16 bit) ? Thank you!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.