Coder Social home page Coder Social logo

punch-mission / regularizepsf Goto Github PK

View Code? Open in Web Editor NEW
18.0 2.0 4.0 15.03 MB

A Python package for manipulating and correcting variable point spread functions. Maintainer: @jmbhughes

Home Page: https://punch-mission.github.io/regularizepsf/

License: Other

Python 94.32% Cython 5.68%
astronomical-algorithms astronomy astronomy-astrophysics astronomy-software image-processing point-spread-function psf psf-estimation solar-physics punch

regularizepsf's Introduction

regularizepsf

codecov DOI PyPI version CI

A package for manipulating and correcting variable point spread functions.

Below is an example of correcting model data using the package. An initial image of a simplified starfield (a) is synthetically observed with a slowly varying PSF (b), then regularized with this technique (c). The final image visually matches a direct convolution of the initial image with the target PSF (d). The panels are gamma-corrected to highlight the periphery of the model PSFs. Example result image

Getting started

pip install regularizepsf and then follow along with the documentation.

Contributing

We encourage all contributions. If you have a problem with the code or would like to see a new feature, please open an issue. Or you can submit a pull request.

If you're contributing code please see this package's development guide.

License

See LICENSE file

Need help?

Please ask a question in our discussions

Citation

Please cite the associated paper if you use this technique:

@article{Hughes_2023,
doi = {10.3847/1538-3881/acc578},
url = {https://dx.doi.org/10.3847/1538-3881/acc578},
year = {2023},
month = {apr},
publisher = {The American Astronomical Society},
volume = {165},
number = {5},
pages = {204},
author = {J. Marcus Hughes and Craig E. DeForest and Daniel B. Seaton},
title = {Coma Off It: Regularizing Variable Point-spread Functions},
journal = {The Astronomical Journal}
}

If you use this software, please also cite the package with the specific version used. Zenodo always has the most up-to-date citation.

regularizepsf's People

Contributors

dependabot[bot] avatar jmbhughes avatar pre-commit-ci[bot] avatar sumanchapai avatar svank avatar taniavsn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

regularizepsf's Issues

`np.complex` is deprecated

I was trying to use the code again to compute an array corrector and found

cpc.to_array_corrector(target_evaluation)

results in

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[78], line 1
----> 1 cpc.to_array_corrector(target_evaluation)

File ~/.pyenv/versions/3.9.16/lib/python3.9/site-packages/regularizepsf/fitter.py:426, in CoordinatePatchCollection.to_array_corrector(self, target_evaluation)
    423     corrected_patch[np.isnan(corrected_patch)] = 0
    424     evaluation_dictionary[(identifier.x, identifier.y)] = corrected_patch
--> 426 array_corrector = ArrayCorrector(evaluation_dictionary, target_evaluation)
    427 return array_corrector

File ~/.pyenv/versions/3.9.16/lib/python3.9/site-packages/regularizepsf/corrector.py:220, in ArrayCorrector.__init__(self, evaluations, target_evaluation)
    217     raise EvaluatedModelInconsistentSizeError("The target and evaluations must have the same shape.")
    219 values = np.array([v for v in self._evaluations.values()], dtype=float)
--> 220 self.target_fft, self.psf_i_fft = _precalculate_ffts(self._target_evaluation, values)

File ~/.pyenv/versions/3.9.16/lib/python3.9/site-packages/regularizepsf/helper.pyx:70, in regularizepsf.helper._precalculate_ffts()

File ~/.pyenv/versions/3.9.16/lib/python3.9/site-packages/regularizepsf/helper.pyx:74, in regularizepsf.helper._precalculate_ffts()

File ~/.pyenv/versions/3.9.16/lib/python3.9/site-packages/numpy/__init__.py:305, in __getattr__(attr)
    300     warnings.warn(
    301         f"In the future `np.{attr}` will be defined as the "
    302         "corresponding NumPy scalar.", FutureWarning, stacklevel=2)
    304 if attr in __former_attrs__:
--> 305     raise AttributeError(__former_attrs__[attr])
    307 # Importing Tester requires importing all of UnitTest which is not a
    308 # cheap import Since it is mainly used in test suits, we lazy import it
    309 # here to save on the order of 10 ms of import time for most users
    310 #
    311 # The previous way Tester was imported also had a side effect of adding
    312 # the full `numpy.testing` namespace
    313 if attr == 'testing':

AttributeError: module 'numpy' has no attribute 'complex'.
`np.complex` was a deprecated alias for the builtin `complex`. To avoid this error in existing code, use `complex` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.complex128` here.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
    https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations

Seems like I need to update some code!

Class __init__ docstrings don't go into docs

I just saw this problem with my own docs, and I was checking here to see if you had a solution, since I copied your docs setup. When a class has a docstring and its __init__ also has a docstring, only the class docstring goes into the docs, and the __init__ docstring gets eaten. See, e.g., FunctionalCorrector, whose __init__ docstring goes nowhere. This means the input arguments to the class constructor don't get documented.

One solution is to add autoapi_python_class_content = 'both' to docs/source/conf.py, to concatenate the __init__ docstring to the class docstring

Single image vs multiple while making model. Question

The code gives option of using multiple images to make a model. If using multiple images, do they all have to be aligned so that the star patterns are in same pixel positions? If so, does the code take care of removing zeros when taking the median because some of our images after alignment end up with zeros at the edges when the field (this is dynamic from image to image night to night, depends on how different was the telescope's position from reference image.

@sumanchapai initial results inquiry

image The image is a screenshot of our corrected FIT. I see two things: 1. bigger patches (sized 128 * 128 it looks like) 2. smaller patches around stars (looks like sized 32).

I had use the following parameters in making the model.

psf_size = 32
patch_size = 256
target_fwhm = 3.25

# Define the target PSF
center = patch_size / 2
sigma = target_fwhm / 2.355

and 2d symmetric gaussian as the target model.

My rough understanding is that first bigger patch is a result of applying psf function for a patch size of 128*128 and smaller one for star. I am not super clear about this and would appreciate if you could clarify or lead me to place where I could find more info regarding it.

Transition from `lmfit` to `astropy.modeling`

When this package was astropy affiliated, @WilliamJamieson suggested we transition the functional model fitting from lmfit to astropy.modeling. This has the benefit of removing a dependency from the package since we already are using astropy. This issue is a reminder to look into this transition.

We might benefit from looking into custom models.

Return results information when building a model

As it works right now, you only get the model back and no information about the process building the model. For example, it would potentially be good to know how many stars went into each patch when building the model. This could help you diagnose problems when building the model. Additionally it would be nice to have a way to visualize a model more easily.

Add a more robust star background removal code

When you build a model if there is structure around the stars, you're not sampling a true delta function of PSF model; it's contaminated by the contribution of the background. We need a way of removing that. One approach is to simply make the PSF size very small and zero out the rest of the patch, but that may not always be ideal.

HDF5 path specification issue

Ran across an issue trying to setup this repository locally on an ARM Mac. Details below:

  • Installed HDF5 library using homebrew - note that this defaults to /opt/homebrew/ on ARM Macs rather than /usr/local/
  • Cloned regularizepsf repository locally from GitHub
  • On setting up the virtualenv, an error was triggered when PyCharm was attempting to install the requirement deepdish:

.. ERROR:: Could not find a local HDF5 installation. You may need to explicitly state where your local HDF5 headers and library can be found by setting the ``HDF5_DIR`` environment variable or by using the ``--hdf5`` command-line option.

On setting the environmental variable HDR5_DIR to my .zprofile startup file to point towards /opt/homebrew/opt/hdf5, this at first did not seem to work within PyCharm's virtualenv setup. After a reboot of my machine, this did, however, seem to work.

Array vs functional model confusion.

I have a confusion in https://punch-mission.github.io/regularizepsf/quickstart.html. The article mentions about two models: "functional model or an array model." and later says that "this quickstart tutorial walks through the array model PSF model form". However, later in the "Building the target PSF" section there's the sentence "This is a functional model and is thus decorated with the simple_psf decorator". Is the word "functional" in the last sentence used in the same sense as the former? What are other possible target models?

Add scheduled workflow runs

As part of glancing through regularizePSF for your Affiliated package request in astropy/astropy.github.com#528, I noticed that your CI was only being run on pull requests. It is wonderful that you have GitHub actions CI setup!

I do have one suggestion for your CI, you should run it on a cron see: astropy, though you don't have to run it as regularly as what I linked to. This is not formally part of the review or its requirements; however, it maybe useful to have it run automatically as this package is probably not having a constant stream of PRs being made. Doing so will give you some warning of one of your dependencies causing you issues.

[Error] during model making correction model "RuntimeWarning division by zero error"

I have been getting this error when making array correction model:

[REDACTED]\Python310\site-packages\regularizepsf\corrector.py:239: RuntimeWarning: invalid value encountered in divide
  [v / v.sum() for v in self._evaluations.values()], dtype=float)

but what's been surprising is that I am only getting this when making model some images and not others.

I am using this code:

from typing import List
from pathlib import Path
import regularizepsf as rpsf
import numpy as np

COMA_PSF_SIZE = 16 # size of the PSF model to use in pixels
COMA_PATCH_SIZE = 128  # square side dimension PSF will be applied over
COMA_ALPHA = 3  # see paper
COMA_EPSILON = 0.3  # see paper


fwhm_x_target, fwhm_y_target = 2.855, 2.8514
IMAGES_FOLDER = Path("G:/Raw Data/Summer 2003/September 8, 2003/m23")


def make_coma_correction_model(
    images: List[str] | List[Path], xfwhm_target: float, yfwhm_target: float
) -> rpsf.ArrayCorrector:
    # Define the target PSF
    @rpsf.simple_psf
    def target(
        x,
        y,
        x0=COMA_PATCH_SIZE / 2,
        y0=COMA_PATCH_SIZE / 2,
        sigma_x=xfwhm_target / 2.355,
        sigma_y=yfwhm_target / 2.355,
    ):
        return np.exp(
            -(
                np.square(x - x0) / (2 * np.square(sigma_x))
                + np.square(y - y0) / (2 * np.square(sigma_y))
            )
        )

    target_evaluation = target(
        *np.meshgrid(np.arange(COMA_PATCH_SIZE), np.arange(COMA_PATCH_SIZE))
    )

    # Extract all the stars from that image and create a PSF model with a target PSF
    image_paths = [str(p) for p in images]  # No support for pathlib.Path yet
    cpc = rpsf.CoordinatePatchCollection.find_stars_and_average(
        image_paths, COMA_PSF_SIZE, COMA_PATCH_SIZE
    )

    return cpc.to_array_corrector(target_evaluation)


def ac1():
    ac_1_images = [ IMAGES_FOLDER / f"m23_3.5-{x}.fit" for x in range (101, 121)]
    return make_coma_correction_model(
        ac_1_images, fwhm_x_target, fwhm_y_target
    )

def ac2():
    ac_2_images = [ IMAGES_FOLDER / f"m23_3.5-{x}.fit" for x in range (521, 541)]
    return make_coma_correction_model(ac_2_images, fwhm_x_target, fwhm_y_target)


if __name__ == "__main__":
    # Note that when generating array corrector model using ac1, we don't get any warning
    # While when using ac2, we do. See this by just ac1 and commenting ac2 and vice versa.
    ac1()
    ac2()

Note that I am getting this error when running the function ac1 but not ac2. The only difference in these two are the images used. The images used are available at: https://drive.google.com/drive/folders/1XuYReGwjFqnEm1GV0O_uTO31aczy8BHT?usp=sharing

The impact of this is that when applying correction using model from ac2, my images have a value of nan in certain boxes. These are the black rectangles in these images.
image

Are these regions in the original images where it wasn't possible to generate a psf model?

Relax version pins

It's annoying to have == everywhere. ~= should suffice to keep us up to date.

Parallelize model building with Dask

I believe the process of extracting all the stars from an image could be parallelized using Dask. This would speed up model building. In general, we should optimize the model building process some.

Create warning when building the model if a patch is empty

As mentioned in #93 , the model gets zeroed out or set to NaNs so that when it is applied it produces blank regions of the corrected image.

  • Warn a user who creates a model if insufficient stars were available in a region.
  • Warn a user who uses a model with insufficient stars that a region will be blank

Create guide on adjusting parameters on new data (and FAQ)

It would be good to create a guide for new users on how to adjust parameters (psf window size, target psf parameters, alpha, epsilon, etc.) to deal with commonly seen problems (boxes around stars, Fourier ringing, black circles around stars) when using the technique on a new dataset. This stems from discussion in #47. @sumanchapai may have insight that they learn from this procedure on their data. @svank also might have ideas from his work.

Are there problems installing on M-series Macs?

It looks to somehow be related to hdf5. Maybe I just haven't set up my laptop completely?

I got this when I tried:

Collecting regularizepsf
  Using cached regularizepsf-0.2.1.tar.gz (80 kB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Requirement already satisfied: astropy in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from regularizepsf) (5.3.3)
Collecting sep
  Using cached sep-1.2.1-cp39-cp39-macosx_11_0_arm64.whl (230 kB)
Collecting cython
  Using cached Cython-3.0.2-py2.py3-none-any.whl (1.2 MB)
Requirement already satisfied: matplotlib in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from regularizepsf) (3.8.0)
Collecting lmfit
  Using cached lmfit-1.2.2-py3-none-any.whl (102 kB)
Requirement already satisfied: scipy in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from regularizepsf) (1.11.2)
Collecting scikit-image
  Using cached scikit_image-0.21.0-cp39-cp39-macosx_12_0_arm64.whl (12.4 MB)
Collecting deepdish
  Using cached deepdish-0.3.7-py2.py3-none-any.whl (37 kB)
Requirement already satisfied: numpy in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from regularizepsf) (1.26.0)
Collecting dill
  Using cached dill-0.3.7-py3-none-any.whl (115 kB)
Requirement already satisfied: PyYAML>=3.13 in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from astropy->regularizepsf) (6.0.1)
Requirement already satisfied: pyerfa>=2.0 in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from astropy->regularizepsf) (2.0.0.3)
Requirement already satisfied: packaging>=19.0 in /Users/jhughes/Desktop/repos/punchbowl/venv/lib/python3.9/site-packages (from astropy->regularizepsf) (23.1)
Collecting tables
  Using cached tables-3.8.0.tar.gz (8.0 MB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'error'

  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [11 lines of output]
      <string>:19: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html
      ld: library 'hdf5' not found
      clang: error: linker command failed with exit code 1 (use -v to see invocation)
      cpuinfo failed, assuming no CPU features: 'flags'
      * Using Python 3.9.6 (default, Aug 11 2023, 19:44:49)
      * Found cython 3.0.2
      * USE_PKGCONFIG: True
      .. ERROR:: Could not find a local HDF5 installation.
         You may need to explicitly state where your local HDF5 headers and
         library can be found by setting the ``HDF5_DIR`` environment
         variable or by using the ``--hdf5`` command-line option.
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

[notice] A new release of pip available: 22.3.1 -> 23.2.1
[notice] To update, run: pip install --upgrade pip

Fix numpy 2.0 breaks

Describe the bug
NumPy 2.0 seems to have broken things... go see the CI failures.

Sep is still on an earlier version of numpy maybe and things aren't passed between numpy 2.0 from regularizepsf to there correctly as I get a value error: ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject.

Remove unused extract() method

PatchCollectionABC (and also CoordinatePatchCollection) has an .extract() method that doesn't appear to be used---the contents of extract() are more-or-less reproduced inside find_stars_and_average. I'm betting either find_stars_and_average should use extract, or extract should be removed as un-used.

If you keep .extract(), the test test_coordinate_patch_collection_extraction_many_coordinates in test_fitter.py generates warnings (from inside PatchCollectionABC.add()) because duplicate coordinates are generated for the test case. Those can be silenced by de-duplicating the generated coordinates by adding coords = list(set(coords)) at the start of the test function (or maybe there's a cleaner way by adjusting how hypothesis generates the coordinates).

Working with microscopy images

Hi,

I'm attempting to fix coma in microscopy images. I took an image of a calibration slide with a grid of points:
blue

The image is originally in tif format. When trying to process it using find_stars_and_average(), I encounter errors.

When providing the image in fits format (after converting from tif):

    537 def _calculate_pad_shape(self, size: int) -> Tuple[int, int]:
    538     print(size, self.size)
--> 539     pad_amount = size - self.size
    540     if pad_amount < 0:
    541         raise InvalidSizeError(f"The average window size (found {size})" 
    542                                "must be larger than the existing patch size"
    543                                f"(found {self.size}).")

TypeError: unsupported operand type(s) for -: 'int' and 'NoneType'

When providing the image as a numpy array:


--> 406 background = sep.Background(image)
    407 image_background_removed = image - background
    408 image_star_coords = sep.extract(image_background_removed, 
    409                                 star_threshold, 
    410                                 err=background.globalrms,
    411                                 mask=star_mask)

File sep.pyx:415, in sep.Background.__cinit__()
File sep.pyx:313, in sep._parse_arrays()
File sep.pyx:227, in sep._get_sep_dtype()
ValueError: input array dtype not supported: uint16

However, when I use tests/data/DASH.fits as an input, the function works without any errors.

Switch to GNU LGPLv3 license

After some discussion, we have decided to change the license from MIT to GNU LGPLv3.

  • get approval from contributors
  • merge new license

Resolve confusion on x/y and array row/column

CoordinateIdentifier is defined as storing x and y coordinates. After star identification, a bunch of CoordinateIdentifers are instantiated, with sep's x going into the y slot and vice-versa, so the x/y slots of CoordinateIdentifer are really being used as row/column slots. And indeed, when making the stellar cutouts, x is used for the vertical axis and y the horizontal axis.

I'm not sure what the ultimate goal is for CoordinateIdentifer, but I think either its fields should be renamed as row/column or similar, or the x and y fields should be used consistently as vertical and horizontal coordinates. (This bit me while visualizing my patches.)

Improve star detection feedback

As identified in #93, there is little way to know which stars are used when building a model.

  • Create a response from building a model that shows exactly which stars were used from each image

Windows installation issue

I am running into the following issue trying to install regularizepsf on windows. Is this a known issue?

image

It says something like requiring "Microsoft Visual C++ 14.0 or greater". I just wanted to make sure that it's the issue and not something else.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.