Coder Social home page Coder Social logo

simeonreusch / fpbot Goto Github PK

View Code? Open in Web Editor NEW
2.0 1.0 0.0 21.29 MB

Provides a forced photometry pipeline for ZTF based on ztfquery and ztflc

Home Page: https://github.com/simeonreusch/fpbot

License: BSD 3-Clause "New" or "Revised" License

Dockerfile 0.24% Python 99.71% Shell 0.05%
astronomy astrophysics ztf forced-photometry

fpbot's Introduction

DOI CI Coverage Status

fpbot

This package a Forced Photometry Pipeline based on ztfquery and ztflc. It needs IPAC access to download the images, as well as access to the AMPEL Archive to obtain information on the transients.

If you are planning to run forced photometry on many ZTF transients, this is the right tool for you!

Note: Requires Python >= 3.10. Also requires a MongoDB instance for storing the metadata, reachable under port 27017. This can be modified in database.py.

Installation

  1. Note that libpq-dev needs to be present. On Debian/Ubuntu, issue sudo apt install libpq-dev. On Mac OS, run brew install postgresql.

  2. Then install via: pip install fpbot. Alternatively, clone this repo and install it with poetry. To do so, run

git clone https://github.com/simeonreusch/fpbot.git
cd fpbot
poetry install
  1. If MongoDB is not present, it can easily be installed. On Debian/Ubuntu, just follow this instruction set. After this, make sure the demon runs. Issue
sudo systemctl start mongod
sudo systemctl enable mongod

On MacOS, make sure brew is present. To do so, follow this tutorial.

  1. fpbot requires an environment variable to know where to store the data. Include a line in your .bashrc or .zshrc like export ZTFDATA='/absolute/path/to/ZTF-data-folder/'. If you don't need AMPEL access, you are done!

  1. If you want to use the AMPEL API for alert data (you don't have to!), you need credentials for the API. You can get these here.

  2. NOTE: If you are planning to run fpbot on a headless system which does not provide the luxury of a systemwide keychain, please add export ZTFHUB_MODE='HEADLESS' to your .bashrc or .zshrc. The pipeline will then uses ztfquery's base64-obfuscated password storage.

ALTERNATIVE: Use Docker container

fpbot comes shipped with a Dockerfile and a docker-compose.yml. Use them to build the docker container (this includes all dependencies as well as a MongoDB instance). Note: You have to provide a .ztfquery file in the fpbot directory containing access data for ztfquery (see ztfquery or ztflc for details).

First, do the following:

git clone https://github.com/simeonreusch/fpbot.git
cd fpbot
docker-compose build

in the directory containing 1) the Dockerfile, 2) the docker-compose.yml and 3) the .ztfquery credentials file and run with

docker-compose run -p 8000:8000 fpbot. This exposes the web API to port 8000 of your local machine.

Troubleshooting

In case way too few images are downloaded, check your IRSA credentials. These are stored in ~.ztfquery. If there is a problem with these, ztfquery will not complain but simply only download publicly accessible images.

Usage

By importing class

All functionality of the command-line tool is present in the class. Just call it according to the commands available in pipeline.py.

For example:

from fpbot.pipeline import ForcedPhotometryPipeline

pl = ForcedPhotometryPipeline(
    file_or_name="ZTF19aatubsj",
    daysago=90,
    nprocess=24
)

pl.download()
pl.psffit()
pl.plot()

By systemwide command (fp name -operations --options)

Always:

name A ZTF name has to be provided, or an ASCII file containing one ZTF name in each line or an arbitrary name if followed by the ra/dec-option as to be provided.

optionally:

-radec [RA DEC] If this is given, the name can be chosen arbitrarily (but a name MUST be provided). Radec must be given in a format that can be parsed by astropy; e.g. -radec 218.487548 +40.243758.

Additional commands

-dl Downloads the images used for forced photometry from IPAC. Needs a valid IPAC account.

-fit Performs the PSF-photometry fit and generates plots of the lightcurve(s).

-plot Plots the lightcurve(s).

-plotflux Plots the lightcurve(s), but with flux instead of magnitude.

-sciimg Experimental: Also downloads the science images from IPAC (note: to create thumbnails if specified)

-thumbnails Experimental: Generates thumbnails for all science-images. Science images have to be downloaded (see -sciimg)

Options

--nprocess [int] Specifies the number of processes spawned for parallel computing. Default is 4. Note: download is always performed with 32 processes in parallel, as IPAC upload-speed is the bottleneck there.

--daysago [int] Determines how old the photometric data should be. Default: all.

--daysuntil [int] Determines how new the photometric data should be. Default: all.

--snt [float] Specifies the signal-to-noise ratio for plotting and SALT-fitting.

--magrange [float float] Defines upper and lower magnitude bound for plotting the lightcurves; order is irrelevant.

--fluxrange [float float] Defines lower and upper flux bound for plotting the flux lightcurves; order is irrelevant.

Examples

fp ZTF19aatubsj downloads this ZTF object, does forced photometry, plots it and saves it to the default directory in "forcephotometry" (ZTFDATA, located at $ZTFDATA in your .bashrc/.zshrc/..., see ztfquery doc).

fp ZTF19abimkwn -dl -fit --nprocess 16 downloads all images for ZTF19abimkwn found on IPAC, performs PSF-fitting and plots the lightcurve with 16 processes in parallel.

fp supernovae.txt -dl -fit Downloads all difference images for ZTF transients found in supernovae.txt, each line a ZTFname. These are then fitted, but not plotted. To get a nice example of ZTF lightcurves, issue: fp example_download.txt -dl -fit -plot.

fp this_looks_interesting -radec 143.3123 66.42342 -dl -fit -plot --daysago 10 -magrange 18 20 Downloads all images of the last ten days of the location given in RA and dDecec, performs PSF-fits and plots the lightcurve in the 18--20 magnitude range.

By systemwide bulk command (fpbulk file.txt -operations --options)

file.txt must be an ASCII file containing one ZTF-ID per line. The usual options apply (e.g. -dl, -fit).

Requirements

  • ztfquery is used to download the image files from IPAC.
  • ztflc is used for PSF-fitting.
  • AMPEL credentials are neccessary for the pipeline to work.

Notes

Slackbot

There is a bot for Slack included, based on the SlackRTM-API. You have to create a classic Slack app for this, because the newer version depends on the Events API, which itself seems to need a web server to run. Classic slack Apps can be created here. Make sure not to convert to the new permission/privilege system in the process (Slack tries to push you towards it, be careful). After successfully setting up the App/bot and giving it permissions, change the bot-username to the one of your bot in start_slackbot.py and it should basically work (first start requires you to enter the bot- and bot-user credentials, also provided by Slack).

Resulting dataframe

The dataframes resulting after plotting (located at ZTDATA/forcephotometry/plot/dataframes) consists of the following columns:

  • sigma(.err): The intrinsic error
  • ampl(.err): The flux amplitude (error)
  • fval: Total minimized value
  • chi2(dof): PSF-fit chi square (per degrees of freedom)
  • Columns 9-39: The science image header
  • target_x/y: pixel position of target
  • data_hasnan: Data contains NaN-values (should always be False)
  • F0: Zero point magnitude from header converted to flux
  • Fratio(.err): Flux to flux zero point ratio (error)
  • upper_limit: For forced photometry result < signal to noise threshold, this is the limiting magnitude from the Marshal (see maglim column)
  • mag(_err): Flux amplitude (error) converted to magnitude. For detections below signal to noise threshold, this value is set to 99.

fpbot's People

Contributors

dependabot[bot] avatar github-actions[bot] avatar simeonreusch avatar

Stargazers

 avatar  avatar

Watchers

 avatar

fpbot's Issues

BUG

Describe the bug
When running fp.psffit() apparently a directory is trying to be used that does not necesarrily exist.

To Reproduce
Steps to reproduce the behavior, just follow the steps as described in the README.

Expected behavior
Works.

Screenshots

----------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
Cell In[18], line 1
----> 1 pl.psffit()

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/fpbot/pipeline.py:664, in ForcedPhotometryPipeline.psffit(self, nprocess, force_refit)
    661 if number_of_fitted_datapoints_expected > fitted_datapoints or force_refit:
    662     self.logger.info(f"{name} ({i+1} of {objects_total}): Fitting PSF.")
--> 664     fp.run_forcefit(
    665         nprocess=nprocess,
    666         store=True,
    667         force_refit=force_refit,
    668         no_badsub=False,
    669     )
    671     fp.store()
    673     lastfit = Time(time.time(), format="unix", scale="utc").jd

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/ztflc/forcephotometry.py:166, in ForcePhotometry.run_forcefit(self, indexes, update_diffdata, store, no_badsub, force_refit, nprocess)
    164     self._data_forcefit = pd.DataFrame(dataout_final)
    165     if store:
--> 166         self.store()
    168 else:
    169     for i in tqdm(indexes):

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/ztflc/forcephotometry.py:93, in ForcePhotometry.store(self, filename, mode)
     91 oldmask = os.umask(0o002)
     92 if len(self._data_forcefit) > 0:
---> 93     self._data_forcefit.to_csv(filename, index=False, mode=mode)

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/util/_decorators.py:211, in deprecate_kwarg.<locals>._deprecate_kwarg.<locals>.wrapper(*args, **kwargs)
    209     else:
    210         kwargs[new_arg_name] = new_arg_value
--> 211 return func(*args, **kwargs)

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/core/generic.py:3720, in NDFrame.to_csv(self, path_or_buf, sep, na_rep, float_format, columns, header, index, index_label, mode, encoding, compression, quoting, quotechar, lineterminator, chunksize, date_format, doublequote, escapechar, decimal, errors, storage_options)
   3709 df = self if isinstance(self, ABCDataFrame) else self.to_frame()
   3711 formatter = DataFrameFormatter(
   3712     frame=df,
   3713     header=header,
   (...)
   3717     decimal=decimal,
   3718 )
-> 3720 return DataFrameRenderer(formatter).to_csv(
   3721     path_or_buf,
   3722     lineterminator=lineterminator,
   3723     sep=sep,
   3724     encoding=encoding,
   3725     errors=errors,
   3726     compression=compression,
   3727     quoting=quoting,
   3728     columns=columns,
   3729     index_label=index_label,
   3730     mode=mode,
   3731     chunksize=chunksize,
   3732     quotechar=quotechar,
   3733     date_format=date_format,
   3734     doublequote=doublequote,
   3735     escapechar=escapechar,
   3736     storage_options=storage_options,
   3737 )

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/util/_decorators.py:211, in deprecate_kwarg.<locals>._deprecate_kwarg.<locals>.wrapper(*args, **kwargs)
    209     else:
    210         kwargs[new_arg_name] = new_arg_value
--> 211 return func(*args, **kwargs)

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/io/formats/format.py:1189, in DataFrameRenderer.to_csv(self, path_or_buf, encoding, sep, columns, index_label, mode, compression, quoting, quotechar, lineterminator, chunksize, date_format, doublequote, escapechar, errors, storage_options)
   1168     created_buffer = False
   1170 csv_formatter = CSVFormatter(
   1171     path_or_buf=path_or_buf,
   1172     lineterminator=lineterminator,
   (...)
   1187     formatter=self.fmt,
   1188 )
-> 1189 csv_formatter.save()
   1191 if created_buffer:
   1192     assert isinstance(path_or_buf, StringIO)

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/io/formats/csvs.py:241, in CSVFormatter.save(self)
    237 """
    238 Create the writer & save.
    239 """
    240 # apply compression and byte/text conversion
--> 241 with get_handle(
    242     self.filepath_or_buffer,
    243     self.mode,
    244     encoding=self.encoding,
    245     errors=self.errors,
    246     compression=self.compression,
    247     storage_options=self.storage_options,
    248 ) as handles:
    249 
    250     # Note: self.encoding is irrelevant here
    251     self.writer = csvlib.writer(
    252         handles.handle,
    253         lineterminator=self.lineterminator,
   (...)
    258         quotechar=self.quotechar,
    259     )
    261     self._save()

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/io/common.py:734, in get_handle(path_or_buf, mode, encoding, compression, memory_map, is_text, errors, storage_options)
    732 # Only for write methods
    733 if "r" not in mode and is_path:
--> 734     check_parent_directory(str(handle))
    736 if compression:
    737     if compression != "zstd":
    738         # compression libraries do not like an explicit text-mode

File ~/opt/miniconda3/envs/fpbot/lib/python3.10/site-packages/pandas/io/common.py:597, in check_parent_directory(path)
    595 parent = Path(path).parent
    596 if not parent.is_dir():
--> 597     raise OSError(rf"Cannot save file into a non-existent directory: '{parent}'")

OSError: Cannot save file into a non-existent directory: '/Users/jannisnecker/ztf_dataforcephotometry'

Additional context
pl = ForcedPhotometryPipeline(file_or_name="ZTF19adgzidh", daysago=3000, nprocess=6, ra=225.4345633, dec=41.2829577)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.