Coder Social home page Coder Social logo

impact-common's Introduction

impact-common

Build Status

Common tools for estimating impact functions.

Installation

Run the following:

python setup.py develop --user

You can drop --user on your local machine.

impact-common's People

Contributors

brews avatar jrising avatar trinetta avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

emileten

impact-common's Issues

Tag commit/release for mortality?

Mortality projection is done with commit fcaf244. It would be great if we could tag this commit so that it's easier for researchers to access.

Maybe tag that commit as "v0.1.0" and then bump master up to v0.2.0 -- to be clear I don't mean tagging a commit "v0.2.0", just update the package version on master.

Need to remember if we do bump master, be sure to update the version in setup.py so that package installs update properly.

Refactor to test used gdppc.py interface without "imperics_shareddir" requirement

Folks are saying that others should use impactcommon.exogenous_economy.gdppc but the interface isn't really tested. We should stick it in a testing harness so we get some kind of coverage (regardless of whether others use it or not).

The tests should run without needing a shareddir directory tree. Need to see if we can safely break the hard coupling on a static file directory and external data files.

Refactoring is likely needed, but we can't break downstream packages.

Sum of global IR-level GDP != sum of ISO-level GDP

Hi @jrising,

In mortality, we have 2 different ways of valuing damages: In the paper we scale a region's VSL by its IR-level income, however for integration we scale it by the region's county-level income. Theoretically, the global population average VSL should be the same using the 2 methods, however we found they are off by a tiny magnitude.

We use the IR-level SSP{ssp}.nc4 and ISO-level {var}_{iam}_{ssp}.csv GDPpc files, both of which are generated using the GDPpcProvider function in this repo.

I did a simple check of where the discrepancies were by comparing each country-year's sum(ir_pop * ir_gdpppc) to the sum(ir_pop) * iso_gdppc and they are equal in all countries other than SJM, SSD, GRL, ESH and KO-. Each of these receives the global average GDPpc in the ISO file (~7065 in 2010). Each of these has a flat population in each year due to no Landscan data being available. SSD, GRL, KO-, and ESH have more than one IR within the ISO, but even the single-IR SJM, GRL have different GDPpc values in the IR and ISO level GDPpc files. (I checked the other countries that use the flat population, and found that the rest of them are single-IR countries, but for all them IR GDPpc == ISO GDPpc.)

While it looks like the ISO GDPpc values are from the global average of /social/baselines/gdppc-merged-nohier.csv, I believe the IR values go through the HierarchicalGDPpcProvider process.

Tagging @tcarleton so she can follow.

Thanks!

FutureWarning in gdppc.py

We have this warning coming in:

impactcommon/exogenous_economy/gdppc.py:164: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError.  Select only valid columns before calling the reduction.

I think all we need is the value column.

Generalize `impactcommon/math/minspline.py` so that it gets local extremum and not just minima of spline curves

This is needed to advance on prototype labor impact projections in https://gitlab.com/ClimateImpactLab/Impacts/impact-calculations/-/issues/126.

Ideas:

  • Refactor minspline.py to extremaspline.py but keep it backwards compatible.
  • Write a unit test for the desired behavior we want in new findsplineextrema() and findsplinemax().
  • Copy impactcommon.math.extremaspline.findsplinemin() to a new function findsplineextrema(). generalizing it to find minima and maxima.
  • Refactor impactcommon.math.extremaspline.findsplinemin()to just call the new findsplineextrema().
  • Write new findsplinemax() - it's just syntax sugar for findsplineextrema(), again.

Building array of single-element arrays in `GDPpcProvider.get_iso_timeseries()` with numpy>=1.14

With newer versions of numpy (>= 1.14) GDPpcProvider.get_iso_timeseries() returns a numpy array filled with single-element numpy arrays.

This problem is based here and in the proceeding for-loop:

self.cached_iso_gdppcs[iso] = np.array(gdppcs)

The problem causes an error downstream in impact-calculations when code tries to take the log of this object-array. Here is an example traceback from an energy diagnostic run:

Traceback (most recent call last):
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/bin/impactprojection", line 10, in <module>
    sys.exit(impactcalculations_cli())
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/click/core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/click/core.py", line 717, in main
    rv = self.invoke(ctx)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/click/core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/click/core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/click/core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/cli/__init__.py", line 57, in diagnostic
    gg.main(file_configs)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/generate/generate.py", line 200, in main
    mod.produce(targetdir, weatherbundle, economicmodel, pvals, config, push_callback=lambda *args: genericpush_callback(*args, weatherbundle=weatherbundle, economicmodel=economicmodel), diagnosefile=os.path.join(targetdir, shortmodule + "-allcalcs.csv"))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/container.py", line 77, in produce
    produce_csvv(basename, csvvpath, module, specconf, targetdir, weatherbundle, economicmodel, pvals, configs.merge(config, model), push_callback, suffix, profile, diagnosefile)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/container.py", line 104, in produce_csvv
    calculation, dependencies, baseline_get_predictors = caller.call_prepare_interp(csvv, module, weatherbundle, economicmodel, pvals[basename], specconf=specconf, config=config, standard=False)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/generate/caller.py", line 83, in call_prepare_interp
    calculation, dependencies, baseline_get_predictors = mod.prepare_interp_raw(csvv, weatherbundle, economicmodel, pvals, farmer, **kwargs)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/calcspec.py", line 14, in prepare_interp_raw
    covariator = specification.create_covariator(specconf, weatherbundle, economicmodel, config, quiet=config.get('quiet', False))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/specification.py", line 47, in create_covariator
    covariators.append(get_covariator(covar, None, weatherbundle, economicmodel, config=fullconfig, quiet=quiet))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/specification.py", line 20, in get_covariator
    return get_covariator(covar.keys()[0], covar.values()[0], weatherbundle, economicmodel, config=config, quiet=quiet)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/interpret/specification.py", line 24, in get_covariator
    return covariates.BinnedEconomicCovariator(economicmodel, 2015, args, config=configs.merge(config, 'econcovar'))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/adaptation/covariates.py", line 105, in __init__
    super(BinnedEconomicCovariator, self).__init__(economicmodel, maxbaseline, config=config)
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/adaptation/covariates.py", line 46, in __init__
    self.econ_predictors = economicmodel.baseline_prepared(maxbaseline, self.numeconyears, lambda values: averages.interpret(config, standard_economic_config, values))
  File "/home/bmalevich/miniconda3/envs/impactprojection-py27-TEST/lib/python2.7/site-packages/adaptation/econmodel.py", line 71, in baseline_prepared
    econ_predictors[region] = dict(loggdppc=func(np.log(baseline_gdppcs)), popop=func([popop]))
AttributeError: 'numpy.float64' object has no attribute 'log'

Currently a workaround is to pin the work environment to numpy=1.14.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.