Coder Social home page Coder Social logo

kujaku11 / mt_metadata Goto Github PK

View Code? Open in Web Editor NEW
16.0 2.0 4.0 41.9 MB

Tools for standardizing metadata, geared towards magnetotelluric (MT) data but is general enough to accommodate "any" type of metadata.

Home Page: https://mt-metadata.readthedocs.io/en/latest/

License: MIT License

Makefile 0.13% Python 99.04% Objective-J 0.83%
magnetotellurics metadata-management

mt_metadata's People

Contributors

kkappler avatar kujaku11 avatar lheagy avatar xandrd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

mt_metadata's Issues

Create a `to_obspy` function for filters

Need to create functions for each filter object that will output an Obspy stage.

Also need to create logic to put the stages in order and have the ability to input normalization frequency, gain and gain frequency.

And have a way to compute the total response to get the instrument sensitivity

Bug In Properties assignment in FIR filter

In the StationXML it says:
2000.0
But when I read the filter the sps is 200.0

This took a bit of debugging but what is happening is that inside the data-acquistion box they are applying the following scheme:
Native data is sampled at 32000Hz
This is downsampled to 2000Hz by fir "AD32M"
This is downsampled to 400Hz by fir "fs2ds"
This is downsampled to 200Hz by fir "F96C"
This is downsampled to 40Hz by fir "fs2ds"

Note that the filter "fs2ds" occurs twice in the sequence, so the 2000Hz one is being overwritten by the 200Hz one later on ... sigh ... OK, so how do we get around this?

First thought is that if the filter has "decimation_active" True then we name as the usualname_inputsamplerate.
This will make the decimaton filter names unique.
Alternatively we could track how many times a given AAF is applied and product them, but the first approach seems better.

plot_response function for PoleZero filter not showing poles-zero diagram

Here is a snippet that makes the plot I attach in a screengrab. Not sure why there aren't x's and o's being plotted for the poles and zeros in the subplot on the far right -- could be an axis scaling issue.
Screenshot from 2021-08-30 17-23-23

m = MTH5()
m.open_mth5(h5_path, mode="r")
run_obj = m.get_run(station_id, run_id)
experiment = m.to_experiment()
run_ts_obj = run_obj.to_runts()
channel = "ex"
ch_ts = ts = run_ts_obj.dataset[channel]
ch_ts.data = 1.0 * ch_ts.data
ch_ts.data -= np.mean(ch_ts.data)
response = run_obj.get_channel(channel).channel_response_filter    
frequencies = np.logspace(-4,2, 100)
f1_index = 0
fN_index = 99
print(f"Active frequency range from {frequencies[f1_index]:.3} to {frequencies[fN_index]:.3} Hz")
print(f"Total number of filters = {len(response.filters_list)}")
for i_filter,fltr in enumerate(response.filters_list):
    print(f"\n\n FILTER# {i_filter}")
    print(f"{fltr.name} {fltr.units_in}-->{fltr.units_out} \n Type: {type(fltr)} \n {fltr}")
    fltr.plot_response(frequencies[1:])

Translation of MT metadata to StationXML is not quite right yet.

  • The order of the runs is should be alphabetic or numeric, currently random. Probably has to do with how the runs are collected.
  • Survey:
    • Missing: survey.id, operator.contact
  • Channel:
    • Missing latitude, longitude, elevation values, which seem to be filled from the station metadata.
    • calibration_units should be physical units

@akelbert Can you find any other issues?

Filters keys no longer have expected values

We had talked about adding the station label as part of the filter key, but it looks like the string representation of the station object wound up getting associated with the filter key, not the label.

i.e. instead of prepending

"PKD"

this was prepended instead:

"station pkd (bear valley ranch, parkfield, ca, usa)\n\tstation code: pkd\n\tchannel count: 4 per 373 (selected per total)\n\t1996-09-06t16:28:00.000000z - \n\taccess: open \n\tlatitude: 35.95, longitude: -120.54, elevation: 583.0 m\n\tavailable channels:\n\t\tpkd..bf1, pkd..bf2, pkd..bq1, pkd..bq2_bq1_0'

Screenshot from 2021-09-14 16-26-48

Anti-alias FIR Filter Discussion

We are able to load these FIR filters. These tend to be anti-alias filters applied during digitizer decimation processes.

  1. The filters applied to the E and H will in general be the same so that ratios and TFs will not be affected assuming the survey uses the same acquisition systems -- which is common but not true in general
  2. If we do back these filters out we need to sort out the phase shifts and whether these are applied zero-phase, from the right or from the left. It is not clear how the filter application is referenced in Station XML .. there is a element in the XML, cannot find doc on what it means though

This issue relates to #34

Store transfer functions as xarray

It would be convenient if the transfer function and covariances were stored in an xarray.DataSet. The coordinate could be period or frequency, and dims or variables could be zxx, zxy, zyy, zyx, tzx, tzy, ...

strip dependencies of mtpy from fix_2 branch

I was struggling with getting tests running on the new branch due to the following import:
from mtpy.utils.mtpy_logger import get_mtpy_logger

FWIW here are all the mtpy calls I see in the current pulled version

mt_metadata/transfer_functions/io/edi.py:20:import mtpy.utils.gis_tools as gis_tools
mt_metadata/transfer_functions/io/edi.py:21:import mtpy.utils.exceptions as MTex
mt_metadata/transfer_functions/io/edi.py:22:import mtpy.utils.filehandling as MTfh
mt_metadata/transfer_functions/io/edi.py:23:import mtpy.core.z as MTz
mt_metadata/transfer_functions/io/edi.py:24:from mtpy import __version__
mt_metadata/transfer_functions/io/edi.py:25:#from mtpy.utils.mtpy_logger import get_mtpy_logger
mt_metadata/transfer_functions/io/edi.py:86:    Tipper                mtpy.core.z.Tipper class, contains the
mt_metadata/transfer_functions/io/edi.py:88:    Z                     mtpy.core.z.Z class, contains the
mt_metadata/transfer_functions/io/edi.py:100:        >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:108:        self.logger = get_mtpy_logger(f"{__name__}.{self.__class__.__name__}")
mt_metadata/transfer_functions/io/edi.py:198:            >>> import mtpy.core.Edi as mtedi
mt_metadata/transfer_functions/io/edi.py:672:            >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:1254:        >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:1486:            >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:2230:        >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:2297:        >>> import mtpy.core.edi as mtedi
mt_metadata/transfer_functions/io/edi.py:2680:    Read an edi file and return a :class:`mtpy.core.mt.MT` object
mt_metadata/transfer_functions/io/edi.py:2691:    from mtpy.core import mt
mt_metadata/transfer_functions/io/edi.py:2723:    Write an edi file from an :class:`mtpy.core.mt.MT` object
mt_metadata/transfer_functions/io/edi.py:2731:    from mtpy.core import mt
mt_metadata/transfer_functions/io/edi.py:2734:        raise ValueError("Input must be an mtpy.core.mt.MT object")
mt_metadata/transfer_functions/io/jfile.py:13:import mtpy.core.z as mtz
mt_metadata/transfer_functions/io/jfile.py:14:from mtpy.utils.mtpy_logger import get_mtpy_logger
mt_metadata/transfer_functions/io/jfile.py:28:        self.logger = get_mtpy_logger(f"{__name__}.{self.__class__.__name__}")
mt_metadata/transfer_functions/io/jfile.py:428:        # put the results into mtpy objects
mt_metadata/transfer_functions/io/jfile.py:487:    from mtpy.core import mt
mt_metadata/transfer_functions/io/zmm.py:15:import mtpy.core.z as mtz
mt_metadata/transfer_functions/io/zmm.py:16:from mtpy.utils import gis_tools
mt_metadata/transfer_functions/io/zmm.py:17:#from mtpy.utils.mtpy_logger import get_mtpy_logger
mt_metadata/transfer_functions/io/zmm.py:81:        self.logger = get_mtpy_logger(f"{__name__}.{self.__class__.__name__}")
mt_metadata/transfer_functions/io/zmm.py:670:    from mtpy.core import mt
mt_metadata/transfer_functions/io/mt_xml.py:23:import mtpy.core.z as mtz
mt_metadata/transfer_functions/io/mt_xml.py:24:from mtpy.utils.mttime import get_now_utc
mt_metadata/transfer_functions/io/mt_xml.py:25:from mtpy.core.standards.helpers import element_to_dict, flatten_dict
mt_metadata/transfer_functions/io/mt_xml.py:726:            >>> import mtpy.core.mtxml as mtxml
mt_metadata/transfer_functions/io/mt_xml.py:1033:    .. seealso:: mtpy.core.mt_xml.XMLConfig
mt_metadata/transfer_functions/io/mt_xml.py:1038:    Z               object of type mtpy.core.z.Z
mt_metadata/transfer_functions/io/mt_xml.py:1039:    Tipper          object of type mtpy.core.z.Tipper
mt_metadata/transfer_functions/io/mt_xml.py:1044:              tag information.  This is left this way so that mtpy.core.mt.MT
mt_metadata/transfer_functions/io/mt_xml.py:1045:              can read in the information.  **Use mtpy.core.mt.MT for
mt_metadata/transfer_functions/io/mt_xml.py:1057:        >>> import mtpy.core.mt_xml as mtxml
mt_metadata/transfer_functions/io/mt_xml.py:1669:            raise EMFTXMLError("To set Z, input needs to be an mtpy.core.z.Z object")
mt_metadata/transfer_functions/io/mt_xml.py:1687:            raise EMFTXMLError("To set Z, input needs to be an mtpy.core.z.Z object")
mt_metadata/transfer_functions/io/readwrite.py:13:    * the return value is a :class:`mtpy.core.mt.MT`
mt_metadata/transfer_functions/emtf_xml/xml_schema.py:24:from mtpy.core.io.emtf_xml import XML_CSV_FN_PATHS
mt_metadata/transfer_functions/emtf_xml/xml_schema.py:25:from mtpy.core.metadata.standards import schema
mt_metadata/transfer_functions/emtf_xml/xml_metadata.py:8:from mtpy.core.metadata.metadata import (
mt_metadata/transfer_functions/emtf_xml/xml_metadata.py:18:from mtpy.core.io.emtf_xml.xml_schema import XMLStandards
mt_metadata/transfer_functions/emtf_xml/xml_metadata.py:19:from mtpy.utils.mttime import MTime
mt_metadata/transfer_functions/core.py:650:    #     Returns a mtpy.imaging.plotresponse.PlotResponse object
mt_metadata/transfer_functions/core.py:661:    #     from mtpy.imaging import plot_mt_response
mt_metadata/transfer_functions/core.py:762:            >>> import mtpy.core.mt as mt
tests/__init__.py:4:# assume tests is on the root level of mtpy

Review handling the case of empty filters_list

mt_metadata/mt_metadata/timeseries/filters/channel_response_filter.py
currently does:
_validate_filters_list()
with the following logic:

if filters_list in [[], None]:
return None

Because so many things that operate on filters_list assume an iterable, the code will probably be more readable if we use an empty list rather than None. That said, the filters list will practically almost never be None, but it could if a user had calibrated time series data.

Reconcile StageGain and NormalizationFactor in PoleZeroFilter

  1. Stage gain in PoleZeroFilter getting lost after reading in from StationXML. Track down where stage gain is going. We will want to merge this in with the normalization_factor in a PoleZeroFilter() instance so that all gain is in one place.

  2. Be careful of the "multiplicative sense" of the stage gain when incorporating into the complex response.
    i.e. backing out a 10x gain means divide by 10 from the raw data. Ditto with the normalization_factor

  3. Put together a test that confirms correct units. With a FluxGate this should be easy because we can compare against IGRF for example.

decide if scale factors module is to be included

@kujaku11
Should we create a module with things like
degrees_to_radians
radians_to_degrees
mV_per_km_to_V_per_meter
V_per_meter_to_mV_per_km
etc?
It's a bit hokey but I think makes the code more readable and makes the codes less error prone by centralizing all the scale factors to one place where are imported from

maybe put in mt_metadata.util ?

two readmes

It seems we have a readme both in md and rst. It can be a bit annoying to maintain two. Do you have a preference for either? In the SimPEG repo, we stuck with the rst file as it was the same as the rest of the docs, but it doesn't make much difference, sphinx can handle both. So just suggesting we have one rather than two :-)

Experiment should have attributes

timeseries.Experiment should have attributes like:

  • time_period
  • location
  • name
  • description

These quantities should be derived from the survey list.

Logger Error when importing mt_metadata

Get a LoggerError when importing mt_metadata if another process is running. This makes the logger enter an crazy loop. Probably has something to do with rotating file handler.

potential circular import

mth5 imports from mt_metadata often, but there is a one off case where mt_metadata is importing from mth5. Should that example be migrated to mth5?

mt_metadata/examples/mth5_from_stationxml.py
has on line 12:
from mth5 import mth5
This is the only import from mth5 in the entire main branch.

Discussing the ecosystem dependencies of all the repositories in play, it looks like
mt_metadata > mth5 > aurora > mtpy

Undesirable Mutability in Base() (of Filter) Class

Instances of Filters are changing live when other filters are being instantiated.

Here is a 6-line code to demonstrate:

from mt_metadata.timeseries.filters.filter import Filter
qq = Filter()
print(qq)
from mt_metadata.timeseries.filters.time_delay_filter import TimeDelayFilter
tdf = TimeDelayFilter()
print(qq)

We instantiate a Filter() base class, and it has no delay attr.  But after instantiating a TimeDelayFilter(), qq has a delay attr.


Output from the two print statements below
# Out[4]:
# {
#     "filter": {
#         "calibration_date": "1980-01-01",
#         "name": null,
#         "type": null,
#         "units_in": null,
#         "units_out": null
#     }
# }




# Out[7]:
# {
#     "filter": {
#         "calibration_date": "1980-01-01",
#         "delay": null,
#         "name": null,
#         "type": null,
#         "units_in": null,
#         "units_out": null
#     }
# }

Unconventional StationXML

Observed Issues:

  1. sometimes magnetic field is identifed by channel code "T" rather than "F"
  2. At least some historical station xml files are returning units of Ampres and Tesla. We need code to handle these cases and correct for them. In particular, I think the Amperes reference is incorrect and the Tesla is contrary to our convention.
    We need to both identify/log/report these issues and add compensatory scaling factors to calibration.
  3. Some dipoles are Q2, Q3 for horizontal, but mt_metadata expects Q1, Q2. How to handle this?

Most likely solution would seem to be update the stationxml rather than the code ... let's agree if that is the path and then make a plan for executing

No warning of multiple epochs given when translating inventory

When multiple instances of the same channel inhabit a stationxml, only one instance (the first one it seems) is returned by the translator from inventory to experiment. There is not necessarily wrong but one might expect a warning that of the three distinct channels in the file, two of them are being ignored by the translator.

Attached to this issue is a .txt file. This file is actually a staitonxml file, with the extension changed from .xml to .txt so that github would allow me to upload it. This file has three separate descriptions of channel LQ2, spanning the following three separate epochs:
2003-03-13T22:35:00 - 2003-09-12T18:54:00
2003-09-12T18:54:00 - 2005-03-15T16:45:00
2005-03-15T16:45:00 - 2005-05-18T22:15:00

fdsn-station_pkd_LQ2_20030912-20050315.txt

When I execute the following code:

'''
from obspy.core.inventory.inventory import read_inventory
from mt_metadata.timeseries.stationxml import XMLInventoryMTExperiment
xml_filename = "fdsn-station_pkd_LQ2_20030912-20050315.xml"
inventory = read_inventory(xml_filename)
translator = XMLInventoryMTExperiment()
experiment = translator.xml_to_mt(inventory_object=inventory)
print(experiment)
'''

The result is:

Experiment Contents

Number of Surveys: 1
Survey ID: None
Number of Stations: 1
--------------------
Station ID: PKD
Number of Runs: 1
--------------------
Run ID: 001
Number of Channels: 1
Recorded Channels: ey
Start: 2003-03-13T22:35:00+00:00
End: 2003-09-12T18:54:00+00:00
--------------------

There was no warning about multiple instances of the same channel, and it looks like it only populated the first of the three. Not 100% sure what we want the behaviour to be, but there should probably be some warning that the experiement has fewer channels than the inventory

Fate of test_filters

@kujaku11
Just updated stationxml_filters branch and it looks like test_filters.py is gone:

delete mode 100644 mt_metadata/timeseries/filters/test_filters.py
delete mode 100644 tests/timeseries/filters/test_filters.py

Was it dropped intentionally? Or shall I restore it?

Get up on PIP and Conda

Build the infrastructure to get up onto PIP and Conda so that it can be installed easily.

Create Filters XML Reader

We need one.
migrate /mth5/examples/filter_prototyping.py into metadata repository and make it read filters.xml

Create metadata for EMTF XML

Create the metadata standards for EMTF XML data and translation into a more standard version similar to the time series metadata.

Failing to write emtfxml

This is a little tricky to debug. Trying to write emtfxml, and encountered some issues.

  1. In emtfxml I encountered an AttributeError in emtfxml.py because the station_metadata did not have a valid datatype. I looked at the standards/station.json and saw that, for example LPMT should be a valid code. However, manually overwriting with that did not work. I wound up having to change it to "MT" to get around this issue
self.site.acquired_by = sm.acquired_by.author
        try:
            self.sub_type = f"{sm.data_type.upper()}_TF"
        except AttributeError:
            #self.sub_type = "LPMT_TF"
            # self.sub_type = "LPMT"
            # mt_metadata.utils.exceptions.MTSchemaError: LPMT_TF not found in
            # options list ['MT_TF']
            self.sub_type = "MT_TF"
  1. When working through the run_list in extfxml.py I had to work around some more AttributeError instances:
for r in sm.run_list:
            fn = emtf_xml.FieldNotes()
            fn.dipole = []
            fn.magnetometer = []
            try:
                fn.instrument.id = r.data_logger.id
                fn.instrument.name = r.data_logger.type
                fn.instrument.manufacturer = r.data_logger.manufacturer
                fn.sampling_rate = r.sample_rate
                fn.start = r.time_period.start
                fn.end = r.time_period.end
            except AttributeError:
                fn.instrument.id = "GMD"
                fn.instrument.name = "gus the magic datalogger"
                fn.instrument.manufacturer = "Louis XIV"
                fn.sampling_rate = 1.0
                fn.start = "2020-05-03T12:12:12+00:00"
                #"1980-01-01T00:00:00+00:00"
                fn.end = "2020-05-04T12:12:12+00:00"
  1. Moving forward, metadata.Base was throwing an interesting error that may be related to the workarounds I applied above. Here is the output of a terminal after I put the following print statements around line 600 of metadata.py
 if required:
                if (
                    value not in [None, "1980-01-01T00:00:00+00:00"]
                    or self._attr_dict[name]["required"]
                ):
                    print(f"name {name}")
                    if name=="value":
                        print("??")
                    print(f"value {value}")
                    meta_dict[name] = value
            else:
                meta_dict[name] = value

TERMINAL OUTPUT:

...
name location.z2
value 0.0
name location.declination.model
value None
name location.declination.epoch
value None
name location.declination.value
value None
name orientation.method
value None
name orientation.reference_frame
value geographic
name orientation.angle_to_geographic_north
value 0.0
name value
??
value 0
Traceback (most recent call last):
  File "test_process_cas04.py", line 77, in <module>
    main()
  File "test_process_cas04.py", line 71, in main
    process_cas04()
  File "test_process_cas04.py", line 14, in process_cas04
    tf_cls.write_tf_file(fn="test_emtf.xml", file_type="emtfxml")
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/transfer_functions/core.py", line 962, in write_tf_file
    return write_file(self, fn, file_type=file_type)
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/transfer_functions/io/readwrite.py", line 195, in write_file
    return file_writer(mt_object, fn)
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/transfer_functions/io/emtfxml.py", line 1066, in write_emtfxml
    emtf.write(fn=fn)
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/transfer_functions/io/emtfxml.py", line 272, in write
    emtf_element, value,
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/transfer_functions/io/emtfxml.py", line 401, in _write_element
    parent.append(self._convert_tag_to_capwords(value.to_xml()))
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/base/metadata.py", line 744, in to_xml
    self.to_dict(nested=True, required=required), self._attr_dict
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/base/metadata.py", line 597, in to_dict
    value not in [None, "1980-01-01T00:00:00+00:00"]
  File "/home/kkappler/software/irismt/mt_metadata/mt_metadata/base/metadata.py", line 76, in __eq__
    sorted(json.loads(other).items(), key=itemgetter(0))
  File "/home/kkappler/anaconda2/envs/py37/lib/python3.7/json/__init__.py", line 348, in loads
    return _default_decoder.decode(s)
  File "/home/kkappler/anaconda2/envs/py37/lib/python3.7/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4)

Look at the output .. normally it is showing key-value pairs, but right before it fails the name of the key is "value" which is weird.
In any case, what seems to be happening is that some metadata element is failing when it calls

__repr__()

but I cannot seem to track down the offending piece of metadata. FWIW I got the same result on IAK34 and CAS04

Creating FAP Filter

I cloned your main branch, installed mt-metadata using the setup.py file and tried to create a frequency_response_table_filter and convert my filter to an ObsPy stage. I ran into the following error:

---------------------------------------------------------------------------
	IndexError                                Traceback (most recent call last)
	/var/folders/pn/k5blhlvd7vzdfy85t2z3_h940000gn/T/ipykernel_1237/2779005987.py in <module>
	----> 1 bx_filter.to_obspy()
	
	~/Downloads/mt_metadata-main/mt_metadata/timeseries/filters/channel_response_filter.py in to_obspy(self, sample_rate)
	    291 
	    292         """
	--> 293         total_sensitivity = self.compute_instrument_sensitivity()
	    294 
	    295         total_response = inventory.Response()
	
	~/Downloads/mt_metadata-main/mt_metadata/timeseries/filters/channel_response_filter.py in compute_instrument_sensitivity(self, normalization_frequency)
	    245             sensitivity *= complex_response
	    246         try:
	--> 247             return np.round(np.abs(sensitivity[0]), 3)
	    248         except TypeError:
	    249             return np.round(np.abs(sensitivity), 3)	
IndexError: invalid index to scalar variable.

I updated my local copy of your branch with the following fix: except IndexError (line 248) and was able to convert my filter to an ObsPy stage. Not a big deal but I just wanted to bring it up in case it would be helpful.

Stop Loading Example Data from mth5_test_data

Need to remove any dependencies on MTH5 and probably can remove dependencies on the mth5_test_data, the XML files are small enough to pack with mt_metadata.

Suggest putting any example XML files into the mt_metadata/data folder and load all files from there. This should make the tests run easier and remove dependencies on too many external packages.

Merged Filters Complex Response should Ignore Time Delay type filters.

The complex response for the time delay filter should in general be avoided. ")

  1. It can suffer some pretty serious phase wrapping as frequencies grow, making the plots hard to interpret
  2. When applied to finite time series it rolls the timeseries (in the np.roll() sense) resulting in data from late time showing up at early time - i.e. non physical data which will result in incorrect processing results

In general, delay corrections should be applied in time domain before spectral processing and so these need to be treated differently.

This begs the question, should we group any other filters into time domain application? For example, it is easy to apply coefficient filters there, but there doesn't seem to be any compelling reason to. In general we will not have flat reponses in frequency domain and the defauly notion right now is to apply all filters merged in frequency domain with the lone exception of delay, which will be applied by pre-aligning all multivariate timeseries and interpolating them all onto the same time axis for a given "Run"

Two Filter Plotting Functions should be merged into One

At a minimum, put all filter plotters into a single module. Remove redundancies.

<1>
mt_metadata/timeseries/filters/filter_base.py:140:from mt_metadata.timeseries.filters.plotting_helpers import plot_response
</1>

<2>
mt_metadata/timeseries/filters/filter_base.py:303: def plot_complex_response(self, frequency_axis, **kwargs):
mt_metadata/timeseries/filters/filter_base.py:304: from iris_mt_scratch.sandbox.plot_helpers import plot_complex_response
</2>

MTPy MT() Object Review

Pull MTPy and take a look at the MT() object, and see how the Transfer Functions from Aurora will fit into this container and then review .

Focus on
mtpy/core/mt.py

You will need to look at Tipper and Impedance Tensor.
Variances are stored right now, but the covariances are not.
** Covariances not stored right now.
Some say full covariances are needed todo the rotations for the errors. This is arguable however.

This issue blocks issue #2

Create Frequency Response Lookup Table Filter class

While we are not receiving these from IRIS today, we do anticipate this sort of filter to be available in the future. Lets create a class for this and make the tests read for IRIS. Also, these filters will be useful for the ongoing pole-zero fitting exercises

Sanity checks for metadata parameters

Working on CAS04 test dataset for aurora I encountered the following filter:

{
"coefficient_filter": {
"calibration_date": "1980-01-01",
"comments": "analog to digital conversion",
"gain": 484733700000000.0,
"name": "v to counts (electric)",
"type": "coefficient",
"units_in": "V",
"units_out": "count"
}
},

The ADC gain is impossibly large. Maybe some checks at ingest can be added.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.