Coder Social home page Coder Social logo

cirdles / et_redux Goto Github PK

View Code? Open in Web Editor NEW
12.0 12.0 16.0 7.47 MB

EARTH-TIME.org's flagship data-processing and workflow automation software.

Home Page: http://cirdles.org/projects/et_redux/

License: Apache License 2.0

Java 99.76% HTML 0.24% Shell 0.01%
earthtime geochron geochronology igsn java

et_redux's Introduction

CIRDLES

A repository for general issues for CIRDLES projects.

et_redux's People

Contributors

bowring avatar dependabot[bot] avatar gitter-badger avatar johnzeringue avatar luskjh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

et_redux's Issues

Long loading times for live data reduction

We have been running our live data reduction for every sample for the past few weeks now and have found some things that could improve the output data. I will be creating a few issue posts as some are longer than others.

Our first issue goes as follows. Once we reach ~170 analysis the program begins an infinite loop of loading/number crunching. Redux performs so much math that we are not able to reject fractions or change filters. It behaves as if the program is frozen but will still refresh with time. We see the grey bar progress to 99% then add that fraction as well as refresh all the views, but then redux immediately starts to reduce the next fraction. This doesn't allow us to see what has just been reduced let alone change sliders or zoom in on concordia or the PDP. We also are unable to remove any unwanted fractions because of this incredibly long loading time. With that said, this behavior makes redux fall significantly behind the mass spec. We need the data to import smoother or in a more reasonable amount of time. Possible solutions that I have are to break long runs into distinct chunks, or run the reduction in the background, and have a manual button to import or show the data that had been reduced in the background.

As an example of breaking up the run into chunks or segments: for a run of n=300 would it be possible to break the whole run into thirds. This would allow us to use the reduced data from the first 100 while the next 100 is being reduced. Another possibility would be similar to Gehrels’ agecalc program. Agecalc uses a continuous running six average to reduce the data, and leaves data outside this window to not be recalculated.

My second idea would be to pause of hide the automatic refresh. For example, if we are 100 analysis in and looking at a population and applying discordant or reverse discordant filters and selecting our best age, we need time to do this, but right now the constant loading and refreshing of views is impossible to get around. So, if we could pause the refreshing of new data, but continue to have the data being compiled and reduced in the background we would be able to use the plots more efficiently and look closer at the data. Once we are done looking at what we wanted, we would then be able to refresh the live view and insert the data that has been reducing in the background in to the live data view and continue running the program as normal.

I understand this is a lot for one issue, but this has to be fixed for redux because we aren't able to use the live data and obstructs the usability of redux especially for users with large-n detrital samples. I hope the ideas help and let me know what you all are thinking.

Unable to open PROBABILITY-DENSITY.svg files in AI

I am getting an error message when I try to open PROBABILITY-DENSITY.svg files created by redux in Illustrator. The error reads "The operation cannot be completed because of an unknown error. [CANT]" To test if this was a general problem, I generated a svg file of a concordia plot, and I was able to open this in Illustrator without any problem. It seems to be a problem with the files being created for the probability density plots. I am running Redux 3.6.6 and IllustratorCS5.1.

Remember settings

I think it would be useful to remember the settings when loading a sample. This would save literally milliseconds but would be very useful once there are more labs involved. It could get messy and prone to mistakes if we have to search a specific lab each time. I already noticed the first pull down window in the loading page is saved. Nice touch. If all three could be remembered that would be swell.

Post GSA suggestions

Since GSA has came and went I have a few suggestions to make the workflow a bit easier. These ideas are for cosmetic and overall ease of use.

  1. Be able to add and remove unknown spots in groups rather than individually. Being able to highlight a bunch to add to the removed fractions area. This feature would be very useful when creating weighted means. Even if we were able to use this feature in the sample dates window so it is easier to select the population of grains you want for the weighted mean.
  2. Adjusting the size of windows when they appear. Like at gsa our live data window was so huge that it was difficult to maneuver. Make the windows a more reasonable size.

Secondary RM designation not 'sticky'

Open the 'Project Manager' or process new raw data to bring up the 'Project Manager' window, then change the role of one or more of the 'Unknowns' to 'Secondary RMs'. The dropdown menu reads 'Secondary RM' until you hit 'Save Project', when the roles spontaneously switch back to 'Unknown'. The role should stay as 'Secondary RM'.

ADD: Most recent acquisition age window in live data reduction

This corresponds to the loading time issue we have been dealing with. When a new fraction is added to the live data table it goes to the bottom of its designated column. This should continue, but we would like there to be a window that shows us the name of the last acquisition along with the 6/8, 7/5, and 7/6 ages and uncertainties. This would be helpful when redux begins its infinite loop when we can't even scroll down to the data table. If the ages and uncertainties were added to a box under the concordia so we can see them with out scrolling down that would be especially useful.

Legacy data

When creating a new legacy data entry and trying to upload to sesar/geochron there is a problem with getting an IGSN.

I think that if we have the same option here as we do when we are putting data up to geochron after we reduce it. This would make this process much smoother.

So, if we can get a field to validate our geopass and then have sesar make an IGSN for our legacy data

Pulling Standards out for only 6/8 or 6/7

Hi,

As I have been working with the data table and dealing with poorly behaved standards I have haven't figured out how to remove the standard from just one type of fractionation whether its 6/8 or 6/7. When I remove that fraction it takes it out of both when there is only an issue with one (6/7 or 6/8). Is there any way to select that or choose which one I would like to take a standard out of?

Problem opening 3.6.9 on Windows

I have tried multiple times to open redux version 3.6.9. I get to the home screen but when I select LA-ICPMS nothing happens. It doesn't crash, it just doesn't open....

Common Pb correction for 202 spiked samples

When Pbc is >blank Pbc, one has the option of partitioning the Pbc subtraction from reservoirs with different compositions (blank + Stacey and Kramers, e.g.). However, if the sample is spiked with ET2535, this function does not work. While it is uncommon to need this (given that if Pbc is an issue, mass fractionation is not likely a big source of uncertainty), in working on a subset of very rare and very old zircons, it was a good idea to spike and analyze some leachates from chemical abrasion derived from step leaching single zircons. These leachates have a couple pg of Pbc instead of the usual and it would be nice to be able to test the sensitivity of those dates to the Pbc correction.

More cosmetic changes: Fractionation plots on Live view

On the live data reduction we would like to be able to have a button to show or hide the fractionation plots. The fractionation plots are excellent for the minders and people familiar with u/pb geochronology, but for visitors and people learning for the first time it can get rather confusing. So, if we would be able to hide those plots when necessary and have the concordia and PDF plots take up the space instead.

Once again, the plots in the live reduction are very important to see whats going on so all we request is to be able to hide it when we don't need to be monitoring it.

-Dan

Measured data top panel?

Hi Jim.

We have just upgraded to the new version of ET Redux. In the new version the top panel that contain the measured ratios has gone. Some of us find that panel useful so we can look at the measurement uncertainty and the sample/tracer ratio, and practically it helps check reduced data to measurement files (sometimes handy). Would it be possible to reinstate this? Noah said people found it distracting but it has some use for some of us, and it could be hidden now you have this drag-able divider?

Cheers,

Dan, Blair, Diana, and Simon.

Filters resetting in Live view

When filters for discordance, reverse discordance, and uncertainties are set they are reset to the default values after every new grain. If they could lock after you set them so we don't have to keep adjusting them this would be great. Also if we could set a default filter for these before we begin the run.

Rebase `v3.0.4` release code into `master`

Since v3.0.4 has been released to the public and is considered stable, it should be merged into master.

I would recommend the following (perhaps in a fresh clone):

  • git checkout master
  • git pull [upstream master], updating master
  • git fetch --tags [upstream], ensuring that the tag v3.0.4 is known locally
  • git rebase v3.0.4, merging the changes without creating a new commit
  • git push [upstream master]

Note that the parameters in brackets are only used when working from a clone of a fork (and assume that upstream is a remote pointing to CIRDLES/ET_Redux).

Having trouble generating weighted means

Hi,
I'm having trouble generating weighted means. I haven't been able to do this for any of my data sets, I will send an email out with a Redux file.
Thanks!
Clay

Incorporate SL and R33 standards

All is going well with the newest version of redux for the E2 lab here at laserchron! Recent visitors have loved all the features corresponding to the live data reduction. Right now redux is only using FC to reduce the data. However, we use SL and R33 in agecalc as well. We are wondering if it would be possible to have these standards included in the fractionation plots as well?

Storing accepted parameter models as resources.

Release 3.2.1 takes an initial pass at storing accepted parameter models as resources with the GJ1 mineral standard model. This enhancement requires designing and implementing a more general and robust solution.

Aliquot window in live workflow

I am running Redux 3.6.6. Aliquot manager windows will not open while Redux is in live workflow. This may be intentional, but it took me awhile to figure out why the windows were not opening. If it is necessary to stop live workflow when the aliquot manager window is open, I think it would be better to have this happen automatically rather than the aliquot manager window just not opening. When a user tries to open an aliquot manager window, this action could automatically stop live workflow.

Gray text in results table

New samples in the results table are remaining gray even after the initial data is input. Noah confirmed this behavior on his computer and e-mailed Jim an example redux file a couple weeks ago. I have since noticed that if I open and close the sample dates window, this fixes the problem and all fractions that contain data turn black.

Pb204 background issues

When we run redux Pb204 has a baseline with a value below zero. I have looked at what our actual backgrounds are and I am seeing that we are starting around ~300 cps. This is quite a bit different and will show more reverse discordance in older grains. Which is a trend we have recently been seeing. Hopefully we can get a fix to this as soon as possible. I know that may be complicated with the math but I can hope.

Regressions on TW plots

It would be nice to be able to do regressions on TW plots for LA-ICP-MS data, with the option to peg the upper end of the regression to a given 207Pb/206Pb.

Trouble with refitting functions

While refitting individual functions for FC reference materials I was running into issues when I would try and manually adjust the y-intercept. It would move and adjust how I wanted, but as soon as I would go to the next FC and open the local fit functions the y-intercept for the very last FC I worked on reverted to its original value. If we could make it so the y-intercept after it is changed would lock that new value instead of changing every time a different analysis is updated. Would greatly make fitting functions less frustrating.

-DA

Live mode crash

Live mode went to a blank screen around spot 360 on a large n run. This is the first I've seen it crash, but this was with 3.6.12 I will see if it was just a user error.

Archiving 2nd aliquot to geochron.org

I just archived two aliquots from the same sample to geochron.org. In the 'Archive Aliquot to Database' window, the first time I went to upload the second aliquot, the following text was displayed in red: "Note: This aliquot exists in Geochron. You may update it by choosing 'overwrite'." I think Redux noticed that I had already archived the first aliquot with the same IGSN, but this message wasn't true since I hadn't uploaded the second aliquot yet. I chose 'Overwrite' and the upload was successful, but the message was confusing.

Thermo Finnigan Neptune data

Dear Mr. Bowring, I'd like to know if ET_Redux is currently able to handle data from Thermo Finningan Neptune?

Felipe

Error importing legacy TIMS data

When using the csv import for legacy data, I am getting an error message. I am using the MIT legacy csv template. I then go to new sample from legacy analysis>ID TIMS legacy analysis (MIT) and select the csv. When I do this, I get an error saying "Document error: Premature end of file". If I press okay, it then takes me to the usual csv import screen and still seems to work otherwise. I am running ET_Redux 3.6.6. I was not able to attach a csv to this report, but can e-mail one of the files.

Mac keyboard shortcuts

This is pretty minor, but as I was pasting fraction weights into the 'manage sample' window, I noticed that the default Mac keyboard shortcut for pasting (command-v) doesn't work; instead it requires control-v. It looks like other keyboard shortcuts (select all, copy, cut) also require using 'control' instead of 'command'. Not sure if it is possible to have separate keyboard shortcuts for Mac or Windows in the same java program, but I wanted to check. Thanks!

Ryan Frazer
UNC-Chapel Hill

upper intercept regression

There is a bug with the upper intercept regressions when using ET_redux 3.0.4 to regress legacy data entered as a UCSB LASS project. The regressions work, but when I select "truncate regression curve below concordia" in the concordia plot display settings, the regression line disappears completely.

Scroll bars with in Raw data manager

I was working in the raw data window today and was repetedly switching between fitting functions to individual reference materials and the measures intensities tabs. When I would do this the horizontal scroll to navigate the individual fractions would go to wherever I was on the previous tab. This is frustrating because each tab isn't perfectly lined up with all of the others, so when I'm trying to fit functions for FC-21 and I want to check the measured intensities or see the whole session plot when I switch to the desired tab it appears at the same scroll distance where I initially was. With that being said, yes it was confusing, if both scroll bars could be independent for each separate tab that would make the functionality of flipping between individual tabs much easier with in the raw data manager.

-DA

Resetting of filters in live data reduction

When there is a new fraction added to the data table both the concordia and PDP are refreshed to add that new fraction. When the plots reset they automatically revert the positive and negative discordance filters back to 100%. This doesn't make sense as you only have a short amount of time to view the changes because of the next acquisition being added so soon. We would want the filters to stay at the values we give them until we change them again or turn them off. When a new fraction is added it will only be plotted if it corresponds to the set filter values. This will give us a much better visualization of data the data if the filters aren't constantly being reset.

Sample-by-sample views of unknowns

The sample-by-sample viewing of unknowns, controlled by the dropdown list of samples at the top left of the Raw Data Manager window is pretty cool, and a great way to look through the data for secondary reference materials and unknowns that are all (roughly) the same age. However, locking the y-axes together for all of the unknowns, especially for a detrital sample, forces analyses with very different scales onto the same y-axis. The result is that it's difficult to see the data for any one of the analyses.

  • One solution would be to have a three radio-button { 'lock y-axis' , 'scale to all data' , 'scale to included data' } choice. The first would plot (all ratios of) all fractions on the same y-axis, which is nice for primary and secondary reference materials and unknowns that are repeat measurements of the same expected age. The second would re-scale each fraction's plots to the min and max observed values, just like we did before the sample-specific views arrived. The third option would re-scale to the max and min observed values that have not been been rejected, which is useful if there is an extreme outlier that forces the scale to be much larger than the relatively smaller scatter in the other data. If you wanted to get fancy, you could even default the position of this radio button depending on the 'analysis purpose'. For unknowns with a 'SingleAge' analysis purpose, default to 'lock axis'. For unknowns with a 'DetritalSpectrum' analysis purpose, default to 'scale to all data'.
  • And one last thing. At present, you need to hit the 'ReFit Functions' button to recalculate any fits after rejecting data, or else the changes don't affect any subsequent calculations. The 'ReFit Functions' button also causes the axes to rescale back to their default scaling. For the newly inter-locked unknowns, this means that you not only lose any scaling you've done for that fraction with a local y-axis, but for any other fraction that you've rescaled. The 'ReFit Functions' buttons should not do any axis rescaling, ever.

Add: Tab to change background time

As of now, our background index for calculating baseline is correct for our U/Pb analysis within redux. We would like to be able to change the index for which redux calculates this baseline in the parameters window before we start a run. This would aid us by letting us manually change which indices are selected for baseline without having to get a patch from Jim. Right now, our U/Pb method on the Element 2 is excellent and we don't expect any changes in the near future. Having a default setting, which would be what it is set to now, and then being able to edit and save new setting would be excellent. Thanks

-DA

Redux plotting histogram incorrectly

Redux appears to be plotting a histogram incorrectly. The sample has 22 data points, but the histogram bars only account for 21 data points.

I cannot attach the redux file here, but will e-mail it (sample 06-12).

The problem occurs when I set the histogram bin width to 50. The plot appears to miss the data point with a date of 1059 Ma.

Some cosmetic upgrades for live reduction

These are just a few simple cosmetic adjustments George has request for the live data reduction window.

  • Adding a full screen button to max and min the size of the window easily.
  • Removing the R33 and SL analysis from the PDP plot so it only displays the unknowns.
    This will make the view easier for the visiting researchers who are unsure about using the software.
  • Reducing the size of the report table if we choose not to have all the other columns active and have the slider bar to move up and down in the table correspond to the resizing.
    This will make our scrolling throughout the data table easier and more efficient.
  • Last, having a most recent age window that displays the most recent analysis with the following information: sample name, 6/8 age and uncertainty, 7/5 age and uncertainty, 6/7 age and uncertainty.
    This would be great for the visiting researchers so they don't have to search for the most recent grain and have the views refresh on them and have to start their search over. When the views refresh the report table defaults back to the top starting with the FC grains. Having this window would make the viewing much easier and less stressful to visitors.

Other than just those cosmetic issues both the fast uncertainties and full propagation are working incredibly. The speed of which the data comes in is incredible, and as of these past two weeks the fractional plots have been looking excellent. George has also noticed that the uncertainties in the live views are almost 4 times the uncertainties we get in agecalc, for all three different ages. I have to still work more with the actual data manager mode to see if I can bring those uncertainties closer to agecalc.

Thanks;
-Dan

202-205-235 Tracer and Common Pb correction

Is there a possibility of allowing a 202-205-235 spike in ET_Redux (the spike we use in Oslo)?

Redux does not seem able to perform common Pb corrections on my data. I only register the 206/204 ratio in Tripoli, could that be the problem? Could you please advice me on which ratios to read?

Thanks in advance for your help.

Make test suite operational

As of 175db2f, there's a large test suite that's disabled out of the box in pom.xml.

After some prodding, it seems that there are a few issues preventing the test suite from being a useful verification tool:

  • Test classes do not conform to the naming convention <source class>Test, hampering IDE features (such as Netbean's Test File).
  • ETException's constructor opens a dialog, forcing the developer to manually close several of them during automated testing.
  • Some tests create files in the developer's project folder instead of using temporary files or virtual filesystems.
  • After correcting ETException and reenabling the test suite, three tests fail.
  • The tests are not readable enough to clearly communicate their purposes and methods.

While improving readability is a big task, the first four three points are not. I'll try and get things working in the coming week.


Update: While messing with the tests on another machine, I was unable to replicate the test failures that I originally reported.

Weighted mean calculations in live workflow

There appears to be one or more bugs when calculating the weighted mean by sample in the Sample Data Interpretations window when live workflow is running. I have seen two different behaviors. The first is that when I try to calculate a weighted mean by sample for a sample with two different aliquots, Redux kept un-checking and excluding the final fraction from the second aliquot each time live workflow updated. Similarly, when I just tested this issue on my laptop, when I went to the weighted mean window, then clicked the "by sample tab", the aliquot fraction folder kept closing. I opened the folder to select what aliquots to include in the weighted mean calculation, but each time live workflow updated, it would close the folder.

More info on pdf output

When outputting a weighted mean to pdf place the fraction information from the left hand column in the file as well. That way all information is available to the user.

Extract extraneous main methods into JUnit tests

A quick grep reveals that there are currently 63 classes with main methods, only one of which is being used as an entry point for the program.

At a glance, many of these mains seem to be integration tests for object serializations. It would be straightforward and beneficial to extract these methods into JUnit tests, so that they could be run automatically.

For reference, below is the output of grep -rl 'void main' ., showing all files with a main method.

./src/main/java/org/earthtime/dataDictionaries/RatioNamePrettyPrinter.java
./src/main/java/org/earthtime/ETRedux.java
./src/main/java/org/earthtime/matrices/matrixModels/CorrelationMatrixModel.java
./src/main/java/org/earthtime/matrices/matrixModels/CovarianceMatrixModel.java
./src/main/java/org/earthtime/matrices/matrixModels/CovarianceMatrixWithSubMatricesModel.java
./src/main/java/org/earthtime/matrices/matrixModels/JacobianMatrixModel.java
./src/main/java/org/earthtime/physicalConstants/PhysicalConstants.java
./src/main/java/org/earthtime/ratioDataModels/initialPbModelsET/commonLeadLossCorrectionSchemes/CommonLeadLossCorrectionSchemeA1.java
./src/main/java/org/earthtime/ratioDataModels/initialPbModelsET/commonLeadLossCorrectionSchemes/CommonLeadLossCorrectionSchemeA2.java
./src/main/java/org/earthtime/ratioDataModels/initialPbModelsET/InitialPbModelET.java
./src/main/java/org/earthtime/ratioDataModels/mineralStandardModels/MineralStandardUPbModel.java
./src/main/java/org/earthtime/ratioDataModels/pbBlankICModels/PbBlankICModel.java
./src/main/java/org/earthtime/ratioDataModels/physicalConstantsModels/PhysicalConstantsModel.java
./src/main/java/org/earthtime/ratioDataModels/rareEarthElementsModels/RareEarthElementsModel.java
./src/main/java/org/earthtime/ratioDataModels/tracers/TracerUPbModel.java
./src/main/java/org/earthtime/ratioDataViews/MineralStandardUPbRatiosDataViewEditable.java
./src/main/java/org/earthtime/ratioDataViews/MineralStandardUPbRatiosDataViewNotEditable.java
./src/main/java/org/earthtime/ratioDataViews/PhysicalConstantsDataViewEditable.java
./src/main/java/org/earthtime/ratioDataViews/PhysicalConstantsDataViewNotEditable.java
./src/main/java/org/earthtime/ratioDataViews/RatiosDataViewEditable.java
./src/main/java/org/earthtime/ratioDataViews/RatiosDataViewNotEditable.java
./src/main/java/org/earthtime/ratioDataViews/TracerUPbRatiosDataViewEditable.java
./src/main/java/org/earthtime/ratioDataViews/TracerUPbRatiosDataViewNotEditable.java
./src/main/java/org/earthtime/UPb_Redux/aliquots/UPbReduxAliquot.java
./src/main/java/org/earthtime/UPb_Redux/customJTrees/CheckBoxNodeTreeSample.java
./src/main/java/org/earthtime/UPb_Redux/dateInterpretation/vermeeschKDE/FFT.java
./src/main/java/org/earthtime/UPb_Redux/dateInterpretation/vermeeschKDE/Tester.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/AboutBox.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/DialogEditor.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/fractionManagers/FractionNotesDialog.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/graphManagers/GraphAxesDialog.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/projectManagers/BasicDnD.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/projectManagers/ProjectManagerFor_LAICPMS_FromRawData.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/sampleManagers/GeochronSampleCustomMetadataDialog.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/sampleManagers/heatMapManagers/HeatMapManager.java
./src/main/java/org/earthtime/UPb_Redux/dialogs/sampleManagers/sampleDateInterpretationManagers/SampleDateInterpretationAny2AxesChooser.java
./src/main/java/org/earthtime/UPb_Redux/fractions/AnalysisFraction.java
./src/main/java/org/earthtime/UPb_Redux/fractions/Fraction.java
./src/main/java/org/earthtime/UPb_Redux/fractions/UPbReduxFractions/UPbFraction.java
./src/main/java/org/earthtime/UPb_Redux/initialPbModels/InitialPbModel.java
./src/main/java/org/earthtime/UPb_Redux/mineralStandardModels/MineralStandardModel.java
./src/main/java/org/earthtime/UPb_Redux/pbBlanks/PbBlank.java
./src/main/java/org/earthtime/UPb_Redux/reports/ReportSettings.java
./src/main/java/org/earthtime/UPb_Redux/samples/SampleMetaData.java
./src/main/java/org/earthtime/UPb_Redux/samples/SESARSampleMetadata.java
./src/main/java/org/earthtime/UPb_Redux/tracers/Tracer.java
./src/main/java/org/earthtime/UPb_Redux/utilities/BrowserControl.java
./src/main/java/org/earthtime/UPb_Redux/utilities/comparators/IntuitiveStringComparator.java
./src/main/java/org/earthtime/UPb_Redux/valueModelPanelViews/MineralStandardUPbRatiosPanelViewNotEditable.java
./src/main/java/org/earthtime/UPb_Redux/valueModelPanelViews/ValueModelsPanelViewEditable.java
./src/main/java/org/earthtime/UPb_Redux/valueModelPanelViews/ValueModelsPanelViewNotEditable.java
./src/main/java/org/earthtime/UPb_Redux/valueModels/MeasuredRatioModel.java
./src/main/java/org/earthtime/UPb_Redux/valueModels/MineralStandardUPbRatioModel.java
./src/main/java/org/earthtime/UPb_Redux/valueModels/SampleDateModel.java
./src/main/java/org/earthtime/UPb_Redux/valueModels/ValueModel.java
./src/main/java/org/earthtime/UPb_Redux/valueModels/ValueModelReferenced.java
./src/main/java/org/earthtime/UPb_Redux/valueModelViews/MineralStandardUPbRatioViewNotEditable.java
./src/main/java/org/earthtime/UPb_Redux/valueModelViews/ValueModelViewNotEditable.java
./src/main/java/org/earthtime/utilities/jamaHelpers/MatrixRemover.java
./src/main/java/org/earthtime/utilities/TimeToString.java
./src/main/java/org/earthtime/visualizationUtilities/agePicker/AgePickDemo.java
./src/main/java/org/earthtime/visualizationUtilities/ProgressBarViewer.java
./src/main/java/org/earthtime/xmlUtilities/SimpleTransform.java

Labeling for standard numbers in live mode

I have noticed that there is a miss labeling for standards in live mode.
FC goes as follows: after FC-4 numbers start to be skipped and are only even
FC-1
FC-2
FC-3
FC-4
FC-6
FC-8
FC-10
FC-12
FC-14
etc,...

R33 and SL follow a similar trend but start at index two and then continue to skip odd numbers.

This seems to be just a minor labeling issue, but to visitors it may appear as an issue with standards

error ellipses

Hello!
The issue: as data is being pulled into Redux the uncertainties of 206/207 seem fine, but 206/238 age and errors fly through the roof the longer the acquisition continues. We witnessed an unknown of 180 Ma, and the very next aliquot, also an unknown, came in at 3.9 Ga. Subsequently, the aliquot prior (180 Ma) increased to 227 Ma, and uncertainties rose. It looks like the unknowns are 'correcting' each other.

Default 'fractionation technique'

Whenever a saved LA-ICPMS project is opened in Redux, the first time the user clicks 'Project Raw Data', the 'fractionation technique' defaults to 'Downhole', even if the sample was interpreted and saved with the 'Intercept' method. This choice is important and should be preserved through opening and closing the file. When opened for the first time, the 'Project Raw Data' window should default to the last saved choice for fractionation method and display that output.

Trouble loading an LA-ICPMS raw data file

Hi all,

I have some raw LA-ICPMS data I am trying to reduce, but for some reason ET_Redux will load it to 6% and then stop. I tried loading the dataset multiple times on different versions of Redux and every attempt failed at 6%. I've attached a .zip file containing the raw data in a separate email. I think it probably has something to do with the raw data file, however I can't find any obvious differences in this dataset and other functional ones.

Thanks,
Clay

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.