We have been running our live data reduction for every sample for the past few weeks now and have found some things that could improve the output data. I will be creating a few issue posts as some are longer than others.
Our first issue goes as follows. Once we reach ~170 analysis the program begins an infinite loop of loading/number crunching. Redux performs so much math that we are not able to reject fractions or change filters. It behaves as if the program is frozen but will still refresh with time. We see the grey bar progress to 99% then add that fraction as well as refresh all the views, but then redux immediately starts to reduce the next fraction. This doesn't allow us to see what has just been reduced let alone change sliders or zoom in on concordia or the PDP. We also are unable to remove any unwanted fractions because of this incredibly long loading time. With that said, this behavior makes redux fall significantly behind the mass spec. We need the data to import smoother or in a more reasonable amount of time. Possible solutions that I have are to break long runs into distinct chunks, or run the reduction in the background, and have a manual button to import or show the data that had been reduced in the background.
As an example of breaking up the run into chunks or segments: for a run of n=300 would it be possible to break the whole run into thirds. This would allow us to use the reduced data from the first 100 while the next 100 is being reduced. Another possibility would be similar to Gehrels’ agecalc program. Agecalc uses a continuous running six average to reduce the data, and leaves data outside this window to not be recalculated.
My second idea would be to pause of hide the automatic refresh. For example, if we are 100 analysis in and looking at a population and applying discordant or reverse discordant filters and selecting our best age, we need time to do this, but right now the constant loading and refreshing of views is impossible to get around. So, if we could pause the refreshing of new data, but continue to have the data being compiled and reduced in the background we would be able to use the plots more efficiently and look closer at the data. Once we are done looking at what we wanted, we would then be able to refresh the live view and insert the data that has been reducing in the background in to the live data view and continue running the program as normal.
I understand this is a lot for one issue, but this has to be fixed for redux because we aren't able to use the live data and obstructs the usability of redux especially for users with large-n detrital samples. I hope the ideas help and let me know what you all are thinking.