nsls-ii / ptycho_gui Goto Github PK
View Code? Open in Web Editor NEWNSLS-II Ptychography Software - frontend
License: MIT License
NSLS-II Ptychography Software - frontend
License: MIT License
This is an important info that's missing currently. GUI needs to detect this automatically. Perhaps these were not stored in the databroker? Need to check.
Related: #8
For better flexibility, the GPU buttons (0, 1, 2, 3) should be replaced by a text field so that arbitrary selection of GPUs (ex: 0, 2-4, 6, 8
; similar to how batch processing is specified) can be done. This also frees the limitation of 4 GPUs and avoids the need of auto detection (by giving the users full control).
Using shared memory for only two integers is clearly an overkill...
This can be done by including the cython-generated C codes in the backend.
Some were planned with @xiaojinghuang but we didn't have time to implement it...
Observation from #20
matplotlib
v3.0.3 doesn't seem to behave properly on line plots. The errors aren't plotted at all.
This is due to an unexpected code path introduced in 1d873d4. After initializing the mmap, the string "init_mmap" was mistakenly passed to the error windows, so subsequent data were (silently) rejected.
This does no harm:
Not sure if this is related: every time the start button is clicked, the following warning is always shown:
'c' argument looks like a single numeric RGB or RGBA sequence, which should be avoided as value-mapping will have precedence in case its length matches with 'x' & 'y'. Please use a 2-D array with a single row if you really want to specify the same RGB or RGBA value for all points.
For the Fermat scan (or other non-mesh scans), x/y step size is technically ill-defined. Perhaps change the window elements dynamically based on scan type? It's probably too much work though...
...such as NSLS-II/PyXRF#191.
A number of examples:
Although the Python mmap doc says
class mmap.mmap(fileno, length, flags=MAP_SHARED, prot=PROT_WRITE|PROT_READ, access=ACCESS_DEFAULT[, offset])
(Unix version) Maps length bytes from the file specified by the file descriptor fileno, and returns a mmap object. If length is 0, the maximum length of the map will be the current size of the file when mmap is called.
When it comes to Mac OS X, the length=0
rule seems to not respected and causes an OSError: [Errno 22] Invalid argument
. We should instead set length
to be the size of the allocated shared memory, available through shm.size
. See, e.g., https://github.com/osvenskan/posix_ipc/blob/46eae821f56f364a3afd6d68b3562adeced3794a/tests/test_memory.py#L135-L165.
If the GUI crashes very violently, it may be possible that the shared memory is not cleaned up. Try do a search (based on ownership) and clean up at GUI start time?
Challenges:
/dev/shm
)?Related: NSLS-II-HXN/profile_collection/pull/15, NSLS-II/playbooks/pull/188
HXN_databroker.py
needs to be changed. Also ptycho_gui.py
.
See leofang/ptycho#31.
This is mainly for avoiding supplying a MPI machine profile
. Any thought toward this is useful.
Currently when we plot the scan points we assumed the scanned area is a square:
But this may not be always valid...
Currently we maintain, effectively, 4 distinct dicts:
Param()
class here: https://github.com/leofang/ptycho_gui/blob/b64326d7d0db5f05bda684a55f295bf1fdeb57e2/nsls2ptycho/ptycho_gui.py#L104-L108configparser
: https://github.com/leofang/ptycho/blob/b0b39bd849ad65842659e29dc446f9629fa1433e/utils.py#L8ptycho_trans
class https://github.com/leofang/ptycho/blob/b0b39bd849ad65842659e29dc446f9629fa1433e/ptycho_trans_ml.py#L79This has caused a significant headache every time we want to pass new attributes from the fronted to the backend: We have to update all the four files above plus https://github.com/leofang/ptycho/blob/master/recon_ptycho_gui.py to get it work, which is too tedious from the maintenance point of view.
The goal is to maintain a single dictionary across both frontend and backend. Steps to resolve this are:
ptycho_trans
class directly in the frontend, and dump its attributes to disk as beforeParam
classconfigparser
...because it's currently one of the slowest part in the whole process. This is connected to the save_recon()
function in the backend.
need to work with @xiaojinghuang on this
This is needed until cupy/cupy/pull/1942 is merged to master and released on PyPI.
See leofang/ptycho/issues/23.
Accompanying issue of leofang/ptycho/issues/12.
As per requested by @mrakitin.
On Mac OS X the default maximum size for each shared memory segment is 4MB:
leofang@Leos-MacBook-Pro:~/$ sysctl -A | grep shm
kern.sysv.shmmax: 4194304
kern.sysv.shmmin: 1
kern.sysv.shmmni: 32
kern.sysv.shmseg: 8
kern.sysv.shmall: 1024
machdep.pmap.hashmax: 21
security.mac.posixshm_enforce: 1
security.mac.sysvshm_enforce: 1
leofang@Leos-MacBook-Pro:~/$ ipcs -M
IPC status from <running system> as of Wed May 15 11:09:08 EDT 2019
shminfo:
shmmax: 4194304 (max shared memory segment size)
shmmin: 1 (min shared memory segment size)
shmmni: 32 (max number of shared memory identifiers)
shmseg: 8 (max shared memory segments per process)
shmall: 1024 (max amount of shared memory in pages)
This will have to be considered if the reconstructed arrays are too large.
That being said, so far I don't find this causes any problem:
import posix_ipc
s1 = posix_ipc.SharedMemory('/xxx', flags=posix_ipc.O_CREAT, size=4194304+1) # 1 byte larger than maximum
Refs:
http://www.spy-hill.com/help/apple/SharedMemory.html
https://stackoverflow.com/questions/36595754/shared-memory-folder-in-mac-error-shm-invalid-argument
@xiaojinghuang reported that the cropped images do not seem right starting this Tuesday or Wednesday. While the production code was not changed this week, it might be related to the recent fix 96d5586.
Currently the following tasks are done manually but should be automated:
.pyx
source file is updatedCurrently the GUI is tied to @NSLS-II-HXN, but other beamlines (such as @NSLS-II-CSX) may also want to use it. We should define some API for flexible coupling to beamline-specific databroker processing functions.
Part of the work will involve creating a switch of some kind to select beamline profiles. Also, defer imports of beamline-specific packages to the corresponding databroker wrappers (e.g., don't import hxntools
when the main window starts).
Currently only intensity frames and scan positions are read from H5 in the backend, and the rest is read from the GUI. Therefore, it is possible that we are asking for more info than we actually need for completing the reconstruction.
TODO:
TODO:
alpha
is really nothing, don't keep it there@xiaojinghuang requested this feature so that once we identify the ROI and hot pixels for the head of a series of scans, we can use batch mode to extract data from databroker (using the same ROI and hot pixels) and run the reconstruction.
TODO:
A special method should be added such that when clicking the x button on the main window, all windows will be closed (preferably with a confirmation dialogue). This would eliminate the nuisance of multiple clicking every time.
take screenshots, list steps for reconstructions, explain meanings of each button and field, etc
This is a follow-up of #26. The following code snippet was commented out because it would freeze the whole window:
ptycho_gui/nsls2ptycho/ptycho_gui.py
Lines 973 to 975 in 211041e
thread.terminate()
. As a result, currently during batch processing if one clicks "Stop", the H5 worker will still finish up the work in the background. Only the whole processing queue is killed.
If a more recent Qt/PyQt5 is available (at least 5.10) we should revisit this problem. I suspect it's the underlying implementation causing the problem.
In 770b7d2 the Windows menu is disabled: https://github.com/leofang/ptycho_gui/blob/770b7d27860083d7033a3457e971676fc9158026/nsls2ptycho/ptycho_gui.py#L129-L133
This should be worked out so that even if the user accidentally closes those windows, they can still be brought back by clicking the menu items.
In setup.py
(and/or accompanying codes) we should:
~/.ptycho_gui/
(can be avoided if we use RawKernel.compile()
in the backend, see cupy/cupy/pull/1889); this would also involve moving configure.sh
to the backend.Which is better, robust, and perhaps backward compatible?
save_data_h5_mll_new.py
if np.abs(angle) <= 45.:
x = np.array(df['dssx'])
else:
x = np.array(df['dssz'])
y = np.array(df['dssy'])
points = np.zeros((2,num_frame))
points[0,:] = x#[:500]
points[1,:] = y#[:500]
Attn: @xiaojinghuang
Starting from v2.9.0, h5py supports in-memory H5, which means one can allocate a segment of shared memory, within which a H5 object can live. If the frontend can do this, it will help avoid reading large data from disk in the backend.
Thoughts:
This is to work with line_profiler
which is currently activated manually in the backend.
Currently we pass num_processes
to ScanWindow
:
https://github.com/leofang/ptycho_gui/blob/6e9ea6d5aea1a5f3642fd1acc20a8ba2c2d62718/nsls2ptycho/ptycho_gui.py#L550
However, if the MPI machine file is used, the actual MPI size is counted during initialization:
https://github.com/leofang/ptycho_gui/blob/9e62e62f409b495e83c8a24640f221704990eb58/nsls2ptycho/core/ptycho_recon.py#L99-L125
This workflow needs to be changed/refactored so that the correct size is passed to ScanWindow
.
May be relevant: #21
mainly for testing on DGX-2, but could also be useful for general users?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.