lhcopt / lhcmask Goto Github PK
View Code? Open in Web Editor NEWMask modules for LHC and HL-LHC
Home Page: http://cern.ch/lhcmask
Mask modules for LHC and HL-LHC
Home Page: http://cern.ch/lhcmask
The normalized emittance for Beam EXPERT SixTrack input is not used.
We suggest to put it to 1000 to avoid possible confusion.
When enable_imperfections or enable_knob_synthesis are activated in the run3 pymask, submodule_04e_s1_synthesize_knobs.madx calls errors/HL-LHC/corr_MB_ats_v4. This routine redefines the knobs used for matching the tunes and the control of the coupling. The new expressions of the knobs can be found in temp/MB_corr_setting.mad (see following png).
while the initial expressions of the knobs from the optics :
As a result, the following knobs are disconnected
'qknob_1': {'lhcb1': 'dQx.b1_sq', 'lhcb2':'dQx.b2_sq'},
'qknob_2': {'lhcb1': 'dQy.b1_sq', 'lhcb2':'dQy.b2_sq'},
'cmrknob': {'lhcb1': 'CMRS.b1_sq', 'lhcb2':'CMRS.b2_sq'},
'cmiknob': {'lhcb1': 'CMIS.b1_sq', 'lhcb2':'CMIS.b2_sq'},
and the matching of the tune and coupling correction fails
Hello,
When the user selects only the HO BB lenses in the bb_dfs dataframe and removes all LR, an error occurs when generating the BEAM expert block in fc.3 (generate_sixtrack_input). Although there are workarounds, the possibility that sxt_df_4d is empty should also be considered.
Thanks :)
Discuss with @freddieknets the on_disp OFF/ON between the BB module and the error module.
Could be introduced to keep file size small enough (especially in the case of extensive studies)
In light of Massimo's recent comments, that the tools could be made to be applicable to several machines, two steps should be taken:
move central repositories from /afs/cern.ch/eng/lhc/optics/simulation_tools to /afs/cern.ch/eng/tracking-tools (Massimo is getting the adequate rights for this)
the repositories on github should be renamed to show their non-dependence on the LHC:
lhcmask -> trackingmask
lhcerrors -> trackingerrors
lhctoolkit -> trackingtools
lhcmachines -> trackingmachines
(alternative names are welcome)
This is not trivial as - as far as I understood - the forks need to be adapted as well.
But exactly for this reason we shouldn't postpone it too long.
In SixDesk, %EMIT_BEAM is given in µm.
This means that in the mask this needs to be accounted for, either in the main mask file or in module_01.
I propose to do it in module_01, such that the user specifies the emittance in micrometers (which is also how we do it colloquially).
Hi,
there seems to be a bug if one uses the BB lenses installed in MADX the the python BB modules
NOTA BENE
The incriminated line is
Line 273 in 8505216
The point is that XMA/YMA is wrongly referred to the weak beam orbit instead of to the reference orbit (aka ideal orbit in the MAD-X manual).
Investigations are on going for the fix:
For now we use the pysixtrack line generated from the sixtrack input.
To be sorted out...
Ilias points out that in
the ';' is missing.
Update the knobs of the config.yaml to be lower case (e.g., dQx.b1_sq -> dqx.b1_sq)
In general it is not sufficient to do checks only when the optics is produced because:
Ideally I would do the tests at the very end of pymask, but:
Final orbit rematch brakes the knobs (should be modified)
We have only one sequence at that point (difficult to compute separations or luminosities). Maybe we could always setup both beams (2 mad instances)? The overhead would be significant (but we could use 2 cores)
In the presence of machine imperfections, we might have to open the tolerances quite a bit
If needed, we could have two sets of tests:
As much as possible we should make automatic tests, not relying on visual inspection by the user.
Automatic checks for knob behaviour
When doing parametric scans it is good to produce some "color-plots" a la Sofia with final values from twiss (with bb on/off)
Errors should not pass silently:
All the above focuses on linear optics:
The beam-beam of B4 looks reasonable but it has been never quantitatively validated.
In fact also the bb in B1 could profit from some more systematic checks.
Should we use also tracking in the tests (pysixtrack/sixtracklib)?
A few points that should be addressed soon:
Would like to be able to have a 'b2_with_bb' and 'b2_without_bb' mode in order to have the MAD-X Survey and Twiss in the same reference frame as for b1. Whilst it is not a physical mode, it is useful for some calculations for example calculating the beam-beam interaction geometry.
Now that the tracking tools have no longer to be found in /afs/cern.ch/eng/tracking-tools it would be useful to write in the README that a Pymask installation is required inside the lhcmask directory to find the tracking tools.
I would check that the integer and fractional tunes are integer and fractional (just to be a bit more robust, otherwise wrong coupling may appear).
Example:
def coupling_measurement(mad, qx_integer, qy_integer,
qx_fractional, qy_fractional,
tune_knob1_name, tune_knob2_name,
sequence_name, skip_use):
import numpy as np # This could be omitted in the package
assert isinstance(qx_integer, int), f"ERROR: qx_integer={qx_integer} should be an integer."
assert isinstance(qy_integer, int), f"ERROR: qy_integer={qy_integer} should be an integer."
assert 0<=qx_fractional<=1, f"ERROR: qx_fractional={qx_fractional} should be >=0 and <=1."
assert 0<=qy_fractional<=1, f"ERROR: qy_fractional={qy_fractional} should be >=0 and <=1."
print('\n Start coupling measurement...')
if not skip_use:
mad.use(sequence_name)
# Store present values of tune knobs
init_value = {}
for kk in [tune_knob1_name, tune_knob2_name]:
try:
init_value[kk] = mad.globals[kk]
except KeyError:
print(f'{kk} not initialized, setting 0.0!')
init_value[kk] = 0.
# Try to push tunes on the diagonal
qmid=(qx_fractional + qy_fractional)*0.5;
qx_diagonal = qx_integer + qmid
qy_diagonal = qy_integer + qmid
mad.input(f'''
match;
global, q1={qx_diagonal},q2={qy_diagonal};
vary, name={tune_knob1_name}, step=1.E-9;
vary, name={tune_knob2_name}, step=1.E-9;
lmdif, calls=50, tolerance=1.E-10;
endmatch;
''')
# Measure closest tune approach
mad.twiss()
qx_tw= mad.table.summ.q1
qy_tw= mad.table.summ.q2
cta = float(np.abs(2*(qx_tw-qy_tw)-np.round(2*(qx_tw-qy_tw)))/2)
# Restore intial values of tune knobs
for kk in [tune_knob1_name, tune_knob2_name]:
mad.globals[kk] = init_value[kk]
print('\n Done coupling measurement.')
return float(cta)
I noted that if one compare the globals before and after the measurement the variable tar (corresponding to the final penalty function of the matching, target) changed.
mad.globals['tar']=0 # to give an initial state
globals_before=deepcopy(dict(mad.globals))
coupling_measurement(mad, qx_integer=62, qy_integer=60,
qx_fractional=.313, qy_fractional=.318,
tune_knob1_name='dqx.b1_sq',
tune_knob2_name='dqy.b1_sq',
sequence_name='lhcb1',skip_use=False)
globals_after=deepcopy(dict(mad.globals))
print('The following variables are created:')
print(list(set(globals_after)-set(globals_before)))
print('The following variables are modified:')
print([i for i in globals_after.keys() if globals_after[i]!= globals_before[i]])
linear_normal_form.py
should be moved to xpart.
Methods find_closed_orbit_from_tracker
and compute_R_matrix_finite_differences
could go to xtrack
Check det(R) == 1
and symplectification should go in compute_linear_normal_form
_set_orbit_dependent_parameters_for_bb
ported to xfields, should be removed from pymask. NB: function from xsuite does not re-enable the bb lenses, this needs still to be done by pymask (explicitly)
I would like to remove check against sixtrack from xline example, can use paymaster test for this. Needs to be extended with ebe functionalities.
Error running 000_pymask.py returning the following:
File ~/Apps/lhcmask/pymask/pymasktools.py:456, in generate_xsuite_line(mad, seq_name, bb_df, optics_and_co_at_start_ring_from_madx, folder_name, skip_mad_use, prepare_line_for_xtrack, steps_for_finite_diffs)
452 xf.configure_orbit_dependent_parameters_for_bb(tracker,
453 particle_on_co=particle_on_tracker_co)
455 _disable_beam_beam(tracker.line)
--> 456 RR_finite_diffs = tracker.compute_one_turn_matrix_finite_differences(
457 particle_on_tracker_co,
458 **steps_for_finite_diffs)
459 _restore_beam_beam(tracker.line)
462 (WW_finite_diffs, WWInv_finite_diffs, RotMat_finite_diffs
463 ) = xp.compute_linear_normal_form(RR_finite_diffs)
TypeError: compute_one_turn_matrix_finite_differences() got an unexpected keyword argument 'dx'
The error is tracked down to
Lines 456 to 458 in b4b6d2a
where **steps_for_finite_diffs
should be replaced by steps_for_finite_diffs
, i.e. the function expects a dictionary, not an unpacked version of it. See :
https://github.com/xsuite/xtrack/blob/b090594d7564fec044361aeee29a661308ea6803/xtrack/twiss_from_tracker.py#L54-L68
Hi,
On the "optics_specific_tools.py" file, it is possible to skip the make thin of the sequence by setting:
sliceFactor=0
This will trigger the following:
else:
warnings.warn('The sequences are not thin!')
However this will lead to an error as the warnings python package is not imported anywhere (at least to my knowledge).
Simply adding:
import warnings
in the "optics_specific_tools.py" file solves this issue. But I think it would be worth having it by default in order to avoid this small error.
Thanks a lot.
Hi,
following the discussion on the last BB meeting I think we could provide a simple way to switch between AFS/EOS with two modifications
system, "ln -fns /afs/cern.ch/eng/lhc/optics/simulation_tools simulation_tools";
! system, "ln -fns /eos/project/a/abpdata/lhc/optics/simulation_tools simulation_tools";
G.
Otherwise they cannot be used properly in cpymad
@giadarol, @freddieknets, @rdemaria
in case the clipping of the corrector strength impacts Q/Q', should we stop the execution raising an error?
For some reason it needs to be repeated in module_02_lumilevel.madx and in module_03_beambeam.madx to get the same result as the reference mask.
For ions the charge of the strong beam (BB lenses) should be multiplied by the Z of the strong beam species (for proton is irrelevant since Z=1).
This can be done with on_bb_charge but we would like to propose to add explicitly the convenient parameters in the config.py (for example the rest mass and charge of the weak beam and the charge of the strong beam).
To ensure modularity, we need at least the following:
Other requirements that you think of can be added to this issue as comments.
Hi,
one could consider to add in the sequence and in the optics the variables
Hi,
making tests of the examples (examples/hl_lhc_collision) I ran it
All the 3 outputs are different (by "small" but "unexpected" amount). And btw are significant different from the output of the repository in /afs/cern.ch/eng/lhc/optics/simulation_tools/modules/examples/hl_lhc_collision/
(obtain from Gianni's machine using the /afs/cern.ch/user/m/mad/bin/mad but one need to verify for which version was obtained (probably non the last one).
So I suggest clarifying the numerical dependence of the MAD-X results on the platform.
Cheers,
G.
In module_01:
!Avoid crabbing more than the crossing angle
if ( abs(on_crab1)>abs(on_x1) && on_crab1 <> 0) {on_crab1 = abs(on_crab1)/on_crab1 * abs(on_x1);}
if ( abs(on_crab5)>abs(on_x5) && on_crab5 <> 0) {on_crab5 = abs(on_crab5)/on_crab5 * abs(on_x5);}
I discussed with Gianni, and we believe that this limiting is unneeded. The user sets both the crabbing angle and the crossing angle, so he's aware what he's doing (he might even want to simulate over-crabbing on purpose, following the idea that he doesn't know the value of the crossing angle very well).
As an alternative, we might output a warning when over-crabbing.
Hey!
just a minor note, but I would like to suggest not calling (mask-external) files from within external files, as this can quickly lead down a rabbit hole.
In the LHC job_tracking.mask
one could run the mask "offline" by just going through the call
and readtable
commands in the main mask and copy the files locally.
In the HL-LHC mask this was broken by the file slhc/errors/Efcomp_MQXFbody.madx
which calls two other files at the end. Not only could these calls be directly in the main-mask, but also just copying the three files locally would not work as the calls also relied on slhc
being properly linked.
This seems still to be true in this new mask version here.
Any thoughts?
Hi,
I would also add
def knobs_df(my_df):
'''
Extract the knob list of a pandas DF (it assumes that DF has a column called "knobs")
and contanting lists
Args:
my_df: a pandas DF (it assumes that DF has a column called "knobs").
Returns:
A data frame of knobs.
'''
import itertools
import numpy as np
aux= list(my_df['knobs'].values)
aux= list(np.unique(list(itertools.chain.from_iterable(aux))))
my_dict={}
for i in aux:
my_dict[i]={}
filter_df=knob_df(i, my_df)
my_dict[i]['multeplicity']=len(filter_df)
my_dict[i]['dependences']=list(filter_df.index)
return pd.DataFrame(my_dict).transpose().sort_values('multeplicity', ascending=False)
to extract and sort "by-use" the knobs of a sequence or of the global space of MAD-X.
Cheers,
on_disp is deactivated:
https://github.com/lhcopt/lhcmask/blob/master/python_examples/run3_collisions_python/000_pymask.py#L377
if enable_imperfections is False & enable_knob_synthesis is also False, exec, crossing_restore; is never called and on_disp=0 even though on_disp=1 in config.py
We propose to move the modules to the optics repository.
Define a better strategy for the CC installation (presently done in module of the beam-beam).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.