HYM

HYM

1. Code Name: HYM
2. Code Category: Equilibrium, MHD, Hybrid
3. Primary Developer Elena Belova
4. Other Developers and Users: Clayton Myers
5. Short description (one line if possible): 3D nonlinear, parallel, global stability code, with flexible geometry using arbitrary orthogonal coordinate system
6. Computer Language (Fortran77, Fortran90, C, C++, etc) and approx # of lines: F90, MPPL preprocessor, about 10K lines
7. Type of input required (including files and/or output from other codes). Is there any special input preparation system (eg, GUI): Unformatted data file generated by equilibrium solver or saved by HYM for restart, small formatted data file with run parameters
8. Type of output produced (including files that are read by other codes and sizes of large files and synthetic diagnostics): Unformatted damp file (restart), unformatted files for vector and scalar field data, and particle data, largest size 1.5GB
9. Describe any postprocessors which read the output files: A couple of small Fortran90 codes which extract data from unformatted data files and convert it to IDL input files. IDL GUI code and VisIt are used for visualization
10. Status and location of code input/output documentation: Input/output parameters are described in the code. Code normalization etc description in available in pdf
11. Code web site? No designated website, code information on http://w3.pppl.gov/~ebelova/frc.html
12. Is code under version control? What system? Is automated regression testing performed? No
13. One to two paragraph description of equations solved and functionality including what discretizations are used in space and time: Resistive MHD; Hall-MHD; hybrid model with fluid electrons and full-orbit kinetic ions; new version: fluid ions and drift-kinetic electrons. Delta-F or full-F options. 4th order finite difference, 2d-order leap-frog trapezoidal time stepping scheme
14. What modes of operation of code are there (eg: linear, nonlinear, reduced models, etc ): Linear or nonlinear, 3D or 2D, arbitrary geometry
15. Journal references describing code: Belova, E. V., et al., Phys. Plasmas 7, 4996, (2000); Belova, E. V., et al. Phys. Plasmas 11, 2523 (2004)
16. Codes it is similar to and differences (public version): M3D-K has similar MHD-energetic ion model, but HYM uses full orbit description for ions, rather than gyrokinetic.
17. Results of code verification and convergence studies (with references):
Conservation laws, particle orbits: Belova, E. V., et al., Phys. Plasmas 7, 4996, (2000); Belova, E. V., et al., Phys. Plasmas 10, 3240 (2003).
Stability studies benchmarked vs published results: Belova, E. V., et al., Phys. Plasmas 7, 4996, 2000; Phys. Plasmas 8, 1267, 2001; Phys. Plasmas 10, 2361 (2003).
18. Present and recent applications and validation exercises (with references as available):
NSTX:
  1. Belova E. V, 2010 APS DPP invited talk
  2. Belova E.V., N.N. Gorelenkov, C.Z. Cheng, and E.D. Fredrickson, Numerical Study of Instabilities Driven by Energetic Neutral Beam Ions in NSTX, in Proceedings of 30th European Physical Society (EPS) Conference on Controlled Fusion and Plasma Physics, St. Petersburg, Russia, July 2003; PPPL-3832.
  3. N. N. Gorelenkov, E. V. Belova, H. L. Berk, C. Z. Cheng, E. Fredrickson, W. Heidbrink, S. Kaye, G. Kramer, Phys. Plasmas 11, 2586 (2004).
SSX (Swarthmore College):
  1. Belova, E. V., R. C. Davidson, H. Ji, M. Yamada, C. D. Cothran, M. R. Brown, M. J. Schaffer, Nuclear Fusion 46, 162 (2006).
  2. C. Myers, E. V. Belova, M. R. Brown, T. Grey, C. D. Cothran, J. Fung, M. Schaffer, "Numerical simulations of the doublet compact torus configuration", manuscript in preparation for submission to Physics of Plasmas (2011).
  3. Cothran, C. D., J. Fung, M. R. Brown, M.J. Schaffer, E. Belova, "Spectroscopic flow and ion temperature studies of a large-s FRC", J. Fusion Energy 26, 37 (2007).
  4. Brown, M. R., C. D. Cothran, J. Fung, M. J. Schaffer, E. Belova, "Novel dipole trapped spheromak configuration", J. Fusion Energy 26, 37 (2007).
MRX:
  1. S. P. Gerhardt, E. V. Belova, M. Yamada, et al., "New inductive field-reversed configuration formation scheme utilizing a spheromak and solenoid induction", Physics of Plasmas 15, 032503 (2008).
  2. S. P. Gerhardt, E. V. Belova, M. Yamada, et al., "Inductive sustainment of oblate FRCs with the assistance of magnetic diffusion, shaping, and finite-Larmor radius stabilization", Physics of Plasmas 15, 022503 (2008).
  3. S. P. Gerhardt, E. V. Belova, M. Yamada, H. Ji, M. Inomoto, Y. Ren, and B. McGeeham, "New method for inductively forming an oblate field reversed configuration from a spheromak", Nucl. Fusion 48, 032001 (2008).
  4. S. P. Gerhardt, E. V. Belova, M. Yamada, H. Ji, M. Inomoto et al., "Inductive sustainment of a field-reversed configuration stabilized by shaping, magnetic diffusion, and finite-Larmor radius effects", Phys. Rev. Lett. 99, 245003 (2007).
  5. H. Ji, E. Belova, S. P. Gerhardt, and M. Yamada, "Recent advances in the SPIRIT (Selforganized Plasma with Induction, Reconnection, and Injection Techniques) concept", J. Fusion Energy 26, 93 (2007).
19. Limitations of code parameter regime (dimensionless parameters accessible): Lundquist number S < 10^5 to 10^6
20. What third party software is used? (eg. Meshing software, PETSc, ...): None
21. Description of scalability: Scales up to 1000 processors for 500 x 200 x 36 grid size in MHD runs, Still needs work to improve scalability for hybrid runs
22. Major serial and parallel bottlenecks: Load balance in particle runs
23. Are there smaller codes contained in the larger code? Describe: There are two separate equilibrium solvers for FRC and tokamak-ST simulations: FRCIN and TKIN, both F90, TKIN is a parallel code (MPI).
24. Supported platforms and portability: Distributed memory computers, MPI; HYM code was ported to run on PPPL cluster and various NERSC machines
25. Illustrations of time-to-solution on different platforms and for different complexity of physics, if applicable: Linear hybrid simulations of GAE modes in NSTX runs about 3-8 hours (32-100 processors), nonlinear simulations take about 20-40 hours; nonlinear MHD simulation of spheromak merging with high resolution runs about 3-4 hours (1000 processors) at NERSC.