TSC

1. Code Name: Tokamak Simulation Code (TSC)
2. Code Category: Free Boundary 2D Transport
3. Primary Developer: S. C. Jardin
4. Other Developers and Users: J. Chen, S. Kaye, L. P. Ku, C. Kessel, L. P. Ku, F. Poli, N. Pomphrey, R. Sayer, G. Bateman (Lehigh)Ding, Liu (EAST), Jang, Na (KSTAR), Nakamura (JT-60), Bandyopadhyay (India), Wu, Pan(HT-7U), Pautasso (ASDEX), Nassi, Sugiyama (IGNATOR)
5. Short description: Simulates equilibrium and profile evolution in a tokamak and solves the poloidal field circuit equations in the presence of conductors and plasma
6. Computer Language and approx # of lines: F90 ~ 40,000 lines
7. Type of input required. Is there any special input preparation system (eg, GUI): TEXT input file. Also can read coil current and magnetic files from NSTX or DIII-D. Can also import profiles from an existing TRANSP run using TRXPL.
8. Type of output produced: Yes, can write files JAYPHA & WIREFA for disruption studies, DIVHISA for divertor studies, LHCDOUA for lower hybrid studies, and EQDSKA for stability studies. movie.cdf file
9. Describe any postprocessors which read the output files: Code produces a tsc.cgm file that the graphics postprocessor ictrans uses to produce plots. JSOLVER reads the EQDSKA file. TWIR reads the JAYPHA and WIREFA files.
10. Status and location of code input/output documentation: The input is documented in the SVN site.
11. Code web site? http://w3.pppl.gov/topdac/tsc.htm
12. Is code under version control? What system? Is automated regression testing performed? The code is under SVN. No automated regression testing.
13. One to two paragraph description of equations solved and functionality including what discretizations are used in space and time: TSC is the Tokamak Simulation Code developed at PPPL and used extensively at Princeton and throughout the world. It can model the evolution of a free-boundary axisymmetric tokamak plasma on several different time scales. The plasma equilibrium and field evolution equations are solved on a two-dimensional Cartesian grid, while the surface-averaged transport equations for the pressures and densities are solved in magnetic flux coordinates. An arbitrary transport model can be used, including Coppi-Tang, GLF23, MMM95. Neoclassical-resistivity, bootstrap-current, auxiliary-heating, current-drive, alpha-heating, radiation, pellet-injection, sawtooth, and ballooning-mode transport models are all included. As an option, circuit equations are solved for all the poloidal field coil systems with the effects of induced currents in passive conductors included. Realistic feedback systems can be defined to control the time evolution of the plasma current, position, and shape. Required voltages for each coil system can be output as part of a calculation. Vertical stability and control can be studied, and a disrupting plasma can be modeled. TSC can also be run in a "data comparison" mode, in which it reads specially prepared data files for NSTX or DIII-D experiments. For each of these, a special postprocessor to directly compare TSC predictions with both magnetics and kinetics data for particular shots from these experiments. In all modes, TSC calculates the ballooning-mode stability criteria internally, and it also writes files that are read by the PEST code to calculate ideal and resistive stability for low-n mode.
14. What modes of operation of code are there (eg: linear, nonlinear, reduced models, etc ): Code is normally run in a "transport" mode or in a "shape and position control" mode. Can be run without a plasma to test structure model. Can select from many different transport models, and with many heating system options. Can also be run in SWIM IPS framework for detailed heating and current drive options. Option to specify or import density and/or temperature profiles.
15. Journal references describing code:
[1]Jardin, S. C. DeLucia, J. L., Pomphrey, N., "Dynamic modeling of transport and positional control of tokamaks." J. Comput Phys. 66 481 (1986).
[2] Jardin, S. C. et al, "Modeling of post-disruptive plasma loss in PBX", Nucl. Fusion 27 569 (1987).
[3] Jardin, Bell, and Pomphrey, "TSC Simulation of Ohmic Discharges in TFTR", Nucl. Fusion 33, 371 (1993)
[4] Jardin, Kessel, and Pomphrey, "Poloidal flux linkage requirements for ITER", Nuclear Fusion 34 1145 (1994)
[5] Sayer, R.O, Peng, YKM, Jardin, SC, "TSC plasma halo simulation of a DIII-D vertical displacement episode", Nuclear Fusion 33 969-978 (1993)
[6] Jardin, S. C., Schmidt, G. L, Fredrickson, et al, "A fast shutdown technique for large tokamaks", Nuclear Fusion 40 923-933 (2000)
16. Codes it is similar to and differences: Similar to DINA, CORSICA, and free-boundary PTRANSP in what it is used for. However, there are some technical differences. TSC is the only code that solves the magnetic diffusion equation on a 2D grid, and this gives some advantages in the calculation of volt second consumption, and also in the calculation of disruption halo-current physics.
17. Results of code verification and convergence studies: Basic verification and convergence studies are presented in [1]
18. Present and recent applications and validation exercises:
[1] C.Y. Liu, B. J. Xiao, et al, "EAST plasma current and position prediction by TSC and experimental verification", Plasma Science and Technology 12 156-160 (2010)
[2] Kessel, CE, Campbell D, Gribov Y, et al, "Development of ITER 15MA ELMy H-mode inductive scenario", Nuclear Fusion 49 085034 (2009)
[3] Kessel, CE, Giruzzil G, Sips ACC, et al, "Simulation of the hybrid and steady state advanced operating modes in ITER", Nuclear Fusion 47 1274-1284 (2007)
[4] Takei N, Nakamura Y, et al, "Intermittent beta collapse after NBCD turn-off in JT-60U fully non-inductive reversed shear discharges", Plasma Physics and Controlled Fusion 49 335-345 (2007)
19. Limitations of code parameter regime: None apparent
20. What third party software is used? NCAR
21. Description of scalability: Code is presently used in serial mode, although parallelization has been begun.
22. Major serial and parallel bottlenecks: The main bottleneck is the equilibrium calculation which is done by dynamic relaxation. This could be sped up through parallelization.
23. Are there smaller codes contained in the larger code? Describe: No.
24. Supported platforms and portability: Runs at PPPL on the cluster and on STIX. Runs at NERSC and ORNL. Also, many foreign collaborators have installed it.
25. Illustrations of time-to-solution on different platforms and for different complexity of physics, if applicable: Typical runs take 20min to several hours.