1. Code Name: TRANSP
2. Code Category: Transport code, whole device model, experimental data analysis system.
3. Primary Developer: D. McCune
4. Other Developers and Users: In CPPG: Rob Andre (MHD equilibrium solvers & free boundary model), Eliot Feibush (graphics & post-processing tools), K. Indireshkumar (past development on RF modules), Marina Gorelenkova (NUBEAM), C. Ludescher-Furth (production system), Xingqiu Yuan (predictive transport modelsÑjust started). In addition, many physicists made important contributions to the code development, over the years, especially: Rich Hawryluk, Rob Goldston, Mike Zarnstorff. As to users: there are 116 authorized users of the PPPL Fusion Grid TRANSP/PTRANSP service (varying levels of activity), including 37 PPPL account holders (students, staff), 53 non-PPPL US users, and 26 international users. In the six years through FY-2010, about 25000 runs were served in the production system, including 10000 by overseas users. At PPPL, R. Budny has been the most consistent user, having run the code for a wide variety of simulation studies with results used in dozens of publications.
5. Short description (one line if possible): legacy core plasma oriented tokamak whole device model and transport code with detailed heating and current drive simulation.
6. Computer Language (Fortran77, Fortran90, C, C++, etc) and approx # of lines: Entire TRANSP system is about 2400000 lines of code (counting comments; 1600000 lines without comments), 95% fortran, with some C and C++ libraries. In addition, Python and csh are used extensively for development and operational tools. The code has been converted to real*8 precision fortran-90 although older sections still use fixed form fortran-77 source format and programming style. Much of the code (~50%) is auto-generated from specification files by Python scripts. These specifications define the code inputs and outputs; the scripts allow inputs and outputs to be updated easily.
7. Type of input required (including files and/or output from other codes). Is there any special input preparation system (eg, GUI)? Every TRANSP run takes a control namelist as input. This namelist in turn names time dependent physics input data "Ufiles" or MDS+ signals typically representing measured data from tokamak experiments (e.g. electron density and temperature vs. time and a radial flux coordinate). Sometimes the time dependent input data comes from another model or data source other than measurement. The code has been modified over the years to accept a wide variety of input signals. There is a GUI, preferred by perhaps 10% of current users, for namelist preparation. Tokamak sites often have GUIs for run data preparation and run launching; run data preparation has to be customized to each experiment due to variation of diagnostics and site-specific data handling.
8. Type of output produced (including files that are read by other codes and sizes of large files and synthetic diagnostics): The most widely used TRANSP output is a NetCDF file (with adaptation for MDS+ service) of ~1000 time dependent scalar and profile signals representing the evolution of the plasma flux surface geometry and fields, temperatures and densities, particle, momentum, and heating sources, current drive, and transport, with a high level of detail. In addition, a record of the TRANSP namelist and time dependent input data is preserved. And, at user discretion, additional outputs may be saved at select time slices. Important examples of such additional outputs are fast ion (neutral beam or fusion product) distribution functions from TRANSP's imbedded NUBEAM fast ion simulation.
9. Describe any postprocessors which read the output files: rplot allows visualization of TRANSP main time dependent output; trxpl & trxpl2ps can extract Plasma States and other time slice datasets from the archived time dependent results data.
10. Status and location of code input/output documentation: the main description document for namelist and code use,, is actively maintained by TRANSP developers. The code web site also links to numerous related documents and publication lists.
11. Code web site: -- actively maintained, with links to all available user and input/output documentation.
12. Is code under version control? What system? Is automated regression testing performed? The source code is under svn control, hosted on, with on- and off-site developer access allowed. Regression testing is not automatic. There are a small number (4) "standard" regression tests covering a range of features, which developers are asked to run and examine prior to major svn commit operations. There is some need for more comprehensive regression testing capability, but it would be expensive to provide.
13. One to two paragraph descriptions of equations solved and functionality including what discretizations are used in space and time: The core plasma fluid transport and poloidal field diffusion equations are solved within a time evolving flux surface geometry constructed from a series of axisymmetric MHD equilibrium solutions (usually with prescribed boundary although an option for free boundary solutions is under construction). Numerical models and/or input data are provided for heating, momentum, particle and current sources affecting the transport equations. A time step hierarchy is provided so that slowly evolving (and expensive to evaluate) sources are updated as needed, i.e. less frequently than every transport time step. The transport equations are formulated over a 1d grid with a user chosen time invariant number of radial zones evenly spaced in square root of the normalized enclosed toroidal magnetic flux, with time derivative transformation terms introduced to deal with grid motion (relating normalized flux to actual flux). Source terms are computed over grids optimized for each source model and then interpolated to the transport solver flux grid.
14. What modes of operation of code are there (eg: linear, nonlinear, reduced models, etc): The code's predictive options (PTRANSP) use transport equations to predict evolution of fluid plasma quantities (density, temperature, toroidal angular momentum). The code's analysis options (traditional TRANSP) invert the equations to infer transport necessary to match observed evolution of fluid plasma profiles. There are many hybrid runs with analysis of some quantities and prediction of others.
15. Journal references describing code: Standard references: R.J. Hawryluk, "An Empirical Approach to Tokamak Transport", in Physics of Plasmas Close to Thermonuclear Conditions, ed. by B. Coppi, et al., (CEC, Brussels, 1980), Vol. 1, pp. 19-46; J. Ongena, M. Evrard, D. McCune, "Numerical Transport Codes", in the Proceedings of the Third Carolus Magnus Summer School on Plasma Physics, (Spa, Belgium, Sept 1997), as published in Transactions of Fusion Technology, March, 1998, Vol. 33, No. 2T, pp. 181-191.
16. Codes it is similar to and differences (public version): There are a number of similar codes in the community, with free boundary MHD equilibrium (TSC, Corsica) and prescribed boundary equilibrium (ONETWO, Baldur, JETTO). All these codes have broadly similar capabilities and have been in wide use since the 1970s. What has distinguished TRANSP historically has been a strong coupling of the code to workflows for analysis of experimental data. This broadened the TRANSP user community and led to wide availability of TRANSP datasets, readable by common software independent of originating tokamak. These have served as a de facto standard, facilitating collaboration across many experimental programs and with many applications (e.g. providing input to first principles 3d theory and computation studies).
17. Results of code verification and convergence studies (with references): There are no known publications specifically focused on this topic. Since TRANSP is a multi-physics whole device model, it is a very broad subject. Usually, verification studies have focused on an individual physics component (e.g. NUBEAM or an RF model, not TRANSP as a whole). However, there are numerous experimental papers that have examined sensitivity of TRANSP results to various assumptions. (DMC: Not sure how to choose among them)
18. Present and recent applications and validation exercises (with references as available): There are many experimental papers that make use of TRANSP analysis results with comparisons to various measurements. (DMC: Not sure how to choose among them; this is a query that should probably go to users).
19. Limitations of code parameter regime (dimensionless parameters accessible): TRANSP is constructed from numerous physics components, each with its own limitations and range of validity. Therefore, no simple answer is possible. It is probably more informative to note that TRANSP is an axisymmetric core-only tokamak code which assumes 1d nested flux surfaces and treats plasma parameters (temperature, density, and toroidal angular velocity) as flux surface invariants. It does not have explicit models for SOL plasmas or plasma material interfaces. It does not have the 3d geometry that would be needed for detailed modeling of alternative configurations such as stellarators, although TRANSP runs with special constraints have been used for a variety of stellarator and RFP studies.
20. What third party software is used? (eg. Meshing software, PETSc, ...): NetCDF, MDS+, lapack/blas, fftw, scalapack, superLU, pgplot.
21. Description of scalability: It is mainly a serial code with access to MPI scalable components (e.g. NUBEAM and the TORIC ICRF model).
22. Major serial and parallel bottlenecks: Serial bottlenecks: running time for TRANSP analysis simulations are dominated by heating and current drive source models. This is often still the case in predictive simulations, although there are predictive runs where the transport model and transport equation solvers become competitive or dominant over the calculation of source terms. Parallel bottleneck: Amdahl's Law - the unparallelized portion of the multi-physics model will always dominate runtime in the limit of large numbers of processors. Many parts of TRANSP - 1d models - appear to be too small to parallelize efficiently.
23. Are there smaller codes contained in the larger code? Describe: heating and current drive sources (NUBEAM, TORIC, LSC, TORAY, GENRAY, FRANTIC); transport (NCLASS, GLF23, MMM95, MMM07); mathematical libraries (FFTW, SuperLU, Scalapack) as used by various components; I/O libraries: NetCDF, MDS+. 135 libraries are referenced in the TRANSP build.
24. Supported platforms and portability: this is mainly a question of fortran compilers. The code generally runs well on systems using Pathscale fortran, Intel fortran, Lahey-Fujitsu fortran, gfortran. There have been problems running over PGI fortran and IBM fortran; these could be overcome if there is ever sufficient reason to do so.
25. Illustrations of time-to-solution on different platforms and for different complexity of physics, if applicable: Typical NSTX analysis TRANSP runs take of order one hour on a serial machine. "Time slice" analysis runs can finish in five minutes or less. On the other hand, many ITER simulations require 1-4 weeks. A recent JET simulation with high resolution ICRF (TORIC) required 6 weeks on a single processor which was reducible to 1.5 days using an MPI version of TORIC running over infiniband on 96 processors; suitable equipment for production of such MPI-TORIC runs is being acquired.
26. Any additional comments: A major component of the success of TRANSP has come not from the physics model itself: it is a reasonable model but certainly not at the cutting edge of research. Rather, the TRANSP effort has long benefitted from careful attention to operational engineering issues: data management and data handling associated with the tokamak experimental data analysis workflow, with many practical benefits to users and much facilitation of collaboration. Also, the provision of a TRANSP production system, the "Fusion Grid", has been popular. This is because it is well supported, i.e. user problems are seriously addressed.