1. Code Name: DEGAS 2
2. Code Category: Neutral particle transport
3. Primary Developer: D. P. Stotler
4. Other Developers and Users: C. F. F. Karney; M. E. Rensink (LLNL), EPSi Project, H. Matsuura (Osaka Pref. U.), A. Necas (TriAlphaEnergy), S. Banerjee (IPR, India), G. Fujun (Sichuan University, China), K. Tahiliani (IPR, India), S. Eilerman (U. Wisc.)
5. Short description (one line if possible): DEGAS 2 simulates the transport and behavior of neutral species generated by plasma wall interaction (as well as neutrals sourced externally or by volumetric electron-ion recombination).
6. Type of input required (including files and/or output from other codes): Input for the main simulation code is generated via a staged series of preprocessors, with multiple options for some of these. Almost all of the inputs to these preprocessors are plain, human readable text files. The products of the preprocessors are platform independent, binary netCDF files. More generally, the input required for the code consists of:

  • Geometry - The geometry is specified in terms of "cells", with each cell defined as the intersection of the volumes on the positive side of a set of quadratic surfaces. The "zones" are then made of one or more of these cells, with all input & output quantities assumed constant over "zones". In addition, the code requires connectivity information to allow tracking from one cell to the next. In practice, the geometry is built upon a mesh from a plasma transport code or a magnetic equilibrium, plus a specification of the plasma facing surfaces of the device being modeled. The most widely used preprocessor can utilize input from the DG / Carre codes (originally part of SOLPS, now also distributed with DEGAS 2), read EQDSK files, as well as DEGAS 2 - specific files created by UEDGE, XGC0, and OSM / DIVIMP. Other preprocessors have been developed to generate input to this preprocessor directly from an EQDSK file or from VMEC output.
  • Plasma background - This primarily consists of the density, temperature, and flow velocity for all zones and for all plasma species in the problem. Options exist for obtaining these data from UEDGE, XGC0, OSM / DIVIMP, and the original DEGAS code. One of the preprocessors accepts a user written subroutine for determining these data via an analytic or semi-analytic prescription based on the geometry of the problem.
  • Problem physics - The identity of the "test" (usually neutral) and "background" (usually consisting of ions and electrons) species to be simulated is first specified. Both are selected from a larger, reference set of "species" (comprised of "elements"). Then, the set of "reactions" (e.g., electron impact ionization of hydrogen atoms, hydrogen ion-atom charge exchange) between them is given; these are selected from the full list of reactions packaged with the code. Similarly, the "materials" used to represent the solid boundaries of the problem are provide. A list of "particle-material interactions" describes how collisions of the various particles with these material surfaces are to be simulated.
  • The requisite data for the "reactions" as a function of the test and local background properties are required as well as a prescription for specifying the outcome of each interaction. The data (e.g., reaction rate, cross section, kinetic description of reaction products) for each reaction are contained in separate netCDF files. These files are generated as needed by one of a number of different preprocessors (and, in some cases, by hand). The most sophisticated of these codes reads data in the Aladdin format and computes the required Maxwellian averaged rates.
  • The probabilities and outcomes of interactions between the test species and the materials boundaries are specified in a manner analogous to that used for "reactions". That is, the particular data for each interaction is contained in a separate netCDF file that can be generated by any of a number of different means.
  • Neutral source - This typically consists of a plasma flux to a surface (e.g., from a plasma transport code), in which case the neutral source is that due to recycling. Sources due to gas puffs or volumetric ion-electron recombination can also be simulated. For the recycling and gas puff sources, the spatial distribution and strength of the sources must be input. A puff temperature governs the kinetic characterization of the gas puff source.
  • Tallies - The set of quantities to be computed by the code and their dependencies (on spatial coordinate, species) is controlled by a separate input file.

7. Type of output produced (including files that are read by other codes and size of files if large and synthetic diagnostics): The primary output file (up to a few 100's of MB) from the main code is a set of unstructured arrays; the mapping of these data to the desired output quantities is contained in the tally netCDF file. These tallies can include the output from synthetic diagnostics; the most common of these are reconstructions of visible camera images.
8. Describe any postprocessors The primary postprocessors are:

  • An interactive and / or script driven interface to the output data file that produces tabular text files that can be read by commercial graphing tools.
  • A "plotting" code that reads in the output file and problem geometry and then maps the output tallies onto a uniform 2-D or 3-D mesh, resulting in HDF or Silo format files that can be visualized by third party packages (e.g., VisIt).
  • A simple code for performing a statistical comparison of two output files.

9. Status and location of code input/output documentation: The documentation consists of the User's Manual ( ) and the preambles to the various preprocessors, postprocessors, and other utility codes. Links to the latter are contained in the User's Manual.
10. Code web site?
11. Is code under version control? What system? Is automated regression analysis performed? The DEGAS 2 source code is maintained in a CVS repository on the PPPL Linux cluster. No automated regression testing is performed, although a set of examples exercising various aspects of code functionality are used for manual regression testing.
12. One to two paragraph description of equations solved and functionality including what discretizations are used in space and time:

DEGAS 2 in effect solves the Boltzmann equation for the neutral particle distribution function. The present version of the code is steady state. Were it to be time dependent, the minimum time step would be set by a typical transit time for a neutral particle across a "zone". In principle, these zones can be made arbitrarily small. In practice, their size is determined either by the spatial resolution of the input plasma data or by the computational time required to attain a particular level of statistical precision in the results. Most commonly DEGAS 2 is run in an axisymmetric mode in which the neutral particles are tracked in 3-D, but all output quantities are averaged over toroidal angle. The bulk of the fully 3-D simulations carried out date have been set up as variants of symmetric (linearly or toroidally) geometries. The most extreme example to date has been the simulation of gas flow through the entire Alcator C-Mod vacuum vessel, including the divertor plenum regions and vertical ports.

13. What modes of operation of code are there (eg: linear, nonlinear, reduced models, etc ):

The most common mode of operation is a direct, linear calculation with fixed input. Nonlinear problems (through the inclusion of neutral-neutral collisions) are solved approximately via an iterative technique; to conserve energy and momentum in this case, the neutral-neutral collisions must be described by a Krook operator. An iterative plasma-neutral solution capability is available in coupling to the UEDGE and XGC0 plasma transport codes.

14. Journal references describing code: The primary and original reference is: D. P. Stotler and C. F. F. Karney, Contrib. Plasma Phys. 34, 392 (1994). Additional descriptive information of newer capabilities is contained in application related articles listed below.
15. Codes it is similar to and differences (public version): EIRENE - Monte Carlo code widely used in conjunction with the B2 fluid plasma transport code, in particular for ITER divertor simulations. Represents the state-of-the-art in terms of physics content, including radiation transport and trapping. NIMBUS - JET based Monte Carlo code used in conjunction with the EDGE2D fluid plasma transport code. Has limited manpower for support and development.
16. Results of code verification and convergence studies (with references):

First, DEGAS 2 is distributed with a set of test routines used to check the code, run environment, and input data:
  • System dependent routines,
  • Random number generator,
  • Atomic physics reactions,
  • Plasma-material interactions,
  • Source specification.
The geometry is rigorously tested as part of the setup phase. Numerous assertions are spread throughout the code to catch unanticipated situations and errors in input.
Integrated verification tests of DEGAS 2 that have been performed include those:
• Against analytic solutions,
  • Escape probability,
  • 1-D analytic fluid model (User's Manual),
  • Couette flow (User's Manual),
  • Nonlinear relaxation of distribution function,
  • Gas flow through a pipe in molecular flow and transition regimes [D. P. Stotler and B. LaBombard, J. Nucl. Mater. 337-339, 510 (2005)],
  • Temporal resolution provided by the single-state collisional radiative model used for helium [D. P. Stotler, J. Boedo, B. LeBlan, R. J. Maqueda, and S. J. Zweben, J. Nucl. Mater. 363-365, 686 (2007)].
• Against other codes,
  • Original DEGAS,
  • EIRENE (User's Manual),
  • KN1D.
A large number of less involved and not publicly documented tests (e.g., of conservation) have been carried out over the years.
17. Present and recent applications and validation exercises (with references as available):
The principal validation tests of DEGAS 2 include:
  • Simulation of the Balmer-alpha spectrum in TFTR - This was actually performed with the original DEGAS code, but all of the requisite models and capabilities were carried over to DEGAS 2. The level of agreement was limited by our imperfect knowledge of the experimental details and of the relevant atomic physics processes. [D. P. Stotler, C. H. Skinner, R. V. Budny, A. T. Ramsey, D. N. Ruzic, and R. B. Turkot, Phys. Plasmas 3, 4084 (1996)]
  • Alcator C-Mod divertor baffling experiments - Model used to specify the input plasma data was found to be woefully inadequate. [D. P. Stotler, C. S. Pitcher, C. J. Boswell, T. K. Chung, B. LaBombard, B. Lipschultz, J. L. Terry, and R. J. Kanzleiter, J. Nucl. Mater. 290-293, 967 (2001); D. P. Stotler, C. S. Pitcher, C. J. Boswell, B. LaBombard, J. L. Terry, J. D. Elder and S. Lisgo, in Atomic Processes in Plasmas, 13th APS Topical Conference on Atomic Processes in Plasmas, (Gatlinburg, Tennessee, April 22Ð25, 2002) (American Institute of Physics, Melville, New York, 2002), p. 251-260]
  • Simulation of gas conductance through Alcator C-Mod divertor Ð Two experimental situations were simulated, but the resulting code-experiment deviations were of opposite sign. Potential explanations have not been pursued. [D. P. Stotler and B. LaBombard, J. Nucl. Mater. 337-339, 510 (2005)]
  • Steady-state modeling of the gas puff imaging diagnostic - Primary limitation is in specifying a suitable time average of the input plasma data. [D. P. Stotler, J. Boedo, B. LeBlan, R. J. Maqueda, and S. J. Zweben, J. Nucl. Mater. 363-365, 686 (2007)]

Other recent applications include:
  • 3-D simulations of gas puff imaging experiments on Alcator C-Mod [J. L. Terry, S. J. Zweben, M. V. Umansky, I. Cziegler, O. Grulke, B. LaBombard, and D. P. Stotler, J. Nucl. Mater. 390-391, 339-342 (June 2009); S. J. Zweben, B. D. Scott, J. L. Terry, B. LaBombard, J. W. Hughes, and D. P. Stotler, Phys. Plasmas 16, 082505 (August 2009)]
  • Simulations of molecular hydrogen in the Princeton FRC [D. R. Farley, D. P. Stotler, D. P. Lundberg, and S. A. Cohen, J. Quant. Spec. Rad. Trans. 112, 800 (2011)]
  • 3-D simulations of diffusive lithium evaporation in NSTX [D. P. Stotler, C. H. Skinner, W. R. Blanchard, P. S. Krstic, H. W. Kugel, H. Schneider, and L. E. Zakharov, J. Nucl. Mater. (in press)]
  • Coupled XGC0 Ð DEGAS 2 simulations of cold ions in plasma edge due to recycling [W. Wan, S. E. Park, Y. Chen, G.-Y. Park, C. S. Chang, and D. Stotler, Phys. Plasmas (in press)]
18. Limitations of code parameter regime (dimensionless parameters accessible) A primary limitation is the assumption that the plasma background can be described by a drifting Maxwellian distribution (requires collision mean free path / parallel connection length << 1) . This is not an intrinsic limitation of the algorithm; in principle, an arbitrary plasma distribution function can be specified at startup or as part of the plasma specification. Secondly, the set of plasma-material interactions presently available is limited by our understanding and the lack of first-principles models. Again, this is not an intrinsic limitation. Third, the treatment of neutral-neutral collisions uses a Krook collision operator with constant cross section to permit particles, momentum, and energy to be conserved. A consequence of this assumption is that heat transport by these neutral-neutral collisions will not be accurate. More generally, the Monte Carlo algorithm is inefficient in high density situations in which the mean free path for charge exchange / elastic scattering is much smaller than the zone size. At present, the code computes the steady state solution. However, full time dependence is currently being added; limited time dependent simulations have been performed in the past. Sputtering processes are not presently explicitly an option within the set of available plasma-material interactions. Since sputtering effectively represents a separate source of neutral species, additional code development is needed to incorporate them into the code.
19. What third party software is used ? (eg. Meshing software, PETSc, ...):
  • Gnu Make - for compilation.
  • netCDF - used as the primary device independent binary file format for input and output files.
  • FWEB - for code documentation and preprocessor macros. Note that the DEGAS 2 source code can be "tangled" into F77 or F90 compilable code.
  • Triangle - Jonathan Shewchuck's Triangle code is used by one of the geometry setup routines.
  • Silo / HDF Ð Output files for use with third-party visualization packages (e.g., VisIt) can be generated in either of these formats.
  • NCAR - The Carre mesh generation code (used primarily with SOLPS, but now also distributed with DEGAS 2) can generated NCAR-based output, if desired. ¥ Motif Ð Needed by the DG graphical user interface geometry setup code (borrowed from SOLPS).
20. Description of scalability: In principle, the Monte Carlo algorithm used by DEGAS 2 should scale very effectively. Good scaling has been demonstrated on 10's of processors. Scaling to many more processors with the existing code does appear to be limited, presumably by inter-processor communication or data consolidation at the end of the run.
21. Major serial and parallel bottlenecks: See the previous question.
22. Are there smaller codes contained in the larger code? Describe. DEGAS 2 is not a single code, but a complex package of setup, test, simulation, and post processing tools. Which ones are needed depend on the user's objective and familiarity with DEGAS 2. A comprehensive list is contained in the User's Manual ( ) 23. Supported platforms and portability: Historically, DEGAS 2 has been run on a wide variety of platforms (Sun, DEC Alpha, SGI, IBM AIX, HP). However, the ready availability of Linux clusters has resulted in almost exclusive use of the Linux OS (both 32- and 64-bit) over the last several years. Compilers recently tested include Portland Group (F77), Pathscale (F90), and gfortran (F90). Virtually all system and compiler dependent code is confined to two files, although the F90 version may require no modifications whatsoever. Execution on systems without FWEB is facilitated by a remote compilation capability.
24. Illustrations of time-to-solution on different platforms and for different complexity of physics, if applicable.: Linear problems with simple geometry (symmetric) and relatively long mean free paths (compared with mesh cell sizes) can be run on a single Linux processor in a matter of minutes. Linear, 3-D problems with high resolution meshes (e.g., gas puff imaging) require a few hours using 20 - 30 processors. The most demanding simulations done to date were nonlinear, 3-D, with short mean free paths; one documented case used 41 hours on 30 1.7 GHz AMD Athlon processors.