NBODY2023 simulation lecture

Online Hands-On Training workshop

Gravitational N-body simulations (gravothermal, dense clusters) at Univ. of Heidelberg, Germany (ARI/ZAH)

March 20-23, 2023; zoom online, access data sent by email few days before beginning, ask
spurzem@ari.uni-heidelberg.de

Some new, hopefully helpful advice, for the ssh key generation at the end of this webpage, here (end of page).

The lecture and exercises are targeted for current or future users of Nbody6++GPU or similar. There will be some general background introduction, but the main aim is not a review, but a support about everything one needs to know to run the codes - theoretical knowledge as well as practical informations to run it.

Begin: Monday, March 20, 10:00 a.m.
Lecture Time: 10:00 - 13:00 (with short coffee break)
Lunch Break: 13:00 - 14:00
Tutorial Time: 14:00 - 17:00 (with short coffee break)
End: Thursday, March 23, 13:00
A homework task can be completed until Sunday, April 23, 2023. The Student_GPU Queue will be terminated by Friday, March 31, 2023; thereafter you can use the general GPU queue (parameter in batch job file).


List of Participants
Lecture Slides Monday, March 20 , Introduction, History, Physics of N-Body Systems, Fast Tour of Nbody6++GPU code
Whiteboard Monday Afternoon (Kepler Architecture)
Lecture Slide Beginning of Tuesday, March 21 , Hardware, Accelerators, Supercomputers
Whiteboard Tuesday Morning (Hermite Scheme, Hierarchical Block Steps)
Whiteboard Tuesday Afternoon (Time Symmetry)
Whiteboard Wednesday Morning (Regularization)
How to get canonical eqs. from Poincare Hamiltonian
Whiteboard Wednesday Afternoon (Regularization)
How to do a "Three-Body" Regularization
Lecture Slides Thursday Morning, March 24 , Stellar Evolution, Relativistic Binaries, Other Codes
Whiteboard Thursday Morning (Data Structure, Code Structure)

Zoom Recording March 20 Morning, ~270 MB
Zoom Recording March 20 Afternoon before crash, ~250 MB
Zoom Recording March 20 Afternoon after crash, ~75 MB
Zoon Recording March 21 Morning 1: Still Trouble with annother zoom crash, not yet recovered.
Zoom Recording March 21 Morning 2, ~105 MB
Zoom Recording March 21 Afternoon, ~270 MB
Zoom Recording March 22 Morning, ~220 MB
Zoom Recording March 22 Afternoon, ~190 MB
Zoom Recording March 23 Morning, ~220 MB

Hands-On Tutorials (last update March 21, 9:30 a.m.) : Practical Instructions (nbody-kepler.pdf)
Download of tarball, beta version (differs from github by new Fortrtan Namelist Input, routines nbody6.F, input.F, data.F, scale.F, xtrnl0.F, binpop.F, modify.F)


Download links to NBODY6++GPU (GitHub),
the The NBODY6++ Manual ,
and Sverre Aarseth's NBODY webpage


Scientific literature ( yet to be updated)

  • A comprehensive literature list (to be updated for 2023), books and papers for download, partly overlapping with the following ones
  • Sverre J. Aarseth, "Gravitational N-body simulations", Cambridge University Press, 2003
  • Sverre J. Aarseth, Christopher Tout, and Rosemary Mardling, "The Cambridge N-body lectures", Springer Publishing, 2008
  • Douglas Heggie and Piet Hut, "The Gravitational Million-Body Problem", Cambridge University Press, 2003
  • C.D. Murray and S.F. Dermott, "Solar System Dynamics", Cambridge University Press, 2009
  • James Binney and Scott Tremaine, "Galactic Dynamics", Princeton University Press, 1994
  • Heggie & Matthieu (1986), "Standardised units and time scales" (download)
  • Introduction to Smoothed Particle Hydrodynamics (SPH) by Ralph Klessen (download)

Links and software downloads:





Organizer Rainer Spurzem (ARI/ZAH Heidelberg)

Prerequisites (1) The N-body simulation packages are written in Fortran77 (still a lot), C++, CUDA C (for GPU), and contains instructions for MPI and OpenMP parallelization. You should have a basic experience with a higher level programming language (python, C, Fortran, ...). All the extensions for GPU and parallelization will be briefly explained, but are not the main topic.
(2) For the hands-on tutorials you should have an ssh software running on your computer, to make a safe connection to our kepler2 system in Heidelberg. (4) It is easiest to run ssh from the terminal in Linux or Mac. It may be possible with Windows as well, but this may be more difficult. (5) Those who already have a kepler2 account: please use it for the tutorials.
ssh key generation:
(6) Those who have not yet a kepler2 account: please send your public ssh key to me spurzem@ari.uni-heidelberg.de . An instruction is here:
ssh-keygen -t rsa [passphrase should NOT be empty]

You need to do this from the computer from which you want to log on to kepler2. It produces a private key file named

id_rsa and a public key file id_rsa.pub

Send the public key file by email to me. DO NOT SEND THE PRIVATE KEY!
For questions in linux, please look here:
https://www.digitalocean.com/community/tutorials/how-to-configure-ssh-key-based-authentication-on-a-linux-server ; look especially down the page to "Copying Your Public Key Manually", where you can see how your public ssh key should look like (one line starting with ssh-rsa).
For questions in Windows/putty client, please look here:
https://devops.ionos.com/tutorials/use-ssh-keys-with-putty-on-windows/
Look for: Create New Public and Private Keys - it shows how to save the public key, then send it!
NOTE: The public key file should be one long line starting with ssh-rsa ...
If you have problems, use: ssh -v lecturenn@kepler2.zah.uni-heidelberg.de and send me the output; for Windows/putty send the Session Log of putty!