Large scale galaxy simulation
How are large scale N body cosmological simulations handled?
Simple N^2 method.
The force on each point is computed by explicitly listing every other point, and summing up the gravitational force from it. This is simple, but very inefficient and is not effective for large simulations.
Tree method: Barnes-Hut
The particle may be separated in to an oct-tree filled based on the locations of the particles. An entire subtreewhich is far away from a point can be approximated with a single center of mass (or more complicated multipole expansion), while closer portions of the tree can be expanded and checked in detail. This produces decreasing resolution, and decreasing cost, at long distances.
Particle Mesh Ewald
The particles can be distributed into a uniform grid that fills the space so that when calculating the force applied to a particle only the total mass of any given cell is used, rather than a detailed view of its contents. This method fails for short distances, so typically the forces involved are split into a long-range portion, computed from the cells, and a separate short-range portion, computed directly from nearby particles. FFT Fast Correlation is useful to compute long-range forces to and from each grid cell simultaniously. Each cell may also be treated as more than a simple point. Density functions may be generated for the volume of the cell to increase accuracy.
There are many ways to render the data once it is generated, including temperature, velocity, density, delta density, dark matter vs. real matter, or simply coloring bases on luminosity (they are stars, color them as such).
Movies of some simulations can be found here
Millennium Run from Virgo Consortium
GADGET-2 software, actually very similar to the version used by the Millennium Run
A simulation which introduces relativistic affects
Simulation on a small number of galaxies, large number of stars