Home
Modern Physics
Cosmology
Research
Links

 

Eric D. Carlson
Associate Professor of Physics


Numerical General Relativity

General Relativity (G.R.) is Einstein's very successful theory of gravity. On a macroscopic scale, it is, as far as we know, a well-defined theory that describes the gravitational interaction of matter.

According to G.R., when two objects orbit each other, they will bend space-time in a predictable way. It is fairly straightforward to show that for large orbits (such as occur here in the solar system), the resulting motion will cause waves of distortion, called gravity waves, to emanate outward from the system. Such gravity waves should, in principle, be detectable at large distances provided we have sensitive enough detectors.

Unfortunately, systems like our solar system produce such weak gravity waves that they are undetectable. To produce large gravity waves, it is necessary to use massive objects that are in very close orbits. Specifically, tight orbits of black hole pairs, neutron star pairs, or black hole / neutron star combinations could produce detectable gravity waves.

As the two objects come very close together, the gravitational radiation can become quite substantial, and the orbit rapidly decays. Eventually, the two objects should merge, forming a single larger black hole. However, as they become very close, the objects are moving at relativistic speeds, and approximation methods that worked when they were far apart break down. The goal of Numerical General Relativity is to write computer simulations that can realistically compute how the final stages of inspiral and coalescence occur. I hope to work on this problem in collaboration with Dr. Greg Cook here at Wake Forest.

In principle, it should be a simple matter. Einstein's equations describe how space-time warps in the presence of matter. For black hole coallescence, that is all you need. For neutron stars, you also need the equation of state, and we should be able to approximate that. Ideally, you could take various models of the equation of state and produce predictions for the gravity waves that would come from neutron star coalescence, then measure the gravity waves and use them to discriminate between the models.

In practice, it is difficult to get numerical G.R. programs to behave qualitatively right, let alone quantitatively. The problems are multitude. In general relativity, one is allowed to use arbitrary coordinate systems. This seeming advantage means that without some sort of specific direction, grid points in space have a tendency to drift arbitrarily. If adjacent grid points drift too far, subsequent calculations of the numerical derivative quickly become meaningless, and numerical instability sets in. Consider the graph below, which show the difference between the exact metric of a single spherically symmetric static black hole and a numerical simulation. The event horizon is at 2 on the horizontal axis. The black curve (which would be a straight line at 0, if the simulation were perfect) has been evolved for about a hundred times the time it would take light to cross a space the size of the black hole. It has small errors near the event horizon, but these are not serious. However, just a few more cycles along, instability has crept in at large radii. These instabilities quickly infect the whole graph, growing exponentially and making the simulation effectively useless.

By simulating simple systems where we know the correct answer, we can learn how to prevent such instabilities, and ultimately to develop code that is stable long enough to qive reliable, quantiative predictions for the production of gravitational waves.

Publication:



for questions or comments, contact ecarlson@wfu.edu
Back to Top