On the 10th of April, 2019, this image – a supermassive black hole at the centre of the M87 galaxy – was released to the public.
I am not going to write much about this image, as there are already hundreds of articles describing it, and the story behind it. Regardless, it is an impressive achievement, and it inspired me to begin work on a VFX black hole of my own.
I wanted to make my black hole as true to reality as I could manage, and in particular I wanted to render accurate gravitational lensing that works from any angle – or distance to the event horizon. I began this attempt by collecting as much information as I could about gravitational lensing. I also created a few prototype systems to try out a few different approaches I came up with. These methods all revolved around using a “lens” mesh with an Arnold standard shader applied to it. This shader then had the weights for all maps turned to 0, except for the transmission, at 1. I then set the specular roughness to a very low, non zero number (0.013) and gave it a high index of refraction (IOR).
I came up with this solution after reading this paper and this webpage, where an experiment was conducted using a glass lens whose geometry was mathematically calculated to replicate the gravitational lensing of a black hole (or other massive object). Particular attention was paid to this diagram:
This method only works if the camera always stays at the same distance from the black hole however, and I was not satisfied. As the gravitational lensing of a black hole is not actually created by a physical lens, but spacetime itself bending in the area around the black hole, as the camera approaches, the lens geometry should change when the camera enters this area of curved spacetime.
This becomes especially important as the camera is almost at the event horizon. At a distance of 1.6 times the Schwarzschild radius, although the event horizon should appear to be some distance below, it actually takes up a full 180 degrees of the field of view, as if the observer were right on top of it. As the camera gets even closer, the horizon actually seems to bend around and take up more and more of the field of view until it completely surrounds the observer, looming over from every direction. Just before the viewer crosses the event horizon, the universe outside is visible only as a pinprick of light, exactly 180 degrees away from the surface.
This video by Scott Manley shows this really nicely in space engine. (it is a spherical panoramic video, so drag with the mouse to look around.)
To achieve all of this, I used a sequence of nodes and deformers in Maya to create a dynamic lens geometry that should work at any distance or viewing angle. I started with a nurbs circle to build a cross section for the axially symmetric lens. This circle eventually has deformations applied to it, to create the correct cross section, then a revolved surface is created from it.
A bend deformer was then applied to the circle, which was scaled up depending on the distance to the camera. The goal was to centre the camera in this shape, by scaling the bend by pi*d where d is the distance from the singularity to the camera.
The x scale of the circle must also be scaled up by this same value. so at any scale, it still fully encircles the camera.
With the cross section complete, I then revolved this into a new surface, then aimed it at the camera with a constraint. It was also important to reverse the normals of this surface, to reverse the direction of the light bending.
SCALING however gives us a problem as the z scale appears to change close to the black hole (it is actually not changing, just shifting to a thinner part of the circle). This causes an issue where the amount of distortion changes with distance, and can create a ring of light inside the event horizon. As demonstrated here:
I managed to fix this by scaling the cross section’s z scale along with the x scale, using the formula:
Where D is distance between camera and singularity.
This resulted in much better deformation, and no more sparkly ring inside the event horizon. In fact I can push both extremes in distance, and the model appears to be generally accurate. I can fly far away, and observe the einstein ring, or go right up to the schwarzschild radius and observe the event horizon rise up around the viewpoint. After adding a dynamics system for the accretion disc, I created this demonstration video. The blue shifting at the end of the video is added in post. I don’t know of any way to get proper doppler shifting in Maya.
While the maths is not 100% perfect, I am very happy with how this turned out. I had to fudge some of the calculations to get the result I wanted from the accretion disc (this isn’t the curved light paths you would get with true gravitational lensing, but a close approximation where the light is refracted twice into three straight lines.)
I am most proud of how the model manages to behave correctly even at extremes of distance, where the distortion is very pronounced.