Depth of field rendering improvement in Blender
This additional code adds realism to the render results of the blender cycles engine. It is possible to simulate some of the optical aberrations occurring when creating a picture with a real camera and camera lens.
The spherical aberration in a real lens is contributing mostly to the appereance of out-of-focus areas in an image. Rays of light entering the lens close to the center (para-axial principal rays) converge in a different focal point than rays entering in the outer areas of a lens (marginal rays). This addition to the code in kernel_camera.h creates this behaviour, it is still a technical test.
Code for altering the direction of a sample ray depending on distance to center of entrance pupil. The focal plane will be different for edge rays than for center rays:
... float3 P = make_float3(0.0f, 0.0f, 0.0f); float3 D = Pcamera; /* modify ray for depth of field */ float aperturesize = kernel_data.cam.aperturesize; if(aperturesize > 0.0f) { /* sample point on aperture */ float2 l_uv = camera_sample_aperture(&kernel_data.cam, lens_u, lens_v); float2 lensuv = l_uv * aperturesize; /* compute point on plane of focus */ float ft = kernel_data.cam.focaldistance/D.z; /* distance from center */ + float dis = sqrt(pow(l_uv.x,2.0) + pow(l_uv.y,2.0)); + float3 Pfocus = make_float3(D.x, D.y, (D.z * (1.0+(1.0-dis)*0.1) )) * ft; - // float3 Pfocus = D*ft; /* update ray for effect of lens */ P = make_float3(lensuv.x, lensuv.y, 0.0f); D = normalize(Pfocus - P); } ...
Scene without field of depth
Scene with field of depth, as implemented in blender 2.8
Scene with field of depth and spherical aberration. The Coma effect is also visible in asymmetrical distortion of out-of-focus areas in the image.
The next steps will be to implement some parameters for the other non-chromatic aberrations:
- Spherical aberration
- Coma
- Astigmatism
- Field curvature
- Distortion
Some of the aberrations are depending on each other in some cases. This has to be taken into account. The artist should be able to adjust the parameters idependently.
The additional render time neccessary to achieve a similar image quality might be in a 2-digit percentage range. Some tests have to be done.
Also the displaying, manipulating and storing of parameters should follow the established GUI experience in blender.
Chromatical aberrations will require more changes to the render engine, a native approach will triple the render times at least. More research on algorithms is neccessary.
This approach tries to implement a solution close to the physical / optical realities. It is not a fast software hack. The tradeoff in longer render times might be not acceptable for some artists, desirable for ohters.