Below is a discussion of how occlusion can be
generated directionally from a custom LightSource shader.
Also discussed are methods for baking Ambient
Occlusion into a point cloud that can be referenced each frame, rather
than raytracing and calulation AO per frame. This technique applies to
scenes in which the camera moves, but the geometry does not.
This process was developed by programming
shaders first and rendering small test RIBs with prman, then by
integrating those shaders and methods into RenderMan for Maya using
built-in passes and secondary outputs. This turned out to be the most
significant challege of the project. I will discuss my experiments with
workflow.
Additionally, I mention how moving objects can
be added to the occlusion by creating object Sets like groups) and
raytracing only the moving components.

Pixar's AppNotes describe a shader to bake
occlusion to a point cloud (.ptc). This is fast, effiient, adjustable,
and does not incur raytracing
artifacts (splotchiness on open surfaces):
1st Frame
- uses the bakePTC shader to create a point cloud.
2nd Frame
- reads the point cloud in the surface shader to reuse the occlusion.
No raytracing.
We can
optionally change the RIB to use raytracing very easily. Notes on
raytracing are commented in the RIB file.
1. If all
goes well, this technique saves a lot of time. Including baking the
point cloud, I experienced render times about 3x faster than with
raytracing. That is not always the case, and the visual
quality may be affected greatly by the quality of the .ptc
1. There
are visual artifacts occaisionally. This is most likely due to a render
setting in Maya as well as the quality of the baked point cloud. I did
not thoroughly troublshoot these artifacts.
2. Moving
Objects cannot be baked. Well, they can be baked, but there will be
occlusion ghosting then. Artifacts where the object once was or missing
occlusion.