Reyes rendering pipeline

Reyes rendering is a method used in 3D computer graphics to render an image. It was developed in the mid-1980s by Lucasfilm's Computer Graphics Research Group, which is now Pixar. It was first used to render images for a film in 1982, the Genesis effect sequence in Star Trek II: The Wrath Of Khan. Pixar's PhotoRealistic RenderMan is one implementation of the Reyes algorithm.

Reyes stands for Renders Everything You Ever Saw (the name is also a pun on Point Reyes, California, near which Lucasfilm was located). The algorithm was designed to overcome the speed and memory limitations of photorealistic algorithms, such as ray tracing, in use at the time.

Reyes efficiently achieves several effects that were deemed necessary for film-quality rendering: smooth curved surfaces, motion blur, and depth of field.

Reyes renders curved surfaces, such as those represented by parametric patches, by dividing them into micropolygons, small quadrilaterals each about one pixel in size. Although many micropolygons are necessary to approximate curved surfaces accurately, they can be processed with simple, parallelizable operations. A Reyes renderer tessellates high-level primitives into micropolygons on demand, dividing each primitive only as finely as necessary to appear smooth in the final image.

Next, a shader system assigns a color and opacity to each micropolygon. Most Reyes renderers allow users to supply arbitrary lighting and texturing functions written in a shading language. Micropolygons are processed in large grids which allow computations to be vectorized.

Shaded micropolygons are sampled in screen space to produce the output image. Reyes employs an innovative hidden-surface algorithm or hider which performs the necessary integrations for motion blur and depth of field without requiring more geometry or shading samples than an unblurred render would need. The hider accumulates micropolygon colors at each pixel across time and lens position using a Monte Carlo method called stochastic sampling.

The basic Reyes pipeline has the following steps:

  1. Bound. Calculate the bounding volume of each geometric primitive.
  2. Split. Split large primitives into smaller, diceable primitives.
  3. Dice. Convert the primitive into a grid of micropolygons, each approximately the size of a pixel.
  4. Shade. Calculate lighting and shading at each vertex of the micropolygon grid.
  5. Bust the grid into individual micropolygons, each of which is bounded and checked for visibility.
  6. Hide. Sample the micropolygons, producing the final 2D image.

In this design, the renderer must store the entire frame buffer in memory since the final image cannot be output until all primitives have been processed. A common memory optimization introduces a step called bucketing prior to the dicing step. The output image is divided into a coarse grid of "buckets," each typically 16 by 16 pixels in size. The objects are then split roughly along the bucket boundaries and placed into buckets based on their location. Each bucket is diced and drawn individually, and the data from the previous bucket is discarded before the next bucket is processed. In this way only a frame buffer for the current bucket and the high-level descriptions of all geometric primitives must be maintained in memory. For typical scenes, this leads to a significant reduction in memory usage compared to the unmodified Reyes algorithm.

Reyes renderers[edit | edit source]

The following renderers use the Reyes algorithm in one way or the other or at least allow users to select it to produce their images:

References[edit | edit source]

External links[edit | edit source]

Smallwikipedialogo.png This page uses content from Wikipedia. The original article was at Reyes rendering. The list of authors can be seen in the page history. As with the Graphics Wikia, the text of Wikipedia is available under the GNU Free Documentation License.
Community content is available under CC-BY-SA unless otherwise noted.