Glossary of computer graphics
Appearance
(Redirected from Render Target)
This article needs additional citations for verification. (June 2016) |
This is a glossary of terms relating to computer graphics.
For more general computer hardware terms, see glossary of computer hardware terms.
0–9
[edit]- 2D convolution
- Operation that applies linear filtering to image with a given two-dimensional kernel, able to achieve e.g. edge detection, blurring, etc.
- 2D image
- 2D texture map
- A texture map with two dimensions, typically indexed by UV coordinates.
- 2D vector
- A two-dimensional vector, a common data type in rasterization algorithms, 2D computer graphics, graphical user interface libraries.
- 2.5D
- Also pseudo 3D. Rendering whose result looks 3D while actually not being 3D or having great limitations, e.g. in camera degrees of freedom.
- 3D graphics pipeline
- A graphics pipeline taking 3D models and producing a 2D bitmap image result.
- 3D paint tool
- A 3D graphics application for digital painting of multiple texture map image channels directly onto a rotated 3D model, such as zbrush or mudbox, also sometimes able to modify vertex attributes.
- 3D scene
- A collection of 3D models and lightsources in world space, into which a camera may be placed, describing a scene for 3D rendering.
- 3D unit vector
- A unit vector in 3D space.
- 4D vector
- A common datatype in graphics code, holding homogeneous coordinates or RGBA data, or simply a 3D vector with unused W to benefit from alignment, naturally handled by machines with 4-element SIMD registers.
- 4×4 matrix
- A matrix commonly used as a transformation of homogeneous coordinates in 3D graphics pipelines.[1]
- 7e3 format
- A packed pixel format supported by some graphics processing units (GPUs) where a single 32-bit word encodes three 10-bit floating-point color channels, each with seven bits of mantissa and three bits of exponent.[2]
A
[edit]- AABB
- Axis-aligned bounding box (sometimes called "axis oriented"), a bounding box stored in world coordinates; one of the simplest bounding volumes.
- Additive blending
- A compositing operation where without the use of an alpha channel, used for various effects. Also known as linear dodge in some applications.
- Affine texture mapping
- Linear interpolation of texture coordinates in screen space without taking perspective into account, causing texture distortion.
- Aliasing
- Unwanted effect arising when sampling high-frequency signals, in computer graphics appearing e.g. when downscaling images. Antialiasing methods can prevent it.
- Alpha channel
- An additional image channel (e.g. extending an RGB image) or standalone channel controlling alpha blending.
- Ambient lighting
- An approximation to the light entering a region from a wide range of directions, used to avoid needing an exact solution to the rendering equation.
- Ambient occlusion (AO)
- Effect approximating, in an inexpensive way, one aspect of global illumination by taking into account how much ambient light is blocked by nearby geometry, adding visual clues about the shape.[3]: 446
- Analytic model
- A mathematical model for a phenomenon to be simulated, e.g. some approximation to surface shading. Contrasts with Empirical models based purely on recorded data.
- Anisotropic filtering
- Advanced texture filtering improving on mipmapping, preventing aliasing while reducing blur in textured polygons at oblique angles to the camera.
- Anti-aliasing
- Methods for filtering and sampling to avoid visual artifacts associated with the uniform pixel grid in 3D rendering.
- Array texture
- A form of texture map containing an array of 2D texture slices selectable by a 3rd 'W' texture coordinate; used to reduce state changes in 3D rendering.[4]
- Augmented reality
- Computer-rendered content inserted into the user's view of the real world.[3]: 917
- AZDO
- Approaching zero driver overhead, a set of techniques aimed at reducing the CPU overhead in preparing and submitting rendering commands in the OpenGL pipeline. A compromise between the traditional GL API and other high-performance low-level rendering APIs.[5]
B
[edit]- Back-face culling
- Culling (discarding) of polygons that are facing backwards from the camera.
- Baking
- Performing an expensive calculation offline, and caching the results in a texture map or vertex attributes. Typically used for generating lightmaps, normal maps, or low level of detail models.[6]
- Barycentric coordinates
- Three-element coordinates of a point inside a triangle.
- Beam tracing
- Modification of ray tracing which instead of lines uses pyramid-shaped beams to address some of the shortcomings of traditional ray tracing, such as aliasing.[7]
- Bicubic interpolation
- Extension of cubic interpolation to 2D, commonly used when scaling textures.
- Bilinear interpolation
- Linear interpolation extended to 2D, commonly used when scaling textures.
- Binding
- Selecting a resource (texture, buffer, etc.) to be referenced by future commands.
- Billboard
- A textured rectangle that keeps itself oriented towards the camera, typically used e.g. for vegetation or particle effects.[3]: 551
- Binary space partitioning (BSP)
- A data structure that can be used to accelerate visibility determination, used e.g. in Doom engine.
- Bit depth
- The number of bits per pixel, sample, or texel in a bitmap image (holding one or more image channels, typical values being 4, 8, 16, 24, 32)
- Bitmap
- Image stored by pixels.
- Bit plane
- A format for bitmap images storing 1 bit per pixel in a contiguous 2D array; Several such parallel arrays combine to produce the a higher-bit-depth image. Opposite of packed-pixel format.
- Blend operation
- A render state controlling alpha blending, describing a formula for combining source and destination pixels.
- Bone
- Coordinate systems used to control surface deformation (via Weight maps) during skeletal animation. Typically stored in a hierarchy, controlled by key frames, and other procedural constraints.
- Bounding box
- One of the simplest type of bounding volume, consisting of axis-aligned or object-aligned extents.
- Bounding volume
- A mathematically simple volume, such as a sphere or a box, containing 3D objects, used to simplify and accelerate spatial tests (e.g. for visibility or collisions).[3]: 819
- BRDF
- Bidirectional reflectance distribution functions (BRDFs), empirical models defining 4D functions for surface shading indexed by a view vector and light vector relative to a surface.[8]
- Bump mapping
- Technique similar to normal mapping that instead of normal maps uses so called bump maps (height maps).
- BVH
- Bounding volume hierarchy is a tree structure on a set of geometric objects.
C
[edit]- Camera
- A virtual camera from which rendering is performed, also sometimes referred to as 'eye'.
- Camera space
- A space with the camera at the origin, aligned with the viewer's direction, after the application of the world transformation and view transformation.
- Cel shading
- Cartoon-like shading effect.
- Clipping
- Limiting specific operations to a specific region, usually the view frustum.
- Clipping plane
- A plane used to clip rendering primitives in a graphics pipeline. These may define the view frustum or be used for other effects.
- Clip space
- Coordinate space in which clipping is performed.
- Clip window
- A rectangular region in screen space, used during clipping. A clip window may be used to enclose a region around a portal in portal rendering.
- CLUT
- A table of RGB color values to be indexed by a lower-bit-depth image (typically 4–8 bits), a form of vector quantization.
- Color bleeding
- Unwanted effect in texture mapping. A color from a border of unmapped region of the texture may appear (bleed) in the mapped result due to interpolation.
- Color channels
- The set of channels in a bitmap image representing the visible color components, i.e. distinct from the alpha channel or other information.
- Color resolution
- Command buffer
- A region of memory holding a set of instructions for a graphics processing unit for rendering a scene or portion of a scene. These may be generated manually in bare metal programming, or managed by low level rendering APIs, or handled internally by high level rendering APIs.
- Command list
- A group of rendering commands ready for submission to a graphics processing unit, see also Command buffer.
- Compute API
- An API for efficiently processing large amounts of data.[9]
- Compute shader
- A compute kernel managed by a rendering API, with easy access to rendering resources.
- Cone tracing
- Modification of ray tracing which instead of lines uses cones as rays in order to achieve e.g. antialiasing or soft shadows.[10]
- Connectivity information
- Indices defining [rendering primitive]s between vertices, possibly held in index buffers. describes geometry as a graph or hypergraph.
- CSG
- Constructive solid geometry, a method for generating complex solid models from boolean operations combining simpler modelling primitives.
- Cube mapping
- A form of environment reflection mapping in which the environment is captured on a surface of a cube (cube map).
- Culling
- Before rendering begins, culling removes objects that don't significantly contribute to the rendered result (e.g. being obscured or outside camera view).[3]: 830
D
[edit]- Decal
- A "sticker" picture applied onto a surface (e.g. a crack on the wall).[3]: 888
- Detail texture
- Texture maps repeated at high frequency combined with a main texture on a surface to prevent a blurred appearance close to the camera.
- Deferred shading
- A technique by which computation of shading is deferred to later stage by rendering in two passes, potentially increasing performance by not discarding expensively shaded pixels. The first pass only captures surface parameters (such as depth, normals and material parameters), the second one performs the actual shading and computes the final colors.[3]: 884
- Deformation lattice
- A means of controlling free-form deformation via a regular 3D grid of control points moved to arbitrary positions, with polynomial interpolation of the space between them.
- Degenerate triangles
- Zero area triangle primitives placed in a triangle strip between actual primitives, to allow many parts of a triangle mesh to be rendered in a single drawcall. These are rejected by the triangle setup unit.[11]
- Delaunay triangulation
- A method for generating an efficient triangulating between a set of vertices in a plane.
- Depth buffer
- A bitmap image holding depth values (either a Z buffer or a W buffer), used for visible surface determination, during rasterization of 3D scenes
- Depth map
- A bitmap image or texture map holding depth values. Similar to a height map or displacement map, but usually associated with a projection.
- Depth value
- A value in a depth map representing a distance perpendicular to the space of an image.
- Diffuse lighting
- In shading, a diffuse component of light is the light reflected from the surface uniformly into all directions. This component depends on the surface normal and direction to the light source but not on the viewer's position.
- Direct3D
- Microsoft Windows 3D API, with similar architecture to OpenGL.
- Displacement mapping
- a method for adding detail to surfaces by subdivision and displacement of the resulting vertices from a height map.[12]
- Distributed ray tracing
- Modification of ray tracing that casts multiple rays through each pixel in order to model soft phenomena such as soft shadows, depth of field etc.
- Double buffering
- Using a dedicated buffer for rendering and copying the result to the screen buffer when finished. This prevents stutter on the screen and the user seeing rendering in progress.
- Drawcall
- A single rendering command submitted to a rendering API, referring to a single set of render states.
E
[edit]- Edge vector
- A vector between 2 position vertices in a polygon or polygon mesh, along an edge.
- Environment mapping
- Also reflection mapping, a technique of approximating reflections of environment on complex surfaces of 3D models in real time. A complete 360 degree view of environment needs to be prerendered and stored in a texture using a specific mapping (e.g. cube mapping, sphere mapping etc.)
- Extents
- The minimum and maximum values of an object or primitive along a coordinate axis or set of axes.
F
[edit]- Fixed-function pipeline
- A hardware rendering pipeline without shaders, composed entirely of fixed-function units. A limited number of functions may be controlled by render states.
- Fixed-function unit
- A piece of hardware in a graphics processing unit implementing a specific function (such as triangle setup or texture sampling), without programmable control by shaders.
- Flat shading
- Shading that assigns a uniform color to each face of a 3D model, giving it a "sharp-edge" look.
- Forward rendering
- A term for traditional 3D rendering pipelines which sort lightsources applicable to 3D models in world space prior to rasterization. Contrasts with Deferred shading.
- Forward-plus rendering
- An extension of forward rendering using compute shaders to place lightsources into screen space tiles, to accelerate the use of many lightsources, bypassing some of the disadvantages of deferred shading.[13]
- Fractal
- A complex, self-similar shape described by a simple equation. Fractals can be used e.g. in procedural terrain generation.[citation needed]
- Fragment (pixel) shader
- Shader processing individual pixels or fragments (values that may potentially become pixels).
- Frustum culling
- A stage in a rendering pipeline, filtering out 3D models whose bounding volumes fail an intersection test with the view frustum, allowing trivial rejection.
- Fresnel
- According to Fresnel equations, surfaces show more specular reflections when viewed at near-grazing incidence. This effect is often simulated in computer graphics.
- FXAA
- An approximate antialiasing method performed in a post-processing step which smooths the image in screen space, guided by edge detection (contrasting with the usual supersampling approaches that require larger frame-buffers).
G
[edit]- Geometry
- Typically used to refer to vertex & rendering primitive connectivity information (distinct from materials and textures).[3]: 47
- Geometry shader
- In APIs such as OpenGL and Direct3D, geometry shader is an optional stage able to process 3D model geometry in more advanced ways than a vertex or tessellation shaders (e.g. turn primitives into other primitives).
- G-buffer
- A screen space representation of geometry and material information, generated by an intermediate rendering pass in deferred shading rendering pipelines.[14]
- Global illumination
- Computing the global interactions of light within the scene, e.g. reflections of light from one object to another, by which realism is added.
- Gouraud shading
- Shading technique that computes values at triangle vertices and interpolates them across the surface. This is more realistic and computationally expensive than flat shading, and less than Phong shading.
- Graphics processing unit
- Hardware used to accelerate graphical computations.
- Graphical shader
- A shader associated with the rendering pipeline; not a compute shader.
- Grid cell index
- Integer coordinates in a multidimensional array.
H
[edit]- HDR
- High-dynamic-range imaging, an image format using floating-point values. Allows additional realism with post processing.
- Heightmap
- A 2D array or texture map holding height values; typically used for defining landscapes, or for displacement mapping
- Homogeneous coordinates
- Coordinates of form (x,y,z,w) used during matrix transforms of vertices, allowing to perform non-linear transforms such as the perspective transform.
I
[edit]- Image channel
- A single component (referred to as a channel) of a bitmap image; one of multiple components per pixel, e.g. for RGB or YUV color space, or additional channels for alpha blending
- Image format
- A specific way of representing a bitmap image in memory, also refers to image file formats.
- Image generation
- Synonymous with rendering; taking a 3D scene (or other form of encoded data) and producing a bitmap image result.
- Image generator
- A hardware accelerator for image generation, almost synonymous with a graphics processing unit, but historically used to refer to devices aimed at realtime rendering for simulation (e.g. Evans & Sutherland ESIG line).
- Image order rendering
- Rendering methods that iterate over pixels of the screen in order to draw the image (e.g. raytracing).
- Image plane
- The plane in the world which is identified with the plane of the display monitor used to view the image that is being rendered.
- Immediate mode rendering
- The submission of rendering commands and rendering primitive data without the extensive use of managed resources; rendering primitive vertex attribute data may be embedded directly into a command list, rather than referenced indirectly from resources.
- Impostor
- A dynamically rendered Billboard texture map used to stand in for geometry in the distance. A form of level of detail optimization.[15]
- Incremental error algorithm
- A set of rasterization algorithms which use simple integer arithmetic to update an error term that determines if another quantity is incremented, avoiding the need for expensive division or multiplication operations; e.g. bresenham's line algorithm, or rasterizing heightmap landscapes.[16]
- Index buffer
- A rendering resource used to define rendering primitive connectivity information between vertices.
- Indirect illumination
- Another term for global illumination.
- Instancing
- Rendering multiple objects (instances) using the same geometry data.
- Intersection test
- Determining if two pieces of geometry intersect, commonly required in simulation, rendering pipelines, and 3D modelling applications.
K
[edit]- K-DOP
- A type of bounding volume used for fast intersection tests; a discrete oriented polytope (DOP). These generalise bounding boxes with extents additional discrete planes (e.g. diagonals formed by each pair of coordinate axes, etc.).
L
[edit]- Level of detail (LOD)
- If an object contributes less to the rendered result, e.g. by being far away from the camera, LOD chooses to use a simpler version of the object (e.g. with fewer polygons or textures).[3]: 852
- Light probe
- Object used to capture light parameters at a specific point in space in order to help compute scene lighting.[17]
- Low level rendering API
- A library providing a minimal abstraction layer over a graphics processing unit's raw command lists, such as Vulkan, LibGCM, or Metal (API). The user typically has more control over (and responsibility for) resource management, command buffers, synchronization issues.
- Lighting
- Computations simulating the behavior of light.
- Light vector
- In shading calculations, a 3d unit vector representing the direction of incident light onto a model's surface.
- Light field
- A data structure approximating the 4D flux of light rays through a space (or in the general case, 5D); it may be captured using multiple cameras (e.g. light stage), or rendered from a 3D model by ray tracing.
- Line primitive
- A rendering primitive or modelling primitive representing a line segment, used for wireframes.
- Lumels
- A term for texels in the texture map representing a lightmap.
M
[edit]- Manhattan distance
- Measure of distance between two points, different from Euclidean distance, that sums the distances along principal axes.
- Marching cubes
- A method for triangulating implicit surfaces.
- MegaTexturing
- Texturing technique that works with extremely large textures which are not loaded into memory all at once, but rather streamed from the hard disk depending on the camera view.[18]: 176
- Microtexture
- An alternative term sometimes used for Detail textures.
- Mipmap
- Method of preventing aliasing by storing differently scaled versions of the same image and using the correct one during rendering.
- Modelling primitive
- Basic elements from which 3D models and 3D scenes are composed. Also known as a Geometric primitive.
- Model space
- Coordinate system in which a 3D model is created and stored.
- Model transformation matrix
- A transformation matrix producing world coordinates from a 3D model's local coordinates.
- Multiply blend
- A blending operation used for lightmaps,
N
[edit]- Near clipping
- The clipping of 3D rendering primitives against the near clip plane. Necessary to correctly display rendering primitives that partially pass behind the camera.
- Nearest-neighbor interpolation
- Simplest form of interpolation that for given position outputs the color of the nearest sample.
- Noise
- In real world data a noise is an unwanted distortion of the captured signal, e.g. in photography. In rendering, artificial noise, such as white noise or Perlin noise, is often generated and added on purpose to add realism.
- Normal mapping
- Method of adding detail to the surface of 3D models, without increasing geometry complexity, by using a texture with precomputed normals that are used during shading.
O
[edit]- OBJ format
- A common 3D file format.
- Object order rendering
- Rendering methods that iterate over objects in the scene and draws then one by one (e.g. rasterization).
- Occlusion culling
- Culling (discarding) of objects before rendering that are completely obscured by other objects.
- Occlusion query
- A command passed to a graphics processing unit requesting the testing of bounding volume geometry against the depth buffer to determine if any contents in the potentially visible set; used for hardware accelerated occlusion culling.
- Offline rendering
- Non-real-time rendering.
- OOBB
- An object oriented bounding box (sometimes called object aligned); a bounding box stored in some object's local coordinate system
- OpenGL
- Commonly used 2D and 3D graphics rendering API.
- Outcode
- A small integer holding a bit for the result of every plane test (or clip window edge test) failed in clipping. Primitives may be trivially rejected if the bitwise AND of all its vertices outcodes is non zero
P
[edit]- Packed pixel format
- An image format where the image channels are interleaved contiguously in memory, possibly containing multiple channels within single machine words, equivalent to an array of structures for bitmap data. Contrasts with planar image formats.
- Parallax mapping
- Shader effect that adds detail with a sense of depth to a 3D surface, in a more realistic way than normal mapping.
- Parameter gradient
- The derivative of a vertex attribute with respect to screen space coordinates during rasterization, used for interpolation across a rendering primitive surface.
- Particle effect
- Effects consisting of a number of particles that behave by certain rules, typically used to simulate fire, smoke etc.[3]: 567
- Path tracing
- Photorealistic iterative rendering method based on tracing light paths.
- Perspective correct texturing
- Non-linear texture coordinate interpolation that takes into account perspective, eliminating distortion seen in affine texture mapping.
- Phong lighting
- A commonly used model of local illumination that computes the result as a sum of ambient, diffuse and specular elements of light.
- Phong shading
- Shading technique that uses interpolated normals.
- Photogrammetry
- Science and technology of making measurement from photographs, e.g. automatically creating 3D models of environment.
- Photometry
- Science of measuring light in terms of human perception.
- Photon mapping
- Photorealistic rendering algorithm based on tracing rays from the camera as well as light sources, able to simulate effects such as caustics.
- Physically based rendering (PBR)
- Rendering algorithms based on physics simulation of light, including conservation of energy, empirical models of surfaces.[19]
- Pixel
- Smallest element of a raster image.
- Planar image format
- An image format where the image channels (or even bits) for a single pixel is separated into several parallel arrays, equivalent to a structure of arrays for bitmap data.
- Point cloud
- A surface defined by a collection of vertices without connectivity information.[20]
- Point sprite
- A rendering primitive in 3D graphics pipelines, allowing one vertex plus radius to define a billboard; corner vertices are automatically generated. Typically used for particle systems
- Polygon mesh
- A 3D model consisting of vertices connected by polygon primitives.
- Polygon primitive
- A rendering or modelling primitive defining a flat surface connecting 3 or more vertices.
- Portal
- A means of occlusion culling, defining a visible window between adjacent bounding volumes, used in portal rendering.
- Post processing
- Effects applied to a bitmap image in screen space after 3D rendering pipeline, for example tone mapping, some approximations to motion blur, and blooms.[21]
- Predicated rendering
- A feature facilitating occlusion culling within a graphics pipeline, performed by a command list asynchronously form the CPU, where a group of rendering commands are flagged to be conditional on the result of an earlier occlusion query.
- Premultiplied alpha
- A variation of a bitmap image or alpha blending calculation in which the RGB color values are assumed to be already multiplied by an alpha channel, to reduce computations during alpha blending; uses the blend operation:
dst *= (1 - alpha) + src
; capable of mixing alpha blending with additive blending effects - Primitive
- A basic unit of geometry for rendering or modelling.
- Procedural generation
- Generating data, such as textures, 3D geometry or whole scenes by algorithms (as opposed to manually).
- Procedural texture
- A texture (very often a volume texture) generated procedurally by a mathematical function and with the use of noise functions.[3]: 198
Q
[edit]- Quaternion
- A means of representing rotations in a 4D vector, useful for skeletal animation, with advantages for interpolation compared to euler angles (i.e. not suffering from gimbal lock).[22]
R
[edit]- Radiometry
- Measurement of electromagnetic radiation such as light, defining measures such as flux or radiance.[23]: 469
- Raster graphics
- Graphics represented as a rectangular grid of pixels.
- Rasterization
- Converting vector graphics to raster graphics. This terms also denotes a common method of rendering 3D models in real time.
- Ray casting
- Rendering by casting non-recursive rays from the camera into the scene. 2D ray casting is a 2.5D rendering method.
- Ray marching
- Sampling 3D space at multiple points along a ray, typically used when analytical methods cannot be used.[24]: 157
- Ray tracing
- Recursively tracing paths of light rays through a 3D scene, may be used for 3D rendering (more commonly for offline rendering), or other tests.
- Recursive subdivision
- The process of subdividing an object (either geometric object, or a data structure) recursively until some criterion is met.
- Render mapping
- The baking of a rendering of a 3D model surface into a texture map to capture surface properties. Also known as 'render surface map'.[25][26]
- Render pass
- A stage in a rendering pipeline generating some (possibly incomplete) representation of the scene.
- Render states
- Information controlling a graphics pipeline, composed of modes and parameters, including resource identifiers, and shader bindings.
- Render target
- A graphics resource into which rendering primitives are rasterized by a graphics pipeline. Render targets may be frame buffers or texture maps.
- Render to texture
- The process of rasterizing into a texture map (or texture buffer) for further use as a resource in subsequent render passes. Used for environment mapping, impostor rendering, shadow mapping and post-processing filters. Requires the ability to use a texture map as a render target.
- Rendering API
- A software library for submitting rendering commands, and managing render states and rendering resources. Examples include OpenGL, Vulkan, Direct3D, and Metal. Provides an abstraction layer for a graphics processing unit.
- Rendering command
- An instruction for rasterizing geometry in a 3D graphics pipeline, typically held in a command buffer, or submitted programmatically through a rendering API.
- Rendering equation
- Mathematical equation used as a model of light behavior in photorealistic rendering.
- Rendering primitive
- Geometry that can be drawn by a rasterizer or graphics processing unit, connecting vertices, e.g. points, lines, triangles, quadrilaterals
- Rendering resources
- Data managed by a graphics API, typically held in device memory, including vertex buffers, index buffers, texture maps and framebuffers
- Repeating texture
- A texture map applied with wrap-round UV coordinates extending between the 0–1 range (representing one unit of the image), exhibiting periodicity. Contrasts with clamped, mirrored modes or unique mappings.
- Resource
- Data (often held in a buffer managed by a rendering API) read by a graphics pipeline, e.g. texture maps, vertex buffers, shaders, index buffers, or other pieces of 3D model data.
- RGB888
- An RGB color value encoded as 8 bits per channel.
- RGBA
- An RGB color value together with an alpha channel, typically held in bitmap images or intermediates in shading calculations.
- RGBA888
- an RGBA color value encoded as 8 bits per channel
- RGB color value
- A 3D vector describing a color using the RGB color model; may use fixed-point or floating-point representations.
- RGB image
- A bitmap image holding RGB color values in 3 image channels
- Rounding radius
- A value used in smoothing the corners of a geometric figure such as a 2D polygon or 3D polygon mesh.[27]
S
[edit]- Scene graph
- Data structure commonly used to represent a 3D scene to be rendered as a directed acyclic graph.
- Screen space
- The coordinate space of the resulting 2d image during 3D rendering. The result of 3D projection on geometry in camera space.
- Screen space ambient occlusion (SSAO)
- Technique of approximating ambient occlusion in screen space.
- Screen space directional occlusion
- An enhancement of screen space ambient occlusion (SSAO) taking direction into account to sample the ambient light, to better approximate global illumination.[28]
- Shader
- A subroutine written in a shading language describing: vertex transformations, skinning, and possibly vertex lighting (in vertex shaders); shading calculations (in pixel shaders); control over tessellation (tessellation shaders); or general purpose computation.
- Shading calculation
- Surface lighting and texturing blending operations, e.g. including specularity, bump mapping etc.
- Shadow buffer
- A synonym for shadow map.
- Shadow map
- A texture buffer holding depth values rendered in a separate render pass from the perspective of a lightsource, used in Shadow mapping; it is typically rendered onto other geometry in the main rendering pass.[29]
- Shadow volume
- One of the techniques of adding shadows to 3D scenes.
- Signed triangle area
- Found using half the Z component of cross product of a pair of screen-space triangle edge vectors, useful for backface culling and computing parameter gradients in triangle rasterization.
- Skybox
- Method of creating background for a 3D scene by enclosing it in a textured cuboid (or another environment map).[3]: 547
- Sliverous triangle
- Sliver triangle
- A triangle with one or two extremely acute angles, hence a long/thin shape, which has undesirable properties during some interpolation or rasterization processes.[30]
- Software renderer
- Rendering software that doesn't use specialized hardware (a GPU) for its computations, i.e. only uses CPU for rendering.
- Sparse texture
- A texture that can partially reside in the video memory to reduce video memory usage and loading time.
- Spatial hashing
- A form of hashing to accelerate spatial testing e.g. for AI, collision detection, typically using a grid cell index as a key.
- Specular exponent
- Controls the glossiness in the Phong shading model.
- Specular highlights
- In shading, specular highlight is a bright highlight caused by specular reflections, more prominent on metallic surfaces. These highlights depend on the viewer's position as well as the position of the light source and surface normal.
- Spline
- A curve defined by polynomial interpolation through control points.
- Sprite
- 2D image moving on the screen, with potential partial transparency and/or animation.[3]: 550
- State changes
- The passing of changes in render states in a graphics pipeline, incurring a performance overhead. This overhead is typically minimised by scene sorting.
- Stencil buffer
- A buffer storing an integer value for each according screen pixel, used e.g. to mask out specific operations and achieve specific effects.
- Stereo rendering
- Rendering the view twice separately for each eye in order to present depth.
- Surface normal vector
- In shading calculations, the normal to a 3D model surface, typically compared with the light and view vectors to compute the resulting visible colour. Also used for displacement mapping.
- Swizzled texture
- A texture map stored out of the natural pixel order; see swizzling (computer graphics). For example, it may be stored in morton order, giving improved cache coherency for 2D memory access patterns.[31]
T
[edit]- Terrain rendering
- Rendering of landscapes, typically using heightmaps or voxels.
- Tessellation
- Converting a general 3D surface into polygonal representation, important because of HW being optimized for rendering polygons.[3]: 683
- Texel
- Texture element, a pixel of a texture.
- Texture cache
- A specialised read-only cache in a graphics processing unit for buffering texture map reads, accelerating texture sampling operations.
- Texture sampling
- The process of texture lookup with texture filtering. Performed by a texture sampling unit in a graphics processing unit
- Texture sampling unit
- A fixed-function unit performing texture sampling; also known as a texture mapping unit.
- Texture buffer
- A region of memory (or resource) used as both a render target and a texture map.
- Texture map
- A bitmap image/rendering resource used in texture mapping, applied to 3D models and indexed by UV mapping for 3D rendering.
- Texture space
- The coordinate space of a texture map, usually corresponding to UV coordinates in a 3D model. Used for some rendering algorithms such as texture space diffusion
- Transform feedback
- A feature of a rendering pipeline where transformed vertices may be written back to a buffer for later use (e.g. for re-use in additional render passes or subsequent rendering commands), e.g. caching the result of skeletal animation for use in shadow rendering.[32]
- Triangulation
- The process of turning arbitrary geometric models into triangle primitives, suitable for algorithms requiring triangle meshes
- Triangle primitive
- The most common rendering primitive defining triangle meshes, rendered by graphics processing units
- Triangle setup
- The process of ordering triangle primitive vertices, calculating signed triangle area and parameter gradients between vertex attributes as a prerequisite for rasterization.[33]
- Triangle setup unit
- A fixed-function unit in a GPU that performs triangle setup (and may perform backface culling), prior to actual rasterization.[33]
- Trilinear filtering
- Extension of bilinear filtering that additionally linearly interpolates between different mipmap levels of the texture, eliminating sharp transitions.
- Triple buffering
- Improvement of double buffering for extra performance by adding another back buffer.
- Tristrip
- A common rendering primitive defining a sequence of adjacent triangle primitives, where each triangle re-uses 2 vertices from the previous one.
- Trivial accept
- The process of accepting an entire rendering primitive, 3D model, or bounding volume contents without further tests for clipping or occlusion culling. The opposite of trivial rejection.
- Trivial rejection
- Rejecting a rendering primitive or 3D model based on a cheap calculation performed early in a graphics pipeline, (e.g. using outcodes in clipping). The opposite of trivial accept.
U
[edit]- Unified memory
- A memory architecture where the CPU and GPU share the same address space, and often the same physical memory. It is common in Intel[34][35] and AMD[36][37] processors with integrated graphics, SoCs and video game consoles. Supported on some discrete GPUs with the use of an MMU.
- UV coordinates
- Coordinates in texture space, assigned as vertex attributes and/or calculated in vertex shaders, used for texture lookup, defining the mapping from texture space to a 3D model surface or any rendering primitive.
- UV unwrapping
- The process of flattening a 3D model's surface into a flat 2D plane in a contiguous, spatially coherent manner for texture mapping.
V
[edit]- Vector graphics
- Graphics represented as a set of geometrical primitives.
- Vector maths library
- A library defining mathematical operations on vector spaces used in 3D graphics, concentrating on 3D and 4D vectors, and 4x4 matrices, often with optimised SIMD implementations.[38]
- Vertex buffer
- A rendering resource managed by a rendering API holding vertex data. May be connected by primitive indices to assemble rendering primitives such as triangle strips. Also known as a Vertex buffer object in OpenGL.
- Vertex cache
- A specialised read-only cache in a graphics processing unit for buffering indexed vertex buffer reads.
- Vertex shader
- Shader processing vertices of a 3D model.
- View transformation
- A matrix transforming world space coordinates into camera space.
- View vector
- In shading calculations, a 3D unit vector between the camera and the point of interest on a surface.
- View frustum
- A truncated pyramid enclosing the subset of 3D space that projects onto a 'viewport' (a rectangular region in screen space, usually the entire screen).
- Virtual reality
- Computer-rendered content that (unlike augmented reality) completely replaces the user's view of the real world.[3]: 915
- Volume texture
- A type of texture map with 3 dimensions.
- Voxel
- An extension of pixels into 3 dimensions.
- VSync
- Vertical synchronization, synchronizes the rendering rate with the monitor refresh rate in order to prevent displaying only partially updated frame buffer, which is disturbing especially with horizontal camera movement.
- Vulkan
- High-performance, low-level graphics API by Khronos Group.
W
[edit]- W buffering
- A depth buffer storing inverse depth values, which has some advantages for interpolation and precision scaling.
- Weight map
- A set of Vertex attributes controlling deformation of a 3D model during skeletal animation. Per-vertex weights are assigned to control the influence of multiple bones (achieved by interpolating the transformations from each).[39]
- Window
- A rectangular region of a screen or bitmap image.
- Wireframe
- May refer to wireframe models or wireframe rendering.
- Wireframe rendering
- A rendering of a 3D model displaying only edge connectivity; used in 3D modelling applications for greater interactive speed, and clarity for mesh editing.
- World space
- The global coordinate system in a 3D scene, reached by applying a model transformation matrix from the objects' local coordinates.
Z
[edit]- Z buffer
- A 2D array holding depth values in screen space; a component of a framebuffer; used for hidden surface determination.
- Z order
- A Morton order space filling curve, useful for increasing cache coherency of spatial traversals.
- Z test culling
- A form of occlusion culling by testing bounding volumes against a Z buffer; may be performed by a graphics processing unit using occlusion queries.
References
[edit]- ^ "matrices for computer graphics" (PDF). Retrieved 6 August 2023.
- ^ "xbox360" (PDF).
- ^ a b c d e f g h i j k l m n o Akenine-Möller, Tomas; Haines, Eric; Hoffman, Naty (2018). Real-Time Rendering (Fourth ed.). CRC Press, Taylor & Francis. ISBN 978-1-1386-2700-0.
- ^ "Introduction To Textures in Direct3D 11 - Win32 apps". learn.microsoft.com. 23 August 2019.
- ^ "AZDO" (PDF). Retrieved 6 August 2023.
- ^ "blender manual – baking". Archived from the original on 27 November 2015. Retrieved 6 August 2023.
- ^ Heckbert, Paul; Hanrahan, Pat (1984). "Beam Tracing Polygonal Objects". Proceedings of the 11th annual conference on Computer graphics and interactive techniques - SIGGRAPH '84. pp. 119–127. doi:10.1145/800031.808588. ISBN 0897911385. S2CID 2845686.
- ^ Nicodemus, Fred (1965). "Directional reflectance and emissivity of an opaque surface". Applied Optics. 4 (7): 767–775. Bibcode:1965ApOpt...4..767N. doi:10.1364/AO.4.000767.
- ^ Cozzi, Patrick; Riccio, Christophe (2012). OpenGL Insights. United States: CRC Press. p. 133. ISBN 9781439893760. Retrieved 27 August 2016.
- ^ Amanatides, John (January 1984). "Ray tracing with cones". ACM SIGGRAPH Computer Graphics. 18 (3): 129–135. doi:10.1145/964965.808589.
- ^ "sgi tristrip joining with degenerates".[permanent dead link ]
- ^ "gpu gems displacement mapping". Archived from the original on 25 May 2016. Retrieved 7 June 2016.
- ^ "Forward+ Rendering". 4 September 2015.
- ^ "Learn OpenGL, extensive tutorial resource for learning Modern OpenGL". learnopengl.com.
- ^ "Chapter 21. True Impostors". NVIDIA Developer.
- ^ "Incremental error landscape rendering - "voxel space"". 30 April 2020.
- ^ "Light Probes: Introduction". Blender Manual. Retrieved 25 April 2020.
- ^ Ahearn, Luke (2017). 3D Game Environments: Create Professional 3D Game Worlds (second ed.). CRC Press, Taylor & Francis. ISBN 978-1-138-92002-6.
- ^ "Physically Based Rendering: From Theory to Implementation". www.pbrt.org.
- ^ "Geometric Modeling: Digital Representation and Analysis of Shapes" (PDF). Archived (PDF) from the original on 19 August 2016. Retrieved 7 June 2016.
- ^ "Post Process Effects in Unreal Engine". docs.unrealengine.com. Epic Games.
- ^ "quaternions for rotations" (PDF). Archived from the original (PDF) on 7 October 2016. Retrieved 10 June 2016.
- ^ Gomes, Jonas; Velho, Luiz; Mario, Costa Sousa (2012). Computer Graphics, Theory and Practice. CRC Press. ISBN 978-1-4398-6557-6.
- ^ Driemeyer, Thomas (2001). Rendering with mental ray.
- ^ "Render Surface Map". help.autodesk.com.
- ^ "Softimage User Guide". download.autodesk.com.
- ^ "RoundingRadius—Wolfram Language Documentation". reference.wolfram.com.
- ^ "Max-Planck-Institut für Informatik: Data Protection" (PDF). people.mpi-inf.mpg.de.
- ^ "shadow mapping sigraph - Google Search". www.google.co.uk.
- ^ "cleanup of sliver triangles" (PDF). Retrieved 6 August 2023.
- ^ "Optimizing Memory Access on GPUs using Morton Order Indexing" (PDF). Archived from the original (PDF) on 15 August 2020. Retrieved 10 June 2016.
- ^ "OpenGL - Transform Feedback". open.gl.
- ^ a b "triangle setup". Retrieved 6 August 2023.
- ^ "Frequently Asked Questions for Intel® Graphics Memory on".
- ^ "The Compute Architecture of Intel® Processor Graphics Gen9" (PDF). Intel®. Retrieved 6 August 2023.
- ^ "Configuring UMA Frame Buffer Size on Desktop Systems with Integrated Graphics". AMD. 31 March 2021. Retrieved 6 August 2023.
- ^ "AMD Kaveri Review: A8-7600 and A10-7850K Tested".
- ^ "Sony open sources Vector Math and SIMD math libraries (Cell PPU/SPU/other platforms)". Beyond3D Forum. Archived from the original on 24 June 2016. Retrieved 3 November 2018.
- ^ "weight maps". Archived from the original on 31 May 2016. Retrieved 10 June 2016.