Template:Color depth Truecolor (known as Millions by Apple Computer) graphics is a method of storing image information in a computer's memory such that each pixel is represented by three or more bytes.
Generally one byte is used for each channel with the fourth byte (if present) being used either as an alpha channel data or simply ignored. Byte order is usually either RGB or BGR. However, systems do exist with more than 8 bits per channel, and these are often also referred to as truecolor (for example a 48-bit truecolor scanner).
One byte (eight bits) per channel gives 256 (28) intensites for each of the channels which gives 16,777,216 colors for each pixel (often approximated as 16 million despite the fact that it's closer to 17 million). The human eye is popularly believed to be capable of discriminating between as many as 10 million colors.
While an alpha channel is meaningless in a display buffer, 32-bit truecolor has become popular on the computer desktop because it simplifies drawing of translucent images on the screen (and is often a requirement for hardware acceleration of such drawing) allowing desktop environments to more easily provide effects such as translucent windows, fading menus, and shadows.
While the above explanation is more or less from a Microsoft point-of-view (as Windows is the most common OS and it refers to the 24-bit per pixel color mode as truecolor), truecolor can also refer to a display mode that does not need a Color Look-Up Table (CLUT). Thus, truecolor can be used with any color depth (e.g. 8-bit, 16-bit, 24-bit...) but only without a CLUT.