What is Gamma and How is it Used in Photography?

Gamma is a nonlinear operation used to code and decode brightness values in still and moving imagery. It is used to define how the numerical value of a pixel relates to its actual brightness.

While gamma is extremely difficult to understand in its entirety, it is important for digital photographers to understand how it applies to images. Gamma greatly affects how a digital image looks on a computer screen.

Understanding Gamma in Photography

The term gamma is applicable in photographic terms when we want to view images on computer monitors. The concept is important to grasp (even just on the surface) because the goal is to make a digital image that looks as good as possible on calibrated and uncalibrated monitors alike.

There are three types of gamma involved in digital images:

  • Image Gamma – Used by the camera or RAW image conversion software to convert the image into a compressed file (JPG or TIFF).
  • Display Gamma – Used by computer monitors and video cards to adjust the output of an image. A high display gamma will create images that appear darker and with more contrast.
  • System Gamma – Also called ‘viewing gamma,’ this is representative of all gamma values used to display the image: essentially, the image and display gammas combined. For example, the same image viewed on a monitor with a different display gamma will not look the same because the resulting ‘viewing gamma’ is different.
Prev1 of 3Next

Leave a Reply

Your email address will not be published. Required fields are marked *