Australasian Science: Australia's authority on science since 1938

I’ve always wondered: why is a green screen green?

By Lincoln Turner, Researcher, Atomic, Molecular and Optical Physics, Monash University

Green screen technology has become a common feature of film and TV production. Vancouver Film School/Flickr, CC BY-SA

This is an article from I’ve Always Wondered, a series where readers send in questions they’d like an expert to answer. Send your question to

I’ve always wondered why is a green screen green in TV and film making, as opposed to blue or white or beige? – Misha from Brunswick East (The Conversation’s Editor)

If you’ve ever watched a modern blockbuster film, then you’ve almost certainly seen the magic of green screen compositing – or chroma keying – in action. The technique enables film and TV producers to record actors in front of a plain green backdrop, then replace the backdrop with special effects.

Green screens were originally blue when chroma keying was first used in 1940 by Larry Butler on The Thief of Baghdad – which won him the Academy Award for special effects. Since then, green has become more common.

Why? The really short answer is that green screens are green because people are not green. In order for the effect to work, the background must use a colour that isn’t used elsewhere in the shot – and green is nothing like human skin tone. Of course, people wear green clothes, green jewellery and occasionally have green hair or green makeup, but all those things can be changed in a way that skin colour can’t be.

If you are lit by white light, from the sun or a bulb, the light hitting you contains the full visible spectrum of wavelengths. And human skins reflect broadly similar ratios of each colour of the spectrum. If we reflected one colour much more than the others, we’d appear to be a saturated colour.

We’re used to describing skin colour with colour-words, such as brown, pink, white, black or even yellow, but from a colour science perspective, we’re all orange.

Read more:
Does colour really affect our mind and body? A professor of colour science explains

The elements of colour

Colour is defined by our perception, not by physics. Humans have three types of colour-sensitive cells in the retinas of our eyes, which have different colour sensitivities. We can think of them as being “red”, “green” and “blue” sensors, although their sensitivities overlap considerably and are closer to yellow, blue-ish green and blue.

To fully describe a colour, it’s helpful to think of it using three numbers. This could be the red, green, blue intensities (RGB) or the following representation known as HSV. “Hue” (H) corresponds closely to what we loosely call colour, “saturation” (S) corresponds to how rich a colour is, and “value” (V) loosely corresponds to the brightness. These three colour coordinates explain how we might describe a colour as a “dark grey green” or a “light rich blue”.

Figure 1: Representing colour as a hue, saturation and value (brightness) is closer to how we perceive colour, describe it and remember it.
Wikimedia, CC BY-SA

Read more:
A genuinely believable CGI actor? It won't be long

Human skin ranges in brightness (or “value” as it’s shown in the diagram above), but the hue and saturation don’t vary much at all. There are some good physiological reasons for this. In essence, our outer skin layer (epidermis) behaves optically as a neutrally coloured filter over our dermis, which is red largely due to the colour of blood that perfuses it.

Cameras mimic the human eye

Most still and video cameras work a little like our eyes, with a grid of sensors – or pixels – which detect red, green or blue.

But rather like we perceive things as having a brightness and a colour, most video electronics and video recorders convert these inputs into separate brightness and colour information, called luminance (or luma) and chrominance (or chroma) in video jargon.

Figure 2. A full-colour image (right) can be decomposed into a luminance (brightness) component (left), which has no colour information, and a chrominance (colour) component (centre) which has no brightness information. The luminance image is what a black-and-white camera records.
Wikimedia, CC BY-SA

The luminance is basically the brightness, while the chrominance is the location in the hue/saturation colour circle.

When colour TV was introduced, sending the chroma component on a separate sub-channel allowed existing black-and-white TVs to receive the luma channel only and work with the new colour signal. Analogue TV is extinct, but digital TV and internet video still encode luma and chroma separately. This is partly for data compression reasons, but also because it is a more natural representation for correcting colour, and for playing video tricks with green screens.

Read more:
Tupac's rise from the dead was, sadly, not holography

How green screens work

The other name for a green screen – chroma key – gives away how it works. Video production equipment called a chroma keyer looks at the chrominance data.

Pixels that fall in a narrow pie-slice of the hue-saturation circle, centred on the green hue, are deemed to be the green screen. A video switch replaces them with pixels from the background video channel – for example, a weather map. Pixels with all other hues – orange (skin tones), red, yellow, magenta and blue – coming from the camera are let through.

The resulting video output is the weatherperson superposed in front of the weather map. It doesn’t matter at all if the background video has green in it, but if the person on camera is wearing any green, the background will be keyed through that area, and they will appear transparent!

Blue screens work almost as well. Because green and blue are both well away from orange-red on the hue circle, both are suitable for chroma-keying people. If Kermit needed to be keyed on top of a background a blue screen would be essential, whereas Superman needs a green screen.

Film-based compositing methods preferred blue screens, due to the availability of blue-sensitive films. Green screen works slightly better for video as there are more green-sensitive pixels in common camera designs than red or blue. And blue coloured clothes are harder to avoid than green ones.

All sorts of other colours have been used, including magenta, and even white screens lit with bright yellow sodium lamps used to superpose Mary Poppins over London. But as digital cameras take over feature film production, it’s increasingly easy being green.

The Conversation

The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

Originally published in The Conversation.