See the colours.

It’s a dull old day, drizzle, heavy cloud cover. Getting late in the year too, so the sun isn’t very strong even when it’s shining. On a positive note, my Camellia sasanqua ‘Navajo’ is starting to flower; I’d better get some pictures of it, I only have 171 already, I must need some more.

In very short order, I add 47 more images to the mix. I take pictures using my iPhone, both with the native app and with an app called Camera+ 2, then for good measure a few more with my Canon DSLR. Here are the images:

The iPhone native camera app allows a few tweaks but they come under the heading “creative” rather than “control”. Basically it’s a point and shoot. Having pointed and shot you can immediately check to see whether the image is a good likeness of the subject or not, but if it isn’t, there’s not a lot you can do about it prior to taking the picture. You can of course edit it to your heart’s content afterwards.
As with all digital cameras, there are two parts to getting the picture. There is the hardware, the lens focuses the light reflected off the subject onto a sensor which records it as digital data. This data is then processed by software to produce an image that as near as possible represents what you were seeing as your subject.

Your eye and brain do something similar. Your eye collects the data and sends it to your brain, which attempts to make sense of it. Ask me to describe what I see when I look at Camellia ‘Navajo’ and I would say I see a white flower grading to a pink edge.

What makes it all a bit less simple than that is the light itself. I can look at the flower first thing in the morning, at noon or at sunset. I can look at it in sunlight, cloud or in shade. I can look at it when light levels are high or when it is going dark. ‘Navajo’ is still a white flower with a pink edge. That’s partly because I already know that’s what colour it is and partly because my brain puts great store by how colours look in relation to each other, rather than in how they look in absolute terms.
We don’t register changes in the colour of light until it gets extreme, such as at sunrise or sunset, or when the light turns weird under unusual cloud conditions.

If you take a picture of a white wall in sunshine, part of which is in shade, you might expect that the part in shade simply reflected less light and looked less bright. Level up the luminosity and you would find that the shaded area was in fact much more blue. The same is true of Camellia ‘Navajo’. In sun it is white with a pink edge; in shade it is pale blue with a pinky-mauve edge. My brain makes an automatic adjustment, comparing it to everything about it and concludes it is white with a pink edge. The hardware end of the camera records pale blue and mauve, then it is a case of how well the software does its job of pretending to be a human brain.

I took a picture which included part of a neighbour’s white painted wall. In the top right is a shadow from the roof. Square 2 is sampled from the area in sun, square 3 from the area in shadow. Square 4 is square 3 adjusted to the same luminosity as square 2. It is not just darker, it is bluer. The red car has hugely varying colours in the picture but your brain makes sense of it and tells you what colour the car is. You don’t think it has a light bonnet and dark sides.
Colours sampled from the car. None is an accurate reflection of the colour your brain tells you it is.

And so to ‘Navajo’. The auto setting on the phone makes for an acceptable picture. Compare the first two images and you will see that one is bluer than the other. The auto setting has recorded an accurate representation of what the camera had in front of it. The day was dull, the light cool and slightly bluish. By using the cloudy setting, the software has “corrected” the colour of the image to what it would have looked like in “better” light; it’s what it would have looked like on a sunny or cloudy but bright day. Each image, in isolation, would be an acceptable pictorial record of the variety.

The Canon is a different kettle of fish. It can be set to auto but mine never is. You can choose “white balance” by selecting daylight, shade, cloudy, tungsten light, fluorescent light or flash. The software in the camera then adjusts the colour to what it would have looked like taken in normal daylight.
I have it set to record a jpeg image and a RAW image. In the jpeg image whatever pre-set choices I have made are applied and the image is compressed to make a smaller file for storing. The RAW file is what the sensor recorded before any processing took place. Once the image is copied to my computer I can tweak it to my hearts content, changing colour temperature, exposure, contrast, sharpness and much more. The Camera+ app for the iphone gives me much of the same control on the phone.

Here are the original image, brightened a bit, contrast increased, auto gamma adjust applied, colour temperature raised from 5200 to 5800.
Here is the image as taken, then adjusted to how it might have looked in shade or on an even duller day, finally adjusted to a warmish sunny day, late afternoon perhaps. As well as the overall effect, compare the hue of the pink parts of the flower.
Sampling the same place from the middle and right hand images really brings home the difference the temperature of the light makes. You would be reluctant to accept that both could be right. Let’s have another look at the three versions again.

Put the flower back into context amongst its foliage and look at each version in isolation, and all three are acceptable in the sense that they are believable. They’re believable because depending on the light, each of them could be absolutely accurate. In the particular case of Camellia sasanqua ‘Navajo’, it is winter flowering and quite likely to be growing in shade, so most people’s experience of it would be as the colder, bluer version. That doesn’t make it right, or for that matter, wrong.

It does mean that if you want to show what a flower colour is accurately, it is probably best to include at least one shot taken from a bit further back. Then the brain can get to work and judge it against everything else in the picture, plus the automatic processing that the camera does is less likely to have gone awry because of being confronted with a big block of colour that it can’t accurately evaluate. Compared to us humans, cameras really do have very very small brains.

8 thoughts on “See the colours.

  1. Accurate colour reproduction is a challenge. The colours of gardens and plants when shown on television are so very often inaccurate, over-saturated etc.

    Like

      1. I imagine that something of that order runs through the producer’s/director’s head as they make everything so very bright and colourful but not true to reality at all. Grass, especially, can be very glaring at times.

        Liked by 1 person

  2. ‘Navajo’ was one of the more popular Camellia sasanqua that we grew, but i never put so much effort into photographing it for the catalogue. I thought that it was relatively easy. Some of the rhododendrons were much more difficult, and of course, some are variable from year to year, and in different regions. It must be very difficult to represent hydrangeas accurately.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s