It’s a dull old day, drizzle, heavy cloud cover. Getting late in the year too, so the sun isn’t very strong even when it’s shining. On a positive note, my Camellia sasanqua ‘Navajo’ is starting to flower; I’d better get some pictures of it, I only have 171 already, I must need some more.
In very short order, I add 47 more images to the mix. I take pictures using my iPhone, both with the native app and with an app called Camera+ 2, then for good measure a few more with my Canon DSLR. Here are the images:
The iPhone native camera app allows a few tweaks but they come under the heading “creative” rather than “control”. Basically it’s a point and shoot. Having pointed and shot you can immediately check to see whether the image is a good likeness of the subject or not, but if it isn’t, there’s not a lot you can do about it prior to taking the picture. You can of course edit it to your heart’s content afterwards.
As with all digital cameras, there are two parts to getting the picture. There is the hardware, the lens focuses the light reflected off the subject onto a sensor which records it as digital data. This data is then processed by software to produce an image that as near as possible represents what you were seeing as your subject.
Your eye and brain do something similar. Your eye collects the data and sends it to your brain, which attempts to make sense of it. Ask me to describe what I see when I look at Camellia ‘Navajo’ and I would say I see a white flower grading to a pink edge.
What makes it all a bit less simple than that is the light itself. I can look at the flower first thing in the morning, at noon or at sunset. I can look at it in sunlight, cloud or in shade. I can look at it when light levels are high or when it is going dark. ‘Navajo’ is still a white flower with a pink edge. That’s partly because I already know that’s what colour it is and partly because my brain puts great store by how colours look in relation to each other, rather than in how they look in absolute terms.
We don’t register changes in the colour of light until it gets extreme, such as at sunrise or sunset, or when the light turns weird under unusual cloud conditions.
If you take a picture of a white wall in sunshine, part of which is in shade, you might expect that the part in shade simply reflected less light and looked less bright. Level up the luminosity and you would find that the shaded area was in fact much more blue. The same is true of Camellia ‘Navajo’. In sun it is white with a pink edge; in shade it is pale blue with a pinky-mauve edge. My brain makes an automatic adjustment, comparing it to everything about it and concludes it is white with a pink edge. The hardware end of the camera records pale blue and mauve, then it is a case of how well the software does its job of pretending to be a human brain.


And so to ‘Navajo’. The auto setting on the phone makes for an acceptable picture. Compare the first two images and you will see that one is bluer than the other. The auto setting has recorded an accurate representation of what the camera had in front of it. The day was dull, the light cool and slightly bluish. By using the cloudy setting, the software has “corrected” the colour of the image to what it would have looked like in “better” light; it’s what it would have looked like on a sunny or cloudy but bright day. Each image, in isolation, would be an acceptable pictorial record of the variety.
The Canon is a different kettle of fish. It can be set to auto but mine never is. You can choose “white balance” by selecting daylight, shade, cloudy, tungsten light, fluorescent light or flash. The software in the camera then adjusts the colour to what it would have looked like taken in normal daylight.
I have it set to record a jpeg image and a RAW image. In the jpeg image whatever pre-set choices I have made are applied and the image is compressed to make a smaller file for storing. The RAW file is what the sensor recorded before any processing took place. Once the image is copied to my computer I can tweak it to my hearts content, changing colour temperature, exposure, contrast, sharpness and much more. The Camera+ app for the iphone gives me much of the same control on the phone.



Put the flower back into context amongst its foliage and look at each version in isolation, and all three are acceptable in the sense that they are believable. They’re believable because depending on the light, each of them could be absolutely accurate. In the particular case of Camellia sasanqua ‘Navajo’, it is winter flowering and quite likely to be growing in shade, so most people’s experience of it would be as the colder, bluer version. That doesn’t make it right, or for that matter, wrong.
It does mean that if you want to show what a flower colour is accurately, it is probably best to include at least one shot taken from a bit further back. Then the brain can get to work and judge it against everything else in the picture, plus the automatic processing that the camera does is less likely to have gone awry because of being confronted with a big block of colour that it can’t accurately evaluate. Compared to us humans, cameras really do have very very small brains.
Accurate colour reproduction is a challenge. The colours of gardens and plants when shown on television are so very often inaccurate, over-saturated etc.
LikeLike
I don’t watch much television and no gardening programs. Is the problem that accuracy would make the gardening program seem dull compared to what came before and after?
LikeLiked by 1 person
I imagine that something of that order runs through the producer’s/director’s head as they make everything so very bright and colourful but not true to reality at all. Grass, especially, can be very glaring at times.
LikeLiked by 1 person
‘Navajo’ was one of the more popular Camellia sasanqua that we grew, but i never put so much effort into photographing it for the catalogue. I thought that it was relatively easy. Some of the rhododendrons were much more difficult, and of course, some are variable from year to year, and in different regions. It must be very difficult to represent hydrangeas accurately.
LikeLike
Customers are not inclined to accept an explanation that the picture is of the right plant but that the plant is variable according to growing conditions, season etc.
LikeLiked by 1 person
Oh, I SO know that one. What is annoying is that I see others marketing flowers with obviously fake color or ‘enhanced’ profusion of bloom, and getting away with it. And then there is this;
https://tonytomeo.com/2018/07/11/horridculture-rose-colored-glasses/
LikeLike
Wow! Those Pampas beat anything I’ve seen here. As for the rose seeds, my pet niggle is seeds of variegated plants like Phormiums.
LikeLiked by 1 person
Oh of course! They are quite common on EBay.
LikeLiked by 1 person