Cloud iridescence

| 30 Comments
IMG_1110_Iridescence_600.jpg

Cloud iridescence. The brightest example of cloud iridescence I have ever seen – visible without polarizer or sunglasses for around an hour. The photograph is not enhanced in any way. The sky, alas, is underexposed and appears black; enough to make a person long for Kodachrome.

30 Comments

the demise of Kodachrome is lamented, but with any digital camera you can bracket the exposure and composite four or five exposures in PhotoShop. Some cameras have an automatic bracket setting.

It works for any scene that isn’t moving.

… with any digital camera you can bracket the exposure and composite four or five exposures in PhotoShop. Some cameras have an automatic bracket setting.

Yes, thanks, I certainly could have bracketed if I had wanted to - my wife just made a nice composite of one landscape I took at different exposures - but this was just a snapshot, and I didn’t give it a second thought. Next time!

Or if you have a digital SLR or high-end P&S shoot in RAW format. Digital sensors have about twice the dynamic range of Kodachrome or any other slide film. But going straight to JPEG in-camera will compress that, often down to the chrome range if you select one of the “saturated” or “contrasty” modes (such settings only affect the in-camera RAW-to-JPEG conversion, if you shoot in RAW and postprocess on your computer you’ll have the full range of information captured by the sensor to play with).

I had no idea that JPEG reduced the dynamic range so.

Hewlett-Packard used to market a camera that did some kind of in-camera processing and had a remarkable dynamic range and also, I think, very good noise reduction. I do not know whether they still sell such a camera. I attended a seminar, quite a few years ago, and everyone was astonished (a) by the camera and (b) that they have never heard of it before.

I had no idea that JPEG reduced the dynamic range so.

Well, standard JPEG is only 8-bits per RGB channel, for starters. Then on top of that, typically (higher-end, at least) cameras provide a choice of conversion strategies. You want punch like Velvia? Use a “contrasty saturated” mode … and possibly lose dynamic range even beyond the inherent limitations of JPEG format. “possibly” because different manufacturers use different algorithms … anyone seriously curious should shoot an image of something with a lot of dynamic range in RAW, and then in various JPEG modes, and compare what you can tease out of the RAW image with the various JPEG representations.

Hewlett-Packard used to market a camera that did some kind of in-camera processing and had a remarkable dynamic range

Now it’s my time to thank you, because your comment led me to Google, and Google led me to this page:

http://www.hpl.hp.com/techreports/2[…]2007-96.html

JPEG supports an extended mode that allows for 12 bits rather than 8 bits per RGB channel, which hugely increases the dynamic range that can be captured!

I did not know that, I’m only familiar with baseline (8 bits per channel) JPEG.

Interesting! That makes me wonder if the camera you describe was creating extended mode JPEGS …

JPEG supports an extended mode that allows for 12 bits rather than 8 bits per RGB channel, which hugely increases the dynamic range that can be captured!

I hope I’m not the only one, but I do not understand why 12 bits results in more dynamic range than 8 bits. In one case, you take 0 to 1 V (say) and divide it into 212 levels, whereas in the other case you divide it into 28 levels. I thought the dynamic range was the same in both cases, but one merely has more levels.

Similarly, I wonder why JPEG necessarily reduces the dynamic range. Or does it?

I think the seminar I heard was essentially this. The camera must have been the HP Photosmart 945. I am afraid the paper is mostly Greek to me, but you can get what he is talking about by reading the figures and captions. I just checked the HP website, and I saw only a handful of small point-and-shoot cameras, so I assume they do not make the 945 any more.

Matt Young said:

JPEG supports an extended mode that allows for 12 bits rather than 8 bits per RGB channel, which hugely increases the dynamic range that can be captured!

I hope I’m not the only one, but I do not understand why 12 bits results in more dynamic range than 8 bits. In one case, you take 0 to 1 V (say) and divide it into 212 levels, whereas in the other case you divide it into 28 levels. I thought the dynamic range was the same in both cases, but one merely has more levels.

Similarly, I wonder why JPEG necessarily reduces the dynamic range. Or does it?

Back in the 1980s when I worked on CCD imagers at Kodak (this was before Kodak shot all its researchers and then tanked), we made a CCD imager with built-in dynamic range enhancement.

The procedure was to simply dump excess charge in any given overexposed pixel into the sensor substrate so that the lower light levels in other pixels could be raised to levels that would allow for conversion to more bits in the A-to-D converter.

This greatly boosted the signal-to-noise throughout the entire image while retaining the brightest parts of the image.

It worked like the eye does in effectively reducing the sensitivity of overexposed pixels while maintaining the sensitivity of pixels receiving low levels of illumination. It also eliminated the bright line that used to occur as a result of an overexposed pixel flooding all shift registers along its clock-out path.

Almost all scenes have a dynamic range that exceeds the range of the sensor. If you have 8 bits, and you set the highest intensity to 256, the lowest level you can resolve is 1/256th of that. If you have 12 bits you can resolve 1 4096th deeper into the shadows, an improvement of 12 dB.

The kicker is that the 8 JPEG bits are *not* a linear transform of the 12 RAW bits, because of the Gamma factor.

Of course, if you think you can do better than the internal JPEG image, you can usually go RAW if you have a DSLR or high end P&S.

“Of course, if you think you can do better than the internal JPEG image, you can usually go RAW if you have a DSLR or high end P&S.”

As a DSLR user who routinely shoots RAW, in my experience the practical dynamic range difference between RAW and jpg is about two stops in the highlights. That is, in a RAW developer module, one can recover around two stops, maybe 2.5 stops of highlights, which in the jpg would simply be unrecoverable blown-out white.

The human eye can distinguish about 24 stops of dynamic range, the best digital cameras about 11-12 stops, slide film around 5 stops, print film around 7 stops. Some claim certain B&W film can get 15 stops.

The theoretical discussion about dynamic range, bit depth, shadow noise, etc can get really esoteric, and eminently avoidable, but the good news is that sensors, A/D convertors, and software keep getting better and better.

Exactly, Gingerbaker.

But with the RAW developing you can often decide which is more important, the shadows or the highlights, and develop accordingly. The JPEG has that decision baked in.

And yes, we have more than enough controversy discussing creationism and intelligent design. No sense adding a RAW vs JPEG war in the discussion here! 8^)

Incredible image; like being inside a soap bubble. So sorry I couldn’t see it myself, but happy you were there to save it for others.

I’ve learned a few things here, but I’d like to add a quick discussion of bits and stops.

Back when Ansel Adams was the quintessential technical photographer, he created the concept of zones. As best as I can understand, zones corresponded to f-stops. a one zone difference meant the incoming light was more or less intense by a factor of two.

I believe black and white film could capture detail across nine zones, but you could reduce or extend this during development. Color film had less range, and little ability to modify it during development.

Daylight scenes, as Gingerbaker notes, can have 20 or more zones. That’s an intensity difference of 2^20. Or more. Eyes can alter their sensitivity from one part of a scent to another, allowing you to see detail in highlights and shadows at the same time.

You can emulate this in a digital camera by taking a number of exposures and manipulating the way you combine them. It’s a bit of witchcraft, but it’s not actually hard to learn.

Some cameras capture up to 14 bits or zones in a single RAW image. RAW image editors allow you easily to manipulate highlight and shadow detail without affecting the mid-tones.

Gingerbaker:

Some claim certain B&W film can get 15 stops.

midwifetoad:

Back when Ansel Adams was the quintessential technical photographer, he created the concept of zones

B&W print paper has about 7 stops of dynamic range, Ansel Adams’s Zone System is all about optimally mapping a scene’s dynamic range to the negative and then the print. Different developers can expand/shrink the dynamic range of the negative. Paper comes in various contrasts - low contrast paper will map a negative of higher dynamic range to the 7 or so stops available on the paper. Likewise, high contrast paper can expand (say) a negative showing fewer stops of dynamic range to the full range available on the paper. The Zone System involves the entire process from metering (understanding the dynamic range of the scene you’re photographing) to exposure to the negative development process to the print process (including choice of contrast grade of paper and/or control via developer/developing time and temp choices).

Of course he also manipulated images via burning and dodging, but that’s separate from the Zone System process.

The idea is to utilize the entire dynamic range of the paper by mapping tones of the scene through the entire image making process in a way that matches your artistic vision of the scene.

It’s so complex that only a chemist concert pianist could’ve invented it!

Works a lot better for something static like Adams’s famed landscapes than (say) frenetic high-speed sports.

dhogaza said:

“B&W print paper has about 7 stops of dynamic range, Ansel Adams’s Zone System is all about optimally mapping a scene’s dynamic range to the negative and then the print. Different developers can expand/shrink the dynamic range of the negative. Paper comes in various contrasts - low contrast paper will map a negative of higher dynamic range to the 7 or so stops available on the paper. Likewise, high contrast paper can expand (say) a negative showing fewer stops of dynamic range to the full range available on the paper. The Zone System involves the entire process from metering (understanding the dynamic range of the scene you’re photographing) to exposure to the negative development process to the print process (including choice of contrast grade of paper and/or control via developer/developing time and temp choices).

Just read this ( http://theonlinephotographer.typepa[…]tml#comments ), which was so germane to your comment and our discussion that I felt compelled to post it. :)

See also http://en.wikipedia.org/wiki/Dodging_and_burning

I recall a discussion many years ago in Sky and Telescope magazine about using various exposures of negatives as dodging masks to increase the dynamic range of astronomical photos.

Some cameras capture up to 14 bits or zones in a single RAW image. RAW image editors allow you easily to manipulate highlight and shadow detail without affecting the mid-tones.

yup, that’s the way to go. If you let your camera do the processing and store the image as a jpg, you’ve tossed away almost half the information that was in the raw image of even a standard, entry level, modern digital camera.

The way a digital camera processes image data from the sensor to store as a jpg is pretty much exactly the same as the way any image editing software processes the raw data to save as a jpg.

I look at it this way:

If you spent more on your computer than your camera, it’s a good bet your computer can do a much better job of processing the raw data than the camera can.

bottom line:

unless you’re shooting with a top end digital camera (you spent at least 3K on it), you’re probably better off using a good photo-editing software, like photoshop, to edit the raw data from the camera.

If you camera indeed stores raw data, and given that data storage cards are really cheap, I would recommend setting your camera to take the largest possible size of image, and store BOTH the raw (for final processing) and the jpg version (for quick looks and reference).

the jpgs let you get a good look at what you got, while the raw will let you process highly contrasting images very easily. Moreover, the jpgs will typically be fine for shots with low contrast, or when you’re just in a hurry.

there are also some good independent raw processing proggies out there, but if you go that route, you have the possibility of your particular brand being bought up by the black hole that is Adobe Software (photoshop makers). Still, if you really don’t want to give Adobe your cash (don’t blame ya), GIMP is another option, which is entirely open source and runs on any platform.

GIMP will do most things photoshop can do, but is considerably slower at some of the more complex operations, and lags behind a bit in the raw processing area.

so how hard is it?

well, they’ve streamline the process with photoshop so much, that it is little different than editing any normal photo now.

a bit more of a learning curve in GIMP, but that only amounts to about an extra hour or so to figure out.

trust me; unless massive speed batch processing is important to you (you don’t take hundreds of pics a day, do you?), then raw processing is the way to go.

plus, you will always have the original raw data to play with, should raw processors change and allow you to do new things.

probably tldr, but I’ve been doing pro photography for 10 years now, figured out how much more powerful raw processing was in my first year, and I simply would never go back to non-raw processing at this point.

…oh, and as to the pic:

Sweet!

I rather like the darker patches of sky; adds depth to the image.

I rather like the darker patches of sky; adds depth to the image.

I thought so too but decided it was not my place to say so.

My Son the Computer Guru has been trying to get me to shoot in Raw for a while now, and now My Wife and Harshest Critic has chimed in, but I have never before seen a need – the resolution (edge response, I suppose) in JPEG, for example, always seems to be about a pixel, and stuff I photograph always seems to come out fine, so I have never worried very much about using or not using JPEG. Did not realize that the dynamic range was compressed so. Digital film, so to speak, is indeed very inexpensive.

But would using Raw slow “motor drive” at all? I have the feeling that a slower card slows motor drive enough for me to (think I) notice it.

Depends on the camera my low end DSLR is rated thusly:

JPEG: 3 shots per second RAW: 1.5 shots per second

Maximum Burst [buffer size] 514 [!] for JPEG 4 for RAW

Card speed doesn’t enter into it until after you reach the “maximum burst”

I look at it this way:

If you spent more on your computer than your camera, it’s a good bet your computer can do a much better job of processing the raw data than the camera can.

Well, I look at it a bit differently:

If I convert to JPEG in-camera, I’ve thrown away the RAW image.

If I process on my computer, I can fiddle-faddle to my heart’s content.

This view’s screwed by my Canon 50D which allows me to store RAW+JPEG :( Now I can have the best of both worlds!

Seriously, though, I just prefer postprocessing because I got used to it before storing both RAW and JPEG was available on my camera system of choice.

But would using Raw slow “motor drive” at all? I have the feeling that a slower card slows motor drive enough for me to (think I) notice it.

To add to KeithB’s comment above, the basic issue is file size. RAW images, though compressed, are lossless … every pixel is preserved with its dynamic range info. JPEG first throws away information (much of which isn’t detectable by eye) but the algorithms also map the color/contrast space into a smaller bitspace, which means the final compressed result is typically much smaller that the RAW file.

So, essentially you can shoot full-speed with motor drive until the internal buffer is full, which means you can burst more JPEGs than RAWs before the camera pauses to write to flash memory.

Write times are much faster than a few years ago. I’ve never been a “blast off 50 frames in 5 seconds” kinda guy, anyway. I grew up on slide film, and blasting off $10 every few seconds just wasn’t covered by my budget, that being based on fairly limited media sales :)

Really, there’s still something to be said for “the decisive moment”, rather than “the decisive blind-assed machine-gun motor drive” :)

And one can certainly use JPEG for “Motor Drive” and RAW for images like this one when post-processing is likely more important.

I like the way the chimney acts as a shield to the light emphasising the lovely rainbow colors effect and iridescence that has an air of the northern lights about it.

JPEG first throws away information (much of which isn’t detectable by eye) but the algorithms also map the color/contrast space into a smaller bitspace, which means the final compressed result is typically much smaller that the RAW file.

I think of brands of camera the way people used to think of brands of film. Each manufacturer favors a type of image sensor and a style of image processing.

They all take good snapshots, but before I’d drop more than $500 on a camera I’d want to see what they do with color and contrast.

ISO and tripods make more difference than most people suspect. I had to do some catalog photography with an old Canon Rebel. Just 6.5 Mp, but when I put it on a tripod, used the self timer and used 200 ISO, I was able to get good, large images of bare metal surfaces without artifacts.

And lighting! Don’t forget lighting!

http://theonlinephotographer.typepa[…]e-long-.html

You have the coolest pictures, Matt.

You have the coolest pictures, Matt.

Thanks, but the very coolest pictures come from the photography contest.

About this Entry

This page contains a single entry by Matt Young published on January 17, 2011 12:00 PM.

A Horoscope I’d like to see was the previous entry in this blog.

The Discovery Institute Tries Something Different: Feedback is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Categories

Archives

Author Archives

Powered by Movable Type 4.381

Site Meter