To quote from Wikipedia, “In photography, exposure value (EV) denotes all combinations of a camera's shutter speed and relative aperture that give the same exposure. In an attempt to simplify choosing among combinations of equivalent camera settings, the concept was developed by the German shutter manufacturer Friedrich Deckel in the 1950s. Exposure value also is used to indicate an interval on the photographic exposure scale, with 1 EV corresponding to a standard power-of-2 exposure step, commonly referred to as a stop.”
EV charts usually assumed an ASA or ISO value of 100, but EV can also be used to describe specific values of ISO settings on digital cameras. This was not as much of an issue with film, because you had to change film to change ISO. Now, with digital sensors, the sensitivity setting is yet another variable for the camera to set automatically, or for a knowledgeable user to adjust. My Nikon D7000 has ISO starting at 100 and adjustable by 1/3 EV values up to a top value unheard of with color film. Although the staggering large values above ISO 6400 are done with digital tricks, it can extend the sensitivity out to 25,600. You won’t even need a moon lit night with those high numbers, a few stars in the sky will make the dark as bright as day. Every advance in camera technology seems to move the ISO number upward and be less impacted by noise.
In the last installment of the "Science of Photography," I introduced the myriad of available digital light sensor sizes. What is really amazing to me is how many individual sensors, called pixels, they’ve crammed onto a small piece of silicon. So let’s dig into sensitivity and resolution. We will discuss this very physics based topic for both film and digital sensors and I’ll try to keep the water from getting too deep. But, we’ve just been wading the the shallow end of the pool. The topics to come are pretty deep and those who are not experienced swimmers in math, physics, and optics, may gulp down a few mouths-full of water. If it gets too deep, you can always glaze over and the look at the pictures.
Sadly, this medium of Facebook Notes does not allow for good pictures and illustrations, but I’ll do my best to create some word pictures. You can also do some searching on the Internet for good articles on these various topics. I’ll drop enough names to make some good Google search phrases.
Kodak
Before I wade into the pool, I have two brief asides. First is this news update from Kodak. The Kodak brand is an iconic name when it comes to associating it to photography and cameras. Back in the days before digital technology many amateur and professionals alike used their critically acclaimed 35mm film. Now it has been announced that the Kodak EasyShare range of cameras are to cease production as the company looks to further cut costs. It's the end of an era.
Last month the company filed for Chapter 11 bankruptcy protection in the US as it attempts to restructure and survive. But now the company has announced that it is leaving the camera business.
Kodak currently has over 1,000 digital imaging patents that could prove invaluable as it looks to secure its long term future. Not only am I very sad to hear of the end of Kodak cameras because of my experience with film, but Kodak was also a pioneer in the development of digital cameras. That is one reason it has such a large portfolio of patents.
Digital camera technology is directly related to and evolved from the same technology that recorded television images. In 1951, the first video tape recorder (VTR) captured live images from television cameras by converting the information into electrical impulses (digital signals) and saving the information onto magnetic tape. Bing Crosby laboratories (the research team funded by Crosby) created the first early VTR and by 1956, VTR technology was perfected (the VR1000 invented by the Ampex Corporation) and in common use by the television industry. Both television/video cameras and digital cameras use a CCD (Charged Coupled Device) to sense light, color, and intensity.
Texas Instruments patented a film-less electronic camera in 1972, the first to do so. In August, 1981, Sony released the Sony Mavica electronic still camera, the camera which was the first commercial electronic camera. Images were recorded onto a mini disc and then put into a video reader that was connected to a television monitor or color printer. However, the early Mavica cannot be considered a true digital camera even though it started the digital camera revolution. It was a video camera that took video freeze-frames.
Since the mid-1970s, Kodak has invented several solid-state image sensors that "converted light to digital pictures" for professional and home consumer use. In 1986, Kodak scientists invented the world's first megapixel sensor, capable of recording 1.4 million pixels that could produce a 5x7-inch digital photo-quality print. In 1987, Kodak released seven products for recording, storing, manipulating, transmitting and printing electronic still video images. In 1990, Kodak developed the Photo CD system and proposed "the first worldwide standard for defining color in the digital environment of computers and computer peripherals." In 1991, Kodak released the first professional digital camera system (DCS), aimed at photojournalists. It was a Nikon F-3 camera equipped by Kodak with a 1.3 megapixel sensor.
The first digital cameras for the consumer-level market that worked with a home computer via a serial cable were the Apple QuickTake 100 camera in 1994, the Kodak DC40 camera in 1995, followed by the Casio QV-11 with LCD monitor, late in 1995, and Sony's Cyber-Shot Digital Still Camera in 1996.
Kinko's and Microsoft both collaborated with Kodak to create digital image-making software workstations and kiosks which allowed customers to produce Photo CD Discs and photographs, and add digital images to documents. IBM collaborated with Kodak in making an internet-based network image exchange. Hewlett-Packard was the first company to make color inkjet printers that complemented the new digital camera images. All in support of the Kodak DC40.
The rest, as they like to say, is history! So now you know the rest of the story.
Crop Factor
In notes with family in Alaska, we’ve discussed using specific lenses on a wide variety of camera sensor sizes. In the case of the Lincolns, they have Canon cameras with sensors a variety of sizes from full frame (23.9mm x 35.5mm), to APC (16.7 x 25.1mm), to smaller 14.9mm x 22.3mm, and even the smaller 4/3” size. When you swap lenses around between cameras with different size sensors, I’ve already discussed the differences in normal view, wide angle, and telephoto. So how do you allow or compensate for these variations of lens performance on different cameras? One ratio has been designed to deal with the match of lens to sensor size and it is called “crop factor.”
"In digital photography, a crop factor is related to the ratio of the dimensions of a camera's imaging area compared to a reference format; most often, this term is applied to digital cameras, relative to 35 mm film format as a reference. The most commonly used definition of crop factor is the ratio of a 35 mm frame's diagonal (43.3 mm) to the diagonal of the image sensor in question; that is, CF=diag(35mm) / diag(sensor)." Crop Factor can be used to determine zoom lens equivalent focal lengths.
Rather than spending time describing crop factor, I suggest this Wikipedia article which I’ve quoted in the previous paragraph will explain it better than I can. Check out:
http://en.wikipedia.org/wiki/Crop_factor
I think this explanation is about the best I’ve read and doesn’t require my poor attempt to explain. And that leaves me more time to talk about resolution. Let’s start with film.
Film Resolution
So, what is the resolution of film? With digital sensors, we often quote the total megapixels as if it was a pedigree. However, as we’ve stated several times, digital sensors come in different sizes. So a more meaningful value would be resolution per area called pixel density. That is, the total number of pixels in a given square area.
But I said I wanted to start with film. So what is the resolution of typical 35mm color film? There is no one answer, because film doesn't have to bother with pixels. With film, the image is continuous in all three dimensions: x, y, and z (intensity). With film, you get the same resolution at color transitions (green/magenta, for instance) as you get for light/dark transitions. With film, you have complete Red, Green, and Blue resolution at every point. (Color photography, either film or sensor, is done with three primary colors that are then combined to produce all the colors of the rainbow.)
Film's sharpness decreases gradually as the spatial frequency of fineness of detail increases. (Compare that to a whistle that is so high pitched, only a dog can hear it. Sharpness decreases on film for "high frequency" detail.) Film's response to detail gradually becomes less as the details get finer. This is best measured using a technique called the Modulation Transfer Function or MTF curve. MTF is the most widely used scientific method of describing lens performance
MTF applies to every imaging system, film or digital. Photographers see these charts in tech data sheets for film and for lenses. Film can resolve insanely fine details, but not with as much contrast as coarser features. This natural response is similar to our eyes, and another reason film looks so good. Digital, does not have this gradual transition and has a fine structure made up of discrete pixels. When you increase the size of film photos, the detail gradually disappears -- it just becomes blurry or fuzzy. With digital pictures, as you blowup the size of the picture, you start to see the fine structure. This is called “pixelation.”
I had an interesting thought that this is much like vacuum tube amplifiers vs. solid-state. Vacuum tubes have a much more gradual transition when they are over-driven into saturation when compared to transistors which shut off abruptly. That is one of the reasons that most musicians prefer vacuum tube amplifiers. It is a similar difference between film and digital sensors. Film makes the transition to fine detail a gradual function compared with pixelation.
Digital Film Scans
When you scan film, good scanners resolve right up to their DPI (dots or pixels per inch) rating. Film scans also have complete RGB color information and resolution at each pixel. Film scans resolve detail about as well as the original film, up to the resolution of the scanner. There is no response to details finer than the resolution of the scanner, even if it's on the film and visible in optical prints. When people compare film to digital, they are usually only comparing scans of film to digital.
With digital cameras, you get full contrast up to the very highest limit of the sensor's resolution. Finer details simply disappear, or become aliases. This is one way film and digital look so different. Film records fine and coarse details naturally, while digital (and video) tend to record medium details more strongly than film, but have no response to the extremely fine details which film can record.
Often the finest medium details to which the digital camera is sensitive are boosted in contrast. This is called sharpening, and is how we get digital images to fool the eye into thinking they're sharp.
Digital cameras never resolve their rated resolution. The only digital cameras that do were those with Foveon sensors, but then Sigma started lying, too. Let me explain:
The Foveon sensor is a CMOS image sensor for digital cameras, designed by Foveon, Inc. (now part of Sigma Corporation) and manufactured by National Semiconductor and Dongbu Electronics. It uses an array of photosites, each of which consists of three vertically stacked photodiodes, organized in a two-dimensional grid. Each of the three stacked photodiodes responds to different wavelengths of light, that is colors. This difference is due to that fact that different wavelengths of light penetrate silicon to different depths. The signals from the three photodiodes are then processed, resulting in data that provides the three additive primary colors, red, green, and blue. The key point is that each pixel location produces full color.
So the Foveon sensor produces the colors without the use of filters. Other, (non-Foveon) digital cameras use a black-and-white sensor on which red, green and blue dots have been painted. The painted dots act as color filters. That means it takes three distinct sensors to fully resolve the color of a pixel.
That means it takes more than one individual sensor to produce a complete color pixel, since only one-third of the sensor is painted with each color, firmware in the camera (or in raw conversion software) takes the pixels of each color, and interpolates (smoothes) values in-between the pixel locations of each color to create brightness value for each color at every other color's location.
Actually there are two green sensors for each red or blue. A Bayer array consists of alternating rows of red-green and green-blue filters. Thus the Bayer array contains twice as many green as red or blue sensors. Each primary color does not receive an equal fraction of the total area because the human eye is more sensitive to green light than both red and blue light.
Therefore, at each pixel location in a digital camera's image, we don't have full RGB data. We only get about half, which is why digital camera images at 100% won't look as good as good film scans at 100%, or lower resolution settings of your camera seen at 100%.
This is all called Bayer Interpolation. With this, most digital cameras really only resolve about half their rated megapixel rating. For instance, a 10MP camera really only sees about as well as a theoretically perfect 5MP digital camera, or 5MP film scan.
Foveon chips see at full resolution, but the makers of those cameras lie about the resolution to keep up with other cameras. Most Foveon-chipped cameras multiply the real resolution by three! What Sigma sells as 14MP cameras are really only 5MP. So, who yah gonna believe? There are lies, damn lies, and sales literature!!
So how many pixels does it take to describe all the detail we can get from film?
With all my positive talk about Kodak earlier -- and I was always a Kodak film user in my day -- Fuji Velvia film has been the choice of professional photographers since the 90's. Fuji Velvia 50 is rated to resolve 160 lines per millimeter. This is the finest level of detail it can resolve.
In order to convert the film resolution to digital terms, assume each line will require one light and one dark pixel, or two pixels. Think of it like a checkerboard with black and white squares. Thus it will take about 320 pixels per millimeter to represent what's on Velvia 50.
320 pixels x 320 pixels is 0.1MP per square millimeter.
35mm film is 24 x 36mm, or 864 square millimeters.
Therefore, to scan most of the detail on a 35mm photo, you'll need about 864 x 0.1, or 87 Megapixels. But wait: each film pixel represents true R, G and B data, not the softer Bayer interpolated data from digital camera sensors. A single-chip 87 MP digital camera still couldn't see details as fine as a piece of 35mm film.
Since the manufacturers count every sensor, even though it takes two to get a full color pixel, you'd need a digital camera of about 87 x 2 = 175 MP to see every last detail that makes onto film.
That's just 35mm film. Pros don't shoot 35mm, they usually shoot 2-1/4" or 4x5." At the same rates, 2-1/4" (56mm square) would be 313 MP, and 4x5" (95x120mm) would be 95 x 120 = 11,400 square millimeters = 1,140 MP, with no Bayer Interpolation. A digital camera with Bayer Interpolation would need to be rated at better than 2 gigapixels to see things that can be seen on a sheet of 4x5" film!
As we've seen, film can store far more detail than any digital capture system. The gotchas with any of these systems is that:
1.) It takes one heck of a lens to be able to resolve this well.
2.) It takes even more of a photographer to be able to get that much detail on the film, and
3.) If you want to scan the film and retain this detail, you need one quite a scanner (320 lpmm = 8,000 DPI).
As digital film scanners improved over time and increased resolution, we saw more details. You can argue that no digital scanner has ever equaled the resolution of film. Consumer 35mm scanners have topped out at about 5,400 DPI and we still saw more detail in our scans than we did at 4,800 DPI -- a value I've some times seen given for 35mm film. Obviously, it is not correct.
Film never stopped amazing us as we scanned it at higher and higher resolutions, and this is why.
5,400 DPI is equal to 212 pixels per mm, or 0.045MP/mm^2. Thus a 35mm slide, scanned on that 5400 scanner, yielded 39MP images, without Bayer Interpolation. Open these in PhotoShop, and 39x3 = 120 MB files, again, sharper than the Bayer-interpolated images from digital cameras.
By the way, even though I keep calling these sensors in digital cameras, digital sensors, they are actually analog sensors and the analog output is processed into a digital picture using the Bayer Interpolation implemented as a software algorithm.
And that is a good place to stop for the day. I covered a lot more and wrote many pages more than I expected. It is fun to get into the film and lens discussion and MTF. I'll explain that more, but -- if I'm not careful -- the next thing you know I'll be talking about Fourier Transforms and soon that will lead to Euler's Equation (what I consider the most beautiful equation of all time). I'll save that for next time.
http://mickey-cheatham.blogspot.com/2012/09/the-science-of-photography-part-nine.html
No comments:
Post a Comment