R. & C.
Picture
 On Line Support  Re. Scanner  FAQs  Solution 1  Solution 2  PC Brand  Error Code  ScanningTips
 TWAIN  Scannings  File Formats  R. & C.

Resolution and Color Depth

(C) Mustek, 1996

All rights reserved. This document may not be reprinted, reproduced or distributed in anyway without the express written permission of Mustek.

Is More Really Better?

Every day we are bombarded with ads telling us more is better. More RAM is better. More CPU speed is better. Faster modems are better. In most cases it's true, more is better. But is it true concerning scanners. Is a 1200 DPI scanner better than a 600 DPI scanner? In an absolute sense, yes. Particularly if you are scanning small items that you want to enlarge before printing. But the question you should be asking yourself is, "How often will I actually need the higher resolution capability?"

Prior to scanning any image, the first thing you should do is determine what resolution to scan at. Since modern advertising has conditioned us to think that more is always better, new scanner owners frequently scan at higher resolutions than they need to. In actuality, there is rarely a need to scan higher than 240 DPI. The resolution should always be determined by the capability of the output device. I will try to put the resolution issue in perspective by discussing scanning for glossy magazines.

Magazines are printed at 133 Lines Per Inch. Magazine printing technology is not quite equivalent to Laser and Inkjet printer technology, so the layout artists scan at 1.5 times the printing resolution. This means that pictures in magazines are scanned at 200 DPI. That's right, those beautiful pictures in the pile next to your easy chair were scanned at a mere 200 DPI! Clearly more is not always better.

When scanning for output on an inkjet printer, you should scan at 1/3 of the resolution you are going to print at. Modern inkjet printers have a maximum resolution of 720 DPI. Dividing 720 by 3 is how I got the 240 DPI number mentioned above. But the 720 DPI inkjet printers can't print at 720 on plain paper. When using plain paper on such a printer, you should set the printer driver to 360 DPI. Dividing 360 by 3 means you should scan at 120 DPI for printing on plain paper with your 720 DPI inkjet printer.

NOTE: Most HP inkjet printers have a maximum resolution of 600. So you would use 200 DPI instead of 240 DPI for most HP printers. See your printer's manual for details.

Does that mean if you scan at 720 DPI and print at 720 DPI on special coated paper that the picture is not going to look good? No. The picture will look fine, but it will create a larger file on your hard drive. To understand all of this, I recommend that you perform the following test which I recently did.

I scanned a 4" x 6" photograph at three different resolutions. 240 DPI, 360 DPI and 720 DPI. I then printed all three images at 720 DPI using the special coated paper recommended by my printer manufacturer. I wrote the scanned resolution number on the back of each print. When I showed all three to colleagues here at Mustek, none of them were able to tell which was which. There was absolutely no difference in the quality of the printed images!

However there was considerable difference in the size of the three files that I had saved on my hard disk! Here are the file sizes using the *.TIF file format:

240 DPI = 4.032Mb

360 DPI = 9.074Mb

720 DPI = 36.295Mb

If I had done what most people do, scanned the image at the maximum resolution of my printer (720 DPI), I would be wasting about 32 megabytes of precious hard disk space storing data that I don't really need.

It's okay to try this experiment at home!

My 720 DPI inkjet printer holds three different colors of ink. It can only print 240 dots of each color. The dots are printed close together and they blend with each other to produce the variety of colors in my scanned picture. This is why I can get the results I need by scanning at 1/3 of my printer's resolution.

So why do we make 1200 DPI scanners? Because some people need to scan postage stamps and turn them into billboards. Well, not really. But some users do have the need to enlarge their scanned images by several 100%. It takes a higher resolution scanner to do this. Unfortunately, it is beyond the scope of this article to teach you how to scale images. In the next few months, we will have a tutorial section on this Web Site that will cover this type of task.

Color Depth Issues

If you have not read the article on Understanding Scanner Mechanics, you should read it carefully prior to reading this section.

Computers use a Binary number system to store information. We are used to using a Decimal number system which has 10 possible digits (0-9). The Binary number system only has 2 digits (0&1). These correspond to switches being either on or off. The chips in your computer are nothing more than a complex system of switches that can be turned on and off by program codes.

When you only have 2 possible digits to work with, it takes more of them to represent 256 possible integers (0-255 being the 256 integers) than it does using the Decimal number system. Here is what 255 looks like in Binary:

11111111

Here is what 0 looks like in binary when stored as 8-bits:

00000000

All numbers between 0 and 255 will be various combinations of 1s and 0s. (If you desire to understand the Binary number system further, talk to a mathematician or a computer programmer. This is about as far as I care to understand it! You can play with the Windows calculator which is capable of converting between Decimal and Binary. Open the calculator and change it to Scientific Mode from the View Menu. Enter a number between 0 and 255 then click the Binary button and see what happens. Enter 256 and see what happens when converting to binary - you get 9-bits!)

It takes 8 digits (8-bits) to store the Decimal number 255 in computer code.

24-bit scanners can detect up to 256 different levels of intensity for the three colors used by a monitor (Red, green and blue). The monitor uses dots of each of these three colors placed very closely together to trick your eyes into seeing a broad spectrum of colors (16.77 million to be precise).

To store a single dot of color, the computer needs a value between 0-255 for each of the three colors the monitor blends together to produce the perceived color. That means it takes 24-bits to store one dot of color in a file on your hard disk. Below are the 24-bits for a dot of white:

11111111,11111111,11111111

A dot of black would be stored as 24 0s. there are 16.77 million unique combinations of 1s and 0s possible when you have 24 digits to work with. This is why 24-bit color scanners can scan up to 16.77 million colors.

Monitor Color Depth

All computer monitors use dots of red, green and blue to represent the colors of an image. Current Video Card technology only allows 256 intensity levels for each of these dots. Therefore, current computer monitors can only display 24-bit color or 16.77 million colors. I doubt that this will ever change because the human eye cannot distinguish more colors than this. There also would be a quadruple increase in file sizes which would clog the "arteries" of your computer if video card technology advances to 48-bit technology (the next logical step). Keep this 24-bit limitation of your video card in mind as it is quite important when we get to the discussion of 30-bit and 36-bit scanners.

Video cards allow you to run your monitor in several different modes. You can set your monitor to display 16 colors, 256 colors, 16-bit color (65,000 colors) and 24-bit color (16.77 million colors). Some video cards have a 32-bit mode, but this is merely a speed trick. Computers move information in 16-bit or 32-bit chunks. When a video card allows 32-bit mode, it is storing 24-bits of color information in 32 digits merely to make data transfers faster. The extra 8-bits contain null information.

The amount of colors your computer monitor can display is dependent entirely on the video card you have installed. If your computer is more than 4 years old, you probably don't have a 24-bit capability. Most machines purchased within the last few years are capable of displaying 24-bit color.

The amount of RAM on your video card limits the color depth for each resolution of your monitor. Most current video cards can display resolutions up to 1,024 x 768. That is 1,024 pixels horizontally by 768 pixels vertically. (A pixel is one set of three dots, one each of red, green and blue. A pixel can display one dot produced by the scanner.) If your video card has 1MB of RAM, you can display 24-bit color at 640 x 480 resolution. If you set a 1MB video card to 800 x 600 pixels, you will only be able to display 16-bit color. If your video card has 2MB of RAM, you can display 24-bit color at 800 x 600 resolution. It takes 4MB of RAM on your video card to display 24-bit color at 1,024 x 768 resolution.

So how do I know what resolution and color depth my monitor is using you ask? Well, in Windows 95 it is easy to adjust. When you click the Display icon in your control panel, a screen will come up which allows you to set the resolution and color palette for your monitor. Click on the Settings Tab to get to the adjustments for your video card.

30-Bit and 36-Bit Scanners

Those of you who have purchased a 30-bit or 36-bit Mustek scanner may have noticed by now that it only produces 24-bit files. If you understand what you have read so far, then you already know why it does this. But you must be wondering why we even make scanners that go higher than 24-bits.

The main reason for going higher than 24-bits is because 24-bit scanners typically produce slightly dark images. It is possible to make them scan lighter, but then you lose detail in the shadow areas of your image. 30-bit and 36-bit scanners produce lighter images by default. Additionally they give the operator control over shadow and highlight detail which 24-bit scanners don't give.

A 30-bit scanner collects 10-bits of data each for the red, green and blue color components while 36-bit scanners collect 12-bits for each. The scanner driver allows the operator to control which 24 of those 30 or 36 bits are kept and which ones are discarded. This adjustment is made by changing the Gamma Curve. Gamma settings are accessed through the Tonal Adjustment Icon on the Twain Interface. Mustek's 36-bit scanner, the Paragon 1200 SP Pro comes with calibration software which interfaces with the Twain driver to select the 24-bits of data that best represent the original image.

The images below were scanned at 72 DPI on a Mustek Paragon 800 II SP at various gamma settings. The first row of images are a portion of an Agfa IT8.7/2 Calibration Target. The second row of images are my bosses son.

Scanned at Gamma 1.0

Scanned at Gamma 1.8

Scanned at Gamma 2.4

Best detail in highlights

Shadows sacrificed

Best detail overall

Best detail in shadows

Highlights sacrificed

That's why it's the default*

*Some of our older drivers default to a Gamma of 1.0. You can easily set the gamma to 1.8 where it will remain unless you change it or unless you re-install the Twain driver.

 Understanding the mechanics of your scanner, video card, monitor and various file formats will enable you to get good results from your Mustek scanner. With a little practice you will be producing the best possible images while keeping file sizes to a minimum. That is the test of a good scanner operator. If you are putting the resulting images on a Web Page, those who access your page will appreciate your keeping image file sizes as small as possible.

 On Line Support  Re. Scanner  FAQs  Solution 1  Solution 2  PC Brand  Error Code  ScanningTips

Copyright 1997, Mustek Systems Inc. All rights reserved.