Much confusion is caused by the over-use of the generic unit of image resolution 'dpi'. It is always much better to use the correct unit specific to use. This paper looks at the definitions of spi, lpi, ppi as well as the generic dpi.
What do 'dpi', 'ppi', 'spi' and 'lpi' mean?
We recognise these as some of the units we use to quantify an image's resolution but now know that in themselves they do not indicate the image's size or quality (see Do Digital Images Exist in the Real World?).
We use these resolution units to map the pixels of our 'virtual' digital image into the 'real world' images that we can see and print.
What are the differences?
The units are all similar and in some cases interchangeable, but they don't describe the same thing at all. It is unfortunate that these units are often incorrectly used within Digital Imaging which can be more than a little confusing. If we are going to communicate with accuracy we need to have a full understanding of the units we use and to use them correctly.
What do they really mean?
spi - samples per inch
We scan an image by rasterising (or sampling) that image. The amount (or number) of separate 'samples per inch' is described by the unit 'spi'. In fact 'spi' is not widely used because in most cases, it can be considered interchangeable with 'ppi'.
ppi - pixels per inch
'ppi' is used to describe the resolution of the image when it is in its 'virtual' state (see Do Digital Images Exist in the Real World?). Once the digital image is in the computer it has entered the 'virtual world' and has no 'real world' size until rendered by an output device that translates the 'virtual' digital image into a 'real world' image.
In the 'virtual world' any given resolution is only a guide for how the image might be output. This has no effect on the size or quality of the image itself, e.g. an image with 800 pixels in it's length could be 2in at 400ppi or 8in at 100ppi, however the digital image would be identical in both cases.
When a 'virtual' digital image is output onto a monitor there is a one-to-one mapping from pixel to 'monitor dot' and we therefore also measure this in 'ppi'. Typically this used to be 72ppi with old monitors but these days is more likely to be as much as 92-96ppi.
lpi - lines per inch
When we print an image out its 'real world' size will be dependent on the resolution of the printer. This resolution measurement is described by the number of printed half-tone lines per inch (lpi) that the printer uses. Typical resolutions for this would be from 85 lpi for newsprint to 200 lpi for an art magazine.
dpi - dots per inch
'dpi' or dots per inch is by far the most confused and mis-used unit of them all.
It has become known and accepted as a generic 'catch-all' unit that the printing industry uses to incorrectly encompass all the units mentioned above. This is unfortunate, as its correct meaning is quite specific. The 'dot' of dots per inch is simply a description of the smallest dot that a printer is capable of producing. This is very different to the lines of the half-tone pattern which we measure in 'lpi'.
For a printer to create a half-tone effect it has to vary the amount of ink it puts on the paper. Lots of ink will make a saturated colour and a little ink will leave a lighter colour and more of the paper colour. The printer can either do this by varying the size of the halftone dot (Amplitude Modulation) or by varying the number of dots (Frequency Modulation) within a half-tone grid measured in 'lpi'. A normal 'half-tone' dot is made up from a framework of these smaller 'printer' dots. Thus 'dpi' measures the smallest size of these 'printer' dots and is the final limit to the quality of that printing device.
We can therefore correctly use this unit to quantify the quality of a printer. The higher 'dpi' a printer has, the finer quality it can print and the more lines per inch it can handle.
Typically this varies from 600dpi for an office laser printer to 3200dpi for a commercial printing workflow.
Using the Units Appropriately
- When we scan a print, we use 'spi' (or 'ppi')
- When we open a digital image in Photoshop, we use 'ppi' as a guide of how it might output
- When we view a digital image on a monitor, we use 'ppi', although the overall pixel dimension will be of far more use
- When we print out an image we need to know the printing resolution of the output device, that is measured in 'lpi'
- When we compare the quality or sharpness of a printer we use 'dpi'
The more precise and accurate we are in our communication the less room there is for a mistake and that can only be of benefit to all of us. However we still need to remember that in general there is much confusion about the resolution units and we are bound to find that most people will just use 'dpi' for any description of resolution and it will be up to us to clarify what they in fact mean.
Do vector images have a resolution?
Vector images are resolution independent . instead of pixels they use a series of co-ordinates to describe the shape, position and fill of an image. While a raster image requires a grid of pixels to describe a simple shape a vector image needs only a few co-ordinates and is therefore much smaller. The simple resolution independent nature of the vector image means that it can be scaled up or down with no loss of detail.