Thanks Phos.
From <
http://firesign3.com/glossary.html>, Pixel Dimension is defined as..."The size of a computer display screen expressed in horizontal pixels by vertical pixels (H x V, or 800 x 600, for example)".
From <
http://www.scantips.com/basics1b.html>: "If the image size were say 1000x750 pixels (written as width x height by convention), then there would be 1000 columns and 750 rows of data values, or 1000x750 = 750,000 pixels total. For 24 bit color, each pixel's data contains three 8-bit RGB byte values, or 750,000 x 3 = 2,250,000 bytes".
In my example I have a 2304x1728 image. If I multiply those and take 3 times the product, the answer is 11.9 Mpx, while Photoshop gives "Pixel dimension" as 11.4M.
So now I have three questions
1. Why the difference between 11.9 and 11.4?
2. Are the three colors included in the definition of pixel dimensions? Are there actually 3 pixels (i.e., red, green, blue) for every point in the 1024x1728 array of an image?
If you blow up an image on the screen, you see each pixel having one out of 16.777M colors, based on the relative weights of the three colors. But it is one pixel.
3. What am I missing?