Adam Megacz
2006-02-20 03:11:47 UTC
After reading through the DNG specification, it seems like nearly all
of the new metadata falls into two broad categories:
1) The camera's mosaic pattern: the (x,y,color) coordinates of
sensors and their filters.
2) Parameters indicating how to map from the camera's [linear]
colorspace to a given linear color space. This includes data from
fully-masked sensors.
It seems to me that (2) isn't necessary with something like EXR, nor
would you really have a good reason to keep the camera-encoded-linear
values around for archival purposes (assuming you have enough EXR
precision; probably 32bpp).
I'm not quite sure about (1), but I can sort of see the argument for
it. Although it seems to me that you could demosaic the image and
keep around a tiny amount of information that would let you *reverse*
the demosaicing process with near-zero loss, in case you ever found a
need for the mosaiced data.
The main argument for RAW/DNG is that the camera is the wrong place to
be doing postprocessing... this makes sense, and I see why these
formats make sense for getting bits out of the camera. But for
long-term archival, I'm a bit skeptical, especially since I can see
that DNG is going to need to be extended every few years. I'm
wondering if something like OpenEXR wouldn't be a better archival
format. DNG/RAW strike me more as "bits on the wire" formats than
"bits on the disk" formats.
Also, is it reasonable to assume that imaging devices will continue to
have linear colorspaces? I can see two outcomes here -- either a new
generation of CCD's comes out with a log/float-ish colorspace or else
the CCDs get so incredibly sensitive that they can capture hundreds of
bits of linear colorspace data, which would be impractical to store
without first translating to some alternatively-mapped colorspace.
Just some thoughts.
- a
of the new metadata falls into two broad categories:
1) The camera's mosaic pattern: the (x,y,color) coordinates of
sensors and their filters.
2) Parameters indicating how to map from the camera's [linear]
colorspace to a given linear color space. This includes data from
fully-masked sensors.
It seems to me that (2) isn't necessary with something like EXR, nor
would you really have a good reason to keep the camera-encoded-linear
values around for archival purposes (assuming you have enough EXR
precision; probably 32bpp).
I'm not quite sure about (1), but I can sort of see the argument for
it. Although it seems to me that you could demosaic the image and
keep around a tiny amount of information that would let you *reverse*
the demosaicing process with near-zero loss, in case you ever found a
need for the mosaiced data.
The main argument for RAW/DNG is that the camera is the wrong place to
be doing postprocessing... this makes sense, and I see why these
formats make sense for getting bits out of the camera. But for
long-term archival, I'm a bit skeptical, especially since I can see
that DNG is going to need to be extended every few years. I'm
wondering if something like OpenEXR wouldn't be a better archival
format. DNG/RAW strike me more as "bits on the wire" formats than
"bits on the disk" formats.
Also, is it reasonable to assume that imaging devices will continue to
have linear colorspaces? I can see two outcomes here -- either a new
generation of CCD's comes out with a log/float-ish colorspace or else
the CCDs get so incredibly sensitive that they can capture hundreds of
bits of linear colorspace data, which would be impractical to store
without first translating to some alternatively-mapped colorspace.
Just some thoughts.
- a
--
PGP/GPG: 5C9F F366 C9CF 2145 E770 B1B8 EFB1 462D A146 C380
PGP/GPG: 5C9F F366 C9CF 2145 E770 B1B8 EFB1 462D A146 C380