The On The Fly calibration system (OTF), originally implemented by the CADC and the ST-ECF staff is becoming a very important aspect of the HST archive. The main goals behind the implementation of the OTF system were to take advantage of:
As shown by Lubow & Pollizzi (1999), more than 90% of the datasets located within the HST archive need to be recalibrated! Although sometimes the gain is minimal, the potential of extracting better science from the HST data is clear.
This is even more evident for the data acquired at the beginning of the life of an instrument. In fact, during the first months of NICMOS and STIS operations both the pipeline and the calibration files rapidly evolved as the instrument scientist's understanding of the instrumental responses and environmental conditions improved.
In order to perform this operation for STIS and NICMOS, major changes to the existing OTF system were required, because of the complexity of these new instruments.
The STSDAS calibration system is ``header driven'': calibration switches and files, to be used by the calibration pipeline, are listed in the header of the raw file. This is a clever design since it allows to easily cope with changes in the system (e.g., the availability of new calibration steps, or of new reference files) by changing the relevant keywords in the raw file header before running the calibration software.
The first step in the OTF is to identify the correct list of raw files required for the calibration in order to retrieve them.
After retrieving the files, we query the Calibration Database System (CDBS, which is replicated to ECF and CADC from STScI) to get the best references files for a given dataset. This is a complex query since, because of timing differences, sometimes a calibration file is needed but not available yet at our respective centers. We have developed for this case a fall back mechanism allowing us to get the previous best calibration file. Here is an example of the query result:
CDBS: shadfile, N/A CDBS: lamptab, otab$i9u1550oo_lmp.fits CDBS: ccdtab, otab$h1v1158do_ccd.fits CDBS: darkfile, oref$ibk1356ko_drk.fits CDBS: wbiafile, oref$iag16560o_bia.fits CDBS: sptrctab, N/A CDBS: inangtab, N/A CDBS: apdestab, otab$hbp11488o_apd.fits
The result of this query is stored within the proper ckwxxx pset file which will then be used to edit the raw file header. If a reference file is not available at all, the related calibration switch will be set to omit, independently of the rule system (see later).
The list of reference files is built at CADC and checked every day for consistency and completeness. The changes are then replicated to ST-ECF using database and files system synchronisation tools.
Often, one would find wrong keyword values within the header. Once a HST file has been archived it is impossible to correct its header; instead, STScI updates a database which stores the keyword values. Therefore the next step is to extract the value of each critical keyword from that database.
The mapping of header keywords and the HSTDADS database is kept in a configuration file. SQL queries are built on the fly from it.
This model will be used for the new OTF calibration system that STScI is putting in place for WFPC2.
The rule file (Table 1) is the key for the STIS and the NICMOS OTF calibration pipeline. It is part of the OPUS system used at STScI to calibrate the observations as soon as the are received from HST. The rule file assembles the knowledge from the instrument scientist about the calibration steps to be performed according to the instrument mode in use.
Although quite cryptic the file is being parsed dynamically when the OTF is started; a PERL script is then built. Here is an example of the perl code created:
if ( ( $nicmos_cl{'TARGNAME'} eq "DARK" ) ) { $nicmos_cl{'DARKCORR'} = "OMIT"; $reason = 19; } elsif ( ( $nicmos_cl{'FILTER'} =~ /BLANK$/ ) ) { $nicmos_cl{'DARKCORR'} = "OMIT"; $reason = 20; } elsif ( ( $nicmos_cl{'OPMODE'} eq "BRIGHTOBJ" ) ) { $nicmos_cl{'DARKCORR'} = "OMIT"; $reason = 21; } else { $nicmos_cl{'DARKCORR'} = "PERFORM"; $reason = 22; }
We have now acquired all the information needed to perform the switch evaluation. For the example case which we have used so far, the decision algorithm is summarised here:
For each switch in the rules Does the switch need a reference file? > no: switch is set by evaluating the rule %, e.g.: > yes: did CDBS return a file for this (switch,dataset) key ? > no: missing reference file, switch is set to omit > yes: switch is set evaluating the rule, e.g.: Evaluation for DARKCORR is PERFORM Reason being 22
The result is put within a cl script which will be executed to alter the raw file header.
Each pset is finally executed via the putcal IRAF task, which modifies the header keywords in the specified raw files. The datasets are then ready to be calibrated with the most recent version of the STSDAS calibration S/W.
The OTF system contributed (and still contributes) to the reliability of the calibration software: we found and reported problems to the STScI/STSDAS group, which quickly fixed them. As soon as a new version of the calibration software is released we install it in our pipeline. Our archive users, with their archival requests, also contribute to testing the software. The OTF pipeline is used at CADC to produce NICMOS and STIS preview images/spectra of all the available datasets (15 minutes after release date!), further contributing to testing the pipeline.
In other words, the OTF pipeline, being in a never-ending development phase and continuously receiving new reference files, is a dynamic system. An observation calibrated two months ago is different from the one calibrated today. Only at the end of the life of an instrument, when the ``final archive'' is produced, (i.e., no further development is foreseen), will this process stop and the best(?) calibration pipeline be available to the community.
Since the pipeline is entirely modular, it is quite easy for us to add more to it. One example is the preview pipeline. All the HST preview images and spectra which are used at STScI, ST-ECF and CADC are generated at CADC, as an extension of the on the fly calibration system. Their production, which takes place just after the data become public, contributes to:
At the time of writing the HST archive is composed of 338 CDs for the RAW data, and has 18 GBytes of calibration files.
The HST OTF service is available at CADC and ST-ECF. Please contact these respective sites through catalog@eso.org or cadc@hia.nrc.ca.
Lubow, S. & Pollizzi, J. 1999, this volume, 187