HST Science Instruments (SI) are regularly decommissioned and replaced during servicing missions. The scientific and calibration/engineering data archives filled during routine operations necessarily are quite inhomogeneous as regards the quality of the routine calibration applied, and the insight into the instrumental effects that transpired into both, the calibration pipeline software and the associated calibration data base. Typically this resulted in step function increments in the quality of the products.
Scientific use of the near real time calibrated data and the close examination of the global calibration data base often reveals discrepancies, some of them so severe that the scientific value of the data is compromised. The goal of the POA project is to correct these deficiencies, using new software and a new approach to calibration.
Traditional calibration in astronomy has been based on the comparison of measured values with known standards. The correlations found in this manner are usually approximated by polynomial fits, which tend to combine many different effects and do not at all reflect the physics of the instrument. Severe problems occur in the case of measurements taken outside of the range covered by standard sources (e.g. going faint) because of the required extrapolation (Rosa 1995).
To solve this problem we are constructing instrument software models, which follow photons through the components of the instrument, using first principles, plus a Monte-Carlo type simulation to account for statistics and uncertainties (Ballester & Rosa 1997). Inserting the a priori information from physics thus reduces the number of unknowns and therefore the calibration data required to monitor the instrument (Rosa 1997).
By removing the known (physical) effects from the calibration data this predictive calibration therefore enhances dramatically the visibility of signatures not yet accounted for. Examples are camera distortions or thermal zero point drifts in the dispersion relations of spectrographs (Figure 1).
Flight operations of this low resolution HST spectrograph ran from 7/90 to 12/96 and the data were processed using different versions of the ground data processing pipeline. Although a scientific archive with high potential value was accumulated, the calibration is not consistent or coherent. While many of the research goals of the original prime investigators could be achieved with the limited quality of the particular calibrated data sets in question, a posteriori archival research aiming usually at the exploitation of the ``unintended'' science will suffer considerably more from the heterogeneous quality levels of a large archival data base. In addition, several effects not properly accounted for or even unknown during the time of data taking and ground processing severely compromise the scientific value. Examples are scattered light and the improper compensation of the geomagnetic field induced image wandering that affects all zero points in wavelength scales and photometric accuracies (Rosa et al. 1998).
The ST-ECF has been contributing to the FOS calibration effort since 1985:
The ST-ECF has been operating the European Science Data Archive since 1985. Data re-calibration during retrieval (``on-the-fly calibration'') has been implemented by the ST-ECF already by 1996. This offers the opportunity to perform a re-calibration of all FOS data using more appropriate calibrations based on s/w models of the HST/FOS components and its environment.
Even though the detectors were carefully shielded the digicons in FOS turned out to be sensitive to the geomagnetic field. The effect originally recognized is a distortion and/or broadening of the spectral lines due to the motion of the HST through the geomagnetic field. In addition the wandering of the image produces absolute and color term photometric errors in small apertures. There was an attempt to solve this geomagnetic image motion problem (GIMP) through patches in the flight software in 1993. However, there are still residual shifts, introducing errors of up to 450 km/sec (Figure 2).
Our FOS instrument model uses orbital locations based on NORAD ephemerides (better than 100 m at 600 km), and geomagnetic field software from GSFC which was scaled with actual HST magnetometer readings. The displacement of the spectra was determined from an accurate opto-electronic model of the FOS. The flight software was reverse engineered to determine and undo (whenever possible) the error introduced by the GIMP correction.
The goals of the POA project are a thorough and comprehensive review of the combined effect of calibration reference data and calibration pipeline software, having the post-fact opportunity to view the entirety of an instruments lifetime dataset. A revision of the Faint Object Spectrograph (FOS, decommissioned 1997), wavelength calibration is currently finalized and will become available to the community in early 2000. Close-out for the major sources of reduced scientific quality in the current FOS archive is expected for mid 2000. The GHRS archive will follow suite late 2000 to mid 2001. Deliverables foreseen are copies of the recalibrated data for inclusion into the data archive at STScI, as well as on-line availability through the on-the-fly recalibration mechanism available from the STECF/CADC archive sites. Documentation will be updated and made available through the Web, and will be visible on the support pages at STScI as the primary portal.
Ballester, P., & Rosa, M. R. 1997, A&A Suppl., 126, 563
Rosa, M. R. 1995, Predictive calibration strategies, in: ``Calibrating and Understanding HST and VLT Instruments'', P. Benvenuti (ed.), ESO/STECF Workshop, 43
Rosa, M. R. 1997, Predictive Calibration using Instrument Models, ST-ECF Newsletter, 24, 15
Rosa, M. R., Kerber, F., & Keyes, Ch.T. 1998, Zero Points of FOS Wavelengths Scales, FOS Instrument Science Report CAL/FOS 149, STScI