Next: Middle Tier Services Accessing the Chandra X-Ray Center Data Archive
Up: Data Management and Pipelines
Previous: CalFUSE v2.2: An Improved Data Calibration Pipeline for the Far Ultraviolet Spectroscopic Explorer (FUSE)
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint

Sahnow, D. J. & Dixon, W. Van Dyke 2003, in ASP Conf. Ser., Vol. 295 Astronomical Data Analysis Software and Systems XII, eds. H. E. Payne, R. I. Jedrzejewski, & R. N. Hook (San Francisco: ASP), 245

The Next Step for the FUSE Calibration Pipeline

David J. Sahnow, W. Van Dyke Dixon, and the FUSE Science Data Processing Group
Department of Physics and Astronomy, The Johns Hopkins University, Baltimore, MD 21218, Email: sahnow@pha.jhu.edu

Abstract:

The calibration pipeline for the Far Ultraviolet Spectroscopic Explorer (FUSE) satellite was designed years before it was launch-ed. Since then, a number of unexpected instrumental features were discovered and the pipeline was modified appropriately. Eventually, these changes made the design so cumbersome that the pipeline became difficult to maintain. In 2002, we began to develop a new pipeline concept that takes into account the actual instrument characteristics. We present our plans for this improved calibration pipeline.

1. Introduction

The design of the CalFUSE pipeline dates to well before the launch of FUSE. As the primary FUSE mission draws to a close and an extended mission begins, the resources available for maintaining the existing pipeline will diminish. Thus, it is prudent to rethink the design, consider ways to make it easier to maintain, and investigate changes which may improve the data quality. This process was begun in the summer of 2002 when we proposed that for version 3 of the pipeline, a new method for calibrating the data be used. These changes, which are described in the following sections, are intended to improve the data quality while ensuring flexibility for future modifications. The ideas for these changes have been prompted by our three years of experience with FUSE data, along with information obtained during the design of the pipeline for the Cosmic Origins Spectrograph, which will use a similar detector (Beland et al., this conference).

The present FUSE pipeline (Dixon et al. 2003) is less flexible than desired when dealing with a number of instrument properties which were discovered (or appreciated more clearly) after launch. These include the thermally-induced motions of the mirrors and gratings, changes in the detector y scale as a function of count rate, event bursts, the ``worm,'' and the decrease in pointing stability due to the failure of reaction wheels. Some of these effects are due to unexpected performance of the instrument hardware, while others are a consequence of the analog nature of the double delay line detectors. Among the shortcomings of the original design are the fact that time-tag data was converted into a two-dimensional image in an early step. Although this would work well if there were no time-varying effects on the data, this is not the case for FUSE.

In addition to being developed with the instrument anomalies in mind, the new design is more flexible, so that any new effects discovered as the instrument ages can be dealt with more gracefully. The modular design should allow for the addition of new modules with little or no effects on the existing ones. Although the current design also permitted modules to be added, the fact that each created its own output file and expected a unique format for its input made this difficult.

2. The Life and Death of a Photon

Figure 1 shows the path of a photon through the instrument. This list describes each effect. Items marked with an asterisk were not considered in the original pipeline design.

1.1 Doppler Shift due to motion of satellite.
1.2 Wavelength shift due to heliocentric motion.
2. *Satellite pointing jitter.
3. Four Barrel design -- divides incoming light among channels.

Figure 1: A schematic view of the path of a photon through the FUSE instrument. The steps which affect the data (and consequently, the pipeline) are numbered; each of these must be compensated for in the pipeline process.
\begin{figure}
\epsscale{0.34}
\plotone{P7.2_1.eps}
\vskip -0.25in
\end{figure}

4.1 *Mirror motions due to thermal effects, which cause motion of the spots at the FPAs.
4.2 Mirror reflectivity.
5. Focal Plane Assembly (FPA) position, which shifts the location of the spectra on the detectors.
6.1 Grating efficiency.
6.2 Dispersion & astigmatism due to grating design & alignment.
6.3 *Grating motions due to thermal effects, which cause motion of the spectra on the detector.
7. *The ``worm,'' caused by an interaction of the optical design and the detector grid wires.
8.1 Detector quantum efficiency.
8.2 Detector flat field.
8.3 Detector bad pixels.
8.4 Detector background.
9.1 *Detector ``walk'' -- position of photon depends on pulse height.
9.2 Detector geometric distortion effects.
9.3 *Detector change in Y scale as a function of count rate.
9.4 Detector shift and stretch as a function of temperature.
9.5 Detector electronics dead time.
10. Instrument Data System (IDS) computer dead time.

3. Processing Steps

A major improvement in version 3 is the use of a single Intermediate Data File (IDF) for the entire pipeline. The IDF is a FITS file containing a binary table in the first extension. This extension contains one row per photon, and has columns for time, x, y, and pulse height from the raw data; x and y in the geometrically undistorted detector frame; a weighting factor for each photon; x and y after all motions are removed; channel; and wavelength. Nearly all of the pipeline modules operate on this one file, by reading and writing particular columns. A simplified outline of the processing steps is presented below. The numbers in parentheses refer to the steps in the previous section.

Put all photons in a rectified image frame:

Remove Motions:

Assign Wavelengths:

Screen the Data:

Calibration:

4. Some Advantages of Version 3

The single Intermediate Data File means that the I/O is the same for all pipeline modules, and thus the order of modules can be changed, or new ones added, with a minimum of complication. The fact that the flow of the pipeline processing steps more closely follows the inverse of the ``life of the photon'' than previous version did makes it easier for users to understand the steps, and makes it easier to maintain.

Housekeeping (pointing stability, count rates, and high voltage values) data are used where appropriate to improve the quality of the data.

Since every pixel is assigned a floating point wavelength -- rather than having every photon put in a wavelength bin as happens now -- the final 1-dimensional spectrum can be binned to any convenient wavelength scale. This permits a straightforward addition of data from multiple segments. Because the analog photon positions (with times attached) are maintained for as long as possible, roundoff problems that currently exist will be minimized.

Acknowledgments

The NASA-CNES-CSA FUSE mission is operated by the Johns Hopkins University under NASA contract NAS5-32985.

References

Dixon, W. V. and Sahnow, D. J. 2003, this volume, 241


© Copyright 2003 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Middle Tier Services Accessing the Chandra X-Ray Center Data Archive
Up: Data Management and Pipelines
Previous: CalFUSE v2.2: An Improved Data Calibration Pipeline for the Far Ultraviolet Spectroscopic Explorer (FUSE)
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint