Next: Astro-E's Mission Independent Scheduling Suite
Up: Dataflow and Scheduling
Previous: Nightly Scheduling of ESO's Very Large Telescope
Table of Contents -- Index -- PS reprint -- PDF reprint


Astronomical Data Analysis Software and Systems VII
ASP Conference Series, Vol. 145, 1998
Editors: R. Albrecht, R. N. Hook and H. A. Bushouse

VLT Data Quality Control

P.Ballester, V.Kalicharan, K.Banse, P.Grosbøl, M.Peron and M.Wiedmer
European Southern Observatory, Karl-Schwarzschild Strasse 2, D-85748 Garching, Germany

 

Abstract:

The ESO Very Large Telescope (VLT) will provide astronomers with the capabilities to acquire large amounts of high signal-to-noise, multi-dimensional observational data. In order to fully exploit these data, the entire chain of the observation process must be optimized by means of operational procedures and control tools. As modern observatories evolve to production units delivering data products and calibrated scientific data to the end users, new challenges arise in assessing the quality of data products.

             

1. Introduction

Quality control is concerned with the quantitative assessment of calibration and science data. Controlling a quantity involves a measurement procedure and the comparison of the measure to a pre-determined target value. Taking the analogy of speed control on a motorway we can describe the system as a device (radar) collecting measurements of the system performance (passing cars). Measures are compared to a target value (the speed limit). If the system identifies a discrepancy, an operator (the policeman) takes a corrective action.

In an astronomical context, values will be measured by the pipeline on raw and reduced exposures, as well as with the Astronomical Site Monitor tracking ambient conditions parameters. Target value include user requested parameters, initial performance solutions, and modeled performance.

Quality control is essentially a distributed activity. It takes place at different locations and moments during the life cycle of an observation. Target values are produced off-line and must be transfered to the on-line control system. Results are recorded and summarized for evaluation and trend analysis.

2. On-line Quality Control

2.1. Verification of Science Data

Astronomers preparing their observation with the Phase II Proposal Preparation system (P2PP) can request a range of ambient conditions, including airmass, seeing, or moon phase. The observation scheduler takes into account the ambient conditions prevailing before starting the exposure. The on-line quality control system verifies these conditions after the exposure has been realized. The observation is flagged if the ambient conditions do not match the user requested values. Additional verification is performed on the raw frames, for example pixels saturation or read-out noise.

2.2. Ambient Conditions

The Astronomical Site Monitor provides information about weather conditions and other environment parameters (Sarazin & Roddier 1990). The site monitor data are based on cyclical reading of a variety of sensors and measuring equipment and calculation of derived data. The measurements include seeing, scintillation, atmospheric extinction, cloud coverage, meteorological data, and all-sky images. These measurements are compared with the user requested values by the quality control system. Independent seeing measurements are made on raw images during the off-line quality control stage.

The image quality measured at the focal plane of the instrument is usually larger than the ASM value because of the internal seeing resulting from dome, telescope and instrument thermal effects. The QC system will therefore accumulate data and allow the correlation of external ASM seeing measurements with instrumental image quality.

2.3. Verification of calibration data

One of the direct applications of quality control is the verification of instrument performance based on the analysis of calibration exposures. Indeed, in this case we observe a reference source of known characteristics, performance parameters can be measured accurately, and the images are repeatable exposures taken in standard conditions which make them adequate for automatic processing. It will therefore be possible to check that the observation equipment is working normally by analyzing calibration or reference targets exposures.

Observations taken at ground-based observatories are affected by diverse sources of variability. The changing characteristics of optical and electronic systems and atmospheric effects make it necessary to frequently re-calibrate equipment. By performing regular monitoring of the characteristics of the calibration solutions, it will be possible to discriminate between the stable, slowly varying and the unstable components of the solutions, and therefore to learn about the characteristics of the instrument. The stable part of the calibration solution can usually be explained by physical models (Ballester & Rosa 1997).

On the on-line system, calibration data are verified against the reference solutions. A graphical window is regularly updated to display the measurements (Figure 1). The system was tested as a prototype at the New Technology Telescope (NTT).


 
Figure 1: The prototype on-line quality control system at the New Technology Telescope
\begin{figure}

\psfig {figure=ballesterp1.ps,angle=-90,width=13cm,height=9cm}
\end{figure}

Performance measurement is an essential step in making progress in the inherent conflict between the need for calibration data and the time available for scientific data taking. Regular monitoring makes it possible to decide which calibration data are actually required and on which timescale they need to be updated.

3. Off-Line Quality Control

Technical programs are scheduled by application of the instrument calibration plan, describing the type and frequency of calibration exposures required to monitor the characteristics of the observation equipment. The calibration data resulting from technical programs are delivered to the users in addition to their scientific data, and used by the Data Flow Instrument Responsibles (DFIR) to prepare master calibration solutions and monitor instrument performance.

The calibration observation blocks are pre-processed by the instrument pipeline in order to generate a preliminary calibration solution as well as quality measurements. The Data Flow Instrument Responsible is notified after the execution of the calibration observation block. The pre-processed data are retrieved from the temporary area and reduced to produce the master calibration data. After certification, the master data is included in the central archive and distributed to all local copies of the calibration database.

Exposure time calculators and other models are used as references for the instrument performance. They are also used as observation preparation tools, and made available on the Internet.

Among the parameters monitored with the off-line system are the instrument variability (e.g., flexure, drifts, throughput) and peculiarities (e.g., non-linearities, scattering and reflexions). The accuracy of the calibration solutions is verified by the DFIR and documented in the quality control report database.

For the validation of science user data, a more complete verification of the user requested parameters is made off-line, and the science data are associated with calibration solutions. The information collected during on-line verification concerning the calibration accuracies is added to the data.

References:

Ballester, P., & Rosa, M., 1997, ESO Preprint 1220, in press.

Sarazin, M., & Roddier, F., 1990, ``The E.S.O Differential Image Motion Monitor", Astron. Astrophys. 227, 294


© Copyright 1998 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA


Next: Astro-E's Mission Independent Scheduling Suite
Up: Dataflow and Scheduling
Previous: Nightly Scheduling of ESO's Very Large Telescope
Table of Contents -- Index -- PS reprint -- PDF reprint

payne@stsci.edu