Next: User Interfaces, Visualization, Data Acquisition and Reduction
Up: Data Pipelines and Quality Control
Previous: The Chandra Xray Center Data Archive Interfaces
Table of Contents - Subject Index - Author Index - PS reprint -

Ballester, P., Dorigo, D., Disarò, A., Pizarro De La Iglesia, J. A., & Modigliani, A. 2000, in ASP Conf. Ser., Vol. 216, Astronomical Data Analysis Software and Systems IX, eds. N. Manset, C. Veillet, D. Crabtree (San Francisco: ASP), 461

Data Quality Control at the Very Large Telescope

P. Ballester
European Southern Observatory, Data Management Division, Garching, Germany

D. Dorigo
Terma Elektronik GmbH, Weiderstadt, Germany

A. Disarò
Terma Elektronik GmbH, Weiderstadt, Germany

J. A. Pizarro De La Iglesia
Serco Facilities Management GmbH, Munich, Germany

A. Modigliani
Terma Elektronik GmbH, Weiderstadt, Germany


Conducting service observing at large ground-based observatories like the Very Large Telescope profits from delivering standard data products to the users. The operational applications needed to quantatively assess the VLT calibration and science data are provided by the VLT Quality Control system. The Quality Control system is needed to cope with the the large data volumes produced by VLT observations, the geographical distribution of data handling, and the parallelism of observations performed with the different VLT unit telescopes and instruments. The Quality Control system includes three main components: the calibration database is a distributed system for collecting and processing pipeline calibration data before storage in the VLT Archive. The McCreator application allows the operation teams to prepare calibration reference data and control the instrument performance.

1. Service Observing at the Very Large Telescope

The operations model of the VLT allows PIs to apply for visitor-mode or service-mode observation programs. The visitor mode corresponds to the mode of operations that has prevailed until now in most ground-based observatories: the visiting astronomer is physically present at the telescope and can adapt the observation program to specific target properties, changing observation conditions, or calibration needs. The service-mode is inspired in its philosophy from the operations of space-borne observation facilities like the Hubble Space Telescope. The Data Flow system (Grosbøl & Peron 1997) binds the components involved in the observation life-cycle from observation preparation and execution to processing and archiving (Silva & Quinn 1997).

The procedure for proposal preparation in the Data Flow system involves a Phase I and a Phase II proposal preparation. In Phase I, proposals are submitted electronically to ESO and evaluated by the Observing Program Committee (OPC). After the OPC selection has taken place, Phase II preparation is based on template forms describing standard instrument modes and configurations. Observation Blocks are created by specifying the template parameters, target information, and user-defined scheduling constraints. The user will be assisted in these phases by an Instrument Scientist and by observation preparation tools. These tools include generic systems like finding chart generators or guide star selection systems, and instrument related tools like exposure time calculators (ETCs). Feasibility checks of the proposals are performed by the observatory and include technical feasibility and exposure time control.

The Observation Blocks are queued for service observing and organized in a schedule managed on a long-, medium-, and short-term basis. The Short Term Scheduler is a decision support system which assists the operations manager in producing the observing time-line for one or a few nights. The main purpose of the scheduler is to maximize the utilization of telescope time respecting a number of heterogeneous, user-defined constraints: target viewing, lunar illumination, weather and absolute and relative timing. The STS is a flexible and interactive system that can be run in an automatic or semi-automatic way, so that the operator can always override the system's suggestions.

The Science Archive stores all raw frames produced by the instruments, as well as reference calibration data, and log files including maintenance and ambient conditions logs. The Science Archive is available to archive researchers and astronomers for catalog access and retrieval of scientific data as they become available after the end of the proprietary period, as well as retrieval of calibration, instrument data and logs as soon as they have been processed and verified by the Data Flow Operations.

The VLT pipeline is designed on a number of general principles including independence from the data reduction system, events and context driven actions, and support of calibration strategies and reduction levels.

2. Calibration Database

The Calibration Database (Ballester et al. 1999) includes a collection of observation data and reduction procedures representing as completely as possible the observing modes and configurations of the different ESO instruments. Within the Data Flow system, this class is the interface class for Pipeline and Quality Control to the Science Archive. It allows to request and submit files to the archive, and provides for the needs of maintaining a local database facility.

The calibration database gives access to calibration frames, instrument independent data, instrument configuration and components data. Reference Names are associated to each data, and made unique over the system. Data products can be queried and stored through an SQL enabled relational database engine.

Functionalities for a local database are be provided, allowing to cache (Local Calibration Database) the access to data stored in the central archive. Alignment procedures are necessary to update local databases whenever a requested data product is not present. Besides cache use, the Local Calibration Database is the first point of Data Products ingestion, until their certification just before flowing to the Science Archive for general availability.

The Calibration Database Manager standard application for browsing, editing, populating and aligning the local calibration database contents. This application, written in Java, is the interface between the pipeline local calibration database and the ESO archive system. It allows the ESO Quality Control Scientists to request and submit files to the archive, and to structure calibration information to the formats supported by the pipeline and needed for long-term archival and trend analysis.

Figure 1: The VLT Data Quality Control System.

3. Preparing Pipeline Calibration Frames with McCreator

The reduction pipelines associate reduction recipes and calibration data to the incoming observation data. While the reduction recipes usually remain stable over extended periods of time, the calibration data must be regularly updated to cope with always changing instrument and atmospheric properties. The nature and periodicity of the calibration exposures to be taken is defined in a calibration plan which for each mode of the instrument lists the observation blocks to be executed. Technical programs are scheduled and produce on execution calibration data which are pre-processed by the pipeline and stored in a central repository. The pre-processed frames are then used to create reference calibration solutions. After certification the calibration solutions are submitted to the VLT Science Archive.

McCreator is a graphical interface application written in Tcl/Tk, which provides a front-end interface to pipeline and quality control modules such as the Data Organizer, Reduction Block Scheduler, and the Calibration Database. This tool is used internally by the Quality Control Scientists for the preparation of the pipeline calibration data. Calibration data can therefore be processed in a pipeline mode with an interface that allows the QC Scientist to adjust all steps of preparation of the reference calibration data.

4. Assessing Observation Data

The verification of QC parameters allows the observer astronomer to assess observation data by comparing values measured on the raw or reduced data to target values specified by service observing users or by the observatory. Two levels of control are foreseen, called QC0 and QC1 (Ballester et al. 1998).

4.1. Quality Control Level 0

The level 0 of control is to verify that the user requested parameters, evaluated by the scheduler to trigger an observation, have been all respected during the observation. The parameters presently supported by the QC0 application in place for UT1 include airmass, moon distance, fractional lunar illumination, and seeing, and correspond to the observation constraints available in the Phase II Preparation System. This post-observation verification is performed independently and will catch variations of the meteorological conditions that might have occurred after the schedule evaluation. QC0 involves only access to the raw frames or the FITS header. It can therefore be applied to all frames produced by the instrument.

4.2. QC1 and Trend Analysis

The level 1 of control is the verification of the data by measuring parameters on reduced frames. It involves the association of calibration data and pipeline processing. Depending on the parameter controlled the processing is performed directly at the instrument pipeline after the observation or prior to the release of the user data package. Trend analysis is the evolution in time of QC1 parameters.


Ballester, P., Rosa, M. R., Grosbøl, P. 1998, Data Quality Control and Instrument Modeling, SPIE Proceedings 3349, Observatory Operations To Optimize Scientific Return, Kona, Hawaii, 20-21 March, 1998

Ballester, P., Dorigo, D., Disarò, A., Pizarro De La Iglesia, J. A., Modigliani, A. 1999, The VLT Data Quality Control System, ESO Messenger 96, 19

Grosbøl, P. & Peron, M. 1997, The VLT Data Flow Concept, in ASP Conf. Ser., Vol. 125, Astronomical Data Analysis Software and Systems VI, ed. G. Hunt & H. E. Payne (San Francisco: ASP)

Silva, D. & Quinn, P. J. 1997, VLT Data Flow Operations News, ESO Messenger 90

© Copyright 2000 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: User Interfaces, Visualization, Data Acquisition and Reduction
Up: Data Pipelines and Quality Control
Previous: The Chandra Xray Center Data Archive Interfaces
Table of Contents - Subject Index - Author Index - PS reprint -