Next: Automated reduction and analysis of images from multiple data archives
Up: High Performance Computing
Previous: Data Processing at the INTEGRAL Science Data Centre
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint

Türler, M., Rohlfs, R., Morisset, N., Meharga, M. T., Courvoisier, T. J.-L., & Walter, R. 2003, in ASP Conf. Ser., Vol. 314 Astronomical Data Analysis Software and Systems XIII, eds. F. Ochsenbein, M. Allen, & D. Egret (San Francisco: ASP), 440

From INTEGRAL to Planck: Geneva's contribution to the LFI data processing

Marc Türler1, Reiner Rohlfs, Nicolas Morisset, Mohamed T. Meharga, Thierry J.-L. Courvoisier2 and Roland Walter3
INTEGRAL Science Data Centre, ch. d'Ecogia 16, CH-1290 Versoix, Switzerland

Abstract:

The know-how and the tools developed at the INTEGRAL Science Data Centre (ISDC) can also serve other scientific space missions. This is clearly illustrated by Geneva's contribution to the level 1 data processing for the Low-Frequency Instrument (LFI) of ESA's Planck mission. We present here a general overview of the main tasks of this first step of the data processing with the implementation chosen for INTEGRAL and Planck. A similar scheme could be very efficiently applied to other space missions and would help to match tight budget constraints by reusing existing software and knowledge.

1. Introduction

The INTEGRAL gamma-ray mission of the European Space Agency (ESA) was launched on October 17, 2002. Since then, the INTEGRAL Science Data Centre (ISDC) is continuously receiving, processing, analyzing, distributing and archiving the data of its four instruments (Courvoisier et al. 2003). The ISDC is located in Versoix, near Geneva, Switzerland and is staffed by about 35 scientists and engineers. Starting in 1996, this team defined and developed, in collaboration with the Instrument Teams, the complete data processing system including visualization tools, as well as alert generation tools for gamma-ray bursts and for new or flaring sources.

The effort invested in this development can be efficiently reused for other space missions. This is currently being done at the ISDC for ESA's Planck mission to be launched in 2007. Since the end of 2001, a small sub-team of the ISDC, including only three software engineers, is developing the main software components for the level 1 data processing of the Planck Low Frequency Instrument (LFI). This work is done in collaboration with the LFI Data Processing Centre (DPC) in Trieste, Italy, where the software will be integrated and run. We present below a general level 1 data processing architecture and show how this scheme was implemented for the INTEGRAL and Planck missions.

2. Level 1 Data Processing

Level 1 data processing refers usually to all steps needed to convert the raw telemetry into well sorted datasets in physical units containing all the needed information to perform the subsequent scientific analysis. In the case of Planck, the aim of level 1 is to produce long series of time ordered flux measurements for each detector together with spacecraft attitude information to be able to construct, as part of level 2, full sky maps at different frequencies. The basic steps of level 1 data processing are quite general and do not rely much on mission characteristics. They can be summarized as shown in Figure 1.

Figure 1: Overview of the main tasks generally referred to as Level 1 data processing with the actual implementation for INTEGRAL (on the left) and for the Planck LFI (on the right).
\begin{figure}
\plotone{P4-20_f1.eps}
\end{figure}

2.1 Data Receipt and Pre-Processing

The first step is to unframe the spacecraft telemetry packets and to store the packet data either temporarily or permanently. In the case of INTEGRAL, the raw telemetry is archived in FITS files. This first step is done by the Data Receiver, which shall be a very robust component to avoid any loss of data. As it is a very generic tool, we could reuse it basicly unchanged for Planck. The second step is to decode, sort and decompress the data. This is done for INTEGRAL by a complex tool called PreProcessing (Morisset et al. 2004). PreProcessing reads the header of each packet and based on this information sorts the packets per instrument, operation mode and data type. It handles both science and housekeeping data and its architecture is independent from the telemetry. A new type of telemetry packet will simply require a new dedicated parser. This object oriented design makes it easy to adapt the software for other missions. In the case of Planck, only limited changes to PreProcessing were needed to develop the LFI Telemetry Unscrambler (TMU). The output of this program is stored for the time being in separate FITS files. As for INTEGRAL, the data structure format is defined in ASCII templates, which are used by the software to generate empty FITS files ready to be filled with the data. For Planck it is foreseen to store the data in a Versant database to allow a more flexible usage. The interfaces between the TMU and the database are currently being defined.

2.2 Data Visualization

At this point, a data centre has to be able to visualize the data which have been stored. As an illustration, typical displays for INTEGRAL and Planck data are shown in Figure 1. This is important to check the completeness and the quality of the scientific data and to monitor the health of the instruments by looking at the housekeeping data. This task is quite complex for the four instruments aboard INTEGRAL because of the great variety of data which have to be displayed sometimes even in 3-dimensional graphics. ROOT, a C++ framework developed for particle physics at CERN, was found to be a powerful tool to perform this task (Rohlfs 2004). For Planck, the task is simplified by the fact that the data at this level consist of one-dimensional series of measurements equally spaced in time. However, it was also chosen to use ROOT for the Planck LFI Quick-Look Analysis (QLA) in particular to be able to easily include a ``ToolBox'' allowing to apply mathematical functions to the data. It is for instance possible to display time-averaged data, the ratio of two data sets or to perform and display a Fourier-transform of the data. The Real Time Assessment (RTA) of the Planck LFI health will be performed by a tool similar to the QLA allowing, as for INTEGRAL, to select any housekeeping parameter to display its temporal evolution.

2.3 Data Preparation for Level 2

The final step of level 1 data processing is to convert the data into physical units and to store them in a format that can be directly used for level 2. This includes instrument calibration and time correlation, i.e. converting the time given by the on-board clock to Terrestrial Time (TT). The treatment of auxiliary data is also part of this step. Auxiliary data are not sent by the spacecraft, but received from the ground segment. They include orbit parameters, spacecraft attitude and observation scheduling information. The experience we had with INTEGRAL is that this step is far from being trivial and unexpected problems might arise quite late in the software development process. For Planck the task of the TeleMetry-to-Time Ordered Information (TM2TOI) component, as identified now, consists of simply producing long sequences of time ordered measurements spanning six months of data in order to perform whole sky maps as part of level 2. The needed auxiliary information to reconstruct the maps shall be appended to the data. This simple scheme will certainly become a much more complicated task once all the details of the instruments and the auxiliary information will be available.

3. Summary and Conclusion

The tasks to be performed as part of the level 1 data processing are quite general and do not differ much from one space mission to the other. We illustrate this by the architecture designed for INTEGRAL at the ISDC and efficiently adapted to the Planck mission. That so different missions - one observing in the gamma-ray and the other in the millimeter range - can indeed share the same software architecture at the first level of data processing demonstrates that this scheme will certainly also meet the needs of other space missions. Such an efficient reuse of knowledge and software is of prime importance towards a significant reduction of the development coasts of future missions.

References

Courvoisier T. J.-L., Walter R., Beckmann V., et al. 2003, A&A, 411, L53

Morisset N., Contessi T., Meharga M. T., et al. 2004, this volume, 388

Rohlfs R. 2004, this volume, 384



Footnotes

...ürler1
Geneva Observatory, ch. des Maillettes 51, CH-1290 Sauverny, Switzerland
... Courvoisier2
Geneva Observatory, ch. des Maillettes 51, CH-1290 Sauverny, Switzerland
... Walter3
Geneva Observatory, ch. des Maillettes 51, CH-1290 Sauverny, Switzerland

© Copyright 2004 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Automated reduction and analysis of images from multiple data archives
Up: High Performance Computing
Previous: Data Processing at the INTEGRAL Science Data Centre
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint