Next: Grid Data Distribution strategy: Design and Implementation of a Pipeline Oriented Data Management System
Up: High Performance Computing
Previous: The Common Pipeline Library - a silver bullet for standardising pipelines?
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint

Zacchei, A., Vuerli, C., Lama, N., & Pasian, F. 2003, in ASP Conf. Ser., Vol. 314 Astronomical Data Analysis Software and Systems XIII, eds. F. Ochsenbein, M. Allen, & D. Egret (San Francisco: ASP), 396

Planck/LFI DPC Software Integration plan

A. Zacchei, C. Vuerli, N. Lama, F. Pasian
INAF-Osservatorio Astronomico di Trieste,Via Tiepolo 11 34131 Trieste Italy, Email: zacchei@ts.astro.it

Abstract:

A very spread software project needs to be well defined through software integration and development plan to avoid extra work in the pipeline creation phase. Here we will describe the rationale in the case of the PLANCK/LFI DPC project and what was designed and developed to build the integration environment.

1. Introduction

LFI is one of the two instruments installed on board PLANCK, the M3 mission of ESA's Horizon 2000+ programme. Data reduction and analysis will be performed in pipeline mode at the Data Processing Center (DPC). The development of the DPC software is being performed in a collaborative way across a consortium spread across over 20 institutes in a dozen countries. Individual scientists belonging to a Software Prototyping Team develop prototype code, which is then delivered to the LFI DPC team. The latter is responsible to integrate the code, so as to produce the pipeline software to be used during operations. Integrated source code is fed back to the originators. This development takes advantage of tools defined within the PLANCK IDIS 1 collaboration. A software policy has been defined, with the aim of allowing the DPC to run the best possible algorithms within its pipeline, while fostering collaboration inside the LFI Consortium and across PLANCK, and preserving at the same time the intellectual property of the code authors on the processing algorithms devised.

2. The Data Processing Center (DPC)

The PLANCK DPCs are responsible for the archiving and the delivery of the following scientific data products:

DPC processing can be logically divided in five levels: simulation (level S), telemetry processing and interface with the MOC 2 (level 1), data reduction and calibration (level 2), component separation and optimization (level 3), generation of final products (level 4). DPC processing is harmonically arranged in a way that a data pipeline, driven by a process coordinator, is set up. The LFI pipeline is operated at OAT for levels S, 1, 2 and 3. Level 4 is hosted at MPI (Garching). To harmonize inter-consortia and intra-consortium activities, a common set of tools called IDIS for the sharing of information, documents, common software and data is being designed and built.

3. Development/Integration/Release General Scheme

In figure 1 the general structure of the PLANCK LFI development cycle is shown.

Scientists' contributions to algorithms development stems from data processing requirements. When they produce a frozen version (prototype), this code will be engineered, optimized and integrated at the DPC as a module to be harmonized into the pipeline; thus a first release will be issued. Scientific tests on the final product (release) could modify the main requirements, so this implies that we must take into account the pipeline changes that will produce different pipeline issues.

Figure 1: LFI Development Cycle
\begin{figure}
\centering \epsscale{.50}
\plotone{P4-8_1.eps}
\end{figure}

4. Software Integration Cycle

Prototype S/W developed by scientists is not expected to have the robustness, documentation and maintainability that the DPC pipeline S/W is expected to guarantee. For this purpose, an integration team (the DPC team) engineers the prototypes and pipelines elements in operative robust DPC S/W. Efforts will be also specifically concentrated on the possibility of correct recovery in the case of DPC hardware failures, and on the concept of providing ``warm'' or ``cold'' backup capabilities at operations time. The initial delivery of prototype software (code sources, scripts, make files, etc.) from both the software prototyping team and the modeling and simulations team to the DPC is accompanied by documentation forming the basis of the URDs for the scientific pipeline (one for each level). The LFI DPC integration team is responsible for integrating the code, so as to produce pipeline software satisfying all requirements defined in the URDs. The pipeline integration phase follows the ESA PSS-05 software development standards, including Product and Quality Assurance. The performance results and testing results of the integrated software are provided to the software prototyping team and the simulations and modelling team at the time of the official release. Integrated source code is provided back to the originators, and not intended for general distribution.

Figure 2: a) LFI Software Integration Cycle, b) LFI Software Verification and Validation Cycle
\begin{figure}
\epsscale{.60}
\plottwo{P4-8_2.eps}{P4-8_3.eps}
\end{figure}

5. Software Verification and Validation

Each module produced by scientists, before the integration phase, must pass the validation plan. The scheme of the validation (Fig 2b) at acceptance point applied corresponds to the following list of operations:

6. Software Repository

The DPC software developed within LFI, having either prototype or integrated status, is subject to an access policy. This policy has the aim of allowing the DPC to run the best possible algorithms within its pipeline, while fostering collaboration inside the LFI Consortium and across PLANCK, and preserving at the same time the intellectual property of the code authors on the processing algorithms devised. The access policy can be summarized as follows:

7. Conclusions

The entire software development and integration cycle was already tested and applied to the production of the first LFI pipeline (the Bread-Board Model). The principles used to develop, integrate and engineer the code necessary for the PLANCK LFI pipeline can easily be applied to any big software development project.

Acknowledgments

The LFI is funded by the national space agencies of the Institutes of the Consortium (ASI for the Italian participation).

References

Mandolesi N., et al., 1998, PLANCK/LFI, A Proposal Submitted to the ESA.

Pasian F., (2002), Mem S. A. It. 2003, Vol 74, 502, PLANCK/LFI: the Data Processing Centre.



Footnotes

... IDIS 1
Integrated Data an Information System
... MOC 2
Mission Operation Center

© Copyright 2004 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Grid Data Distribution strategy: Design and Implementation of a Pipeline Oriented Data Management System
Up: High Performance Computing
Previous: The Common Pipeline Library - a silver bullet for standardising pipelines?
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint