Next: Adaptive Optics at the VLT: NAOS-CONICA
Up: Telescopes and Observatory Operations
Previous: Telescopes and Observatory Operations
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint

Ekers, R. D. 2003, in ASP Conf. Ser., Vol. 295 Astronomical Data Analysis Software and Systems XII, eds. H. E. Payne, R. I. Jedrzejewski, & R. N. Hook (San Francisco: ASP), 125

Conceptual Design for the Square Kilometer Array

R. D. Ekers
CSIRO, Australia Telescope National Facility, Sydney, NSW 2121, Australia

Abstract:

New technologies have made it possible to construct an affordable radio telescope with collecting area of one square km: the SKA. Such a telescope would be so powerful that we could expand our knowledge of the universe from the earliest stages of its formation through to planetary exploration with greatly enhanced spacecraft communications. The SKA will join the new generation of telescopes at other wavebands with the sensitivity and resolution to image the earliest phases of galaxy formation, as well as greatly extending the range of unique science accessible at radio wavelengths. We already know how to build an SKA, the issue is how to build the most cost effective SKA, and how to maximize the science we can do with it. This project was born International and a number of countries are now comparing conceptual designs. All implementations call on Moore's law to satisfy the computationally demanding requirements. Some are more demanding than others and involve technologies and operational procedures never previously implemented in an astronomical facility.

1. The Development of Radio Astronomy

1.1 Exponential Growth in Science

It is well known that most scientific advances follow technical innovation. For astronomy this is well documented by Harwit (1981). De Solla Price (1963), had also reached this conclusion from his application of quantitative measurement to the progress of science in general. His analysis also showed that the normal mode of growth of science is exponential and showed examples from many areas. Moore's law, describing the 18 month doubling of transistor density on semiconductor chips, is a more recent re-discovery of this effect.

A famous example of exponential growth is the rate of increase of the operating beam energy in particle accelerators, as illustrated by Livingstone and Blewett (1962) and updated by Sessler (1988). Starting in 1930, each particle accelerator technology initially provided exponential growth up to a ceiling where the growth rate leveled off. At that point, a new technology was introduced. The envelope of the set of curves is itself an exponential curve, with an increase in energy of $10^{10}$ in 60 years. This example, originally presented by Fermi, has become known as the Livingstone Curve and is shown in Figure 1a.

1.2 Radio Telescope Sensitivity

A plot of the continuum sensitivity of telescopes used for radioastronomy since the discovery of extra-terrestrial radio emission in 1933 shows this exponential character (Figure 1b) with an increase in sensitivity of $10^5$ since 1940, doubling every three years. As with the previous example, particular radio telescope technologies reach ceilings and new technologies are introduced. In particular, there was a transition (about 1980) from large single dishes to arrays of smaller dishes. To maintain the extraordinary momentum of discovery of the last few decades a very large new radio telescope will be needed in the next decade.

Figure 1: (a) Livingstone curves        (b) Radio telescope sensitivity
\begin{figure}
\plottwo{O3.1_1.eps}{O3.1_2.eps}
\end{figure}

1.3 The Square Kilometre Array (SKA)

An increase in sensitivity needed to maintain this exponential growth until 2010 cannot be achieved by improving the electronics or receiver systems in existing telescopes, but only by increasing the total effective collecting area of radio telescopes to about a million square metres. The project has therefore acquired the appellation, the Square Kilometre Array (http://www.skatelescope.org).

1.4 The Epoch of Re-Ionization

By the end of this decade one of the biggest questions remaining in astronomy will be the state of the Universe when the neutral hydrogen which recombined after the Big Bang is being re-ionized. This epoch of the Universe is totally opaque to optical radiation but can be probed by the 21-cm H line. The first structures will appear as inhomogeneities in the primordial hydrogen, heated by infalling gas or the first generation of stars and quasars. A patchwork of either 21-cm emission or absorption against the cosmic background radiation will result. This structure and its evolution with z will depend on the nature of the re-ionization sources. A large population of low mass stars have a completely different effect than a small number of QSOs. From $z \sim 6$ we expect to see a growing `cosmic web' of neutral hydrogen and galaxy halos forming and evolving (e.g., Tozzi et al. 2000). A radio telescope with a square kilometer of collecting area operating in the 100-200 MHz frequency range will have the sensitivity to detect and study this web in HI emission!

2. The Square Kilometre Array

2.1 The Concept

The SKA is a unique radio telescope now being planned by an international consortium. Extensive discussion of the science drivers and of the evolving technical possibilities led to a set of design goals for the Square Kilometre Array (Taylor & Braun 1999). Some of the basic system parameters required to meet these goals are summarized in Table 1.


Table 1: Instrumental Design Goals
Parameter Design Goal
Sensitivity 100 times the VLA
Total Frequency Range 0.03 - 20 GHz
Imaging Field of View 1 square deg. @ 1.4 GHz
Angular Resolution 0.1 arcsec @ 1.4 GHz
Surface Brightness Sensitivity 1 K @ 0.1 arcsec (continuum)
Instantaneous Bandwidth 0.5 + $\nu$/5 GHz
Number of Spectral Channels $10^4$
Number of Instantaneous Pencil Beams 100 (at lower frequencies)
   

3. Building the Square Kilometre Array

Costs of major astronomy facilities have now reached $US 1 billion levels. International funding is unlikely to exceed this value implying it has to be built at a cost $<$ $US1000 per square metre for $10^6$ square metres. If we compare the costs per square metre of existing radio telescopes (Table 2) we see that we will need innovative design to reduce cost.

Table 2: Cost $/$ sq metre
Telescope       $US/sqm         $\nu_{max}$     
GBT 10,000 100 GHz
VLA 10,000  50 GHz
ATA  3,000  11 GHz
GMRT  1,000   1 GHz

One new technology that helps is the combination of transistor amplifiers and their large-scale integration into systems which can be duplicated inexpensively. Another essential technology is our recently acquired ability to apply digital processing at high bandwidth. This enables us to realize processes, such as multiple adaptive beam formation and active interference rejection, in ways not previously conceivable.

Some aspects of the technology needed are still in the development stage. Institutions participating in the SKA are designing and building prototype systems and the key technologies will be determined from these. The time frame during which a new radio facility is needed to complement other planned instruments will be in the years around 2010.

4. How to Build the SKA?

We have the technology to build the SKA now, we have decades of experience with diffraction limited interferometry and self calibration (Adaptive Optics). The issue for the SKA is not whether we can build it, but how to find the most cost effective solution. Options under consideration include: arrays of small dishes; planar phased arrays; single adaptive reflector; multiple Arecibos; arrays of Luneburg lenses.

4.1 Focal and Aperture Plane Arrays

There is an equivalence between focal plane arrays and aperture plane arrays. For a given number N of receiver elements, these two approaches are exactly equivalent for contiguous aperture. However, achieving the maximum compactness without either shadowing or geometric projection losses is only possible if the aperture plane array is on a tilting platform. For unfilled aperture arrays the synthesis approach trades resolution for brightness sensitivity.

The single dish forms its image with real time delays and is inherently wide band while an array with only electronic phasing (or Fourier transform of the complex coherence function) will be monochromatic. The aperture plane array can be made achromatic by dividing the band into sufficiently narrow spectral channels ( $\Delta \nu / \nu <$ element size/aperture size), and with the rapidly decreasing cost of digital electronics this becomes increasingly affordable.

There are major differences in implementation for the two approaches. A single dish uses optics to combine the analog signal (wave front) at the focus whereas a modern aperture synthesis telescope uses digital signal processing. This difference leads to a very big shift in cost between mechanical structures for a big dish and computers for an aperture plane array. These two cost drivers have a very different time dependence, with the decreasing cost of digital processing shifting the most cost effective designs from big dishes to arrays. At higher frequencies the increased cost contribution of the lowest noise receivers and the cost of the backend signal processing for the larger bandwidth, shift the balance back to arrays with larger dish size. A recent analysis by Weinreb & D'Addario (2001) shows that the optimum centimetre telescope in 2010 will be an array of 8m dishes.

4.2 Mass-produced Parabolas: The Allen Telescope Array

The Allen Telescope Array (ATA) being built by the SETI Institute and UC Berkeley is a modern example of an aperture plane array with 350 $\times$ 6.1m parabolic antennas giving aperture synthesis capability with a very large primary beam (2$.^$50 field of view at 1.4 GHz) and the equivalent of a 100m aperture. Taking advantage of modern electronics and wide band optical communication it will cover 0.5-11 GHz and generate four simultaneous beams. The planned completion date is 2005.

4.3 Phased Array

In the extreme aperture plane array with element size comparable to a wavelength it is possible, with no moving parts, to electronically steer beams to any part of the sky. It is also possible to generate simultaneous independent beams anywhere in the sky.

Figure 2: NFRA phased array with director Harvey Butcher and the Dwingeloo 25m dish in the background.
\begin{figure}
\epsscale{0.65}
\plotone{O3.1_3.eps}
\end{figure}

The Netherlands have now produced a pure phased array with significant collecting area and no moving parts. (Figure 2). The juxtaposition of the 25m dish and the phased array nicely illustrates 50 years of technology development.

4.4 Software and Computing Power

Much more computing capacity is needed for these telescopes with large numbers of elements but with computing power doubling every 18 months (Moore's Law) the required capacity looks achievable. However, software development time scales are now much longer than hardware development time scales so software should be treated as a capital cost and hardware, which needs to be upgraded continually to obtain maximum performance, becomes an operating cost.

5. Sensitivity

The most obvious impact of the SKA will be its sensitivity, almost 100 times that of any existing radio telescope. For example, the current deepest VLA integration in the HDF detects about a dozen sources. In the same region the SKA will detect many hundreds of galaxies and AGNs (Hopkins et al. 2000).

5.1 Computing Demand : Sensitivity

The increased sensitivity makes indirect demands on computational capacity and software systems. Interference rejection of up to $10^4$ will be needed to reach sensitivity limits, and the suppression of sidelobes from stronger sources will require a dynamic range of $10^6$. Achieving full sensitivity will also require reliable operation of a great many elements, putting pressure on the monitoring and debugging software needed to maintain the system.

6. Interference

Ironically, the very developments in communications that drive Moore's Law and make these radio telescopes possible also generate radio interference at levels far in excess of the weak signals detectable with an SKA. The future of radio observations with this high sensitivity will depend on our ability to mitigate against interference. A combination of adaptive cancelation, regulation and geographic protection will be required to let us access the faint signals from the early universe (Ekers & Bell 2001). These techniques will make critical and complex demands on computational requirements and are discussed further in the next section.

6.1 Mitigation Strategies and Issues

Undesired interfering signals and astronomy signals can differ (be orthogonal) in a range of parameters, including: frequency, time, position, polarization, distance, coding, positivity, and multi path. It is extremely rare that interfering and astronomy signals do not possess some level of orthogonality in this $\geq 8$ dimensional parameter space.

Figure 3: ATA in nulls, Bower(2002).
\begin{figure}
\epsscale{0.4}
\plotone{O3.1_4.eps}
\end{figure}

We are developing signal processing systems to take advantage of the orthogonality and separate the astronomy and interfering signals. Antenna arrays and focal plane arrays are particularly powerful because they can take advantage of the position, and even distance (curvature of wavefront) phase space (Ekers & Bell 2002).

The adaptive filter is one of the most promising areas of interference mitigation. The characteristics of the interfering signal in the astronomical data are used to derive the parameters of the filter which removes or reduces the interference. Two implementations of the adaptive filter are currently being studied (Kesteven 2002): the pre-detection filters which are well known in the signal processing field and a post-correlation filter which is well adapted to radio astronomy needs.

When these filter concepts are applied to arrays they are equivalent to the generation of spatial nulls in the direction of the interfering signals. A dramatic illustration of the potential to generate very complex patterns of nulls has been provided by Geoff Bowers who generated `ATA' in nulls with the 350 element of the Allen Telescope Array. (Figure 3).

7. Dynamic Range

Achieving the $10^6$ dynamic range needed to realize full sensitivity will be a very demanding and computationally intensive requirement. The SKA is unusual because in many implementations the `primary beam' is also generated by aperture synthesis hence it will be necessary to calibrate a synthesized time-variable primary beam with precision. The measurement equation formalism (Sault & Cornwell 1999) implemented in AIPS++ allows correction for such image-plane calibration effects. Self calibration (adaptive optics) will be needed.

The radio interferometry group at MIT/Haystack is studying the design of arrays made up of a very large number of small telescopes. These designs might have several thousand elements implying millions of baselines, and antenna elements are less than several meters. Such large-N configurations have extremely dense u-v coverage, and because of the small elements very large primary beams. This results in a massive correlation problem, but from these characteristics spring a startling number of benefits. The sidelobes due to (u,v) coverage are naturally very low, the achievable dynamic range is very high and there is great flexibility to generate nulls in order to remove interference. Of course the computational requirements are also correspondingly larger.

8. Resolution and Field of View (FOV)

Both focal and aperture plane arrays dramatically increase the throughput for surveys. The Square Kilometre Array will be the world's premier instrument for astronomical imaging. No other instrument, existing or currently planned, on the ground or in space, at any wavelength, will provide simultaneously: a wide instantaneous field of view (1 square degree) and exquisite and well defined angular resolution (0.1-0.001 arcsec); and wide instantaneous bandwidth $(\Delta \nu /
\nu > 50\%)$, coupled with high spectral resolution $(\nu /d\nu >10^4 )$ for detecting small variations in velocity.

8.1 Computing Demand : Resolution

For observations with the full field of view the maximum practical baselines used would be limited to about 300 km, corresponding to 0.1 arcsec at 1.4 GHz. Full resolution observations in subfields which contain structure could use up to 5000 km baselines providing milli arcsec resolution.

Signal distribution will involve transport of GHz bandwidth signals over 1000 km to hundreds of stations. Achieving this will be expensive and will be one of the key factors limiting ultimate performance.

8.2 Computing Demand : FOV

The image size for the highest resolution likely to be used over the full FOV is about $10^5\times10^5$ pixels. This is about 400$\times$VLA and should be achieved in 10-15 years. For the higher resolution available with SKA it is neither practical nor sensible to image the full FOV so hierarchical beamforming will be used to only image regions with signals of interest.

Wide field synthesis will require corrections for non planar effects and chromatic aberration. These are discussed in Cotton (1999).

8.3 Computing Demand : Spectral Line Imaging

An ambitious SKA spectral imaging correlator could require correlations of 8000 antennas (3.2$\times 10^7$ baselines) each with 1 GHz band width and 1000 spectral line channels.

Fortunately, extrapolation of the historical rate of correlator development (Figure 4) Wright (2002) shows that achieving this is not an unreasonable projection.

Figure 4: Correlator Development
\begin{figure}
\epsscale{0.87}
\plotone{O3.1_5.eps}
\end{figure}

9. Multiple Beams

The reduction in the cost and size of the electronics in telescopes of the future will allow radio astronomers to take increasing advantage of multibeaming through either focal plane or aperture plane arrays. In the extreme aperture plane array with element size comparable to a wavelength it is even possible to generate simultaneous independent beams anywhere in the sky changing the whole sociology of big telescope astronomy (Figure 5).

Figure 5: SKA Multibeaming
\begin{figure}
\epsscale{0.6}
\plotone{O3.1_6.eps}
\end{figure}

9.1 Computing Demand : Multiple Beams

Many simultaneous beams can be generated by signal processing from the output of an array of small dishes. For example, for an array which contains 500 dishes connected with 2 GHz bandwidth, it requires about 4,000 Gops to form each beam by direction summation. In early 1999 that was quite expensive! At $US250 per digital signal processing Gop, it amounted to $US10M per beam. However, the processing costs are dropping rapidly. Assuming that Moore's law continues to hold, then in about 2008 the processing cost will only be $\$$US2/Gop, corresponding to $<$$US0.1 M per beam.

One of the exciting advantages of the multiple beam operation is the diversity of backend configurations. Different beams could be configured with: spectral line imaging correlators; pulsar timing devices, pulse detectors or SETI processors. Correspondingly diverse software support would be needed.

We can have remotely configurable configurations for making parallel simultaneous observations with multiple backend configurations, and multiple users. Control software for such a facility will present challenging opportunities.

9.2 Observing Transients Before They Happen

Entirely new ways of doing astronomy may be possible with the SKA. With an array that is pointed electronically, the raw, `undetected' signals can be recorded in memory. These stored signals could be used to construct virtual beams pointing anywhere in the sky. Using such beams astronomers could literally go back in time and use the full collecting area to study pulsar glitches, supernovae and gamma-ray bursts or SETI candidate signals, following a trigger from a subarray of the SKA or from other wavelength domains.

10. International

Even with the dramatic reduction in cost of unit aperture, future telescopes such as the SKA will be expensive. One path to achieving this vision is through international collaboration. While the additional overhead of a collaborative project is a penalty, the advantages are also great. It can avoid wasteful duplication and competition; provide access to a broader knowledge base; generate innovation through cross fertilization; and create wealth for the nations involved.

References

Bower, G 2002, ATA memo Series, UC Berkeley

Cotton, W. D. 1999, Synthesis Imaging in Radio Astronomy II, eds. G. B. Taylor, C. L. Carilli, and R. A. Perley. ASP Conference Series, 180, 357

de Solla Price, D. J. 1963, Little Science, Big Science (New York: Columbia University Press)

Ekers, R. D., & Bell, J. F. 2001, in IAU Symp. 196, Preserving the Astronomical Sky, Vienna, ed. R.J. Cohen and W.T. Sullivan (San Francisco: ASP)

Ekers, R. D., & Bell, J. F., 2002, in IAU Symp. 199, The Universe at Low Radio Frequencies, ed. P. Rao (San Francisco: ASP)

Harwit, M. 1981, Cosmic Discovery (New York: Basic Books, Inc)

Hopkins, A., Windhorst, R., Cram, L., & Ekers, R. 2000, Experimental Astronomy, 10, 419

Kesteven, M. 2003, New Technologies in VLBI, Gyeongju, Korea; Nov 5-8, 2002 (San Francisco: ASP)

Livingstone, M. S. & Blewet, J. P. 1962, Particle Accelerators (New York: McGraw Hill)

Sault, R. J. & Cornwell, T. J. 1999, ASP Conf. Ser. 180, Synthesis Imaging in Radio Astronomy II, eds. G. B. Taylor, C. L. Carilli, and R. A. Perley (San Francisco: ASP) 657

Sessler, A. M. 1988, Physics Today, 41, 26

Taylor, A. R., & Braun, R. eds. 1999, Science with the Square Kilometre Array

Tozzi, P., Madau, P., Meiksin, A. & Rees, M. J. 2000, ApJ, 528, 597

Weinreb, S. & D'Addario, L. 2001, SKA Memo Series, 1

Wright, M. 2002, SKA Memo Series,1 21



Footnotes

... 1

© Copyright 2003 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Adaptive Optics at the VLT: NAOS-CONICA
Up: Telescopes and Observatory Operations
Previous: Telescopes and Observatory Operations
Table of Contents - Subject Index - Author Index - Search - PS reprint - PDF reprint