Издательство Cambridge University Press, 2004, -555 pp.
As processing technology continues its rapid growth, it occasionally causes us to take a new point of view toward many long-established technological disciplines. It is now possible to record precisely such signals as ultrasonic or X-ray signals or electromagnetic signals in the radio and radar bands and, using advanced digital or optical processors, to process these records to extract information deeply buried within the signal. Such processing requires the development of algorithms of great precision and sophistication. Until recently such algorithms were often incompatible with most processing technologies, and so there was no real impetus to develop a general, unified theory of these algorithms. Consequently, it was scarcely noticed that a general theory might be developed, although some special problems were well studied. Now the time is ripe for a general theory of these algorithms. These are called algorithms for remote image formation, or algorithms for remote surveillance. This topic of image formation is a branch of the broad field of informatics.
Systems for remote image formation and surveillance have developed independently in diverse fields over the years. They are often very much driven by the kind of hardware that is used to sense the raw data or to do the processing. The principles of operation, then, are described in a way that is completely intertwined with a description of the hardware. Recently, there has been interest in abstracting the common signal processing and information-theoretic methods that are used by these sensors to extract the useful information from the raw data. This unification is one way in which to make new advances because then the underlying principles can be more clearly understood, and ideas that have already been developed in one area may prove to be useful in others. A unified formulation is also an efficient way to teach the many topics as one integrated subject.
Surveillance theory, which we regard as comprising the various information-theoretic and computational methods underlying remote image formation systems, is only now starting to emerge as an integrated field of study. This is in marked contrast to the development of the subject of communication theory in which radio, telephone, television, and telegraphy were always seen as parts of a general theory of communication. For a long time, communication theory has been treated as an integrated field of study while surveillance theory has not.
This book is devoted to a unified treatment of the mathematical methods that underlie the various sensors for remote surveillance. It is not a book that describes the hardware used to build such systems. Rather, it is a book that describes the mathematics used to design the illumination waveforms and to develop the algorithms that will form the images or extract the desired information.
Because my goal is a unified presentation of the mathematical principles underlying the development of remote surveillance, the book is constructed so that the core is the ubiquitous two-dimensional Fourier transform. In addition, the mathematical topics of coherence, correlation functions, the ambiguity function, the Radon transform, and the projection-slice theorem play important roles. The applications, therefore, are introduced as consequences or examples of the mathematical principles, whereas the more traditional treatment of these subjects begins with a description of a sensing apparatus and lets the mathematical principles arise only as needed. While many would prefer that approach – and certainly it is closer to the history of these subjects – I maintain that it is a less transparent approach and does not advance the goal of unification.
Many of the physical aspects of a surveillance system – such as a radar or sonar system or a microwave antenna – are complicated and difficult to model exactly. Some may conclude from this that it is pointless to develop a mathematical theory that is honed to a sharpness beyond the sharpness of the physical situation. I take exactly the opposite view: If a rich mathematical theory can be developed, it should be developed as far as is possible. It is in the application of the theory that care must be exercised. For example, the interaction of electromagnetic waves with reflectors or with antennas can be quite complex and incompletely understood. Nevertheless, we can model the interaction in some more simple way and then carry the study of that model as far as it will go. To object to this would be analogous to objecting to the study of linear differential equations under the argument that every physical system has some degree of nonlinearity. Thus our primary emphasis is on an axiomatic formulation of the mathematical principles of surveillance theory rather than a phenomenological development from the underlying physical laws. The book treats surveillance theory in a formal way as a topic in applied mathematics. The engineering task, then, is to choose judiciously and combine the elements developed here with due regard to the capricious behavior of physical devices.
The two-dimensional Fourier transform (or the three-dimensional Fourier transform) is central to most of the systems developed, and we shall always take pains to bring the role of the two-dimensional Fourier transform to the forefront. Huygens’ principle, which is at the core of much of optics and antenna theory, but often is not treated rigorously, will be presented simply as a mathematical consequence of the convolution theorem of two-dimensional Fourier transform theory. In turn, the study of the far-field diffraction pattern of antennas will be largely reformulated as the study of the two-dimensional Fourier transform. Even the near-field diffraction pattern can be studied with the aid of the Fourier transform.
The book also describes the behavior of an imaging or search radar system from the vantage point of the two-dimensional Fourier transform. Specifically, it views the output of a radar’s front-end signal processing as the two-dimensional convolution of a radar ambiguity function and the reflectivity density function of the illuminated scene. This is the elegant formulation, and it is very powerful in that it submerges all the myriad details of image formation while leaving exposed the underlying limitations on resolution and estimation accuracy, as well as the nature of ambiguities and clutter.
Signals in one dimension
Signals in two dimensions
Optical imaging systems
Antenna systems
The ambiguity function
Radar imaging systems
Diffraction imaging systems
Construction and reconstruction of images
Tomography
Likelihood and information methods
Radar search systems
Passive and baseband surveillance systems
Data combination and tracking
Phase noise and phase distortion