Summary

Time Multiplexing Super Resolving Technique for Imaging from a Moving Platform

Published: February 12, 2014
doi:

Summary

A method for overcoming the optical diffraction limit is presented. The method includes a two-step process: optical phase retrieval using iterative Gerchberg-Saxton algorithm, and imaging system shifting followed by repetition of the first step. A synthetically increased lens aperture is generated along the direction of movement, yielding higher imaging resolution.

Abstract

We propose a method for increasing the resolution of an object and overcoming the diffraction limit of an optical system installed on top of a moving imaging system, such as an airborne platform or satellite. The resolution improvement is obtained in a two-step process. First, three low resolution differently defocused images are being captured and the optical phase is retrieved using an improved iterative Gerchberg-Saxton based algorithm. The phase retrieval allows to numerically back propagate the field to the aperture plane. Second, the imaging system is shifted and the first step is repeated. The obtained optical fields at the aperture plane are combined and a synthetically increased lens aperture is generated along the direction of movement, yielding higher imaging resolution. The method resembles a well-known approach from the microwave regime called the Synthetic Aperture Radar (SAR) in which the antenna size is synthetically increased along the platform propagation direction. The proposed method is demonstrated through laboratory experiment.

Introduction

In radar imaging, a narrow angle beam of pulse Radio Frequency (RF) is transmitted using an antenna that is mounted on a platform. The radar signal transmits in a side-looking direction towards the surface1,2. The reflected signal is backscattered from the surface and is received by the same antenna2. The received signals are converted to a radar image. In Real Aperture Radar (RAR) the resolution in the azimuth direction is proportional to wavelength and inversely proportional to the aperture dimension3. Thus, a bigger antenna is required for higher azimuth resolution. However, it is difficult to attach large antenna to a moving platforms such as airplanes and satellites. In 1951 Wiley4 suggested a new radar technique called Synthetic Aperture Radar (SAR), which uses the Doppler effect created by the movement of the imaging platform. In SAR, the amplitude as well as the phase of the received signal are recorded5. This is possible since the SAR optical frequency is about 1-100 GHz6 and the phase is recorded using a reference local resonator installed on top of the platform. In optical imaging, shorter wavelengths are being used, such as the visible and the near infra-red (NIR), which is about 1 μm, i.e. frequency of about 1014 Hz. The field intensity, rather than the field itself, is being detected since the optic phase changes too fast for detection using standard silicon based detectors.

While imaging an object through an optical system, the aperture of the optics serves as a low-pass filter. Thus, the high-frequency spatial information of the object is lost7. In this paper we aim to solve each of the above mentioned issues separately, i.e. the phase lost and the diffraction limit effect.

Gerchberg and Saxton (G-S)8 suggested that the optical phase can be retrieved using an iterative process. Misell9-11 has extended the algorithm for any two input and output planes. These approaches are proven to converge to a phase distribution with a minimal mean square error (MSE)12,13. Gur and Zalevsky14 presented a three planes method which improves the Misell algorithm.

We propose and demonstrate experimentally that restoring the phase while shifting the imaging lens, as done with the antenna in SAR application allows us to synthetically increase the effective size of the aperture along the scanning axis and eventually improve the resulted imaging resolution.

The application of SAR in optical imaging using interferometry and holography is well-known16,17. However, the suggested method is aimed for mimicking a scanning imaging platform, making it suitable for noncoherent imaging (such as side-looking airborne platform). Thus, the concept of holography, which uses a reference beam, is not suitable for such an application. Instead, the revised Gerchberg-Saxton algorithm is used in order to retrieve the phase.

Protocol

1. Setup Alignment

  1. Start by roughly aligning the laser, the beam expender, the lens, and the camera on the same axis; this would be the optic axis.
  2. Turn on the laser (without the USAT target), and make sure that the light passes through the center of the lens. Use an aperture iris to verify.
  3. Turn on the camera, and make sure that the light focuses on the center of the camera.
  4. Shift back the camera, using the linear z stage. Since the system is going out of focus, the spot of light will grow. Make sure that the center of the spot remains in the same lateral position. If not, carefully change the position of the imaging system and repeat this step until the spot remains at the same spatial position, up to a pixel level.

2. Imaging at Three Defocus Planes

  1. Insert the test target in front of the beam expender. Place the target so that the light that passes through it will pass through the center of the lens.
  2. Capture an image. This image will be an anchor point, and its location will be z0,x0 (all the other images will be in reference to its location). This image will be I1,b.
  3. Shift back the camera (using the linear z stage) a distance of dz = 5.08 mm (or 0.2 in) and capture an image. This image will be I2,b.
  4. Shift back the camera another distance of dz = 5.08 mm (10.16 mm relative to z0) and capture an image. This image will be I3,b.
  5. Go back to z0.

3. Scanning the Aperture

  1. Shift the entire imaging system laterally (using the linear x stage) a distance of dx = 2.5 mm and capture an image. This image will be I1,a.
  2. Repeat the process in Protocol 2. Shift back the camera (using the linear z stage) a distance of dz = 5.08 mm, and capture an image (I2,a). Shift back the camera another distance of dz = 5.08 mm, and capture an image (I3,a).
  3. Now, repeat the procedure for the other side. Shift the imaging system a distance of dx = -2.5 mm and capture a set of three images in three z positions (I1-3,c).
  4. Go back to z0,x0.

4. Phase Retrieval (Numerical Calculation)

  1. Using the three planes method14, and images I1-3,b, retrieve the optical phase of image I1,b.Using the phase that was retrieved, define q1,b.
  2. Monitor the correlation coefficient between I1,b and |q1,b|2, in order to verify that the iterative process does converge. To do so, use the 'corr2' function in MATLAB.
  3. Repeat the phase retrieval process for I1-3,a, and I1-3,c.

5. Super Resolved Image (Numerical Calculation)

  1. Using Fresnel free space propagation (FSP) integral15, back propagate the fields q1,a-c to the lens plane. These fields will be Êlens,a-c+.
  2. Multiply the resulting fields Êlens,a-c+ by exp(+πix02)/λf), in order to pass back through the lens. These fields will be Êlens,a-c.
  3. In order to place the field Êlens,a in its original position, shift it laterally a distance of dx = 2.5 mm.
  4. In order to place the field Êlens,c in its original position, shift it laterally a distance of dx = -2.5 mm.
  5. Sum the three fields Êlens,a-c, in order to combine them, and synthetically increase the aperture size.
  6. Multiply the resulting field by exp(-πix02)/λf), and free space propagate it to image plane.
  7. A resolution improvement by a factor of 3 in the scanning direction should be witnessed.

Representative Results

An example for the nine captured images (three defocus images in three lateral positions) is shown in Figure 3.

An example for the G-S convergence is shown in Figure 4. The correlation coefficient for the central image I1,b is above 0.95, and the correlation coefficient for the side images I1,a, and I1,c is above 0.85 (in full numerical simulation they all passed 0.99).

A representative result for the SR image is presented on Figure 5. In the LR image none of the resolution bars is visible. However, in the SR image the horizontal bars are visible, up to the third element to the right. Notice that since our method synthetically increases the aperture only in the x direction (the movement direction), there is no improvement in the vertical bars.

Figure 1
Figure 1. Full experimental laboratory setup. The experimental laboratory setup contains a laser and beam expander, a USAF test target, a lens and an aperture, a camera, and two linear stages. Click here to view larger image.

Figure 2
Figure 2. The imaging system. The imaging system positioned on top of two moving linear stages, allowing precise movement in the x, z directions. Click here to view larger image.

Figure 3
Figure 3. Laboratory acquired low resolution images. Nine laboratory acquired low resolution images from which the optical phase was retrieved and the super resolution image was generated. Images I1,a-c were acquired in different z positions in x = x+ dx. Similarly, Images I2,a-c were acquired in in x=x0, and Images I3,a-c were acquired in in x = x– dx. Click here to view larger image.

Figure 4
Figure 4. Correlation coefficient results. Laboratory results of correlation coefficient between the obtained intensity |p1,a-c|2 and the original images I1,a-c. Click here to view larger image.

Figure 5
Figure 5. SR results. Laboratory results after 100,000 G-S iterations. Left, the original high resolution object. Middle, blurred low resolution image. Right, the obtained super resolved image. Click here to view larger image.

Discussion

The optical synthetic aperture RADAR (OSAR) concept that is presented in this paper is a new super resolved approach that uses the G-S algorithm and scanning technique in order to improve the spatial resolution of an object in the direction of the scan. The movement of the imaging platform can be self-generated while using an airborne or satellite platform. Unlike many time multiplexing SR techniques, our method does not require any a priori information of the object, other than the fact that it is stationary during the imaging process. The proposed technique is for resolution improvement by a factor of 3, in the scanning direction. The improvement by a factor of 3 is just an example and larger improvement factor are also feasible. However, synthetic aperture improvement is limited and cannot yield synthetic F number of less than 1. In order to extend the SR into 2-D, the scanning process should be repeated in the y direction. The proposed optical concept resembles the resolution improvement SAR technique that is applied for the microwave regime.

Several improvements can be made in the setup in order to make it more applicable. For example, using beam splitters, three cameras can be introduced into the setup and capture simultaneously the three defocused images.

The total run time of the presented results, which consisted of 100,000 iterations, and three lateral positions, was ~30 hr. Each G-S iteration took about 0.3 sec. Executing the algorithm in a real time program and optimizing it for such a processor can reduce the processing time by a factor of about 100,000. Thus, the total processing time can take only a few seconds. Also please note that as can be seen from Figure 4, one does not need 100,000 since the convergence occurs already after 10,000 iterations.

Disclosures

The authors have nothing to disclose.

Acknowledgements

None

Materials

Red Laser Module Thorlabs LDM635
10X Galilean Beam Expander Thorlabs BE10M-A
Negative 1951 USAF Test Target Thorlabs R3L3S1N
Filter holder for 2" Square Filters Thorlabs FH2
1" Linear Translation Stage Thorlabs PT1 X2
Lens Mount for Ø1" Optics Thorlabs LMR1
Lens f = 100.0mm Thorlabs AC254-100-A
Graduated Ring-Activated Iris Diaphragm Thorlabs SM1D12C
2.5×2.5mm Aperture Ø1" Indoor production
High Resolution CMOS Camera Thorlabs DCC1545M

References

  1. De Loor, G. P. Possibilities and uses of radar and thermal infrared systems. Photogrammetria. 24, 43-58 (1969).
  2. Simonett, D. S. Remote sensing with imaging radar: A review. Geoforum. , 61-74 (1970).
  3. Born, M., Wolf, E. . Principles of optics: electromagnetic theory of propagation, interference and diffraction of light. , (1999).
  4. Wiley, C. A. Synthetic aperture radars-a paradigm for technology evolution. IEEE Trans. Aerospace Elec. Sys. 21, 440-443 (1985).
  5. Brown, W., Porcello, L. . An introduction to synthetic-aperture radar. , 52-62 (1969).
  6. Cheney, M., Borden, B. . Fundamentals of Radar Imaging. Siam. , (2008).
  7. Otto, R., Fritz, L. Die lehre von der bildentstehung im mikroskop von Ernst Abbe. Vieweg Braunschweig. , (1910).
  8. Gerchberg, W. R., Saxton, W. O. A practical algorithm for the determination of phase from image and diffraction plane pictures. Optik. 35, 237-246 (1972).
  9. Misell, D. L. A method for the solution of the phase problem in electron microscopy. J. Phys. D Appl. Phys. 6, (1973).
  10. Misell, D. L. An examination of an iterative method for the solution of the phase problem in optics and electron optics: I. Test calculations. J. Phys. D Appl. Phys. 6, 2200-2216 (1973).
  11. Misell, D. L. An examination of an iterative method for the solution of the phase problem in optics and electron optics. II. Sources of error. J. Phys. D Appl. Phys. 6, 2217-2225 (1973).
  12. Fienup, J. R. Reconstruction of an object from the modulus of its Fourier transform. Optics Lett. 3, 27-29 (1978).
  13. Fienup, J. R. Phase retrieval algorithms: a comparison. Appl. Optics. 21, 2758-2769 (1982).
  14. Gur, E., Zalevsky, Z. Image deblurring through static or time-varying random perturbation medium. J. Electron. Imaging. 18, 033016-03 (2009).
  15. Goodman, J. W. Introduction to Fourier Optics. Roberts & Company. , (2005).
  16. Tippie, A. E., Kumar, A., Fienup, J. R. High-resolution synthetic-aperture digital holography with digital phase and pupil correction. Optics Express. 19, 12027-12038 (2011).
  17. Lim, S., Choi, K., Hahn, J., Marks, D. L., Brady, J. Image-based registration for synthetic aperture holography. Optics Express. 19, 11716-11731 (2011).

Play Video

Cite This Article
Ilovitsh, A., Zach, S., Zalevsky, Z. Time Multiplexing Super Resolving Technique for Imaging from a Moving Platform. J. Vis. Exp. (84), e51148, doi:10.3791/51148 (2014).

View Video