The quantitative effects of dislocations on the electrical and optical properties of long‐wavelength infrared (LWIR) HgCdTe photovoltaic detectors was determined by deliberately introducing dislocations into localized regions of two high‐performance arrays having cutoff wavelengths of 9.5 and 10.3 μm at T=78 K. Results show that dislocations can have a dramatic effect on detector R0A product, particularly at temperatures below 78 K. For large dislocation densities, R0A decreases as the square of the dislocation density; the onset of the square dependence occurs at progressively lower dislocation densities as the temperature decreases. A phenomenological model was developed which describes the dependence of the detector R0A product with dislocation density, based on the conductances of individual and interacting dislocations which shunt the pn junction. Spectral response and quantum efficiency are only weakly affected, as is the diffusion component of the leakage current. The 1/f noise current was found to increase approximately linearly with dislocation density and also tracks with the magnitude of the leakage current similar to a data trendline established for undamaged HgCdTe detectors. These results can be used to understand the performance limitations of LWIR HgCdTe arrays fabricated on heteroepitaxial substrates.

This content is only available via PDF.
You do not currently have access to this content.