Improving power amplifier (PA) linearity and efficiency requires a linearization technique called digital pre-distortion (DPD). The DPD technique adds expanding non-linearity to the baseband that complements the compressing characteristic of the RF PA. Ideally, the pre-distorter and the PA cascade become linear, and a constant gain amplifies the original input. With the pre-distorter, you can use a PA up to its saturation point while still maintaining good linearity to increase efficiency.
The DPD method uses a signal generator to generate pre-distorted waveforms and a signal analyzer to extract the DPD model. The signal analyzer then transfers model parameters, such as the lookup table (LUT) or coefficients, to the signal generator. The signal generator applies pre-distortion to the waveform. Then, the waveform plays back to the device under test (DUT) PA. This approach helps improve the adjacent channel power ratio and error vector magnitude to appear closer to linear operation.
This is a modal window.
Additional resources for power amplifier DPD measurement
Need help finding the right solution for you?
What are you looking for?