Microwaves are widely used in fusion-relevant experiments for heating and diagnostic purposes. They suffer, however, from the problem that they have to traverse the plasma boundary, a region where substantial plasma density fluctuations are known to occur. This can significantly distort the beam and thus reduce heating efficiencies which rely on good localization of the injected microwaves (e.g. stabilization of neoclassical tearing modes).
Geometrical-optics tools cannot be used to describe the interaction of the small-scale plasma density turbulence and the microwave. Instead, full-wave codes can be used which do not rely on any simplifying assumptions concerning the density variations.
Here, we present a full-wave simulation (using the FDTD code IPF-FDMC) of a microwave beam with a frequency of 50 GHz injected onto a plasma with a linearly increasing background density (colour-coded in the video). A layer of turbulent plasma density has been added to, as can be seen from the video. Plotted is also the time-evolution of the absolute value of the wave electric field (the microwave is injected from the right-hand side). |