Description
In the solar atmosphere, radiation plays an important role in the energy balance. Extinctions or emissions of photons from transitions between atomic energy levels can either heat or cool the local atmosphere, and their contributions are expressed as the radiative flux divergence, referred to as the radiative losses. Detailed calculations could be computationally expensive, especially in the chromosphere, where the local thermodynamic equilibrium assumption breaks.
Based on the recipe of approximate radiative losses for the quiet Sun, we construct a new recipe for solar flares where the chromosphere undergoes drastic changes. We tabulate the optically thin radiative loss, escape probability, and ionization fraction using a grid of flare models from radiative hydrodynamic simulations as our dataset.
We have also evaluated the performance of different recipes for chromospheric radiative losses in flare simulations. We find that our recipe provides a better approximation of the detailed radiative losses, especially for large flares.
Height-integrated radiative losses imply how much energy is escaped from the deep atmosphere as free photons. Previous studies found that there is a good relation between height-integrated radiative losses and the wavelength-integrated emergent intensity of certain spectral lines like Ca II K. Thus, we propose to use height-integrated radiative losses as a proxy to synthesize Ly𝛼 images from MHD simulations. We apply this method to a Bifrost simulation and find that the synthesized image looks similar to the one of detailed radiative transfer calculations.