Currently I have a gas with a density that follows and inverse square law in distance, . Given that I know the mass attenuation coefficient of this gas, I wish to calculate an effective optical depth using a modified version of the Beer-Lambert Law that uses mass attenuation coefficients:
Where is the mass attenuation coefficient for the solid phase of the gas [cm], is the mass density of the solid phase of the gas, l is the path length, M is the molar mass of the gas, is the pressure of the gas as a function of temperature, R is the ideal gas constant and T is the temperature of the gas. is the mass density of the gas itself and can be extracted from the ideal gas law:
The integral emerges from my attempt at rewriting the first equation for a non uniform attenuation, that I have here due to the inverse square law effecting the density of the gas.
However, I am now concerned that units no longer balance here since τ should be unitless. Can anyone help guide me here?