Abstract
The effect of a free surface deformation on the onset of surface tension driven instability in a horizontal thin liquid layer subjected to a vertical temperature gradient is examined using linear stability theory. Assuming that the neutral state is a stationary one, the conditions under which instability sets in are determined in detail. It is shown that when the upside of the liquid layer is a free surface the free surface deformation is important only for unusually thin layers of very viscous liquids. It is also shown that when the underside of the liquid layer is a free surface the free surface deformation plays an essentially important role and the presence of a vertical temperature gradient can stabilize the layer which, in case of no temperature gradient, is always unstable.