Many statistical models and learning machines are not regular but singular, hence the conventional theory using asymptotic normality can not be employed in analysis of such models. This paper explains the new mathematical theory which enables us to clarify the asymptotic behaviors of the generalization loss and the free energy based on algebraic geometry. Also the short history how this theory has been applied to real world problems is introduced.