When an unmanned aircraft autonomously flies in an extreme environment such as a disaster site, it is necessary to cope with environmental changes such as the influence of disturbance and the movement from outdoors to indoors. In this research, we are aiming to develop supervisor-type navigation and control system for unmanned aircraft to adapt to various surrounding environments. The research objective of this paper is to develop a system that recognizes the surrounding environment, which is one of the functions of the supervisor. By knowing the state of the environments where the unmanned aircraft is flying by environment recognition system, it is possible to issue an instruction for correction of the sensor configuration and control model to the navigation system and control system. In this paper, we first design deep learning models such as feed forward neural network(FFNN) or long short-term memory (LSTM) using LIDAR and GNSS sensor data for environment recognition. Designed system is verified by real time environment recognition experiment. From experiments, it was confirmed that ambient environment recognition can be performed with a F1-measure of about 0.94 using sensor data of LIDAR and GNSS. Moreover, it is considered that the performance of the surrounding environment recognition system can be further improved depending on the structure of the learning model and the training data used for deep learning.