The feasibility of angles-only navigation (AON)-based close-in proximity operation is studied for application to upcoming small satellite-based uncooperative rendezvous missions. For the relative motion equations described by Yamanaka-Ankersen, the Square-Root Unscented Kalman Filter (SRUKF) is adapted for relative position estimations, considering low-volume/mass, lower-power, simple optical/infrared instruments together with Lidar measurements. Considering simplicity and achievements in engineering, the multi-pulse glideslope guidance law is utilized. Using a linear covariance technique, a complete set of analytical functions are deduced for closed-loop true dispersions and estimated dispersion analysis. Monte Carlo simulation proves the offset observation model proposed provides a good solution for the range-observability dilemma: the range estimation error decreases from an initial decameter-level to a final decimeter-level. Two proximity operation mission trajectories are designed: direct v-bar quasi-linear glide approach for robotic arm capture or net capture, and glide approach (i.e., circumnavigate–glide approach for capture with attitude requirement). Using a well-designed relative approaching guidance profile, the camera operational range can be extended. This enables the operation time of the Lidar to be shortened, or even replaced by an optimal/infrared camera. This is very helpful in saving mass and power for the chaser. The variable-structure SRUKF proposed leads to a more robust trajectory: true dispersion is improved by two orders in the 100 m range, and more accurate covariance prediction (i.e., all sampled trajectories are inside the ellipse) when compared to the standard SRUKF. The analysis method proposed, which could raise analytical closed-loop linear covariance, is applicable for onboard maneuver planning and real-time closed-loop control error estimation.
Three-dimensional (3D) image sensors have many applications, including enabling autonomous vehicles to avoid obstacles and providing guidance, navigation, and control for spacecraft immediately before landing on a celestial body. Flash LIDAR is a system that can acquire a 3D image by emitting a diffuse pulsed laser beam, and hence is suitable for both obstacle detection and terrain measurement. In the 3D image sensors used for Flash LIDAR, a photosensor array and time measurement integrated circuit are vertically bonded. Here, we report the results of a detailed evaluation of the principles, functions, sensitivity, and time measurement accuracy of a prototype 1-k pixel (32 × 32 pixels) 3D image sensor based on a multi-pixel photon counting avalanche photodiode. By counting photons, a 3D image sensor is realized that has both high sensitivity and the ability to measure light intensity.
This study clarifies the flow field and flame structure of a cavity flameholder with pylons in a Mach 2.8 airflow. A burned hydrogen/air gas mixture, rich in fuel, was injected in supersonic combustion experiments because self-ignition of fuel is difficult in a mainstream with low enthalpy. Experimental data were collected using the shadowgraph method, direct photography of the flame, wall pressure measurement, and OH Planer Laser-induced Fluorescence (OH-PLIF) measurement. In addition, three-dimensional numerical simulation was conducted. When one pylon was installed upstream of the jet flow, the penetration height of the jet increased, and a flame was formed in the mainstream center. When two pylons were installed at the front edge of the cavity, the flow field inside the cavity differed depending on the distance between the pylon and jet flow. The ignition and combustion of the burned-gas were suppressed at a close distance between the pylon and jet flow. For a large distance between the pylon and jet flow, the ignition and combustion of the burned-gas was enhanced. When hydrogen was injected as the main fuel from the upstream of the cavity, ignition of the main fuel was successful only for a large distance between the pylon and jet flow.