Attitude Estimation of Space Objects Using Imaging Observations and Deep Learning
Abstract
In recent years, as the number of space debris on orbit increases, the risk of collision of satellites with space debris increases. In order to guarantee the continuous space development in the future, it is believed that Active Debris Removal (ADR) that deorbits large debris into the atmosphere is necessary. Estimation of the shape, surface properties, and attitude motion of a space object is effective to prepare for ADR and/or to monitor the health of satellites. Optical measurements such as photometric observations and imaging observations have been widely applied for the estimations. Photometric observation-based methods use light curves, the time history of the brightness of a space object, and was originally used to estimate the attitude motion of asteroids. Imaging observation-based methods use images captured by adaptive optics. Kyushu University has conducted studies on the state estimation of space object on GEO orbit with photometric observations using Unscented Kalman Filter (UKF) and Multiple-model Adaptive Estimation (MMAE). However, estimation by this method requires the initial attitude and the initial angular velocity of a space object, and how to decide these is a problem. Therefore, a method of determining these using imaging observation was proposed, and its feasibility was examined by simulation.
The previous study assumed that the shape and surface properties of a space object were known. A set of captured images corresponding to various attitude angles was generated from the three-dimensional model created in 3DCG software. It was supposed that the attitude of a space object can be estimated by finding the image that has the highest similarity to the target image. However, the technique based on the similarity comparison of images do not work if the target object is not perfectly centered in the image. It is necessary to increase the number of images in the set to improve the accuracy, so that there is also a problem that the calculation cost for one estimation increases. The purpose of this paper is to improve these problems by applying convolutional neural network, and to reveal the system requirement of the imaging observation of GEO space objects. To simulate the effects of atmospheric fluctuations and the optical system on the images, Point Spread Function(PSF) was generated by using SOAPY. Sensor noise was reproduced by adding gaussian noise to the images. In the actual captured images, the object is not always shown at the center of the image, so images were randomly clipped so that they become close to the actual images. In addition, in order to evaluate the error of attitude correctly and to improve learning efficiency, attitude was expressed by quaternion instead of Euler angles used in the previous study. As a network model of deep learning, a convolutional neural network widely used for image recognition was adopted. The network has two of convolution-pooling layers and three of fully connected layers, with an image of 64 by 64 pixels as an input and quaternion representing the estimated attitude as output. First, a training data set is given to the network, so that its parameters are adjusted to minimize the error between the estimated quaternion and the true quaternion. After that, using a test data set, it was investigated whether the network can estimate accurate attitude from images different from those used for learning. As a result of learning, convergence of the error of estimation with training data was confirmed and estimation accuracy for test data was evaluated. In order to reveal the system requirements for imaging observation, the attitude angle estimation error were investigated for cases using observation devices with different resolutions. In addition, the accuracy was compared with the conventional method which is based on the image comparison, for the case with/without sensor noise.- Publication:
-
Advanced Maui Optical and Space Surveillance Technologies Conference
- Pub Date:
- September 2019
- Bibcode:
- 2019amos.confE..21A
- Keywords:
-
- Imaging Observations;
- Adaptive Optics;
- CNN