Ettinger discusses how to overcome aerodynamic obstacles in relation to flying with small reconnaissance planes with a vision-guided flight stability and autonomy system.  This is an interesting problem in that the autonomous reconnaissance plane does not have a context for where it is in flight because it is flying in 3D space (roll, pitch, and yaw). The way they solved this problem was first looking at how the pilots directed the planes and the issues that they came across.  This is an interesting approach to solving this problem because I thought that typically manual control of a vehicle is very different from autonomous control due to the simple fact that currently humans can respond to changes faster and have a better sense of direction than current autonomous vehicles. The solution to the problem was elegant in its relative simplicity in that it is modeled after the behavior of birds in flight. From the article “Birds rely heavily on sharp eyes and vision to guide almost every aspect of their behavior.”

One of the main issues that they have come across with these planes is the wind gusts that can quickly dislodge the plane from its current flight path. In order to stabilize the planes, they developed an algorithm that essentially takes the sky pixel values and the ground pixel values and extrapolates the horizon from those points. This solution works extremely well for most environments and solves the issue of stability. The other issue is in regards to when the plane does not have a reference of the sky or of the ground. The solution to this is a little more involved in that they need to create a statistical model of the previous images in order to determine if it is headed down or up. This model is the appearance of the sky and ground over the recent time history of the MAV’s flight. Based upon the conditions, they can determine whether there is a valid horizon detected, all ground, all sky or upside down.

In only conditions that I can see where this algorithm would fail would be in white out conditions, overcast and night time reconnaissance missions. The reason is that this algorithm works on the distinction between the sky and the ground (in the color scale). In reading the article, the main assumption that this algorithm makes is that the horizon line will appear as a straight line in the image. So from the conditions that I mentioned, it seems that especially at night and in white out conditions there is no distinction of the horizon itself. The solution may be to have a clear day time model saved on the server. Than with statistical inference, the plane could make estimates on where it was in the flight pattern. This still does not solve the stability problem under these conditions. One possible solution would be to try to include reference points of the ground in the models because even at night, with night vision goggles, the reference points would stand out on the landscape in flight. These could be used as a calibration during flight to guess where the horizon is located.

Reference:

  1. Ettinger, Scott M. “Vision-Guided Flight Stability and Control for Micro Air Vehicles.” <http://www.mil.ufl.edu/publications/fcrar02/flight_stability_frcar.pdf>