A key challenge for the 2018 challenge is detecting a landing site without any human intervention. We are given his approximate location with plus or minus 100m accuracy and have to accurately locate an A2 sized target to better then 10m accuracy without any human intervention (for full points).
This means using computer vision which while advancing rapidly is still in it’s infancy, the web comic XKCD put it well. Making computers understand pictures is hard!
Our approach is to use the raspberry pi camera that is mounted in team member Steve’s trusty Optera aircraft and look for a red and blue target.
The software on-board the optera sequentially takes photos and processes each using a python software routine developed by team member Mike that attempts to locate the target in each image. Our target consists of a red square next to a blue square. The imaging processing looks for all red and all blue blobs in the image that are roughly the right shape and size. It then looks for where there is the right size red blob next to the right size blue blob. The method works very well generating 100% detection rate and 0% false positive rate from our sample of approximately 300 images. Once it detects a target it sends a thumbnail to the ground station via mavlink so that we can see if it was successful.
Once it has located the position in the image where the target is located it uses the GPS co-ordinates, altitude, pitch roll and yaw angles of the aircraft from when the photo was taken to geo-reference and calculate the latitude and longitude of the target.
For our first full up weekend of testing with the latest software we were excited to see only a 15m spread between the extremities of all the targets located (the aircraft made many passes) This implies we have plus minus 7m accuracy before we have even started detailed calibration of the camera against the aircraft frame which is a great starting place.
Naturally the imaging system will need fine tuning to narrow the accuracy down further however this is a great start for us.