Optical Navigation 101
Compact-er, lighter, great resolution on digital cameras today make them a great choice for fully & semi-autonomous navigation systems, especially in space.
Starting with the basics – place your index finger vertically infront of your face over the nose, close one eye, then open the closed eye and close the one which was open previously – notice the lateral movement of your finger when seen from different eyes – that for you is “Binocular vision” and by measuring this shift the distance of the “object” from the vantage point can be calculated to reasonable accuracy.
“Disparity” which is the shift in objects when viewed from different “eyes” allows Computer vision systems to ascertain the distance of an object from the “Baseline” which is the line joining the two cameras, equivalent of the two eyes from the experiment above. Typically the distance between the cameras plays an important role in the accuracy at which range could be estimated. Try moving your finger further away from your face and beyond a certain point the “shift” isnt “large enough”.
Managing the entire EDL (Entry, Descent, Landing) phase of any mission using Optical Navigation techniques is the holy grail for spacecraft designers. Despite the obvious advantages of OpNav its a challenging requirement to accurately range the target location, and to do so quickly enough to feed back to “propulsion” and to do that using not so great computational capability.
More on possible work arounds in our next post!