Lessons Learned: Garbage In, Garbage Out

Isn’t it funny how you can talk about a certain principle or idea and then shortly thereafter you get bitten by that very same principle?

There has been much discussion of late about automation in flying and how it may be creating pilots that are less capable with their actual “stick and rudder” skills.  Tools like autopilot have done wonders to reduce the workload on pilots, and in some ways have made certain missions possible when it comes to duty days and other similar restrictions.  These systems can be great assets to those who use them, but they have their limitations.

In most cases these systems require input from a person at some point in the process.  This may be in the form of inputting waypoints, changing elevations, or various other data needed to accomplish the mission.  In some cases there are even inputs from the plane itself that can affect mission performance, which is what happened to me this week.

On this particular sortie I noticed that we had been having issues with the GPS.  In short, it was randomly going in and out throughout the duration of our flight.  It’s really not the end of the world, because that is why I am there as the navigator.  It’s also called a visual low level for a reason.  The route itself provided nothing unusual, but as I am sure you will come to realize as I continue to write these posts, the airdrop is where this became a little more of an issue.

Me at the only desk I enjoy working behind,

Me at the only desk I enjoy working behind,

Before I explain what happened I must first admit that it never should have been an issue, but I was being a little complacent that day and that is what led to a poor drop score on my part.  Other members of the crew could have “saved” me, but I was the one that didn’t perform and thus have to settle for the crappy score I got.

As we were going in the for the run-in the GPS was completely gone so the computer was utilizing our INS to determine where it thought we were.  I will spare you the boring description of how all of that works, and honestly I don’t even understand all of it as I am no engineer.  Short version is that an INS drifts over time.  There are a lot of variables involved as well as the occasional gremlin that randomly makes it drift a lot farther than normal.  I had also noticed that the winds had come from literally every direction during the flight which could just be swirly winds, or a problem with the computer in the plane that generates those numbers.

As we came across the dropzone the pilot was flying right down the black line, according to the computer.  Apparently he was being as complacent as I was because we both followed that black line down the opposite side of the dropzone that we had briefed and that the numbers had supported.  The drop went out right on time (about the only thing I did right when it came to the drops) and we awaited our score.  The dropzone called back that they were measuring which is rarely a good sign since a good drop is close to the middle and is quickly measured.

Sure enough my drop was 250 yards off.

Once we were back at ground speed zero and I was replaying the drop in my mind the whole thing made complete sense.  I had briefed that we would drop on the left side of the dropzone but the plane tracked across the right side which was easily seen by the desired point of impact being visible out the left window of the cockpit.

Once again there are two lessons to be learned from this.  The first is to not be complacent and rely on a computer to do your job.  They can provide valuable insights and guidance but it is your responsibility to ensure that you are utilizing them as a backup and support rather than a crutch to be lazy on.

The reason this is important is the second lesson: garbage in results in garbage out.  A computer is only as good as the data that is input.  Whether that data is input by a human, or derived from its own sensors, if the data is inaccurate, you will get inaccurate results.  In my case it was an INS that thought we were a half a mile away from where we actually were, but it could be entering a wrong frequency for a navaid or incorrect latitude and longitude.

Regardless of where the bad data comes from there is really no replacement for the good old Mk1 eyeball and the brain behind it to ensure that you are taking your aircraft where it needs to be.

Automation and technology are valuable resources and we would be stupid not to utilize them, but we must ensure that we never forget how to use our brains and other resources to ensure that we fly as precisely and safely as possible.  In this case my complacency got me a bad drop score, but there are countless examples of complacency being a killer.

Where have you seen technology be a crutch that actually did more harm than good?