I just did a google search on this, and I would invite you to do the same. The definition that I learned is that it accounts for the loss of coherence, due to a finite band width (finite energy/frequency width). Basically, it characterizes the length over which the wave becomes significantly different from the original wave so that the coherence is lost. You can start from the Heisenberg uncertainty principle, $\Delta E \Delta t \sim \hbar$, and figure out the "coherence time" from it, given the band width $\Delta \nu = \Delta E / h$. Then, given the coherence time $\Delta t$, one can multiply the speed of light to figure out the coherence length.