I think I get it now, CRC's come from interleaving and G-LEFTRS come from G.INP.
Noise Protection is a layered affair, with multiple ways to try to fix, or otherwise cope with, errors in the underlying bitstream.
The first layer comes from FEC. Interleaving just makes FEC more effective.
Bitstream errors that can't be fixed by FEC are then dealt with by retransmission instead (or, colloquially, G.INP). If the retransmission also fails, then it can be re-retransmitted. More than once, possibly.
If retransmission of a block fails too many times, the modem gives up trying to send it. The missing block then becomes a CRC error.
There are counters that tell you how the various layers are performing in raw numbers: RS, RScorr, Rsuncorr, rtx-tx, rtx-corr, rtx-uncorr, LEFTR, and CRC all show this kind of information.
There are counters that show how often errors are happening, to calculate the "mean time between errors" information. These are FECS, LEFTRS, ES and SES.
- CRCs come from bitstream errors that were severe enough to not be fixed by FEC, and happened every time the data was retransmitted N times, until G.INP gave up.
- LEFTRS come from second-long periods when retransmission was being used *so much* that ordinary throughput was reduced.
Both are a sign of prolonged bursts of noise, but they measure the impact on the line in different ways.