BNS range channel sometimes shows rapid decreasing as ~1-2Mpc (e.g. Fig.1).
In most cases, decreased BNS range continues 4-5min.
I checked it comes from whether actual noise excess or the method of PSD estimation.
and found it seems come from actual noise excess on error signal (I haven't found coincident witnesses yet).
And also, this noise excess seems to affect an estimation of the time dependent coefficients of calibration.
-----
At first, I made a some sensitivity ASDs around bad BNS ranges. Because BNS range channel has ~4min. delay in actual time, range was re-estimated in offline to compare ASD and rage value in the same time accurately. Figure.2 shows some 64-second long ASDs without overlap in time segments each other. We can see decreased BNS range and noise excess around 100Hz (and also some resonant peaks) in multiple ASDs. This means noise excess continues at least more than 64 seconds. So I saw DARM in-loop signals at the various test points and found K1:OMC-TRANS_DC_{A,B}_IN1_DQ (which is whitened DARM error signals) shows noise excess the most clearly as shown in Fig.3. Now we can see the exact time epoch of the noise excess on DCPD channels, I made 3 ASDs as 1) no noise excess at T1 cursor , 2) small excess at T2 cursor and 3) large excess at crosshair to maximize a effect of this noise excess in Fig.4. These excess seems come from enhancement of some resonant peaks (17 22, 40, 80? Hz) and too large enhancement of resonant peaks seems induced a noise excess of floor level.
In addition (it's a main topic for me), this noise excess makes glitchy behavior on the estimation of time dependent coefficients. For example, time dependency of optical gain fluctuates +/-10% (it's limited by statistical errors because of short integration time) in normal case. On the other hand, it's fluctuate ~30% around the noise excess on the DARM error signal as shown in Fig.5. We may need to apply more strict gating or smoothing in the offline estimation of time dependent coefficients.