Abstract
For more precise evaluation of the current best(?) sensitivity and noise budgets such as klog#33015, I tried to re-calibrate DARM on 3rd Mar.
Re-calibration was done based on the assumption of 2dB underestimation in optical gain. (To be precise, the OLTF measurement on 5th Mar. was off by 2dB from the last calibration on 19th Feb., and it is probably not confirmed whether the situation on 3rd Mar. is equivalent to one on 5th Mar.)
This was a simulation of the front-end calibration not LL and offline calibration (In other words, it's focused only on below 1kHz and a few tens of percents bias still exists in kHz band coming from super-Nyquist effects of analogue electronics.)
Detailed plots are in DAC Wiki
Details
I tried to re-calibrate DARM by using "pre-calibrated" error and feedback signals. "Pre-calibrated" means inverse-sensing function (1/C) and actuation function (A) are already applied for the error signal and the feedback signal on the front-end model, respectively (These signals are used on the low-latency calibration pipeline). So re-calibration can be done by applying a missed facor as -2dB to the "pre-calibrated" error signal and aligning the relative time between the error signal and the feedback signal as the time of injection point of GWs as h = [(1/C)*V_err]*(-2dB) + [A*V_ctrl]*exp(-iwT).
Relative time difference between the "pre-calibrated" error signal and the "pre-calibrated" feedback signal consists from some digital delays such as Dolphin, light travel time in 3km arm, delays in analog circuits and so on. In addition to them, phase delay coming from super-Nyquiest effects is treated as an approximated time delays in the front-end calibration and it's now set as 8-samples in 16kHz (the "pre-calibrated" error signal delays 8-sample from the "pre-calibrated" feedback signal). Though that's all we must consider, we also need to take into account the phase delays by a decimation filters in order to down-sampled to 4kHz by DAQ process and an interpolation filters in order to re-up-sampled to 16kHz on this analysis (From the view point of the latency on serving low-latency h(t), the DQ-ed "pre-calibrated" feedback signal is down-sampled on the DAQ process. Phase delays of these filters are roughly 6-samples delay in 16kHz. So totally we need to set the time delay to align the timing of two signal as 2-sample in 16kHz.
The spectra of the original DARM and the re-calibrated DARM by the above method are shown in Fig 1. Ratio of these two spectra is also shown in Fig.2. In f >> UGF, re-calibrated DARM is 2dB better than original one (it's consistent with the fact that a factor of -2dB is added to the inverse-sensing function) and these two spectra is same in f << UGF (it's also consistent with the fact that no change in the actuation function). Additional plots are available on DAC Wiki.