Reports 1-1 of 1 Clear search Modify search
MIF (General)
hirotaka.yuzurihara - 16:48 Tuesday 21 January 2025 (32409) Print this report
lockloss investigation: 2024/12/21~12/30

[Yong-Xiang Yang, Chia-Hsuan Hsiung, Gui Chin Liu (from Tamkang university), Chia-Jui Chou, Yuzurihara]

We started the lockloss investigation during the end-year holiday, with powerful supporters from Taiwan. We checked the timeseries by using ndscope and time-frequency map by using the Pastavi.

Details

  • This is a continuous work of klog. So, we focused on the lockloss during 2024/12/21~12/30.
  • The generated plots were collected on wiki.

 

OMC saturation

We found the OMC saturation was happening only on 2024/12/29. (Fig) (Fig) Times are below:

  • 2024-12-29 21:19:16 UTC (lock duration=151 s, saturation duration=32s)
  • 2024-12-29 19:41:26 UTC (lock duration = 193 s, saturating duration = 58 s)
  • 2024-12-29 19:04:37 UTC (lock duration = 702 s, saturating duration = 600 s)
  • 2024-12-29 17:26:56 UTC (lock duration = 278 s, saturating duration = 100 s)

The frequency of the oscillation is ~5 Hz or ~5.2 Hz. (Fig) Even though the OMC saturation was happening, the interferometer kept the locking. Is that health?

 

Excess at f<=10 Hz in PRCL and MICH

Sometimes we observed the excess at ~10 Hz in error or feedback signals of PRCL and MICH, ~30 seconds before the lockloss. (Fig) For example. 2024-12-30 21:38:25 UTC.
Possible scenario is to feedback this excess to the PRM(?). We will check more data and discuss with the interferometer experts.
It's difficult to see this excess in the timeseries. We started to collect more statistics by checking the time-frequency map. We will collect more statistics abour this phenomena.

Related to this, I prepared the script to pass the lockloss time to Pastavi web page.

  • Usage
    • /users/yuzu/lockloss/open_Pastavi.sh 481

 

Drift by BPC feedback control

Sometimes we observed the sudden drift by the BPC feedback control. (Fig) In this case, after the sudden drift, the ETMX overflow happened.
We will collect more statistics abour this phenomena.

 

ETMY MN oplev glitch

We checked the coincident lockloss with the ETMY MN oplev glitch, since 2024-12-28 12:24:50 UTC. After 12/28 13:05, there is no coincident lockloss with the ETMY MN oplev glitch. Three coincident lockloss are:

  • 2024-12-28 13:05:36 UTC (Fig)
  • 2024-12-28 12:24:50 UTC (Fig)
  • 2024-12-28 10:59:28 UTC (Fig)
Images attached to this report
Comments to this report:
hirotaka.yuzurihara - 17:00 Thursday 23 January 2025 (32431) Print this report

This is a continuous work.

Excess at f<=10 Hz in PRCL and MICH

We checked the lockloss at the time we observed the excess power in PRCL and MICH channels at f<10Hz. After the investigation, this phenomena happened only three times. The situation of each losk is different each other.

  • 2024-12-30 21:38:25 UTC
    • ETMX yaw was oscillating with ~1.7Hz. (Fig) But this is not critical for locklos. At 21 s before the lockloss, the amplitude of the ETMX yaw started to increase. (Fig) At the same time, the ASC controls started to oscilation. This will be the cause of the lockloss. (Fig)
  • For other two times, we are still not sure the clear lockloss cause.
    • 2024-12-29 22:28:25 UTC (Fig)
    • 2024-12-19 00:30:11 UTC (Fig)
Images attached to this comment
hirotaka.yuzurihara - 17:49 Thursday 23 January 2025 (32439) Print this report

[Yong-Xiang Yang, Chia-Hsuan Hsiung, Gui Chin Liu (from Tamkang university), Chia-Jui Chou, Yuzurihara]

Transient behavior by BPC feedback control

Sometimes, we observed the transient behavior in the error and feedback signal of BPC for ITMX/ETMX/ITMY/ETMY at the same time.
The error signal is computed from the K1:LSC-DARM_IN1_DQ by demodulation of 865Hz for ITMX and ETMX, 845 Hz for ITMY and ETMY, and 870 Hz. If some additional noise was induced in K1:LSC-DARM_IN1_DQ, it’s possible to mimic the shift of the beam spot and the feedback signal will be sent to the suspension and makes the drift to somewhere, even though the actual beam spot was not moved. That's the scenario we are considering.

The timeseries of error and feedback signal for the BPC are attached. You can see the transient behabior before the lockloss.
The whitened spectrogram is also attached. Before the locklos, the noise level seems to be increased due to some reason.
Based on the chat with Ushiba-san, the amount of feedback signal to ITMX/ETMX/ITMY/ETMY looks small to move the mirror drastically and make the lockloss. On ther other hand, the feedback to PRM is in the unit of urad. It's pottential to move the mirror a lot and make the lockloss.

  • 2024-12-29 22:58:28 UTC (Fig): Whitened Spectrogram around the BPC frequency (Fig)
  • 2024-12-29 22:39:37 UTC (Fig): Whitened Spectrogram around the BPC frequency (Fig)
  • 2024-12-29 19:41:26 UTC (Fig): Whitened Spectrogram around the BPC frequency (Fig)
  • 2024-12-29 19:04:37 UTC (Fig)
  • 2024-12-29 17:26:56 UTC (Fig)
  • 2024-12-28 20:36:31 UTC (Fig)
  • 2024-12-28 18:29:54 UTC (Fig)

Next step

  • Check the behavior of POP90 before the lockloss and check if the POP90 decreased by PRM feedback or not.
  • Check the spectrum of all frequency range of K1:LSC-DARM_IN1_DQ by changing the time
  • Check more lockloss time about whether this phenomena was happening or not
Images attached to this comment
hirotaka.yuzurihara - 13:03 Friday 31 January 2025 (32512) Print this report

This is the additional report about the Transient behavior by BPC feedback control. These lockloss looks to be related to the 5 Hz oscillation of K1:LSC-OMC_DC_OUT_DQ and K1:ASC-DHARD_{P, Y}_IN1_DQ. But, I'm not sure the origin of the 5Hz oscillation.

The common features for these locklosses

I checked the data around the lockloss time listed in the previous post (klog).

  • K1:LSC-OMC_DC_OUT_DQ was oscillating with the 5~5.25Hz during the LSC_LOCK guardian was in OBSERVATION_WITHOUT_LINES state. Its amplitude was growing. (Fig) (Fig)
    • K1:ASC-DHARD_P_IN1_DQ and K1:ASC-DHARD_Y_IN1_DQ shows the same oscillation. The amplitude was not growing. (Fig) The timing of the peaks between these ASC channels looks different.
    • I checked the oplev signals related to Xarm and Yarm. There was no channels with clear 5 Hz oscillation.
  • At 10~20s before the lockloss, the BPC error and feedback signals shows the transient behavior. This phenomena was reported in the previous post (klog). As we checked the ~100 locklosses, this phenomena happened only just before the lockloss.

[homework]Check the behavior of POP90 before the lockloss

We checked the POP90 before the lockloss. But, we could not find the coincident behavior between POP90 and BPC feedback. (Fig)

[homework]Check the spectrum of all frequency range of K1:LSC-DARM_IN1_DQ

I compared the spectrum just before the lockloss (Fig) and the spectrum of 4 minutes before the lockloss (Fig). Due to the 5Hz oscillation, the spectrum shape looks different. But, this different shape continues for more than 1 minutes. I'm not sure why the BPC feedback was affected by the oscillation only just before the lockloss.
For other 6 lockloss, we observed similar behavior.

Discussion

I discussed the possible cause of 5 Hz oscillation with Kenta-san, Yokozawa-san, and Washimi-san.

  • Although we doubted the water release from dam, the seismometer shows the different behavior.
  • I found the 3~10Hz BLRMS of the seismometer was increased between 18:00 UTC~22:00 UTC. (12/28) (12/29) These time is 3:00~7:00 JST.
    • Yokozawa-san commented the possibility that the snow-removing vehicle was working around the KAGRA site. I checked the snow condition at 12/28 and 12/29. The information at Takayama are available (12/28) (12/29). But, I couldn't find the information at KAGRA site. Based on these pages, the weather at Takayama was snow.
    • Or I should ask the people at snow-removing center?
  • Idea of contamination path
    • [cause] snow-removing center --> [cause/phenomena] 5 Hz oscillation at OMC PD and ASC control --> [phenomena] BPC feedback --> lockloss
Images attached to this comment
hirotaka.yuzurihara - 16:00 Monday 03 February 2025 (32561) Print this report

> I found the 3~10Hz BLRMS of the seismometer was increased between 18:00 UTC~22:00 UTC. (12/28) (12/29) These time is 3:00~7:00 JST.
> Yokozawa-san commented the possibility that the snow-removing vehicle was working around the KAGRA site.

I asked Kato-san about the activity of the snow-removing vehicle around the KAGRA site. The snow-removing vehicle operated around the KAGRA site at 2024/12/23, 24, 28, 29 and 30 early morning, 4:30~7:00 JST (the exact working time depends on the date). This time is coincident with the time that K1:PEM-SEIS_IXV_GND_X_BLRMS_3HZ10 increased and you can see that on the summary page  [12/23] [12/24] [12/28] [12/29] [12/30]. When there is no activity of the snow-removing vehicle, we can't see the excess on the seismometer. So, this will be the origin of the excess on the seismometer channel.
The additional seismometers are installed at the OMC, BS and MCF. Unfortunately, the k1nds2 was unavailable due to the preparation of the planed power outage. I will check them next time.

I'm not sure the contamination path fom the seismic motion around 5Hz to the OMC DCPD. Anyway, we need to monitor the relationship beween activity of the snow-removing vehicle and the oscillation of the OMC DCPD.

shinji.miyoki - 5:47 Tuesday 04 February 2025 (32566) Print this report

How about sound in the corner station because the sound from snow remover is mainly at low frequency and loud.

Search Help
×

Warning

×