Reports of 31971
MIF (General)
hirotaka.yuzurihara - 17:19 Friday 11 July 2025 (34527) Print this report
lockloss investigation during O4c (2025-07-10 10:53:19.187500 UTC)

[Yokozawa, Yuzurihara]

We performed the lockloss investigation for the recent lockloss of 2025-07-10 10:53:19.187500 UTC. The previous lockloss investigation was posted in klog34259. This is the longest lock in the O4c by now.

Quick summary 

Although I check all the lockloss phenomena which were reported in the past klog, those phenomena except for the OMC DCPD saturation did not occur just before this lock loss.
The OMC saturation occurred just before the lockloss. But, the quickness of the saturation seems to be different from the past saturations. I and Yokozawa-san checked the timeseires and listed up the possible cause of the saturation. 

Details

  • There was no excess on seismic motion of 1~10 Hz. (Figure 1)
  • About the oscillation, we can see the oscillation on the PR2 pitch with 0.83 Hz. But, the amplitude is not large. (Figure2)
    • We can see the coincident oscillation (0.83 Hz) on the ASC PRC2 pitch and ASC MICH pitch. (Figure3)
  • There was small excess on PRM feedback signal. But, it will be small to cause the lockloss. (Figure4)
  • About the drift, there was no drift on Xarm and Yarm mirrors over 30 minutes. (Xarm, Yarm)
  • About the BPC control, there was no strange behavior just before the lockloss
  • About the saturation of control, there was no saturation on BS and ETMX.
  • There was no glitch on ETMY MN oplev.
  • There was no earthquake around the lockloss time.
    • We can see the large seismic motion in 3~10 Hz and 10~30 Hz. This excess ended 20 minutes later. But, this amplitude is not enough to cause the lockloss. (Figure)

Quick OMC saturation

At this lockloss, we can see the OMC saturation, which we reported many times in the past klog. This will be the direct cause of the lockloss. I and Yokozawa-san checked the original cause of the OMC saturation.
One important difference between this saturation and past saturation is the quickness to reach the saturation. As seen in Figure9 and Figure10, the DCPD signal reached the saturation within 1 ms, which is so quick!
We are thinking it's difficult to make this quick saturation by the suspension. So, we guess it might be associated with the electrical signal or glitch.

In the past saturation, we could see the several oscillation just before reaching the saturation as shown in Figure 11 or this. This might be a hint to investigate the cause more.
Anyway, this was new phenomenon for us (or I missed this phenomenon...).

Other findings

  • The volage monitor channel
    • While K1:PEM-VOLD_OMC_RACK_DC_M18_OUT_DQ is stable, K1:PEM-VOLD_OMC_RACK_DC_P18_OUT_DQ shows the drift behavior and it reset somehow. (Figure 12)
    • K1:PEM-VOLT_AS_TABLE_GND_OUT_DQ is almost stable but shows the unstable behavior at -5 hour and just before the lockloss.
    • It's highly important to check the ground condition around AS port and OMC rack.
  • PMC control
    • K1:PSL-PMC_LO_POWER_MON_OUT16 shows some jumps before the lockloss. It would be very helpful if the PMC expert could comment on this phenomenon.
Images attached to this report
OBS (Summary)
shoichi.oshino - 17:07 Friday 11 July 2025 (34529) Print this report
Operation shift summary
Operators name: Tomaru, Oshino
Shift time: 9-17 (JST)
Check Items:

VAC: No issues were found.
CRY cooler: No issues were found.
Compressor: No issues were found.

IFO state (JST):
09:00 The shift was started. The first half of the calibration work.
Maintenace work
17:00 This shift was finished. The last half of the calibration work.
FCL (Electricity)
masakazu.aoumi - 16:30 Friday 11 July 2025 (34528) Print this report
Monthly inspection of electric equipments
Monthly inspection of electric equipments
With Nakai-denki san,Shinko-denki san
13:00 In
16:00 Out
DGS (General)
takahiro.yamamoto - 16:23 Friday 11 July 2025 (34526) Print this report
Comment to Removing old trend frames (34315)
Jul. 11th

I removed old second frames on the disk storage in the mine.
Removed segment is [1432000000, 1434000000).
VIS (EX)
dan.chen - 16:17 Friday 11 July 2025 (34525) Print this report
Comment to ETMX IP Offload (34460)

Date: 2025/7/11

I performed this work as a weekly work.

Attached figure is a screen-shot just after the work.

Images attached to this comment
VAC (Tube Y)
nobuhiro.kimura - 16:02 Friday 11 July 2025 (34524) Print this report
Refill of cooling water for Y-27 pump unit

[KImura and M. Takahashi]

 At around 10:20 a.m. on July 11, during a routine patrol of Y-end, the occurrence of a strange noise was confirmed near Y-27.
The source of the noise was the cooling water unit for the TMP of the Y-27 vacuum pump, and the cause was a lack of cooling water.
Therefore, the GV of the Y-27 vacuum pump was closed and the TMP was stopped.

 At approximately 1:40 p.m., the cooling water unit was refilled with water to put back into service the Y-27 vacuum pump, and the unit was put back into operation.
As a result of the operation, there were no problems with the TMP or the cooling water unit, and the unit was in regular operation.
As a precaution, we decided to leave the repair equipment at Y-27 and check the situation during the next scheduled patrol.

 

Images attached to this report
VAC (Valves & Pumps)
nobuhiro.kimura - 15:53 Friday 11 July 2025 (34523) Print this report
Delivery of deliverables (TMP)

[Kimura and H.Sawada]

 Two turbomolecular pumps were delivered.
These turbo molecular pumps were transferred to the front room of the parking lot and are temporarily stored next to the lift truck.

Images attached to this report
DGS (General)
satoru.ikeda - 15:50 Friday 11 July 2025 (34522) Print this report
SRM GAS Stepper Motor Unavailable

[Nakagaki-san, Ikeda]

Summary:
This work is related to K-Log#34519: Offload of F1 GAS.

Due to a communication failure with the network switch located in the SRM VIS mini-rack, communication with the SRM GAS Stepper Motor was lost.
To recover, we restarted the SRM network switch by unplugging and reconnecting the LAN cable on the OMC side, which supplies PoE to the switch.

Details:
We received a report from R. Takahashi-san that the SRM GAS Stepper Motor was not functioning.

Upon investigation, we found the following:

The relay of the Stepper Motor showed slight response.
There was no ping response from k1script1 to the LAN-serial converter.
The SRM VIS mini-rack network switch did respond to pings from k1script1.
However, the web interface of the network switch was unresponsive.
Based on this, we suspected a communication failure within the SRM network switch.

Since the SRM network switch uses PoE, its power can be controlled from the OMC side. However, we were unable to locate the port list for remote control.

Therefore, after consulting in the control room, we decided to enter the tunnel for manual intervention.

14:13 - Entered the center and reconnected the LAN cable from the OMC network switch to the SRM.
14:15 - Confirmed at the BS area workstation that the SRM GAS Stepper Motor could be turned ON via BIO and scripts.
14:18 - Handed over control to Takahashi-san.
14:19 - Exited the central area.

OMC Network Switch Port Assignments:
Pico Network
01: PD mini rack switch  
02: OMC mini rack switch  
03: SR3 mini rack switch  
04: SR2 mini rack switch  
05: SRM mini rack switch  
06: BS mini rack switch  
07: PR2 mini rack switch  
08: Green X Table switch  
09: Green Y Table switch  
10: AS Table switch  
11: Precision Air Processor SRX1
12: AS_WFS HWP  
DGS Network
18: OMC Workstation  
 

Non-image files attached to this report
CRY (General)
nobuhiro.kimura - 15:46 Friday 11 July 2025 (34521) Print this report
Comment to Transportation of experimental equipments for Kashiwa (34309)

[Kimura, M.Takahashi, H.Sawada and Yamaguchi]

 We transported the experimental equipments at the parking lot in the mine to inside the prefab house at the Kohguci.

The equipments will be temporarily stored in the prefab house until July 15.

Images attached to this comment
OBS (SDF)
ryutaro.takahashi - 14:58 Friday 11 July 2025 (34520) Print this report
Comment to Changes of observation.snap during O4c (34169)

I accepted the following differences of SDFs (see related klog34517, klog34518, and klog34519). SDF #25-27 were reverted because of misoperation.

Images attached to this comment
VIS (SRM)
ryutaro.takahashi - 14:49 Friday 11 July 2025 (34519) Print this report
Offload of F1 GAS

[Takahashi, Ikeda]

I offloaded the F1 GAS with the FR. The stepper motor initially did not work due to a network switch problem, which was recovered at the site by Ikeda-san.

VIS (PR3)
ryutaro.takahashi - 14:17 Friday 11 July 2025 (34518) Print this report
Offload of SF GAS

I offloaded the SF GAS with the FR, which reached the maximum limit (1360000 steps).

VIS (IY)
ryutaro.takahashi - 14:13 Friday 11 July 2025 (34517) Print this report
Offload of F0 GAS

I offloaded the F0 GAS with the FR.

DGS (General)
shoichi.oshino - 13:58 Friday 11 July 2025 (34516) Print this report
Exchange k1tw0 SSD
[Nakagaki, Oshino]

We exchanged k1tw0 SSD for the new one.
The data recorded on the previous SSD is currently being copied to the NFS server.
The data from the last six months is being temporarily loaded from an external disk. We plan to switch to a NAS storage space next week.
OBS (SDF)
Shingo Hido - 12:18 Friday 11 July 2025 (34515) Print this report
Comment to Changes of observation.snap during O4c (34169)
We accepted SDFs related to the cal measurement in observation.snap, and safe.snap (k1calcs).
K1:CAL-MEAS_{CURRENT, LATEST}
Images attached to this comment
CAL (General)
hirotaka.yuzurihara - 10:20 Friday 11 July 2025 (34514) Print this report
TCam photo session 20250711

I took the TCam photos for four mirrors at 9:30 ~ 9:37 this morning. The previous work is klog34453.

  • ETMY
  • For these mirrors, I updated the reference positions. I also re-draw the yellow line shown at the k1mon4. For the other mirrors, we don't need to update the reference positions.
OBS (SDF)
Shingo Hido - 9:43 Friday 11 July 2025 (34513) Print this report
Comment to Changes of observation.snap during O4c (34169)
We accepted the SDFs reported on klog#34512.

CALEX, CALEY
K1:CAL-PCAL_{EX,EY}_TCAM_{MAIN,PATH1,PATH2}_{X,Y}
Images attached to this comment
CAL (General)
Shingo Hido - 9:40 Friday 11 July 2025 (34512) Print this report
Pcal Parameter Update Report

A CAL Tcam session was performed to obtain beam position information necessary for Pcal. The parameters have already been updated, and SDF is expected to be accepted.

Operator: Dan Chen, Shingo Hido

Update Time: 2025/07/11 09:16:29

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EX_TCAM_PATH1_X 3.46849 mm 3.20691 mm -0.26158 mm
K1:CAL-PCAL_EX_TCAM_PATH1_Y 62.58533 mm 62.78539 mm +0.20006 mm
K1:CAL-PCAL_EX_TCAM_PATH2_X 0.06867 mm -0.21958 mm -0.28825 mm
K1:CAL-PCAL_EX_TCAM_PATH2_Y -63.66836 mm -63.36743 mm +0.30093 mm

Update Time: 2025/07/11 09:17:49

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EX_TCAM_MAIN_X 3.73403 mm 3.62390 mm -0.11014 mm
K1:CAL-PCAL_EX_TCAM_MAIN_Y 12.20820 mm 11.89945 mm -0.30875 mm

Update Time: 2025/07/11 09:18:10

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EY_TCAM_PATH1_X 0.92272 mm 1.27965 mm +0.35693 mm
K1:CAL-PCAL_EY_TCAM_PATH1_Y 63.96127 mm 63.48267 mm -0.47860 mm
K1:CAL-PCAL_EY_TCAM_PATH2_X -0.71031 mm -0.34771 mm +0.36260 mm
K1:CAL-PCAL_EY_TCAM_PATH2_Y -70.49072 mm -71.06132 mm -0.57060 mm

Update Time: 2025/07/11 09:18:46

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EY_TCAM_MAIN_X 6.31424 mm 8.78471 mm +2.47048 mm
K1:CAL-PCAL_EY_TCAM_MAIN_Y -3.21271 mm -3.91830 mm -0.70559 mm
OBS (General)
shinji.miyoki - 6:51 Friday 11 July 2025 (34511) Print this report
Comment to 18 hours longest lock (34510)

After this long lock, the BNS range seems to be sometimes degraded during locking. It is better to check the IFO conditions, especially those related to ASC and/or signal saturation. Some excess noise was visible around 60 Hz to 100 Hz.

Images attached to this comment
OBS (General)
shinji.miyoki - 20:17 Thursday 10 July 2025 (34510) Print this report
18 hours longest lock

Because of a small number of earthquakes, over 18 hours of continuous locking was realized.

However, the lock loss seemed not to be related to seismic noise shock.

Images attached to this report
Comments to this report:
shinji.miyoki - 6:51 Friday 11 July 2025 (34511) Print this report

After this long lock, the BNS range seems to be sometimes degraded during locking. It is better to check the IFO conditions, especially those related to ASC and/or signal saturation. Some excess noise was visible around 60 Hz to 100 Hz.

Images attached to this comment
DetChar (General)
takahiro.yamamoto - 19:19 Thursday 10 July 2025 (34509) Print this report
Bug in overflow segments
During the chat, glitches on DCPD reported in klog#34442 and klog#34498 sometimes makes ADC saturation and the necessity of removing such data from some kind of analyses.
Because such phenomenon can be flagged as overflow segments, I checked them to see the overflow rate by this issue.

Overflows by these glitches were occurred in 28 times after starting observing run as below. Because overflow segments themselves contains the overflow due to the lockloss and during the lock acquisition, I picked up overflows only during science-mode by taking logical AND with the science mode.

Then, I tried to make some plot and found 1-second long time offset in the segment. Attached figures shows the DCPD signals and T-cursors represent overflow segment. In both cases, an overflow in 1-second before the start time of overflow segments is missed and there is no overflow in the last 1-second of the segments. So it suggests that served overflow segments has 1-second offset from the correct time. It might be better to reproduce the overflow segments.

-----
overflow list
1 1433745982 5 2025-06-12 06:46:04.000000 UTC
2 1433745990 3 2025-06-12 06:46:12.000000 UTC
3 1433851377 2 2025-06-13 12:02:39.000000 UTC
4 1433977698 2 2025-06-14 23:08:00.000000 UTC
5 1433989731 2 2025-06-15 02:28:33.000000 UTC
6 1433989734 2 2025-06-15 02:28:36.000000 UTC
7 1433992139 2 2025-06-15 03:08:41.000000 UTC
8 1433994501 2 2025-06-15 03:48:03.000000 UTC
9 1434030010 2 2025-06-15 13:39:52.000000 UTC
10 1434451083 2 2025-06-20 10:37:45.000000 UTC
11 1434457706 3 2025-06-20 12:28:08.000000 UTC
12 1435119032 7 2025-06-28 04:10:14.000000 UTC
13 1435403687 2 2025-07-01 11:14:29.000000 UTC
14 1435415639 3 2025-07-01 14:33:41.000000 UTC
15 1435443337 5 2025-07-01 22:15:19.000000 UTC
16 1435457242 8 2025-07-02 02:07:04.000000 UTC
17 1435457251 2 2025-07-02 02:07:13.000000 UTC
18 1435542275 7 2025-07-03 01:44:17.000000 UTC
19 1435545969 2 2025-07-03 02:45:51.000000 UTC
20 1435711455 7 2025-07-05 00:43:57.000000 UTC
21 1435801782 2 2025-07-06 01:49:24.000000 UTC
22 1435846619 5 2025-07-06 14:16:41.000000 UTC
23 1435846625 4 2025-07-06 14:16:47.000000 UTC
24 1435848511 2 2025-07-06 14:48:13.000000 UTC
25 1435984572 2 2025-07-08 04:35:54.000000 UTC
26 1435987016 3 2025-07-08 05:16:38.000000 UTC
27 1435989032 5 2025-07-08 05:50:14.000000 UTC
28 1436122577 2 2025-07-09 18:55:59.000000 UTC

Images attached to this report
MIF (General)
takafumi.ushiba - 18:51 Thursday 10 July 2025 (34508) Print this report
Coherence measurement between DARM and IMC ASC signals with long data

I measured the coherence between DARM and IMC ASC signals with 128 average during the stable run today (fig1).
Some large coherence can be seen in IMC WFS signals and DARM.
To evaluate the noise comes from beam jitter, noise projection with IMC WFS might be helpful.

Images attached to this report
DGS (General)
takahiro.yamamoto - 17:07 Thursday 10 July 2025 (34502) Print this report
Accidental(?) modification of foton file
Update of the foton filter file of k1omc model was detected.
Modified time is 2025-07-10 14:50:13 JST and it's reported on #observation@slack at 15:01.
$ ls -l --full-time /opt/rtcds/kamioka/k1/chans/K1OMC.txt
-rw-r--r-- 1 controls 92841 2025-07-10 14:50:13.000000000 +0900 /opt/rtcds/kamioka/k1/chans/K1OMC.txt

$ stat /opt/rtcds/kamioka/k1/chans/K1OMC.txt
File: /opt/rtcds/kamioka/k1/chans/K1OMC.txt
Size: 92841 Blocks: 192 IO Block: 1048576 regular file
Device: 0,46 Inode: 47710247 Links: 1
Access: (0644/-rw-r--r--) Uid: ( 1001/controls) Gid: ( 1001/controls)
Access: 2025-07-10 14:50:16.000000000 +0900
Modify: 2025-07-10 14:50:13.000000000 +0900
Change: 2025-07-10 14:50:13.000000000 +0900
Birth: -


No change in filter modules was found on current foton filter from daily overwritten full backup at 2025-07-10 0:05 JST and from weekly differential backup at 2025-07-04 1:05 JST according to the diff command as follows.
$ diff /opt/rtcds/kamioka/k1/chans/K1OMC.txt /mnt/backup/rtcds/kamioka/k1/chans/K1OMC.txt
$ diff /opt/rtcds/kamioka/k1/chans/K1OMC.txt /mnt/diff_backup/opt-20250704/rtcds/kamioka/k1/chans/K1OMC.txt



There was no backup at corresponding time by There was no backup at corresponding time by foton GUI if modification was done on foton GUI. So this modification might be done on some kind of scripts, editors, etc. foton GUI makes a back up as foton file at the time of saving modification with a suffix of that time. If the modification in this time was done on foton GUI, there should be a backup file that timestam is around 7/10 14:50. But the latest backup file by foton GUI is on 6/27.
$ ls -lt --full-time /opt/rtcds/kamioka/k1/chans/filter_archive/k1omc/* | head -3
-rw-r--r-- 1 controls 92841 2025-06-27 09:36:42.000000000 +0900 /opt/rtcds/kamioka/k1/chans/filter_archive/k1omc/K1OMC_1435019820.txt
-rw-r--r-- 1 controls 92850 2025-06-27 09:36:41.000000000 +0900 /opt/rtcds/kamioka/k1/chans/filter_archive/k1omc/K1OMC_1435019819.txt
-rw-r--r-- 1 controls 92859 2025-06-27 09:36:35.000000000 +0900 /opt/rtcds/kamioka/k1/chans/filter_archive/k1omc/K1OMC_1435019813.txt

I'm not sure the behavior of foton GUI when we press SAVE without any modification. So I cannot conclude this modification was done NOT on foton GUI.

Because of no change in filter modules,
- there is no impact for current calibration.
- CFC bit of k1omc wasn't raised.
- CFC_LATCH wasn't also raised by CAL_PROC guardian.
- IFO guardian was able to keep staying in OBSERVING.
MIF (General)
takafumi.ushiba - 17:02 Thursday 10 July 2025 (34498) Print this report
Consideration of non-linear coupling of DARM

As reported in klog34442, there is a coincidence between glitches in OMC DC PD and sensitivity degradation.
So, I tried to systematically understand the kick of the suspension and sensitivity degradation.

As shown in Fig. 1, when glitches are very large, the OMC DC PD signals can become saturated.
In this case, the calibrated DARM signal is no longer reliable, so it should be excluded from the data analysis.

In other cases, the amplitude of each glitch is different, so I categorized them into four groups based on the maximum signal in each glitch: more than 25,000 counts (#1), between 20,000 and 25,000 counts (#2), between 15,000 and 20,000 counts (#3), and less than 15,000 counts (#4).
Figures 2-5 show the Q-transform around the glitch categorized as #1, #2, #3, and #4, respectively.
Even in category #4, some excess can be seen around 60–90 Hz.
So, it would be nice to evaluate the non-lnear effect in DARM sensitivity.
Figure 6 shows the comparison of the spectra when glitches happened (as a reference, spectrum without glitches are also shown with black lines).

Images attached to this report
OBS (Summary)
shoichi.oshino - 17:00 Thursday 10 July 2025 (34507) Print this report
Operation shift summary
Operators name: Tomaru, Oshino
Shift time: 9-17 (JST)
Check Items:

VAC: No issues were found.
CRY cooler: No issues were found.
Compressor: No issues were found.

IFO state (JST):
09:00 The shift was started. The status was OBSERVING.
17:00 This shift was finished. The status was OBSERVING.
Search Help
×

Warning

×