Reports of 33972
CAL (XPcal)
dan.chen - 7:45 Wednesday 18 March 2026 (36610) Print this report
Pcal-X beam position check

A CAL Tcam session was performed to obtain beam position information necessary for Pcal. The parameters have already been updated, and SDF has been accepted.

Operator: Shingo Hido, Kohei Mitsuhashi, Dan Chen

Update Time: 2026/03/18 07:41:38

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EX_TCAM_PATH1_X 0.07325 mm -0.87474 mm -0.94799 mm
K1:CAL-PCAL_EX_TCAM_PATH1_Y 66.15693 mm 66.37439 mm +0.21746 mm
K1:CAL-PCAL_EX_TCAM_PATH2_X 0.56752 mm -0.37755 mm -0.94507 mm
K1:CAL-PCAL_EX_TCAM_PATH2_Y -68.50339 mm -67.02710 mm +1.47629 mm

 

VIS (SR2)
kenta.tanaka - 0:39 Wednesday 18 March 2026 (36609) Print this report
Modification of rolloff filters of SR2 Payload local control

This work is continued from klog36589

## GAS control modification

According to yesterday's results related to GAS, there seem to be gain peaking at 3-4 Hz. So I reduced {F0,F1,BF} gains to -6dB. After that I measured their OLTFs when SR2 is in the LOCK_ACQUISITION state. Fig.1,2,3 show the results. Unfortunately, OLTF could not be measured well maybe due to the coupling between GAS controls. Acoording to loop suppression (IN2/EXC), their controls seem to work well and the peaking around 3-4Hz becomes smaller. Fig.4 shows the error/feedback spectra. The peaking seems to be disappeared although each residual GAS motion seems to be not changed so much.

## IM DAMP roll off modification

I also modified the roll off filters for IM DAMP {L, P, R, Y}. As for T and V, elliptic filter was also not used in SR3. So I did not touch them in this time. (Maybe it is better to change)

I measured the OLTFs after the modification. Figure.5, 6, 7, and 8 show the OLTFs of IM DAMP {L,P,R,Y}. Also Fig. 9 and 10 show the error/feedback spectra. the feedback above 10 Hz was rolled off successfully.

Images attached to this report
CAL (General)
takahiro.yamamoto - 22:33 Tuesday 17 March 2026 (36608) Print this report
Comment to Inspection of a network trouble between CAL and DMG for Low-Latency data transfer (36588)
Finally, this issue was solved by restoring a vanished VLAN settings on the DMG network switch (klog#36602).
The data transfer for the output of low-latency calibration pipeline was also resumed.
Now, it is nothing more than just a PD dark noise with applying DARM sensing response but the latest LL frame are also available at Kashiwa.
DGS (General)
takahiro.yamamoto - 22:20 Tuesday 17 March 2026 (36607) Print this report
Comment to Reorganizing the Camera Network (36590)
All other camera network switches were also reconfigured.
Now, all switches can be reachable from k1cam*.

-----
I couldn't find the camera switch at EY at first. This was because a same IP address was used on the switches at EY and OMC and OMC switch responded earlier than EY one when I accessed that IP address. That IP address was different segment from the camera network and switches couldn't be accessed from k1cam*, so I changed their IP addresses as one in the camera network segment. Latest information can be found in the JGW wiki.

And also, I added camera names on the port description of the switch management interface. This information helps us to do the emergency recovery of each cameras by disabling/enabling PoE power supply on the switch management interface. When a new camera will be added, please update the port description of each switches for an easy network management.
CRY (Cryostat IX)
nobuhiro.kimura - 21:48 Tuesday 17 March 2026 (36606) Print this report
Comment to Replacement of the Q-mass for the IXC (36578)

[Kimura and Yasui]

 At around 10:00 a.m. on March 17, we stopped the vacuum pumping in the Q-mass piping, opened the cryostat-side GV, and connected the Q-mass.
The pressure inside the piping before opening the cryostat-side GV was 1.8×10⁻⁴ Pa, and after opening the GV, it was 4.4×10⁻⁵ Pa.
 We then installed an analysis unit on the Q-mass piping. The final pressure inside the piping was 3.5 × 10⁻⁵ Pa.
We then removed the vacuum pumping unit from the piping.

Images attached to this comment
CRY (General)
nobuhiro.kimura - 20:43 Tuesday 17 March 2026 (36605) Print this report
Maintenance Work on the Duct-Shield Cryo-coolers for IXC and IYC was Completed

[Kimura and Yasui]

 On March 17, we completed maintenance work on the duct-shield cryo-coolers for IXC and IYC.
Maintenance work on the duct-shield cryo-coolers for EXC and EYC is also scheduled to be completed by the end of this week.
We are planning to restart the duct-shield cryo-coolers starting March 23, which will also serve as a test run.

CAL (YPcal)
dan.chen - 15:43 Tuesday 17 March 2026 (36604) Print this report
YPcal calibration

Workers: Kohei Mitsuhashi, Dan Chen

We performed monthly Pcal-Y calibration on 2026/03/17.

After the calibration, we updated EPICS parameters related to the Pcal-Y system. No issues were found.

EPICS Key Before After Δ (After − Before)
K1:CAL-PCAL_EY_1_OE_R_SET 0.98973 0.98977 0.00004
K1:CAL-PCAL_EY_1_OE_T_SET 0.98973 0.98977 0.00004
K1:CAL-PCAL_EY_1_PD_BG_RX_V_SET -0.00484 -0.00492 -0.00008
K1:CAL-PCAL_EY_1_PD_BG_TX_V_SET 0.02704 0.01218 -0.01486
K1:CAL-PCAL_EY_1_RX_V_R_SET 0.50319 0.50276 -0.00044
K1:CAL-PCAL_EY_2_INJ_V_GAIN 0.51512 0.51836 0.00324
K1:CAL-PCAL_EY_2_OE_R_SET 0.98618 0.98609 -0.00009
K1:CAL-PCAL_EY_2_OE_T_SET 0.98618 0.98609 -0.00009
K1:CAL-PCAL_EY_2_PD_BG_TX_V_SET 0.02955 0.01466 -0.01489
K1:CAL-PCAL_EY_2_RX_V_R_SET 0.49681 0.49724 0.00044
K1:CAL-PCAL_EY_WSK_PER_RX_SET 1.84214 1.84228 0.00015
K1:CAL-PCAL_EY_WSK_PER_TX1_SET 0.33376 0.33354 -0.00023
K1:CAL-PCAL_EY_WSK_PER_TX2_SET 0.90500 0.90266 -0.00234

 

Images attached to this report
CAL (YPcal)
dan.chen - 15:38 Tuesday 17 March 2026 (36603) Print this report
Pcal-Y beam position check

A CAL Tcam session was performed to obtain beam position information necessary for Pcal. The parameters have already been updated, and SDF has been accepted.

Operator: Kohei Mitsuhashi, Dan Chen

Update Time: 2026/03/17 15:31:24

EPICS Key Before [mm] After [mm] Δ (After - Before) [mm]
K1:CAL-PCAL_EY_TCAM_PATH1_X 1.33223 mm 0.76950 mm -0.56273 mm
K1:CAL-PCAL_EY_TCAM_PATH1_Y 65.85923 mm 65.80570 mm -0.05353 mm
K1:CAL-PCAL_EY_TCAM_PATH2_X -0.29757 mm 1.39102 mm +1.68859 mm
K1:CAL-PCAL_EY_TCAM_PATH2_Y -69.75503 mm -69.21742 mm +0.53761 mm

 

DMG (Data system trouble)
nobuyuki.kanda - 14:48 Tuesday 17 March 2026 (36602) Print this report
DMG switch (in KAGRA tunnel) setting
There were network outages after the AC power stop/resume at 3rd March. cal-gst1,2 could not reach DMG VPN network 192.168.30.x.
The reason was VLAN setting of DMG's network switch in KAGRA (SR-S352TR1) was lost.
So we set as that connection port #1 and #2 of cal-gst1,2 to same VLAN of DMG network "vlan untag 30" as below list, respectively.
Now the network connection is resumed.

Below is a part of setting table of SR-S352TR1 now.
ether 1 vlan untag 30
ether 1 description cal-gst1
ether 2 vlan untag 30
ether 2 description cal-gst2
...
ether 38 description hyades-0_irmc
ether 39 vlan untag 31
ether 39 description hyades-1_irmc
ether 40 vlan untag 31
ether 40 description kamups01
ether 49 vlan tag 30-31
ether 49 description kaml2sw02
ether 50 vlan untag 30
ether 50 description hyades-0
ether 51 vlan untag 30
ether 51 description hyades-1
ether 52 vlan untag 30
ether 52 description hyades-2

We also change the password of SR-S352TR1.
VIS (IX)
ryutaro.takahashi - 13:51 Tuesday 17 March 2026 (36601) Print this report
Comment to Offload of GAS filters (33170)

I offloaded the F2 and BF GAS filters with the FRs.

PEM (Center)
takaaki.yokozawa - 13:36 Tuesday 17 March 2026 (36600) Print this report
Comment to Tapping test at PSL room (36594)
I just made the all plots for the tapping tests JGWDoc17238
AOS (Beam Reducing Telescopes)
ryutaro.takahashi - 13:34 Tuesday 17 March 2026 (36599) Print this report
Comment to Signal generator for the TMSX LVDT is dead (36501)

[Washimi, Takahashi]

We replaced the broken signal generator (SG) NF1956 with the embedded SG card for the LVDT driver (S1807787) in TMSX. We opened the driver chassis and inserted the SG card. The output of the SG card was connected to "REF IN" port on the front panel. The signal was set to 10kHz in frequency and 7Vp-p in amplitude.

Images attached to this comment
PEM (Center)
takaaki.yokozawa - 12:19 Tuesday 17 March 2026 (36598) Print this report
Comment to Tapping test at PSL room (36594)
Very preliminary results.
Only for the mirrors before the PMC

Fig.1. overview (temporary, I tagged M01' - M04' for the mirrors of the new laser)
Fig.2. - Fig.5. showed the various DQ channels. including the PMC related channel, IMMT1 trans QPD2, IMC refl QPD1,2 (DC, 14I, 14Q), IP QPD1,2 and accelerometers

As you can see, Quite large excess can be seen in IMMT1 trans QPD, when we tapped the M3' and M4' mirrors, that the interesting issue.
Images attached to this comment
CAL (YPcal)
Misato Onishi - 12:18 Tuesday 17 March 2026 (36597) Print this report
Preparation for laser replacement
Dan Chen, Kohei Mitsuhashi, Misato Onishi

We checked the following items for the replacement of the YPcal laser:
・The location for installing the new laser source --> A new rack is required
・The location for installing the periscope
DetChar (General)
hirotaka.yuzurihara - 11:07 Tuesday 17 March 2026 (36596) Print this report
Preparation for the upgrade of Pastavi server

The new Pastavi server was purchased with the SEO budget and delivered to the KAGRA site. I performed the initial setup on the server and copied the data viewer system. After completing the preparation at the KAGRA site, I sent it to Kashiwa last Friday. I will replace the server in Kamioka this Thursday. 
Please note that it may take some time to recover the noise budget mode. I expect it to be restored within a few days.

Note

  • Because the OS version jumped significantly from 9 to 12, many modifications to the settings were necessary.
  • The cgi package has been deprecated from Python 3.13. Although it is possible to use legacy-cgi, its behavior was slightly different from what I expected. Therefore, I decided to use Python 3.12. In the future, it may be necessary to redevelop Pastavi using another framework or tool.
  • Installation of the frameL was not as straightforward as in the previous environment  (actually I just wanted to use the FrChannels command). The procedure for installing lal (or framel) using apt is described on the following pages: [wiki] [IGWN debian repo]
DGS (General)
satoru.ikeda - 10:04 Tuesday 17 March 2026 (36595) Print this report
Comment to Deployment of V2 IO-chassis and the front-end computer for ITMX (36572)

ITMX has successfully reached the LOCK_ACQUISITION state.
We will leave it in this state for a while to verify that no issues arise with the new V2 I/O chassis.

PEM (Center)
takaaki.yokozawa - 6:24 Tuesday 17 March 2026 (36594) Print this report
Tapping test at PSL room
I performed the tapping test again.
Analysis results would be appear soon.
Comments to this report:
takaaki.yokozawa - 12:19 Tuesday 17 March 2026 (36598) Print this report
Very preliminary results.
Only for the mirrors before the PMC

Fig.1. overview (temporary, I tagged M01' - M04' for the mirrors of the new laser)
Fig.2. - Fig.5. showed the various DQ channels. including the PMC related channel, IMMT1 trans QPD2, IMC refl QPD1,2 (DC, 14I, 14Q), IP QPD1,2 and accelerometers

As you can see, Quite large excess can be seen in IMMT1 trans QPD, when we tapped the M3' and M4' mirrors, that the interesting issue.
Images attached to this comment
takaaki.yokozawa - 13:36 Tuesday 17 March 2026 (36600) Print this report
I just made the all plots for the tapping tests JGWDoc17238
CAL (Gcal general)
Kohei Mitsuhashi - 5:26 Tuesday 17 March 2026 (36584) Print this report
NCal pylons cutting
[Mitsuhashi, Dan, Sawada, Takahashi H]

We (practitioner is Sawada san) cut NCal pylon's bottom edges.


Images attached to this report
DGS (General)
takahiro.yamamoto - 1:01 Tuesday 17 March 2026 (36593) Print this report
Comment to Compatibility check of a2A5328-4gmPRO camera and pylon-camera-server (36390)
As I reported in klog#36590, it's difficult to connect 10G speed between IOO rack and N1 rack now. So I put a2A5328-4gmPRO back to the server room and connected to Port 1/1/24 of the core switch again, on which PoE was enabled. a2A5328-4gmPRO and k1cam1 has only 1G NIC but now k1cam1 doesn't operate any other cameras, and the core switch has 10G capability. So the connection between k1cam1 and a2A5328-4gmPRO can utilize 1G bandwidth, including the intermediate path.

In this environment, I tested to take pictures and succeeded to take full resolution figures with Mono8, Mono10p, and Mono12p format. In the limited bandwidth environment, such a high bit rate process had failed. This fact means candidates of new TCam required the network capability as occupying 1G bandwidth by 1 Tcam. If Tcam will be contained in the current GigE network, at least following upgrades will be necessary.
- Network switches at both end station must be replaced as 10G one.
- Front Tcam must be directly connected to the server room switch, 10G switch must be installed around ITM area, or Tcam dedicated switch must be installed around ITM area.
- If license issue cannot be solved soon, replacing brocade switches to another vendor ones may be also required.

It may be easier to construct new Tcam system by the same way of current Tcam (a pair of a mini? PC and a camera in short length) than to upgrade whole GigE network.
DGS (General)
ryutaro.takahashi - 23:21 Monday 16 March 2026 (36592) Print this report
Comment to Deployment of V2 IO-chassis and the front-end computer for ITMX (36572)

I offloaded the IP H2 with the FR for recovery from the saturation. The guardian state could be switched from READY to ISOLATED.

VIS (SRM)
ryutaro.takahashi - 22:48 Monday 16 March 2026 (36591) Print this report
Comment to Assembly of new mirror (36579)

[Takahashi, Washimi, Hirata]

We replaced the window mirror (SRM-W) with the 70% mirror (SRM-M). This is a demonstration for training in replacement. At the beginning, we confirmed the mirror's wedge direction: the X- side (Picture 1). We removed the black cylinder from the rear side and removed the mirror holder (Picture 2) by pushing with a Teflon bar from the rear side (Picture 3). We attached the assembled 70% mirror to the test mass (Picture 4), confirming the wedge direction (Picture 5).

We checked the extracted window mirror. The mirror thickness was 9.6mm at the wedge marker. Two 0.5mm shims were used as the spacer for the flange.

Images attached to this comment
VIS (SR2)
kenta.tanaka - 22:22 Monday 16 March 2026 (36589) Print this report
Modification of rolloff filters of SR2 Tower local control

## Abstract

In order to unifiy the design philosophy of Type B local controll filter, I began to modify the control filter in the SR2 tower part (IP_IDAMP_Y, {F0,F1,BF}_DAMP), especially their roll off filters so that the same filters as SR3 are used in SR2.

Feedback spectra above 10 Hz seem to be reduced thanks to this modification. However, there seems to be gain peaking at 3-4 Hz in GAS signals. so we need to adjust them.

## What we did

In order to unifiy the design philosophy of Type B local controll filter, I began to modify the control filter in the SR2 tower part (IP_IDAMP_Y, {F0,F1,BF}_DAMP) so that the same filters as SR3 are used in SR2.

Basically,  the difference between SR2 and SR3 controll fitlers was the roll-off filters. SR3 usually used the 4th order elliptical lowpass filter. On the other hand, SR2 used 2nd or 3rd order pole. Therefore, the gain of the SR2 filter above 100 Hz is typically more than 20dB larger than the one of SR3. This indicates the SR2 filter transmits the 20dB larger sensor noise above 100 Hz to the suspension. So, I tried to impliment the same roll off fiter to SR2 control.

Fig. 1 shows the example of the modification for SR2_IP_IDAMP_Y. Fig.2 shows the medm screen of SR2_IP_IDAMP at that moment. I modified the roll off filter for GASs in the same manner.

After the implementation, I confirmed that SR2 could be reached LOCK_ACQUISITION state. Then, I measured the spectra of error/feedback signals of their controls in the LOCK_ACQUISITION state. Fig.3 and 4 shows the results of IP and GAS, respectively. Red curves in figures are current SR2 error/feedback signal spectra and magenta curves are the spectra before the modification. Blues(and Cyans) are the spectra of SR3 as references. Above 10 Hz, the feedback signals' spectra were reduced successfully. On the other hand, there seems to be gain peaking at 3-4 Hz in GAS signals. So we need to fix them.

## Next

  • IM_DAMP_{L,T,V,R,P,Y}
Images attached to this report
DGS (General)
takahiro.yamamoto - 22:10 Monday 16 March 2026 (36590) Print this report
Reorganizing the Camera Network
I started to reorganize the camera network because several problems were found in klog#36575.

Salvaging the core network switch
The core switch of the CAM network at U38 of the N1 rack was unreachable and couldn't find its IP/MAC addresses via ping and arp scans. So I accessed it via console cable physically. Then IP address hadn't been set to this switch.
ICX6610-24P Switch>show ip
Switch IP address: None
Subnet mask: None
Default router address: None
TFTP server address: None
Configuration filename: None
Image filename: None
IP MTU: 9216
So I set it as follows and then the core switch became reachable from k1cam* via telnet.
ICX6610-24P Switch>enable
ICX6610-24P Switch#configure terminal
ICX6610-24P Switch(config)#vlan 1
ICX6610-24P Switch(config-vlan-1)#ip address
ICX6610-24P Switch(config-vlan-1)#exit
ICX6610-24P Switch(config)#write memory
Flash Memory Write (8192 bytes per dot).
Flash to Flash Done.

Reset of IP address for the IOO switch
The camera switch at U19 of IOO0 rack had an IP address but it was inconsistent with CAM network segment and there was no way to access it without using dummy(? or virtual?) IP address setting on k1cam*. So I set it by a same way above. Now that switch is reachable from k1cam* via the CAM network.

Migration trial to 10GbE for the connection between N1 rack and IOO0 rack
Though two Brocade switches are connected via SFP+ modules and OS2 fiber, link speed is 1GbE.
telnet@ICX6610-24P Switch>show media
Port 1/3/1: Type : 10GE LR 10km (SFP +)
telnet@ICX6610-24P Switch>show interface brief
Port Link State Dupl Speed Trunk Tag Pvid Pri MAC Name
1/3/1 Up Forward Full 1G None No 1 0 cc4e.24f9.762b
So I checked various switch information one-by-one, then finally I found the license key for enabling 10G link is not activated on the switch at IOO0 rack though it's activated on one at N1 rack.
telnet@ICX6610-24P Switch>show license
License record empty
I could find License packages in the storage at cryo-machine room and tried to create a license file by using transaction keys in the licesen packages. But it cannot be done by my Broadcom account. These transaction keys may be associated to Kokeyama-san's personal account or transaction keys might not be migrated properly when Brocade was merged to Broadcom... Anyway, I'm asking support desk and waiting a response.

Enabling PoE on the switch in N1 rack for TCam test
I understood that it's difficult to enable the 10G link between N1 and IOO0 rack soon. So I enabled PoE on the switch on N1 rack for the test of new TCam with sufficient band width. (Though PoE is enabled on the switch in IOO0 rack, it's not enabled on one in N1 rack probably because GigE cameras are not connected directly to the switch in N1 rack.)
telnet@ICX6610-24P Switch>enable
telnet@ICX6610-24P Switch#configure terminal
telnet@ICX6610-24P Switch(config)#interface ethernet 1/1/24
telnet@ICX6610-24P Switch(config-if-e1000-1/1/24)# inline power


Remaining works
- Reset of the IP addresses of other end-point switches
- Solving license issue for enabling 10G connection
- Elimination of excessive daisy-chaining of network switches
Comments to this report:
takahiro.yamamoto - 22:20 Tuesday 17 March 2026 (36607) Print this report
All other camera network switches were also reconfigured.
Now, all switches can be reachable from k1cam*.

-----
I couldn't find the camera switch at EY at first. This was because a same IP address was used on the switches at EY and OMC and OMC switch responded earlier than EY one when I accessed that IP address. That IP address was different segment from the camera network and switches couldn't be accessed from k1cam*, so I changed their IP addresses as one in the camera network segment. Latest information can be found in the JGW wiki.

And also, I added camera names on the port description of the switch management interface. This information helps us to do the emergency recovery of each cameras by disabling/enabling PoE power supply on the switch management interface. When a new camera will be added, please update the port description of each switches for an easy network management.
CAL (General)
takahiro.yamamoto - 19:15 Monday 16 March 2026 (36588) Print this report
Inspection of a network trouble between CAL and DMG for Low-Latency data transfer
The Low-Latency data transfer has been failing since power was restored following the planned power outage.
So far, I haven't found any issue on L3 level and it seems to be an L2 issue.
To check about L2 settings, we need to access 10GigE Fujitsu switch and ask login info of the company.

-----
The reason for the data transfer failure is that the CAL servers and the DMG servers are unable to communicate with each other. These servers are belongs to DMG LAN and are connected via SR-S352TR1.

At first, we doubted a wrong settings on CAL servers (latest related works are klog#32886 and klog#35374), but two CAL servers were reachable each other and IP settings (ip a) and routing (ip r) seemed to be proper. And also, DMG servers should be fine because the bulk data transfer is working well (of course DMG servers are also reachable each other). I also checked ARP then, MAC address hadn't been resolved either. So CAL servers seemed to locate another place of DMG servers in L2 level. If so, L2 settings are done on SR-S352TR1 because both cal-gst* as CAL servers and hyades-* as DMG servers are directly connected to that network switch.

I don't know the login information to that switch and also DMG folks, so we are now asking the company folks. As what we can do without a login to that switch, I tried to change the connected network ports for cal-gst*. Now cal-gst-1 and -2 are connected to Port#1 and #2. Port#31 and #33-36 are used for UPS and disk storage devices. Port#38-40 are used for iRMC. All of them are connected by UTP cables. Hyades-* and the network switch at Mozumi are connected to Port#49-52 by optical fibers. Even when I used another ports for cal-gst* except Port#32 and #37, a situation didn't changed (CAL servers were reachable each other but unreachable to DMG servers). CAL servers became unreachable each other when I used Port#32 or #37.

For UPS devices and iRMC, dedicated VLAN are assigned. So Port#31-36 and Port#37-40 seems to be assigned another VLAN segment from Port#1-30 and #41-48 at least. I couldn't confirm about Port#49-52 because I didn't have necessary equipments today. I can check them tomorrow by preparing SFP+ module and optical fibers and connecting cal-gst* to Port#49-52.
Comments to this report:
takahiro.yamamoto - 22:33 Tuesday 17 March 2026 (36608) Print this report
Finally, this issue was solved by restoring a vanished VLAN settings on the DMG network switch (klog#36602).
The data transfer for the output of low-latency calibration pipeline was also resumed.
Now, it is nothing more than just a PD dark noise with applying DARM sensing response but the latest LL frame are also available at Kashiwa.
PEM (General)
takahiro.yamamoto - 18:02 Monday 16 March 2026 (36587) Print this report
Weather station PC at Atotsu has CMOS error

When I tried to migrate a copy process of Weather Station Web pages on k1script, I found k1script couldn't access Weather Station@Atotsu.
It was since Feb. 9th according to the last update on the Web page.
Weather Station@Atotsu cannot be launched due to CMOS error now.
A replacement of a CMOS battery (or PC itself in the worst case) seems to be required to recover it.

-----
A copy process of Weather Station Web pages was migrated from the old k1script1 to the new k1script0. This process for Weather Station at Mozumi works fine. On the other hand, it for at Atotsu couldn't access Weather Station PC. Because the old process on k1script1 was also failed, I concluded that it was a problem on the Weather Station PC not on the k1script.

Any remote connections wasn't available, so I checked a situation by plugging a display to the Weather Station PC at Atotsu and found that gathering hardware information before booting OS up failed due to the CMOS trouble as shown in Fig.1. This message is also shown when a CMOS battery (often use CR2032 on common server) becomes empty. Of course, there is some possibility about a malfunction of a motherboard. Anyway what we can try at first is opening PC chassis, and then replacing the CMOS battery if there is no obvious malfunction on the mother board.

Images attached to this report
Search Help
×

Warning

×