The data acquisition system of the XMASS experiment

The XMASS experiment aims to detect dark matter, pp and 7Be solar neutrinos, and neutrinoless double beta decay using highly purified liquid xenon located underground in the Kamioka Observatory in Japan. Construction of the detector was completed in September 2010 and data-taking started in December 2010. This paper discusses the design of the XMASS data acquisition system and its capabilities.


Introduction
The XMASS experiment is designed for multiple physics goals to detect dark matter, pp and 7 Be solar neutrinos, and neutrinoless double beta decay by using highly purified liquid xenon in an ultra-low radioactivity environment [1]. The XMASS detector is located underground (2,700 m water equivalent) in the Kamioka Observatory in Japan. Figure 1 shows a schematic drawing of the detector. It is a single phase liquid xenon scintillator detector containing 835 kg of liquid xenon in an active region. The volume is viewed by 630 hexagonal and 12 cylindrical Hamamatsu R10789 photomultiplier tubes (PMTs) arranged on an 80 cm diameter pentakis-dodecahedron support structure. The shape of the pentakis-dodecahedron consists of 60 isosceles triangles. A total cathode coverage of more than 62% is achieved. In order to shield the liquid xenon detector from external gammas, neutrons, and muon-induced backgrounds, the copper vessel is placed at the center of a ϕ10 m× 11 m cylindrical tank filled with pure water. The water tank is equipped with 72 Hamamatsu R3600 20-inch PMTs to provide both an active muon veto and passive shielding against these backgrounds. The liquid xenon and water Cherenkov detectors are hence called an Inner Detector (ID) and an Outer Detector (OD), respectively. Detailed descriptions of the XMASS detector can be found elsewhere [2]. Construction of the detector was started in April 2007 and completed in September 2010. Commissioning runs were conducted from December 2010 until June 2012. After a year of detector refurbishment, data-taking resumed in November 2013 and is continuing.
In this paper, the readout electronics, data acquisition, and online data monitoring system for the XMASS experiment are presented.  Figure 2 shows a schematic diagram of the readout electronics. Signals from 642 inner detector PMTs are amplified by a factor of 11 using custom-made preamplifier cards and are then fed into analog-timing-modules (ATMs) as well as CAEN V1751 Flash analogue-to-digital converters (ADCs). The ATMs were originally made for and used in Super-Kamiokande I-III [3] and are reused in the XMASS detector. They function as ADCs and time-to-digital converters (TDCs), and record the integrated charge and arrival time of each PMT signal. The dynamic ranges are approximately 450 pC (corresponding to approximately 120 photo-electrons(PEs)) with a resolution of 0.2 pC (corresponding to 0.05 PE) for the ADC and approximately 1300 ns with a resolution of 0.4 ns for the TDC. We have 60 ATM boards for the ID detector.

Readout electronics
CAEN V1751 Flash-ADCs have a 1 GHz sampling rate with 500 MHz bandwidth. The dynamic range of these ADCs is 1 V with 10 bit (1 mV) resolution, corresponding to a 0.05 PE pulse height. The acquisition window is 10 µs (approximately 1 µs before the trigger and 9 µs after it). An on-board FPGA allows to operate the ADCs in a mode where only parts of the waveform around a peak are recorded, discarding data in a band around the baseline (zero-length-encoding). We employ 84 V1751 Flash-ADC boards in 6 VME crates.
Ten or eleven PMTs, which are mounted on the same triangular holder, are connected to one ATM board. The ATM outputs the PMTSUM signal, which is an analogue sum of the 12 channels in a board, after the signal is attenuated by a factor of eight compared to the input signal. 60 PMTSUM signals are fed into CAEN V1721 Flash-ADCs, which have a 500 MHz sampling rate. The dynamic range of these ADCs is 1 V with 8 bit (4 mV) resolution, corresponding to a 1.25 PE pulse height. The acquisition window is 4 µs (approximately 1 µs before the trigger and 3 µs after it). We have 8 these Flash-ADC boards in a VME crate.
Signals from 72 outer detector PMTs are also processed and digitized and recorded by 6 ATM boards.

Clock synchronization
In order to synchronize all the Flash-ADC boards, a custom-made master clock module provides a 62.5 MHz clock signal with an accuracy of ±0.5 ppm and a jitter of 13 ps in RMS. It has 8 LVDS outputs as well as a NIM output. The LVDS clock signals are distributed to a Flash-ADC board in each crate, and the signal is passed from one board to another inside the crate. The sampling clock of each Flash-ADC board is thus locked to the master clock.

Triggering
When the ATM input signal gets below the -5 mV threshold, which corresponds to approximately 0.2 PE in the ID and 0.4 PE in the OD, a rectangular signal 200 ns wide and 15 mV high is generated. An output signal on the ATM front panel called HITSUM is generated by summing up all the rectangular analog signals generated on channels belonging to that ATM module. All the HITSUM signals from the ID ATMs are summed up again by a summing amplifier to generate the global ID HITSUM signal. When the global ID HITSUM signal exceeds the ID trigger threshold, an ID trigger is issued. A separate trigger for the OD is generated in an analog manner from OD signals on the OD ATM boards. The ID and OD trigger signals, as well as GPS 1PPS trigger signals, are fed into a custommade VME trigger module (TRG), which was also used for Super-Kamiokande I-III. For any type of trigger input, the TRG generates a global trigger signal and a 16-bit event number and sends them to both the ATM and Flash-ADC to initiate data collection for each event. The TRG module records the trigger time with a resolution of 20 ns, together with the trigger type and event number. Figure 3 shows data stream from the readout electronics to the offline computing system. The online computing consists of the data readout servers, event building servers, an online storage, and a run control server. In each data readout server, there are three processes running in parallel: collector, sorter, and sender. The collector reads out the data initiated by an interrupt from the hardware and stores it to a circular buffer in a shared memory. For ATM, TRG, and GPS, the data is read out via SBS bit-3 PCI to VME adapter. The Flash-ADC data is read out via CAEN A3818 controller with a maximum data transfer speed of 80 MB/s per optical link. We accumulate multiple events in each electronics board and read out them once from board-by-board to achieve better data transfer speed. Therefore, the sorter then reads the data from the shared memory and rearranges it into the time order and puts it back to the ring buffer. The sender reads the sorted data and sends it to an event building server over TCP/IP at the request from the event builder. There are two event building servers running independently. The data from ATM/TRG, Flash-ADC, and GPS are separately written into a 60 TB online storage as binary files. These binary files are then moved to the offline mass storage and data from ATM/TRG, Flash-ADC, and GPS are merged using the event number assigned and distributed by TRG and converted to the ROOT-based format used in offline analyses. The data acquisition is controlled by a run control program developed based on GTK+2. Figure 4 shows a screen shot of a graphical user interface of the run controller. It manages the run configuration such as the run mode (normal physics run, detector calibration etc.), shift information, and hardware settings.

Online data monitoring
The online data monitoring system consists of a server and viewer processes. Running on the event building server, the monitor server process receives data from the event builder through the shared memory, decodes and fills it into histograms. The viewer process connects to the server process and receives histograms over TCP/IP. Multiple viewers can be connected to the server at the same time. Since there are two event building processes, ATM/TRG and Flash-ADC, running in parallel, there exists two online data monitoring systems as well. Figure 5 shows a screen shot of the online data monitor viewer for the Flash-ADC data stream. The online data monitoring system allows a shifter to identify problems in electronics and data quality as soon as possible.

Performance and future improvements
In the case of usual physics data-taking, trigger rate is approximately 5 Hz with total data rate of 2 MB/s. For detector calibrations, the XMASS data acquisition system was able to achieve stable data-taking with trigger rate up to ∼200 Hz and the maximum data rate of 60 MB/s. If a star as close as Betelgeuse explodes as supernova, the XMASS detector will observe an order of 10 4 events of coherent neutrino-nucleus scattering, most of which occurs within 1 s. The current XMASS data acquisition system cannot catch up in such a case. Therefore, speeding up of the readout system is planned.

Conclusions
The XMASS experiment has designed and implemented a set of readout electronics and data acquisition system which meet the requirements for low-energy and ultra-low radioactivity measurements. Although the trigger rate is not high, the event data size is large since we acquire waveform data of 642 PMTs using Flash-ADCs. Therefore, the dedicated data acquisition systems are developed.