Brought to you by:

Table of contents

Volume 645

2015

Previous issue Next issue

High Energy Particle Physics Workshop (HEPPW2015) 11–13 February 2015, Johannesburg, South Africa

Published online: 15 October 2015

Preface

011001
The following article is Open access

and

The motivation for this workshop began with the discovery of the Higgs boson three years ago, and the realisation that many problems remain in particle physics, such as why there is more matter than anti-matter, better determining the still poorly measured parameters of the strong force, explaining possible sources for dark matter, naturalness etc. While the newly discovered Higgs boson seems to be compatible with the Standard Model, current experimental accuracy is far from providing a definitive statement with regards to the nature of this new particle. There is a lot of room for physics beyond the Standard Model to emerge in the exploration of the Higgs boson. Recent measurements in high-energy heavy ion collisions at the LHC have shed light on the complex dynamics that govern high-density quark-gluon interactions. An array of results from the ALICE collaboration have been highlighted in a recent issue of CERN courier. The physics program of high-energy heavy ion collisions promises to further unveil the intricacies of high-density quark-gluon plasma physics.

The great topicality of high energy physics research has also seen a rapid increase in the number of researchers in South Africa pursuing such studies, both experimentally through the ATLAS and ALICE colliders at CERN, and theoretically. Young researchers and graduate students largely populate these research groups, with little experience in presenting their work, and few support structures (to their knowledge) to share experiences with. Whilst many schools and workshops have sought to educate these students on the theories and tools they will need to pursue their research, few have provided them with a platform to present their work. As such, this workshop discussed the various projects being pursued by graduate students and young researchers in South Africa, enabling them to develop networks for future collaboration and discussion.

The workshop took place at the iThemba Laboratories - North facility, in Gauteng, from the 11th to the 13th of February 2015, where excellent conference facilities with outdoors and indoor tea areas for discussions and interactions were provided, along with a state-of-the-art remote access to the conference venue such that those who were unable to attend the workshop in person could also be present. The laboratory is located next door to the Wits Professional Development Hub (on the corner of Jan Smuts Avenue and Empire Road), which provided the catering for this workshop. A morning plenary session, followed 15+10 minute presentations, was the format across our three days. The topics covered being in high-energy theory and phenomenology (heavy ions, pp, ep, ee collisions), ATLAS physics and ALICE physics. The workshop website is http://hep.wits.ac.za/HEPPW2015.php

011002
The following article is Open access

Details of the conference sponsors, organising committee and speakers can be found in the PDF.

011003
The following article is Open access

All papers published in this volume of Journal of Physics: Conference Series have been peer reviewed through processes administered by the proceedings Editors. Reviews were conducted by expert referees to the professional and scientific standards expected of a proceedings journal published by IOP Publishing.

Papers

Theoretical collider

012001
The following article is Open access

and

The possibility of generating a large trilinear At soft supersymmetry breaking coupling at low energies through renormalisation group evolution in the 5D MSSM is investigated. Using the power law running in five dimensions and a compactification scale in the 10-103 TeV range, we show that gluino mass may drive a large enough At to reproduce the measured Higgs mass and have a light stop superpartner below ∼ 1 TeV, as preferred by the fine tuning argument for the Higgs mass.

012002
The following article is Open access

A brief review on the physics beyond the Standard Model is given, as was presented in the High Energy Particle Physics workshop on the 12th of February 2015 at the iThemba North Labs. Particular emphasis is given to the Minimal Supersymmetric Standard Model, with mention of extra-dimensional theories also.

012003
The following article is Open access

With the Large Hadron Collider already able to produce collisions with an energy of 8 TeV, the formation of higher dimensional black holes may soon be possible. In order to determine if we are detecting these higher dimensional black holes we need to have a theoretical understanding of what the signatures of such black holes could be. As such we shall discuss quasi-normal modes (QNMs) for spin-3/2 fields as they travel through a black hole background. We will begin by studying possible QNMs for N-dimensional Schwarzschild black holes and extend by looking at N-dimensional Kerr black holes. We will use the Wentzel-Kramers-Brillouin approximation to determine the QNMs for the two types of black holes described above.

012004
The following article is Open access

and

An outline for research to be done is given. In the heavy ion experiments at RHIC and the LHC, it is widely believed that a state of matter known as the quark-gluon plasma (QGP) has been produced. The so-called hard particles, or particles with very high momentum that are produced as a consequence of the asymptotic freedom of QCD, can be used as tomographic probes of the QGP. We will study the way in which energy is dissipated in this QGP by calculating, in pQCD, short path length corrections to the well-known energy loss formulae. This calculation is necessary to address the discovery at the LHC that shockingly small systems appear to exhibit collective behaviour.

012005
The following article is Open access

In this proceedings some studies on the prospects of single top production at the Large Hadron Electron Collider (LHeC) and double Higgs production at the Future Circular Hadron Electron Collider (FCC-he) shall be presented. In particular, we investigated the tbW couplings via single top quark production with the introduction of possible anomalous Lorentz structures, and measured the sensitivity of the Higgs self coupling (λ) through double Higgs production. The studies are performed with 60 GeV electrons colliding with 7 (50) TeV protons for the LHeC (FCC-he).

For the single top studies a parton level study has been performed, and we find the sensitivity of the anomalous coupling at a 95% C.L, considering 10-1% systematic errors. The double Higgs production has been studied with speculated detector parameters and the sensitivity of λ estimated via the cross section study around the Standard Model Higgs self coupling strength (λSM) considering 5% systematic error in signal and backgrounds. Effects of non-standard CP-even and CP-odd couplings for hhh, hWW and hhWW vertices have been studied and constrained at 95% C.L.

012006
The following article is Open access

and

AdS/CFT computations have been used to describe the energy loss of QCD-like particles moving through a strongly coupled plasma, but little is understood regarding the initial conditions of these jets. We use the Schwinger-Keldysh finite-time formalism applied to an interacting scalar field theory to derive perturbative expressions detailing the system which exists during the initial stages of a high energy collision.

In this paper we calculate 〈ϕ〉(x) for a scalar Yukawa model, demonstrate the finiteness of the energy momentum tensor for λϕ4 to leading order, and derive an expression for the conditional expectation value of operators to aid in the description jet-like behaviour in interacting theories.

012007
The following article is Open access

and

Within the expanding fireball formed at heavy ion collision, jets are produced which probes the QGP. Analysis the energy loss of these energetic partons as they travel throw QGP may reveal extremely valuable information about the dynamics of the plasma and exhibit distinctive properties such as jet-quenching. The "AdS/CFT correspondence" which imposes the duality between the gauge theory and gravity is a novel tool provides valuable insight into the strongly coupled plasma. The most important result of AdS/CFT is calculating the value of shear viscosity to entropy density ratio which is in remarkable agreement with the hydrodynamics predictions. We study the energy loss rate of light quarks via AdS/CFT correspondence in both static and expanding plasma. In the hope of making contact with QGP physics, we propose a novel jet prescription based on the separation of hard and soft modes in the dual theory and test the AdS/CFT approach with the latest light hadron suppression data from CMS.

Experimental collider

012008
The following article is Open access

Accurate jet reconstruction is necessary for understanding the link between the unobserved partons and the jets of observed collimated colourless particles the partons hadronise into. Understanding this link sheds light on the properties of these partons. A review of various common jet algorithms is presented, namely the Kt, Anti-Kt, Cambridge/Aachen, Iterative cones and the SIScone, highlighting their strengths and weaknesses. If one is interested in studying jets, the Anti-Kt algorithm is the best choice, however if ones interest is in the jet substructures then the Cambridge/Aachen algorithm would be the best option.

012009
The following article is Open access

The Large Hadron Electron Collider (LHeC) at the European Laboratory, CERN, is expected to collide electrons and protons at high energy. Studies pertaining to the feasibility of observing the Higgs boson in this environment were reported in the Conceptual Design Report. Here the effect of decreasing the electron energy in an ep collision to find the optimal, economic electron energy for the study of the Higgs boson in the future LHeC is studied. Two production mechanisms are addressed: a Higgs boson production process in an ep collision, and a background process. The electron energy was varied between 10 GeV and 100 GeV in increments of 10 GeV. The results obtained in this study have shown that using an electron energies between 40 GeV and 60 GeV would be sufficient to measure the properties of the Higgs boson without compromising on the validity of obtained results.

012010
The following article is Open access

When simulations based on the Standard Model (SM) of particle physics are compared to actual data obtained by the ATLAS and CMS experiments at the CERN Large Hadron Collider (LHC). It is seen that there is an excess in the transverse momentum crosssectional data above what is predicted by the simulations. In order to make predictions for higher centre of mass energies at the LHC, simulations of processes resulting in the production of Higgs bosons were done for different centre of mass energies. At the energy scales seen at the LHC, the SM predicts that the main production mechanism for Higgs bosons is gluon fusion. The production of a Higgs boson in this manner must be accompanied by the production of one or more other particles in order for the Higgs boson to acquire transverse momentum (PT). This is because there is no transvers momentum coming into the collision and so conservation of momentum requires two or more particles with opposite PT to be produced in order for them to have non-zero PT. If a heavy scalar boson is produced in this interaction which decays into a Higgs boson and some other particle, the emission of this other particle would give the Higgs boson extra transverse momentum above what is predicted by the SM.

012011
The following article is Open access

The neutral CP-dd boson A is predicted by many models with an extended Higgs sector. Searching for the A boson in the sensitive Zh decay, where h is assumed to be the LHC discovered Higgs boson, within the mass range of 220 — 1000 GeV, offers a gateway to find physics beyond the Standard Model. A search for a gluon-fusion-produced A in the decay to Zh, with a final state of two light leptons and two tau leptons, is conducted with 20.3 fb-1of proton- proton collision data at 8 TeV center of mass energy. The data driven background estimations, background reduction techniques and systematic uncertainty calculations are presented. Upper limits on the cross section times branching ratio of the A boson decaying to ℓℓττ are set for various 2-Higgs-Doublet-Model (2HDM) scenarios. Where no excess is observed, exclusion limits are set on ranges of the 2HDM phase-space.

012012
The following article is Open access

The missing transverse momentum (EmissT) in particle collider experiments is defined as the momentum imbalance in the plane transverse to the beam axis: the resultant of the negative vectorial sum of the momenta of all the particles that are involved in the pp collision of interest. A precise measurement of the EmissT is essential for many physics studies at the LHC, such as Higgs boson searches and measurements, as well as searches beyond the Standard Model. The EmissT measurement is constructed from the reconstructed and calibrated energy deposits inside the calorimeters, a method that has historically served experiments well, but one which is sensitive to fluctuations from noise and, in particular, additional unrelated collisions within the same event - an effect that is becoming more critical with the increasing luminosity of the LHC. A complementary method for measuring the missing transverse momentum is presented, in which track momenta are used in place of the calorimeter energy measurements, allowing the calculation to be made from particles originating solely from the collision vertex of interest. The reconstruction of this track-based missing transverse momentum, pmissT, and its performance in W and Z boson events, is described here.

012013
The following article is Open access

Using a classical gluon cascade, we study the thermalisation of a gluon-plasma in a homogeneous box by considering the time evolution of the entropy, and in particular how the thermalisation time depends on the strong coupling αs. We then partition the volume into cells with a linearly increasing temperature gradient in one direction, and homogeneous/isotropic in the the other two directions. We allow the gluons to stream in one direction in order to study how they then evolve spatially. We examine cases with and without collisions. We study the entropy as well as the flow-velocity in the z-direction and find that the system initially has a flow which dissipates over time as the gluons become distributed homogeneously throughout the box.

012014
The following article is Open access

The missing transverse energy plays a really important role in reconstructing events produced at hadron colliders. Undetectable particles, such as neutrinos, pass through the matter with a negligible probability of interaction. Hence, no direct evidence of them can be measured in a general purpose detector, as ATLAS. However, the total momenta in the transverse plane to the beam axis has to be conserved and computed. In particular, it is used in searches for the Standard Model Higgs boson channels, such as: HWW, HZZ and Hττ. The benefit of using this conservation law is that an energy imbalance may signal the presence of such undetectable particles. Therefore, it becomes also a powerful tool for new physics searches at the Large Hadron Collider, such as Supersymmetry and Extra Dimensions. The performance of the missing transverse momentum reconstruction in the ATLAS detector is evaluated using data collected in 2012 in proton-proton collisions at a centre-of-mass energy of 8 TeV. An optimised reconstruction of missing transverse momentum is used and the effects arising from additional proton-proton interactions superimposed on the hard physics process are suppressed with various methods. Results are shown for a data sample corresponding to an integrated luminosity of about 20 ft-1 and for events with different topologies with or without a genuine missing transverse momentum due to undetected particles.

012015
The following article is Open access

The Standard Model (SM) of particle physics, with the discovery of the Higgs boson, is a model of the known fundamental particles and their interactions. The data taken in the 2012 run was then compared to the Monte Carlo and an excess has been found in the Higgs transverse momentum in the di-photon and ZZ decay channels. A possible explanation is a beyond the SM scalar boson is being produced which would then decay into a dark matter particle and a Higgs boson that looks like the current SM. This dark matter particle would provide the Higgs with excess momentum which may account for the discrepancy observed. A first attempt at modelling the production of the heavier than the SM Higgs (or scalar boson) showed that as the centre of mass energies increase the production cross-section of the scalar boson increased faster than the SM Higgs boson. This indicates that if the hypothesis is true then we should expect greater Higgs boson productions during the 2015 run at higher centre of mass energies. A better understanding of the observed excess is needed before any further conclusions can be made.

012016
The following article is Open access

The Standard Model Higgs boson like particle was observed in July 2012 by both the ATLAS and CMS collaboration at the LHC. The measurements of the Higgs boson properties are performed in Higgs boson decaying into two photons channel after the discovery. Higgs boson production by the gluon-gluon fusion, the vector boson fusion and in association with a W or Z boson or the top-quark pair are measured in this final state. The multivariate analysis method is applied to extract the Higgs boson in the vector boson fusion enriched category to enhance the signal significance. The couplings of each production modes are measured using 20.3 fb-1 2012 data taken at center-of-mass energy = 8TeV and 4.5 fb-1 2011 data taken at = 7TeV in ATLAS. No significant deviation from the Standard Model prediction is found. The fiducial and differential cross sections of Higgs boson are measured using 2012 data in this final state as well. The distribution of several kinematic variables of the two photons and jets are studied. The results are compared with several theoretical predictions.

012017
The following article is Open access

Recent analysis of Large Hadron Collider (LHC) Run 1 data has shown an apparent mismodelling is Higgs pT from ATLAS γγ and 4 decay channels, as well as the CMS 4 decay channel.This has been referred to as the pT crisis, and it is postulated that it can be explained by the Higgs being produced in combination with a dark matter particle. A minimal Z' is modelled as a dark matter mediator and shown to have cross sections too low to satisfy current experimental constraints. The viability of a two-Higgs-doublet model providing a form of dark matter is then argued to be a viable solution to an excess of double Higgs production events from Run 1 LHC data, and an experimental analysis done by ATLAS is given as motivation. I conclude by noting that these analyses will be enhanced by data from Run 2 of the LHC.

Instrumentation

012018
The following article is Open access

The Large Hadron Collider at CERN generates enormous amounts of raw data which presents a serious computing challenge. After planned upgrades in 2022, the data output from the ATLAS Tile Calorimeter will increase by 200 times to over 40 Tb/s. Advanced and characteristically expensive Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs) are currently used to process this quantity of data. It is proposed that a cost- effective, high data throughput Processing Unit (PU) can be developed by using several ARM System on Chips in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. ARM is a cost effective and energy efficient alternative CPU architecture to the long established x86 architecture. This PU could be used for a variety of high-level algorithms on the high data throughput raw data. An Optimal Filtering algorithm has been implemented in C++ and several ARM platforms have been tested. Optimal Filtering is currently used in the ATLAS Tile Calorimeter front-end for basic energy reconstruction and is currently implemented on DSPs.

012019
The following article is Open access

, , , , , , and

The radiation damage in polyvinyl toluene based plastic scintillator EJ200 obtained from ELJEN technology was investigated. This forms part of a comparative study conducted to aid in the upgrade of the Tile Calorimeter of the ATLAS detector during which the Gap scintillators will be replaced. Samples subjected to 6 MeV proton irradiation using the tandem accelerator of iThemba LABS, were irradiated with doses of approximately 0.8 MGy, 8 MGy, 25 MGy and 80 MGy. The optical properties were investigated using transmission spectroscopy and light yield analysis whilst structural damage was assessed using Raman spectroscopy. Findings indicate that for the dose of 0.8 MGy, no structural damage occurs and light loss can be attributed to a breakdown in the light transfer between base and fluor dopants. For doses of 8 MGy to 80 MGy, structural damage leads to possible hydrogen loss in the benzene ring of the PVT base which forms free radicals. This results in an additional absorptive component causing increased transmission loss and light yield loss with increasing dose.

012020
The following article is Open access

, , , , and

The year 2022 has been scheduled to see an upgrade of the Large Hadron Collider (LHC), in order to increase its instantaneous luminosity. The High Luminosity LHC, also referred to as the upgrade Phase-II, means an inevitable complete re-design of the read-out electronics in the Tile Calorimeter (TileCal) of the A Toroidal LHC Apparatus (ATLAS) detector. Here, the new read-out architecture is expected to have the front-end electronics transmit fully digitized information of the detector to the back-end electronics system. Fully digitized signals will allow more sophisticated reconstruction algorithms which will contribute to the required improved triggers at high pile-up. In Phase II, the current Mobile Drawer Integrity ChecKing (MobiDICK) test-bench will be replaced by the next generation test-bench for the TileCal superdrawers, the new Prometeo (A Portable ReadOut ModulE for Tilecal ElectrOnics). Prometeo is a portable, high-throughput electronic system for full certification of the front-end electronics of the ATLAS TileCal. It is designed to interface to the fast links and perform a series of tests on the data to assess the certification of the electronics. The Prometeo's prototype is being assembled by the University of the Witwatersrand and installed at CERN for further developing, tuning and tests. This article describes the overall design of the new Prometeo, and how it fits into the TileCal electronics upgrade.

012021
The following article is Open access

, , , , and

The influence of radiation on the light transmittance of plastic scintillators was studied experimentally. The high optical transmittance property of plastic scintillators makes them essential in the effective functioning of the Tile calorimeter of the ATLAS detector at CERN. This significant role played by the scintillators makes this research imperative in the movement towards the upgrade of the tile calorimeter. The radiation damage of polyvinyl toluene (PVT) based plastic scintillators was studied, namely, EJ-200, EJ-208 and EJ-260, all manufactured and provided to us by ELJEN technology. In addition, in order to compare to scintillator brands actually in use at the ATLAS detector currently, two polystyrene (PS) based scintillators and an additional PVT based scintillator were also scrutinized in this study, namely, Dubna, Protvino and Bicron, respectively. All the samples were irradiated using a 6 MeV proton beam at different doses at iThemba LABS Gauteng. The radiation process was planned and mimicked by doing simulations using a SRIM program. In addition, transmission spectra for the irradiated and unirradiated samples of each grade were obtained, observed and analyzed.

012022
The following article is Open access

and

The ATLAS detector, operated at the Large Hadron Collider (LHC) records proton-proton collisions at CERN every 50ns resulting in a sustained data flow up to PB/s. The upgraded Tile Calorimeter of the ATLAS experiment will sustain about 5PB/s of digital throughput. These massive data rates require extremely fast data capture and processing. Although there has been a steady increase in the processing speed of CPU/GPGPU assembled for high performance computing, the rate of data input and output, even under parallel I/O, has not kept up with the general increase in computing speeds. The problem then is whether one can implement an I/O subsystem infrastructure capable of meeting the computational speeds of the advanced computing systems at the petascale and exascale level.

We propose a system architecture that leverages the Partitioned Global Address Space (PGAS) model of computing to maintain an in-memory data-store for the Processing Unit (PU) of the upgraded electronics of the Tile Calorimeter which is proposed to be used as a high throughput general purpose co-processor to the sROD of the upgraded Tile Calorimeter. The physical memory of the PUs are aggregated into a large global logical address space using RDMA- capable interconnects such as PCI- Express to enhance data processing throughput.

012023
The following article is Open access

, , , , , and

In this study we look at radiation damage and its adverse effects on plastic scintillators housed within the Tile Calorimeter (TileCal) of the ATLAS detector. The study focuses on determining how the interaction of ionizing radiation with plastic scintillators effects their efficacy and desired properties such as high light output and fast decay time. Plastic scintillators form an integral part of the ATLAS trigger system and their optimal functionality is paramount to the success of ATLAS. Electron paramagnetic resonance (EPR) provides insight into the electronic structure of the plastics and can characterize the damage caused by ionizing radiation. Density functional theory (DFT) calculations will be performed in order to simulate the EPR signal. Preliminary EPR results investigate four different types of plastic scintillators. These include three polyvinyl-toluene based Eljen technologies: EJ200, EJ208 and EJ260, and one polystyrene based Dubna sample. It has been observed that the Dubna sample, identical on the current scintillator used in the ATLAS detector, undergoes more structural damage when compared to the Eljen samples.

012024
The following article is Open access

The Large Hadron Collider at CERN is scheduled to undergo a major upgrade in 2022. The ATLAS collaboration will do major modifications to the detector to account for the increased luminosity. More specifically, a large proportion of the current front-end electronics, on the Tile Calorimeter sub-detector, will be upgraded and relocated to the backend. A Demonstrator program has been established as a proof of principle. A new system will be required to house, manage and connect this new hardware. The proposed solution will be an Advanced Telecommunication Computing Architecture (ATCA) which will not only house but also allow advanced management features and control at a hardware level by integrating the ATCA chassis into the Detector Control System.

012025
The following article is Open access

After the 2022 upgrades, the Tile Calorimeter (TileCal) detector at ATLAS will be generating raw data at a rate of approximately 41 TB/s. The TileCal triggering system contains a degree of parallelism in its processing algorithms and thus presents an opportunity to explore the use of general-purpose computing on graphics processing units (GPGPU). Currently, research into the viability of an sROD ARM-based co-processing unit (PU) is being conducted at Wits University with especial regard to increasing the I/O throughput of the detector. Integration of GPGPU into this PU could enhance its performance by relieving the ARMs of particularly parallel computations. In addition to the PU, use of GPGPU in the front-end trigger is being investigated on the basis of the used algorithms having a similarity to image processing algorithms - where GPU can be used optimally. The use of GPUs in assistance to or in place of FPGAs can be justified by GPUs' relative ease of programming; C/C++ like languages as opposed to assembly-like Hardware Description Languages (HDLs). This project will consider how GPUs can best be utilised as a subsystem of TileCal in terms of power and computing efficiency; and therefore cost.

012026
The following article is Open access

and

The ATLAS software framework (ATHENA) is large and dynamic, comprised of around 6.5 million lines of code. It is compiled using the ATLAS monitoring system, NICOS which uses tools and scripts located and tuned for the CERN services, LXPLUS and AFS. Furthermore, the constraints placed on the hardware that the software is based, limits compilations to traditional x86 architecture. With the sudden interest in ARM processors for large scale high energy physics computing, a new system needs to be implemented to build ATHENA versions for ARM, on ARM. This letter serves to introduce a building framework called Atlas Nightly on ARM (ANA). This new framework implements patches to suit the ARM architecture with the goal of a final ATHENA version for ARM.

012027
The following article is Open access

, and

The Large Hadron Collider (LHC) is preparing for a major Phase-II upgrade scheduled for 2022 [1]. The upgrade will require a complete redesign of both on- and off-detector electronics systems in the ATLAS Tile hadron Calorimeter (TileCal) [2]. The PROMETEO (A Portable ReadOut ModulE for Tilecal ElectrOnics) stand-alone test-bench system is currently in development and will be used for the certification and quality checks of the new front- end electronics. The Prometeo is designed to read in digitized samples from 12 channels simultaneously at the bunch crossing frequency while accessing quality of information in realtime. The main board used for the design is a Xilinx VC707 evaluation board with a dual QSFP+ FMC (FPGA Mezzanine Card) module for read-out and control of the front-end electronics. All other functions are provided by a HV board, LED board and a 16 channel ADC daughter board. The paper relates to the development and testing of the ADC board that will be used in the new Prometeo system.

012028
The following article is Open access

, and

Today's large-scale science projects have always encountered challenges in processing large data flow from the experiments, the ATLAS detector records proton-proton collisions provided by the Large Hadron Collider (LHC) at CERN every 50 ns which results in a total data flow of 10 Pb/s. These data must be reduced to the science data product for further analysis, thus a very fast decisions need to be executed, to modify this large amounts of data at high rates. The capabilities required to support this scale of data movement is development and improvement of high-throughput electronics. The upgraded LHC will provide collisions at rates that will be at least 10 times higher than those of today due to it's luminosity by 2022. This will require a complete redesign of the read-out electronics and Processing Units (PU) in the Tile-calorimeter (TileCal) of the ATLAS experiment. A general purpose, high-throughput PU has been developed for the TileCal at CERN, by using several ARM-processors in cluster configuration. The PU is capable of handling large data throughput and apply advanced operations at high rates. This system has been proposed for the fixed target experiment at NICA complex to handle the first level processes and event building. The aim of this work is to have a look at the architecture of the data acquisition system (DAQ) of the fixed target experiment at the NICA complex at JINR, by compiling the data-flow requirements of all the subcomponents. Furthermore, the VME DAQ modules characteristics to control, triggering and data acquisition will be described in order to define the DAQ with maximum readout efficiency, no dead time and data selection and compression.