This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.

Articles

TURBULENCE DECAY AND CLOUD CORE RELAXATION IN MOLECULAR CLOUDS

, , and

Published 2015 January 30 © 2015. The American Astronomical Society. All rights reserved.
, , Citation Yang Gao et al 2015 ApJ 799 227 DOI 10.1088/0004-637X/799/2/227

0004-637X/799/2/227

ABSTRACT

The turbulent motion within molecular clouds is a key factor controlling star formation. Turbulence supports molecular cloud cores from evolving to gravitational collapse and hence sets a lower bound on the size of molecular cloud cores in which star formation can occur. On the other hand, without a continuous external energy source maintaining the turbulence, such as in molecular clouds, the turbulence decays with an energy dissipation time comparable to the dynamic timescale of clouds, which could change the size limits obtained from Jean's criterion by assuming constant turbulence intensities. Here we adopt scaling relations of physical variables in decaying turbulence to analyze its specific effects on the formation of stars. We find that the decay of turbulence provides an additional approach for Jeans' criterion to be achieved, after which gravitational infall governs the motion of the cloud core. This epoch of turbulence decay is defined as cloud core relaxation. The existence of cloud core relaxation provides a more complete understanding of the effect of the competition between turbulence and gravity on the dynamics of molecular cloud cores and star formation.

Export citation and abstract BibTeX RIS

1. INTRODUCTION

Turbulent motion, magnetic field, and gravity are governing factors in the dynamics of star formation in molecular clouds (Pudritz 2002; Ward-Thompson 2002; McKee & Ostriker 2007). Turbulence and its effects on molecular clouds and star formation have been studied since the pioneering work of Chandrasekhar (1949, 1951a, 1951b); a recent review on this topic is given in Mac Low & Klessen (2004). Broadly speaking, turbulence has two competing effects on star formation. On one hand, large-scale turbulence is the main driving mechanism that creates dense cloud cores (Chandrasekhar 1951a; Larson 1981; Kritsuk et al. 2013), which incubate star formation. On the other hand, turbulent motion within the cloud cores provides additional support against gravitational collapse (Chandrasekhar 1951b; Bonazzola et al. 1987; Léorat et al. 1990), which hinders star formation. A full understanding of the dynamics of star formation therefore requires a complete analysis of the effects of turbulence.

In this work, we examine the resistance to gravitational collapse due to the cloud-core turbulence. Most of the previous analyses of this effect were based on the theory of compressible or incompressible, fully developed, and statistically stationary turbulence, i.e., turbulent flows maintained by continuous external energy supplies (McKee & Ostriker 2007). Most of the energy sources that drive the turbulent motion in a molecular cloud, however, are neither uniform nor consistent (Mac Low & Klessen 2004), which means that the turbulence in any particular cloud core cannot be continuously maintained and the turbulent flow will gradually slow down, i.e., the turbulent energy will decay. Additionally, numerical simulations showed that the energy decay time of a typical turbulent flow in molecular clouds is smaller than or comparable to the dynamic timescale of star formation (Stone et al. 1998; Mac Low et al. 1998). These two factors suggest that star formation in cloud cores actually occurs in an environment of decaying turbulence. Therefore, the effect of turbulent energy decay should be taken into account when analyzing star formation in cloud cores, which is what we address here.

This paper is organized as follows: turbulence driving mechanisms in molecular clouds are first reviewed, followed by presentation of the scaling law of decaying turbulence, with an analysis of its effect on Jeans' criterion. The epoch of cloud-core relaxation is then proposed, and results of our scaling analysis are discussed and concluded.

2. DRIVING MECHANISMS OF TURBULENCE IN MOLECULAR CLOUDS

Currently, the general consensus on star formation is that large-scale (∼10–100 pc) turbulence leads to the clustering of dense regions and subsequently the formation of stars. The energy dissipation time of the large-scale turbulence is of the order of ∼106 yr, which is comparable to the flow-crossing time, as well as the free-fall timescale of a cloud (Stone et al. 1998; Ossenkopf & Mac Low 2002; Offner et al. 2008).

Various energy sources that could trigger turbulent flows in molecular clouds have been proposed and corresponding driving scales of turbulence have been numerically studied (Genzel et al. 1998; Mac Low & Klessen 2004; Joung & Mac Low 2006; Brunt et al. 2009; Goldbaum et al. 2011; Falceta-Gonçalves et al. 2015). Magneto-rotational instability (MRI) is an efficient mechanism that couples large-scale galactic rotation with turbulent motions in star-forming clouds, whose energy input into turbulent motion is, however, about two orders of magnitude less than the observed value. Similarly, turbulent motion due to gravitational instability (GI) is not energetic enough to drive star formation. These two instabilities (MRI and GI) therefore likely serve as basic driving mechanisms that contribute to only a small portion of the observed turbulence in molecular clouds. Protostellar jets and outflows are good sources of local turbulence drivers that can affect their surrounding cloud environment but are too small in dimension to account for the large-scale turbulence observed. Massive stars can affect the cloud environment significantly through intense radiation, but only a small portion of the radiation energy is converted to turbulent motion. On the other hand, although winds from massive stars can be more energetic, the population of massive stars is too small to make a major contribution, especially when compared to that of supernova explosion discussed below.

It is suggested by Mac Low & Klessen (2004) that supernova explosion is the dominant turbulence driving mechanism, with sufficient energy input rates to trigger the turbulence observed in molecular clouds. Following their analysis, we assume the supernova explosion rate in the Galaxy (100 pc star formation scale height and 15 kpc in radius) to be (50 yr)−1, which then gives an estimate that the supernova explosion rate in a typical star-forming cloud with a diameter of 100 pc to be ∼(106–107 yr)−1. This result means that there is a typical time lag of 106–107 yr between two successive supernova explosions in a molecular cloud, which is comparable to or slightly larger than the star formation time of several 106 yr. As a result of a supernova explosion, shock waves sweep the gas and dust in the molecular cloud, cluster them into dense cores through particle collisions, and initiate the turbulent motion in these cores as well as in the entire cloud (Joung & Mac Low 2006). To give a general picture of the molecular clouds and the dense cloud cores discussed here, the typical length scale of cloud cores is lcore ∼ 0.1 pc, which is much smaller than the cloud diameter of l0 ∼ 100 pc; and the core density is of the order ρcore ∼ 5 × 104 cm−3, which is much larger than the density of diffuse regions of a molecular cloud ρcloud ∼ 10 cm−3 (both are number densities of molecules). After the shock waves pass by, the decay laws govern the evolution of turbulence in the molecular cloud.

3. TURBULENCE DECAY IN MOLECULAR CLOUDS AND CLOUD CORES

The decay of turbulent energy, and the associated variation of energy spectra in turbulent flows without an external maintaining force, is a classical problem in fluid turbulence research. It can be dated back to the classic paper by von Kármán & Howarth (1938) and has been investigated by Kolmogorov (1941), Batchelor & Townsend (1948a, 1948b), Heisenberg (1948), among many others. Recent studies based on terrestrial fluid experiments have improved the understanding of turbulence decay properties (Kurian & Fransson 2009; Krogstad & Davidson 2010, 2011), while in astrophysical investigations, numerical simulations of star-forming clouds confirmed these scaling relations for decaying turbulence and expanded the results to flows with magnetic fields (Biskamp & Müller 1999; Mac Low 1999; Cho et al. 2003; Kritsuk et al. 2011).

In this work we adopt the following decay law for incompressible turbulence, given as Equation (A1) in Krogstad & Davidson (2011),

Equation (1)

where E(t) = u2(t)/2 is the turbulent kinetic energy at time t after the start of the decay, or the termination of external forcing, which corresponds to the passing of the shock wave in the case where energy is being supplied by the supernova explosion. Furthermore, E0, u0, and l0 are the initial values, i.e., at t = 0, of turbulent energy, fluctuating velocity, and integral scale, respectively, and A is a dimensionless number, typically between 1/3 and 1/2 as found in the observation of isotropic turbulence and A = 1/2 is used in the following calculations. Although the power-law form of decay given by Equation (1) is generally accepted as universal, the exponent n has not been uniquely determined. Currently available data and theories suggest that it should be between 1 and 2 (Biskamp & Müller 1999; Krogstad & Davidson 2011), with many recent experimental evidences supporting that it is close to 1.2 (Kurian & Fransson 2009; Krogstad & Davidson 2010; Sinhuber et al. 2014). Considering the turbulence in molecular clouds, this decay law has the following physical interpretations: without any external energy input maintaining the turbulence, the decay of the total turbulent energy follows the ∼(1 + t/t0)n law and the slowdown of the turbulent speed follows ∼(1 + t/t0)n/2 accordingly, where t0 = l0/u0 is the turbulence decay timescale.

For a (giant) molecular cloud with typical length scale l0 ∼ 100 pc and mass M0 ∼ 105M, if all the energy released through a typical supernova explosion (∼ ESN = 1051 erg) is converted to the turbulent energy of the cloud,5 the turbulent fluctuating velocity in the molecular cloud is $u_0=\sqrt{2E_{\rm SN}/M_0}=30 \ {\rm km\ s^{-1}}$. The corresponding characteristic energy decay time of the turbulence in the cloud is t0 = l0/u0 = 3 × 106 yr, which is consistent with the simulation result of Stone et al. (1998). Then for a typical dense cloud core of lcore ∼ 0.1 pc, the fluctuating velocity within the core is ucore = u0(lcore/l0)1/3 = 3 km s−1 once it is formed, assuming Kolmogorov scaling for incompressible turbulence in the inertial range (see Kolmogorov 1941)6

Equation (2)

Accordingly, the decay timescale for turbulence in such a cloud core is tcore = lcore/ucore = 3 × 104 yr, which is much smaller than the decay time of the turbulent motion in the whole cloud. Regarding the above estimates, it is noted that the initial length scale of the cloud core lcore could vary for different cores, resulting in different core masses and turbulent speeds; the initial local density of the cloud core ρcore could also be different.

Although bound in the gravitational potential of the large cloud, these dense cores form their own local potential fields and can be relatively isolated from other cores (Ward-Thompson et al. 2007). Due to the density difference between the dense cores and cloud diffuse regions, ρcorecloud ∼ 5 × 103, turbulent motion in the diffuse regions around a cloud core can hardly generate strong fluctuations inside the core7 as a result of mass flux conservation (ρu = const.). This means that the turbulent motion within the dense cores is essentially not affected by turbulence of the diffuse cloud after the initial fluctuation, so the dense cores experience the decay of turbulence in a relatively isolated sense. It has already been noted that local star-formation behaviors are different for dense cores of different properties (see, e.g., McKee & Ostriker 2007): (1) cores that have sufficiently high internal turbulent energy compared to their self-gravity potential will re-disperse and cannot form stars, and (2) cores whose turbulent energy are low enough compared to the gravitational energy will collapse under gravity and form star(s). What we will show in the following section is that the decay of turbulent motions in the cores may allow some types of (1) dense cores to eventually evolve to become gravitationally unstable and collapse.

4. JEANS' CRITERION IN CLOUD CORES WITH DECAYING TURBULENCE

Jeans' criterion for a turbulent cloud core to be gravitationally unstable to perturbations of wave number k was derived in Chandrasekhar (1951b) and Bonazzola et al. (1987):

Equation (3)

where c is the speed of sound, uc the turbulent speed of the cloud core, and G the gravitational constant. Hereafter variables with subscript c represent properties of cloud cores. Equation (3) shows that the gravitational instability is a long-wave instability. In a cloud core, the core diameter lc limits the longest wave to k = 2π/λ ∼ 2π/lc, which is the first unstable mode when a cloud core becomes unstable due to changes in conditions. If we take a typical cloud-core temperature of ∼10 K, corresponding to a sound speed of c = 0.2 km s−1, and adopt the scaling law Equation (1) for decaying turbulence, we can express Jeans' criterion, Equation (3), as

Equation (4)

in which uc0 denotes core turbulent speed at the beginning of turbulence decay. According to the turbulence spectra of Equation (2), the core turbulent speed is related to the core size by uc0 = u0(lc/l0)1/3, where u0 and l0 denote initial turbulent speed and size of the cloud, then Jeans' criterion (4) can be further expressed as

Equation (5)

Equation (5) shows that the resistance to gravitational collapse in a cloud core has two contributions: the thermal and turbulent parts, which are respectively the first and second terms on the right hand side of Equation (5). For initially gravitationally stable cloud cores, the decay of turbulent motion will diminish the second term and could cause the criterion to be satisfied at a later time, hence providing an additional way for the cloud core to become gravitationally unstable.

Therefore the existence of turbulence decay transforms Jeans' criterion, Equation (5), into two criteria. For a cloud core to be unstable at the beginning of the decaying turbulence (t = 0), the core diameter has to be greater than

Equation (6)

On the other hand, if there is long enough time (t → +) for the turbulent motion to decay, the core diameter has only to be greater than

Equation (7)

for it to be gravitationally unstable and to collapse at some later time. These two criteria can be inferred as the critical diameter for core collapse with initial turbulence (lc0) and the minimum diameter required for core collapse without turbulence ($l_{c0}^{\prime }$), respectively. Note that the two criteria (6) and (7) do not depend on n, the index of the turbulence decay rate; they are only affected by the sound speed c and the initial turbulent speed uc0 = u0(lc/l0)1/3 in cloud cores. Based on these two criteria, three types of cloud-core evolution exist. (1) Cloud cores with diameters lc > lc0 are gravitationally unstable and will collapse to form star(s). (2) Cloud cores with diameters in the range $l_{c0}^{\prime }<l_c<l_{c0}$ will not collapse initially, but can evolve to be gravitationally unstable after a period of turbulence decay (turbulent speed decreases) and eventually collapse, proceeding in further star formation. (3) Very small cloud cores with diameters $l_c<l_{c0}^{\prime }$ that are stable and do not collapse.

5. CLOUD CORE RELAXATION

When considering the time needed for a turbulent core to decay to become gravitationally unstable, it is more informative to re-write the criterion in the following form:

Equation (8)

For typical cloud cores with c = 0.2 km s−1 and ρc = 5 × 104 cm−3, if the initial turbulent speed of the (giant) cloud of diameter l0 = 100 pc is u0 = 30 km s−1 as estimated in Section 2, the time needed for a core of length scale lc to evolve to be gravitationally unstable can be easily obtained from Equation (8) and is illustrated in Figure 1 (solid line), in which the decay index is taken as n = 1.2. Figure 1 shows that for cloud cores with sizes lc > lc0 = 3.0 pc, t = 0 yr, which means that these cores directly collapse as a result of gravity; for those with sizes $l_c<l_{c0}^{\prime }= 0.1$ pc, t = + yr, which means they will never experience gravitational collapse. In between the two, cloud cores of sizes $l_{c0}^{\prime }<l_c<l_{c0}$ become gravitationally unstable after a period of turbulence decay and collapse to form stars. The epoch after the formation of these cores but before the initiation of gravitational collapse can be defined as the relaxation of cloud cores, during which the turbulent intensity within the cloud cores decreases. As inferred from Figure 1 (solid line), the relaxation time for cloud cores with sizes between 0.1 pc and 3 pc is around 105 to 106 yr, which is comparable to the free-fall time for star formation, tff = [3π/(32Gρc)]1/2 = 4 × 105 yr (McKee & Ostriker 2007). In addition, cloud cores with smaller sizes need longer relaxation times to become gravitationally unstable.

Figure 1.

Figure 1. Relaxation time needed for a cloud core embedded in a (giant) molecular cloud of diameter l0 = 100 pc containing typical initial turbulent motions of u0 = 30, 10, and 3 km s−1 to become gravitationally unstable. The abscissa is the scale of the cloud core and the ordinate is the relaxation time needed before gravitational collapse. For small cloud cores with $l_c<l_{c0}^{\prime }$, the relaxation time is infinity and the core will always relax and cannot become gravitationally unstable; for large cloud cores with lc > lc0, the relaxation time is zero and the core immediately gravitationally collapses once formed; for cloud cores with diameter intermediate of the two, it needs a time trelax for the turbulence to decay and eventually become gravitationally unstable. The horizontal dashed-dotted line denotes the free-fall time tff of the cloud core, which is comparable to the relaxation time.

Standard image High-resolution image

As the total energy released in a supernova explosion may not be fully converted to the turbulent energy of a star-forming cloud, the initial turbulent speed of the entire cloud could be less than 30 km s−1 as in previous calculation. Observations also suggest that the turbulent speeds in clouds of ∼100 pc diameter are typically less than or around 10 km s−1 (e.g., Larson 1981; McKee & Ostriker 2007). Assuming 10% or 1% of the supernova explosion energy is converted to turbulent energy of the molecular cloud, the corresponding turbulent speed is u0 ∼ 10 km s−1 and u0 ∼ 3 km s−1, respectively. Using Equation (8), the relaxation properties for dense cores in these less turbulent molecular clouds are also shown in Figure 1. The comparison between different cloud turbulent conditions shows that lc0, the minimum diameter of a cloud core that can directly evolve to be gravitationally unstable without experiencing turbulence decay, becomes much smaller when the initial turbulent speed decreases, while for cloud cores smaller than lc0, the relaxation takes a relatively shorter time (several 105 yr) than in more turbulent clouds. The plots in Figure 1 clearly indicate that the decay of turbulence leads to the existence of a relaxation epoch for cloud cores with diameter $l_{c0}^{\prime }<l<l_{c0}$ before they experience gravitational collapse. Even when Jeans' criterion has been satisfied and gravitational collapse begins, the decay of turbulent motion will also continue and the turbulent speed decreases until another driving process, such as star winds from nearby, newly formed massive stars, happens. It is also to be noted that turbulence can be enhanced as a result of the adiabatic heating in the compression of a cloud core (Robertson & Goldreich 2012; Murray & Chang 2014). This process cloud be considered as a self-driven mechanism of turbulence in cores as well, which may delay their gravitational collapses as a consequence.

Supernova-driven turbulence has been presumed in the above analyses, while other sources reviewed in Section 2 will also generate fluid turbulent motions. Although in smaller scales and not energetic enough to be the main energy source for star formations (Mac Low & Klessen 2004), these mechanisms may serve as more frequent energy inputs in local star formations. In this sense, the core relaxation discussed above may be interrupted by these local turbulence drivers. Also note that magnetic field is not included in the analyses, the existence of which may lead to different turbulent energy spectra and may slow the decay of turbulence (e.g., Biskamp & Müller 1999; McKee & Ostriker 2007). Consideration of magnetic effects in future works may quantitatively change the turbulence decay and core relaxation behaviors discussed here.

6. CONCLUSION AND DISCUSSIONS

Based on the scaling laws of decaying turbulence, Jeans' criterion on the stability of cloud cores specifies two critical core sizes: lc0, if turbulence exists in the core, and $l_{c0}^{\prime }$ (<lc0), when only the thermal effect is considered. For cloud cores with large enough sizes (lc > lc0), they can be gravitationally unstable once formed. For smaller cores that do not satisfy Jeans' criterion at their formation but have sizes between the two criteria ($l_{c0}^{\prime }<l_c<l_{c0}$), they can evolve to be gravitationally unstable through the relaxation of turbulent energy. For cores with even smaller sizes ($l_c<l_{c0}^{\prime }$), they can never become unstable to gravity even with an infinitely long epoch of relaxation. The process of turbulence decay before gravitational collapse is defined as the relaxation of cloud cores, which lasts for a period of 105–106 yr for typical conditions in star-forming clouds. The existence of core relaxation provides an additional way for cloud cores to evolve to become gravitationally unstable and thus collapse.

Typical values of cloud core turbulent speed, length scale, density, and temperature, as well as supernova rate and (giant) cloud diameter, are used here for an intuitive picture of the core relaxation; these values could vary from one star-forming cloud core to another by as large as even one or two orders of magnitude.

It is also noted that self-similar scaling laws of decay (Krogstad & Davidson 2011) and the Kolmogorov spectra of incompressible turbulence (Kolmogorov 1941) are adopted here, with the analytical results applying in the "inertial range" where energy transfers from larger to smaller scales with negligible influences from driving or viscosity. Furthermore, although the existence of shock waves and the magnetic field in realistic molecular clouds may alter the turbulence spectra, the energy decay rates of turbulence for compressible and incompressible clouds with or without magnetic fields are quite comparable as found in numerical simulations (see the reviews and discussions in Mac Low & Klessen 2004; McKee & Ostriker 2007). Nevertheless, the effects of compressibility, magnetic field, as well as the anisotropy of turbulence on its decay properties, and consequently on the cloud-core relaxation need to be further investigated.

This work was supported by the Center for Combustion Energy at Tsinghua University and by the National Science Foundation of China grant 51206088. Y.G. acknowledges additional support from the Tsinghua–Santander Program for young faculty performing research abroad. H.X. acknowledges support from the Max Planck Society and the German Science Foundation (DFG) through the project A7 of the Collaborative Research Center (CRC) 973 "AstroFIT."

Footnotes

  • The cases in which 10% and 1% of the supernova explosion energy is converted to the turbulent energy are considered in Section 5.

  • Turbulence in molecular clouds is actually compressible, thus the spectral index could be different from that in the Kolmogorov scaling law. Discussions on the effect of compressible turbulence can be found in the last section of this paper. Also to be noted is that the Kolmogorov scale under which viscosity affects is ~10−4 pc in molecular clouds (Kritsuk et al. 2011), which is much smaller than the cloud core diameter.

  • As an estimate, once cloud cores are formed, a turbulent speed of ucloud = 30 km s−1 in the cloud can only result in a flow of the speed ucore = 0.006 km s−1 in the dense core, which is much smaller than the inside-core turbulent motion of several km s−1.

Please wait… references are loading.
10.1088/0004-637X/799/2/227