Archive for December, 2009

Double-Parton Scattering is Not Rare

Despite lots of empirical evidence to the contrary, I tend to think of proton-proton interactions as the collision of single partons (quarks and/or gluons, one from each incoming proton) giving rise to all sorts of rich phenomena. A recent paper by Berger, Jackson and Shaughnessy reminded me that this way of thinking is too simplistic, and that the simultaneous scattering of two pairs of partons from the same two protons provides a non-negligible contribution, especially in certain corners of phase space – corners that may be quite relevant for finding physics beyond the standard model. Their paper is available on the physics archive: arXiv:0911.5348.

I grasp the essence of their paper as follows: imagine that you were looking at the production of b-quark pairs, as part of a search for the Higgs boson. You might look at the subset of events in which the hadron jets from the two b quarks are back-to-back, i.e., for which Δφ is nearly 180°. There will always be some extra activity in the event (even forgetting the contribution from other protons interacting) due to the initial-state radiation (ISR) of gluons. Naively, however, one would expect ISR to be small when Δφ is close to 180°. A standard event generator will simulate only those interactions coming from a single pair of partons, and will very rarely produce four jets – the two back-to-back b-quark jets and two other jets, which necessarily will be back-to-back, also. It would not be difficult to compare such a prediction to real data – in fact, this will surely be done as soon as the LHC delivers more data in a couple of months. According to the studies of Berger, Jackson and Shaughnessy, however, you would see a major discrepancy between the “standard” prediction and the real data…

These studies show that there would be a “surprise” contribution of jets which are themselves back-to-back, very much as if two events were overlayed. The important point is that these two “events” come from the same proton-proton interaction, and hence do not depend on the instantaneous luminosity, in contradistinction to the overlap of two proton collisions in the same beam crossing – which obviously depend on how many protons are in each bunch. Furthermore, collisions from different pairs of protons can be separated to some degree by reconstructing their different Z-coordinates (positions along the beam line), but this is not the case for double-parton scattering.

The thrust of the Berger, Jackson and Shaughnessy paper is a study showing that clear evidence for double-parton scattering can be obtained with a few pb-1 of data at 10 TeV. Here is one of the most telling distributions of their study:

S_phi distribution showing double-parton scattering

Sφ, which peaks toward one for double-parton scattering

Sφ = (1/√2) ⋅ √( (Δφbb)2 + (Δφjj)2)

The quantity Sφ peaks toward one for events in which the b-quark jets are back-to-back, and the other jet pair is also back-to-back. The bulk of the distribution indeed comes from single-parton scattering (“SPS”), in which two ISR gluons accompany the b-quark jets. For the SPS component, there is no special reason why the b-quark jets should be back-to-back, or the two ISR gluons should be back-to-back; the final state populates a four-body phase space which accommodates many other configurations. For the double-parton scattering (“DPS”) component, however, we have the overlay of two two-body final states, and each individual two-body final state is necessarily back-to-back. The DPS component may be small overall, but the plot shows a very tall spike at a corner of phase space. (The SPS component is represented by the red histogram, the DPS by the blue, and the sum of the two by black histogram.)

Double-parton scattering has been investigated empirically in the past, and many papers have been written about it in the context of high-energy hadron-hadron colliders. The paper by Berger, Jackson and Shaughnessy is particularly useful and I hope that the LHC experiments will follow-up once the necessary data have been recorded. At a minimum, an overall factor σeff must be measured in order to make progress. On the longer term, we should try to gain some knowledge of the double parton distribution functions (see arXiv:0911.5348 for details).

December 29, 2009 at 4:10 pm 2 comments

The Physics in the ALICE Paper

The ALICE Collaboration published the very first physics paper on collisions at the LHC. You can see the paper on the archive (arXiv:0911.5430), and soon in the European Physics Journal.

I love to read papers reporting good, basic measurements that I don’t really understand – they give me the opportunity to go learn something new! After all, we don’t measure any arbitrary peculiar quantity because we can – every measurement has a point to it, something more than just checking a Monte Carlo event generator. So, what is the physics point behind the ALICE measurement?

The ALICE Collaboration measured the density of charged particles as a function of pseudo-rapidity (η). Here is the main plot:

ALICE dN/deta plot

Charged multiplicity as a function of pseudorapidity (η)


The horizontal axis gives the pseudorapidity, and the vertical axis gives the “density” or number of charged particles per unit of pseudorapidity.

First, let’s recall what pseudorapidity is. It is sometimes described as a measure of the angle with respect to the beam, but this is not really what is means. It’s not just a substitute for cosθ. It is an approximation to the true rapidity (y), which parametrizes relativistic boosts. In old-fashioned books on special relativity, one sees sinh(y) = βγ so one might say that y is an imaginary angle, but this certainly brings no insight. The main point of rapidity in relativistic kinematics is that a function f(y) transforms as f(y+a) under Lorentz boosts – i.e., the distribution is shifted along the y-axis by some constant amount, but the shape does not change. Hence, it is the shape which is of the essence.

Let’s write the explicit formulae:

y = ½ ln [ (E+pz) / (E-pz) ]

η = ½ ln [ (p0+pz) / (p0-pz) ] = -ln[ tan(θ/2) ]

So it is obvious that η is numerically close to y when the particle is relativistic and its mass does not matter, numerically. Experimentally speaking, however, η is much easier to measure, since one needs only θ, the angle of a track with respect to the beam axis. For y, one needs good momentum measurements and knowledge of the mass, which implies particle identification, which is hard.

All of this is to say that we use η because we can measure it well, but for puzzling out the physics, one should think in terms of the normal rapidity, y.

(Note that the ALICE magnet was off for these data, so they have no way of measuring momenta. For a nice synopsis of the analysis, see Zoe Matthews’ post on Quantum Diaries.)

What do you notice about the results in the plot above? The charge density hardly varies with η! Why not? Why is the density of charged tracks constant as a function of rapidity? Considering its definition as a boost parameter (see above), this should come as quite a surprise. (The small waviness reflects the small numerical differences between η and y.)

Now, to the physics. I learned what little I understand from a very nice review article by Grosse-Oetringhaus (CERN) and Reygers (Heidelberg) which came out just after the ALICE paper: (arXiv:0912.0023). The story begins with Feynman Scaling which pertains to the formation of hadrons in inclusive, inelastic processes.

Let’s begin with this picture: A stream of partons comes in along the z-axis at high energy, and new partons are produces which fly out in many directions. Do not think in terms of simple Feynman diagrams or any sort of one-particle exchange. There is just a big blob with lots of partons coming out. The key concept is that the formation of any particular final-state parton does not depend on the global picture, only the local one. The energy and flavor of the incoming partons don’t figure into the calculation – a parton is produced according to some dynamics, and that parton turns into a hadron according to a universal distribution which might depend on the particle species but not on the other partons or their kinematics. The other key idea is that the distribution functions factorize in terms of the transverse dimension (pT) and a longitudinal quantity. The latter is expressed as a scaling variable, “Feynman xF,” which is simply pz/W, where W = √s is the center-of-mass energy.

Feynman posited that the probability to produce a parton of energy E goes as 1/E. He justified this on very general phenomenological grounds. From this, and integrating over pT, and making a change of variables, one can show that the mean multiplicity N is proportional to ln(W). The total range of rapidity is also proportional to ln(W), so one infers that dN/dy = constant.

Thus, the flatness of the distribution of dN/dη is an important validation of the quark-parton picture, invented in large part by Feynman.

Here is the other important physics plot from the ALICE paper:

ALICE N versus s

Increase of mean charged multiplicity as a function of √s

This compilation of measurements tests the prediction that N increases as ln(W). Clearly, this prediction fails – the increase is quadratic, not linear. This has been known for quite some time, and does not invalidate the quark-parton picture (of course). The arguments behind Feynman scaling were extended, leading to KNO scaling. KNO scaling works well but not perfectly, and the UA5 experiment showed violations of KNO scaling. It was eventually shown that the multiplicity distribution within a narrow η range follows a negative binomial distribution (NBD), or better, a sum of two of them. I can’t explain these things here (nor are they addressed in the ALICE paper), but the salient point is that these extensions of Feynman’s original line of thinking lead to the expectation that N increases quadratically with √s, as seen in the second plot above.

This piece of physics is very rich, even if not new, and it is interesting that the best Monte Carlo generators have some trouble reproducing the data accurately. For a lot more detail, see the review article by Grosse-Oetringhaus and Reygers (arXiv:0912.0023) or a good textbook on hadron-hadron interactions. For more data, just wait awhile – the LHC collaborations are only getting started!

December 20, 2009 at 4:18 am 2 comments

An Excellent Start for LHC Physics (Seminar II)

Today the second public status report on the LHC was held. The presentations from the various experiments are on this public web page, and a recording of the session will be available from the CERN Document Server.

Fabiola Gianotti

Fabiola Gianotti, spokeswoman of ATLAS, delivering the ATLAS report. Several famous physicists can be spotted among the audience.

The LHC has delivered a useful data sample for pp collisions at a center-of-mass energy of 900 GeV (roughly 20 μb-1). There was also a very small sample at 2360 GeV (order of 1 μb-1), which exceeds the highest energy of the Tevatron at Fermilab.

The experiments used these data to establish the performance of essentially all subdetector systems, and to demonstrate the reconstruction of physics benchmarks. Many of these benchmarks are common, so one can make a direct comparison. Here are the reported widths of reconstructed signals, in MeV:

collaboration Ks→ππ Λ→pπ φ→KK π0→γγ η→γγ
ALICE 5.2 1.9 5.3 7.2 -
ATLAS 8.2 3.2 - 19 -
CMS 7.6 3.1 4.6 16 53
LHCb 4.3 1.4 - 11 -

.

The collaborations their capabilities to identify charged particle species. The plots from ALICE are particularly impressive, showing very clean dE/dX separation in their TPC and ITC, augmented by e/π separation in the TRD and nice separtion by TOF. They are planning a publication on the multiplicities and momentum spectra of pions, kaons and protons. ATLAS and CMS also shows dE/dX separation of e/π/K/p, and CMS used this in constructing their φ→KK signal. ATLAS also has a TDR with a very nice separation. LHCb demonstrated that their RICH detector can identify charged kaons well.

All detectors are able to reconstructed leptons (electrons and muons) well. ATLAS, CMS and LHCb showed di-muon candidates, and those from CMS and LHCb are potential J/ψ signals. It will be interesting to see these samples grow from a handful of events to clear signals, hopefully with a clear prompt and non-prompt component from B hadron decays (at higher energies).

Nice examples of di-jet and tri-jet events were shown by ATLAS and CMS, and both collaborations were able to show a jet pT spectrum, with an excellent reproduction by the simulation. More importantly, the missing transverse energy (MET) distributions were shown and the canonical relation between MET and ΣET, again reproduced very well by the simulation. The CMS Collaboration showed results from their _Particle Flow_ algorithm, with a very good calibration and even the reconstruction of neutral hadrons in the calorimeter.

There were two presentations by the forward experiments, TOTEM and LHCf, showing that their apparatus is working and they are able to understand the signals to some degree.

The most exciting note was struck by the ALICE and CMS Collaborations, who gave a glimpse of their fast-track physics analyses. ALICE has already submitted a publication to EPJ on the measurement of the charged particle pseudorapidity (η) density (arXiv:0911.5430). They will also measure the anti-proton/proton ratio, which is sensitive to the physics of hadronization, and the pT spectra of pions, kaons and protons. The CMS Collaboration will update the ALICE measurement of dN/dη, covering a wider range in η with much higher precision.

Jim Virdee

Jim Virdee, spokesman of CMS, delivering the CMS report. People were standing in the back and sitting in the aisles.

Clearly a tremendous amount of rapid work has been done by all collaborations, and the very brief presentations today showed only a very limited set of highlights. It is amazing how well the reconstruction and analysis of the data has proceeded already, and how well the simulations reproduce the data, especially at the level of the detector simulation. Of course this is only the beginning, but I can confidently predict and explosion of physics results to be published next year!

December 18, 2009 at 11:07 am 1 comment


Recent Posts

December 2009
S M T W T F S
« Nov   Jan »
 12345
6789101112
13141516171819
20212223242526
2728293031  

Follow

Get every new post delivered to your Inbox.

Join 48 other followers