Archive for December, 2008
Jack Gunion and Radovan Dermisek have recently published papers on a novel Higgs scenario:
h → a a → 4 τ
In this case, the Higgs boson, h, has properties similar to the standard model Higgs boson, and the authors suppose a mass of about 100 GeV. The pseudoscalar Higgs boson, a, is much lighter: in order to escape bounds from LEP, it must decay predominantly to tau pairs. For details, see Many Light Higgs Bosons in the NMSSM (arXiv:0811.3537) and A light CP-odd Higgs boson and the muon anomalous magnetic moment (arXiv:0808.2509) and references therein.
Four-tau final states are certainly very rare in a hadron collider, and the kinematics of this channel should lead to a very unusual topology. If we take Mh = 100 GeV and Ma = 4 GeV, then each a boson would be highly boosted in the lab frame, and the tau pair would be highly collimated. In the special case that both taus decay leptonically, one has a pair of oppositely-charged, isolated leptons, possibly with different flavors (i.e., e+μ). Their invariant mass would be low, since Ma is low (by assumption) and there are four neutrinos in a double-leptonic tau pair decay. The transverse momentum of the tau pair, recognized as a lepton pair (ee or μμ or eμ) would be high, however, compared to standard model processes such as Drell-Yan production of tau pairs. Furthermore, there are two a Higgs bosons per event, so in an ideal case there would be four isolated leptons in a highly distinctive topology. The dream topology would be along the lines of two e+μ– pairs, each with high transverse momentum, on opposite sides of the detector, isolated. I can’t imagine there would be any significant background for such topologies, nor do I see any special reconstruction problems. My intuition says that the high pT of the a boson should render the leptons from the tau decays sufficiently energetic that they can be reconstructed and would provide an adequate trigger – but this requires a simulation to be sure.
What is the rate? Let’s consider the production mechanism with the highest cross section – gluon fusion. (For an excellent discussion, see a post by Tommaso Dorigo from May of this year.) At the Tevatron, the cross section is about 1.6 pb. So we might take σ×Br(h→aa) = 1 pb, which is pessimistic according to Gunion and Dermisek. This means that 4000 h→aa events have been produced in 4 fb-1 of CDF or D0 data. The leptonic branching ratio of τ leptons is about 0.18 (per species, ie, each for e and μ final states or 0.36 for both together). The number of produced events in various purely leptonic final states would be:
- 4 events as eeee
- 4 events as μμμμ
- 16 events as eμ on both sides
- 67 events as any allowed combination of e and μ
These numbers are not insubstantial, but is the acceptance and efficiency high enough?
Trigger efficiency: we would want two leptons with pT > 8 GeV or so. The typical lepton pT would be around 8 GeV, I believe: very roughly, pT,lept approx ⅓×pT,τ = ⅓×½×pT,a = ⅓×½×½×Mh = 8 GeV. The transverse momentum of the h is not insubstantial and will help push two of the leptons above threshold, so let’s say the trigger efficiency is 50%. It would be easy for at least one of the leptons to fall outside the acceptance, which I would guess is something like 20%. Finally, one would need to identify all four leptons, with at least some loose criteria including isolation, and maybe that costs another factor of ½. Throwing these factors around in a wild fashion, the acceptance × efficiency seems to be of order 5-10%. This means there is little hope for a large sample of “dream” events (eμ twice), but it is not crazy to imagine gathering a sample of, say, five events in all topologies, given that around 67 have been produced, hypothetically. Since this is a counting experiment, the search would succeed only if the number of expected background events is fewer than one, but I’ll bet that would be the case.
So, is this search viable? CDF and D0 have already published very nice papers searching for h → τ+ τ–, so the expertise is at hand… Any takers?
PS. Recent CDF papers using the tau final state to search for new particles include Search for Doubly Charged Higgs Bosons with Lepton-Flavor-Violating Decays involving Tau Leptons (arXiv:0808.2161) and Search for neutral MSSM Higgs bosons decaying to tau pairs in p-pbar collisions at sqrt(s)=1.96 TeV (arXiv:hep-ex/0508051), with more on the way…
I followed a link provided at the Physics and Physicists blog to an article written by the head of the Department of Physics at the University of Tennessee. Tough Times for UT Physics Dept describes the impact of funding cuts on the department. The scenario resembles the one at Northwestern University, where I teach. The number of faculty and lecturers decreases steadily, the number of assistants faster, and the tenured faculty have responded by teaching more than they did in the past.
The posted comments are extremely negative, essentially lambasting the author of that article and academia in general for failing to face reality, and for living a cushy life. The contributors do not appear to be warped or irrational, or to bear special animus toward UT or physics. The basic message is stark, however, for example:
I recommend you tighten your belts and forego raises and take benefit cuts, and share the pain of the private sector.
Those of you who work at universities have long enjoyed a lifestyle most of us private-sector folk envy. Welcome to reality, its about time you joined us.
Or in a more pointed manner:
Who wants to support the Department of Perpetual Grievances? I’m sure such sentiment is unfair to science departments but there it is. Clean your academic house up or get increasingly screwed in the future. Academia is rotting from the core. And for the record I’ve got faculty experience and ten years of post HS degrees.
No one posted a single supportive or sympathetic comment.
To be honest, this scares me. The commentators appear to be well-educated people, yet they hold American universities in very low regard. Many of us at universities feel our research activities are in danger, due to falling federal support and the need to spend more time teaching and supporting our departments. If we took guidance from that segment of the American public, however, we would have to spend even less time on research!
It is often said that American education, especially graduate education, is the best in the world. It is supposed to be one of the great engines of technical innovation. Politicians, journalists and scientists in this country have sounded the alarm about declining support for research and the falling levels of interest of American students in graduate-level studies, as well as declining competence. There clearly is a gulf between them and the commentators whose sharp words I quoted above.
Who is right? Are we professors pampered, ripe for pay cuts and more coursework? Are American students avoiding physics and other difficult subjects because we do not do a good job teaching them? Is there too much scientific research in the United States?
You can probably guess my opinion. What’s yours?
The winter holidays provide an opportunity to read more than one normally can (and to blog more, too). This year I have come across some excellent papers already, and perhaps the most startling one was written by Bob McElrath, who is currently a research fellow at CERN.
Bob’s latest paper is called Emergent Electroweak Gravity (arXiv:0812.2696) – a catchy title to be sure. The paper is short but very deep, and since I am an experimenter, I am not well qualified to judge his conclusions. He recently presented this work in the Chicago area, and my theorist friends seem to find it very interesting indeed.
The paper begins with basic quantum mechanics and statistical mechanics, and quickly comes to the conclusion that startles me the most:
…today, WIMPs and at least two neutrino mass eigenstates are definitely quantum liquids.
The point is that there should be an attractive contact interaction which induces the phase transition to a super-fluid. Since collisions are very rare, the super-fluid is not disrupted. The implication is that calculations of relic densities and WIMP searches need to be revised so that the WIMPs are treated as a quantum liquid rather than a classical gas. I am not expert, but doesn’t this mean that the conventional wisdom needs serious revision, if Bob is right?
In the last third of this short paper, Bob makes some more exciting claims. He points out that a condensate will break Lorentz invariance. More specifically, Poincare invariance will be broken spontaneously and dynamically, and relic neutrino density will vary in space. This links standard model neutrino interactions with the space-time Lorentz group, and Bob identifies a generator for the broken symmetry and the associated Goldstone bosons. The field operator for the neutrino condensate and the propagating Goldstone bosons together form a 4-tensor that would be identified with the spin-2 graviton. Hence, Bob speculates, gravity arises from the standard model interactions of relic neutrinos which necessarily form a quantum super-fluid!
Again, as an experimenter I won’t judge the validity of these claims, but I will say that I find these ideas exciting. I might even venture a couple of basic questions (though not smart ones…)
- What is the impact on predictions for observing WIMPs in direct and indirect detection experiments? What about PAMELA and ATIC?
- Since Bob links gravity to standard model interactions of relic neutrinos, there will be a numerical connection between Newton’s constant and electroweak constants. Can one specify these connections, based on a model for the neutrinos and their density?
- Can we find a way to confirm the breaking of Lorentz invariance from astrophysical observations?
I am confident that Bob will write more about this in the future.
John Conway expresses similar interest in his latest entry at CosmicVariance. As he says, it would be great to read opinions from the experts…
About three weeks ago, I was privileged to attend the final week in the INT 08-3 workshop on Low Energy Precision Physics in the LHC Era. (INT is the Institute of Nuclear Theory at U. Washington.) This excellent series should be a good model for other workshops which seek to bring together experimenters and theorists to really work together, exchanging real information and bridging gaps in understanding. Hats off to Jens Erler and the other organizers for an extremely well conceived, well planned and well executed workshop.
A workshop should be focused but not narrow. The material should pull you out of your comfort zone without making you feel totally ignorant and stupid. The setting should be nice, even beautiful, and conducive to fun discussions and nascent friendship. Theorists may enjoy conferences like this, but experimenters generally don’t have the pleasure. A workshop like INT 08-3 which is good for both experiments and theorists is golden!
Our group for the last week was small but we meshed very well. David Mack knows all about low-energy probes of electroweak physics, and he told us a lot about QWEAK at JLAB. Shufang Su explained the theoretical underpinnings of low-energy precision measurements, as regards physics beyond the standard model. These two talks sparked several discussions on how to combine future precision information with LHC discoveries to infer which theory beyond the standard model is likely the right one. (For example, if the LHC experiments find anomalous events with jets and missing energy, is it SUSY or extra dimensions? Perhaps QWEAK or MOELLER can give us a hint.) I tried to make predictions about when new physics might be observed at the LHC and I will summarize my thoughts about that some other time. At the end of the week, George Hou gave a fascinating talk-and-a-half on heavy quarks, combining theory and experiment in a remarkable way. I wanted to run out and look for them immediately – though he and his group in Taiwan are already well prepared…
If discoveries are made at the LHC which can revolutionize particle physics, then high-quality communication between the theory and experimenter communities will play a crucial role. Many people recognize this – witness the spate of invitations to theorists to speak directly to analysis groups in collider collaborations, and of leading experimenters to provide information at traditionally theory-oriented summer retreats. My hope is that these bridges grow rapidly to meet a soaring demand. And perhaps the blogosphere will help sow further seeds of discourse…
Readers of physics blogs may notice that collider physicists seem to want to find the Higgs over and over again… Some time ago, Tommaso Dorigo mounted a very challenging strategy of using the ttH channel in hadronic mode (ie, the Higgs is produced in association with a top and anti-top quark). In fact, he even issued an open invitation to come up with useful kinematic quantities. More recently, he discussed looking for Higgs bosons in a variant of the so-called golden mode, in which the Higgs boson decays to four leptons through a pair of intermediate Z bosons. Tommaso and his group plan to look for the Higgs in the more challenging di-lepton and di-b-jet channel.
Why not just focus on one, promising channel? Why pursue the more difficult ones?
The standard model of particle physics predicts unambiguously the properties of the Higgs boson (its decays and production cross section) for any assumed value of its mass. Theories of physics beyond the standard model (supersymmetry, little Higgs models, extra-dimensional theories, etc.) predict deviations from those standard model values. So the pat answer to those questions above is: we want to know whether the Higgs boson that we discover (if we discover) is exactly like the standard model version, or some other version, and the only way to know is to look for deviations from the standard model predictions. Since we don’t know where those deviations will/might occur, we need to look everywhere.
But for an experimenter, there is an almost crude reason for wanting to discover the Higgs boson in several channels. We need to know that the signal is real. We do not want to report a statistical fluctuation – or worse, a defective analysis. It is important to be skeptical. But sometimes this translates into a reluctance to report results, or even a suppression of unexpected (or unwelcome) results, and I think this is wrong. If something is a true signal, then repeated, related analyses will prove it more definitely than one single, over-done piece of work.
Inside the internal world of experimental collider physics, I am happy to see people attack the difficult channels, and to try to think of new ones. (I am doing this, too.) For the outside world of interested educated people, please understand that finding the Higgs several different ways is plain good science. Let’s hope that, one or at most two years from now, bloggers like Tommaso, Gordon, colleagues at the US-LHC blog site and at Cosmic Variance among others, will be debating explanations and hypotheses of real Higgs signals, plural.
nearly one year since my last post (sorry!), I am willing to pick up this effort again.
I had hoped to be writing about signs of new physics from the LHC or the Tevatron – or a least progress in analyzing the first LHC collision data. You all know about the magnificent start-up of the LHC, and the awful catastrophe a short time later. CERN has approached the problem with great professionalism, skill, and urgency, and I am optimistic about seeing collision data in late summer, 2009.
I won’t wait that long before making another blog entry…