Number 600, August 1, 2002
by Phil Schewe, James Riordon, and Ben Stein
High-Precision Tests of the Standard Model
High-precision tests of the Standard Model have been reported this
past week in two areas: CP-violation in B mesons (experiments at the
KEK lab in Japan and the SLAC lab in California) and the magnetic
moment of the muon (an experiment at the Brookhaven lab in New York).
The standard model, trying to explain the forces of nature through
the exchange of particles, consists of the electroweak framework (force
exchanged by photons and by Z and W bosons) plus the quantum chromodynamic
(QCD) framework for quarks (force exchanged by gluons).
The model has been highly successful in accounting for the behavior
of electrons in atoms (in the case of some transition frequencies,
theory and experiment agree at the parts-per-trillion level or better)
and does a good job of predicting other phenomena as well, such as
CP violation. The model does not include, but can accommodate, neutrino
Extensions of the standard model, such as superstring theory--which
pictures all matter as consisting of tiny strings or membranes--can
(unlike the standard model) account for the force of gravity, the
existence of extra spatial dimensions, and the proposition (known
as supersymmetry, or SUSY) that all fermion particles have boson counterparts
and vice versa.
SUSY is by now an acceptable idea for many particle physicists but
it would necessitate an overhaul of the standard model since the existence
of superparticles would entail a whole new force, one which transforms
fermions into bosons and back again.
The new CP violation tests were reported at the International Conference
on High Energy Physics in Amsterdam. Both the Belle detector group
at KEK and the BaBar detector group at SLAC observed subtleties in
the decays of B mesons and measured a parameter called sine two beta.
The value measured for both groups, with much better precision than
ever before, is approaching the value predicted by the standard model,
thus erasing past discrepancies. (See SLAC
Meanwhile, at Brookhaven the g-2 collaboration seeks to observe a
departure of the muon's magnetic moment (related to the muon's spin
by the g parameter) from 2, the value it would have in the absence
of interactions between the muon and virtual particles in the universal
vacuum, including possible exotica outside the standard model such
as the supersymmetric entities. Although the SUSY particles are rare
and unstable their mere existence in the vacuum would modify observable
quantities such as the muon magnetic moment.
Thus a measurement of the magnetic moment, by watching muons decay
even as they wobble about in a strong magnetic field, would give indirect
evidence for the extra particles. Moderate evidence in this direction
was previously reported by the g-2 team; the new results, reported also
in Amsterdam (and submitted to Physical Review Letters), follow
suit but with twice the precision of the last report. (See Brookhaven
New Cosmological Upper Limit on Neutrino Mass
Neutrino news has been dramatic these past few years: neutrinos have
been shown to oscillate from one type to another (See Update
375) and the solar neutrino problem has been resolved (See Update
586) after puzzling solar physicists for decades.
These results imply that at least one or more of the neutrino flavors
(electron, mu, tau) have some mass and this, considering the number
of nu's loose in the universe, means that even lightweight neutrinos
will have had a palpable role in influencing the development of galaxies.
But how much nu mass is there and how big a role did nu's play? Particle
physics experiments so far directly establish only values for the square
of neutrino mass differences. From tritium decay experiments comes an
upper limit of 2.2 eV for the electron neutrino. Upper limits for the
mu or tau neutrinos are up in the MeV range.
The new mass limits come from looking at the distribution of galaxies
across the canopy of the sky. The 2dF Galaxy Redshift Survey has scanned
250,000 galaxies (viewed 400 at a time with a telescope in Siding Spring
Mountain, Australia). The galactic coordinates can be compared two at
a time, providing a plot of the number of galaxies versus inter-galaxy
Turned into a galactic "power spectrum," this correlation
study can be used to estimate the likely density of the constituent
species of matter in the universe: baryons (such as protons), cold dark
matter (WIMPs), and hot dark matter (neutrinos are the leading candidate).
The 2dF work arrives at two big neutrino conclusions. (1) Neutrinos
can account for no more than 13% of the matter in the universe and (2)
the sum of all the nu masses (electron plus mu plus tau) is no more
than 2.2 eV.
Group member Oystein Elgaroy (University of Cambridge, email@example.com,
44-1223-75 x17) says that this is the best upper limit for neutrino
mass derived with relatively conservative assumptions on the total matter
density in the universe. (Elgaroy
et al., Physical Review Letters, 5 August 2002; also
see Physical Review Focus, 12
A New Way of Measuring Complexity
A new way of measuring complexity for biological systems has been
proposed by researchers at Harvard Medical School and University of
Lisbon (contact Madalena Costa, 617-667-2428, firstname.lastname@example.org
, Ary L. Goldberger, 617-667-4267, email@example.com and
C.-K. Peng, 617-667-7122, firstname.lastname@example.org). Their method suggests
that disease and aging can be quantified in terms of information loss.
In the researchers' view, a biological organism's complexity is intimately
related to its adaptability (e.g., can it survive hostile environments
on its own?) and its functionality (e.g., can it do higher math?).
In this view, disease and aging reduce an organism's complexity, thereby
making it less adaptive and more vulnerable to catastrophic events.
But traditional yardsticks sometimes contradict this "complexity-loss"
theory of disease and aging. Such conventional metrics, originally developed
for information science, quantify complexity by determining how much
new information a system can generate.
By these traditional measures, a diseased heart with a highly erratic
rhythm like atrial fibrillation is more complex than a healthy one.
That's because a diseased heart can generate completely random variations
("white noise") in its heart rate. These random variations
continually produce "new" information, i.e., information
that cannot be predicted from the heart's past history. On the other
hand, a healthy heart displays a less-random pattern known as 1/f noise
The problem, according to the researchers, is that conventional measures
of complexity ignore multiple time scales. To address the inherent multi-scale
nature of biological organisms, the researchers developed a new "multi-scale
entropy" (MSE) tool for calculating biological complexity.
Their technique works like this: Take a heart rate time series of about
30,000 beats. Then split it into coarse-grained chunks of 20 heartbeats
each and compute the average heart rate in each chunk. Then measure
the heart rate's unpredictability (its variations from chunk to chunk).
More unpredictability means more new information, and greater complexity.
Repeat this complexity calculation numerous times for different-sized
chunks, from 1-19 heartbeats. Such a technique can reveal the complex
arrangement of information over different time scales.
Applied to heartbeat intervals in healthy young and elderly subjects,
patients with severe congestive heart failure, and patients with atrial
fibrillation, the MSE algorithm consistently gives the fluctuations
of healthy hearts a higher complexity rating than the fluctuations of
diseased or aging hearts. (Costa
et al., Physical Review Letters, 5 August 2002.)
Physics News Update
Physics News Update will presently go on a three week holiday.