|
Twenty watts of terahertz
Terahertz radiation, also known as submillimeter radiation,
is the latest frontier in the electromagnetic spectrum. Coherent
sources produce electromagnetic waves extending from wavelengths
of 100,000 km to 0.1 µm, but until recently the
region between around 100 and 300 µm had no good
coherent sources. Such terahertz waves have a range of potential
applications, especially in biomedicine, where they can be
used to analyze surface proteins of living tissues to provide
instant biopsies for diseases such as skin cancers.
Current methods of producing teraherz radiation, however,
yield a limited average power of only about 1 mW. A pulse
of light from a femtosecond laser, for example, can accelerate
electrons to high-peak-power pulses, but the total amount
of radiation from each pulse is low, and so is average power
output. Now, a collaborative effort among three national laboratories
Brookhaven (Upton, NY), Lawrence Berkeley (Berkeley, CA),
and the Thomas Jefferson National Accelerator Facility (Newport
News VA), commonly called the Jefferson Lab has achieved
an average output of 20 W using relativistic electrons from
a linear accelerator, or linac (Nature 2002, 420, 153).
|
 |
Terahertz radiation coming through
a window from the free-electron laser is reflected by
a mirror onto the detector array of a pyroelectric camera
to produce this false-color, real-time video image.
(Jefferson
Lab) |
|
The new method begins with the production of a picosecond-long
pulse of electrons generated by exposing gallium arsenide to a femtosecond
laser. The electrons are then fed into the 30-mlong linac at the
Jefferson Labs free-electron laser (FEL) to boost their energy
to 40 MeV. The accelerated electrons are bent by a magnetic field
into a 1-m arc. As they are accelerated through this curve, they
radiate in the terahertz range.
Although the number of electrons in each pulse is comparable to
that produced by existing techniques, the amount of energy each
electron radiates increases as the fourth power of the relativistic
factor, or of the energy. For 40-MeV electrons, this means a 200,000-fold
increase in power over subrelativistic electrons. With a repetition
rate of 37 MHz, the Jefferson Labs FEL linac produces a 5-mA
current and a 20-W radiation output.
The efficiency of the entire system is greatly enhanced because
after the electrons radiate, they are fed back into the accelerator
to decelerate them back down to 10 MeV. Thus, about threequarters
of the energy of the electrons is fed back into the power supply,
reducing total power input by a factor of 4.
Of course, a 30-m-long accelerator is not a practical source
for medical diagnostics, says Gwyn P. Williams of the Jefferson
Lab, one of the researchers. But we are currently looking
at what the minimum parameters needed for adequate power are. We
expect that we will be able to reduce accelerator length to 2 to
3 m and perhaps the cost to that of an MRI scanner. The team
is now working with Advanced Energy Systems (Medford, NY) to develop
a compact system for commercialization.
Such a terahertz scanner could detect skin cancer on a patient
without a biopsy and substitute spectral analyses for biopsies in
endoscopic procedures such as colonoscopies. (Terahertz radiation
does not penetrate tissue well, so it cannot probe for diseases
located much below the surface.) Because terahertz radiation does
penetrate cloth and paper easily, other possible applications include
security scanners.
|
Chaos in the engine
One of the fastest ways to reduce fossilfuel use is to increase
the energy efficiency of machines, and one of the less-efficient
machines is the internal combustion enginedespite a
century of development. One cause of this inefficiency is
the stroketo- stroke variation in the power supplied by each
piston. Because the crankshaft can absorb only a set amount
of power at a given rotation rate, variations in the energy
generated by each piston cause wasted power, which goes into
engine noise. Eliminating such variability would increase
engine efficiency by about 10% and save enormous amounts of
fuel each year.
Researchers at the Technical Institute of Lublin (Poland)
have found clues
to how this stroke variability develops, clues that may
aid in reducing it . Using a fiber-optic-based pressure measurement
system, they obtained continuous recordings of pressure in
a single piston with high time resolution. They showed that
in certain types of operation, the variation of pressure in
the cylinder became chaotic. Chaotic variability, although
not random, is characterized by a strongly nonperiodic variability
that is difficult to control by any conventional feedback
mechanism.
|
 |
In an internal combustion engine,
the variation of pressure (p) in the cylinder becomes
chaotic at a critical value of the ignition advance
angle, as shown by this pressure-time series.
(Technical University of Lublin, Poland) |
|
The key parameter in the onset of chaos, the Lublin team discovered,
was the ignition advance angle, defined as the difference in the
angular position of the crankshaft between the time that the spark
ignites combustion and the time of maximum compression. The larger
the advance angle, the higher the torque and the larger the efficiency
of the energy conversion, other factors being equal. But the team
found that larger advance angles also led to increased variability
and eventually to chaotic functioning that cut efficiency.
At an advance angle of 5°, which is less than that generally
used for low-speed operation, compression is wholly repeatable and
variability is small. At 20°, somewhat more than the typical
angle, instabilities set in and periodically change the peak pressure
by 50% or more. True chaotic behavior occurs at 30°, an angle
that can be reached in high-speed operation or acceleration, with
peak pressure as much as triple the average value. Pressure variation
at this point exhibits the strange attractor behavior--which lacks
regular oscillation--that is characteristic of chaotic systems.
Improving engine efficiency requires achieving a larger advance
angle without setting off chaotic behavior. "There are a number
of ways of reducing the chaotic behavior,"explains Grzegorz
Litak, one of the researchers. "You can use a higher spark
energy, swirl the incoming air faster, or use direct gasoline injection.
Unfortunately, all of these solutions are expensive.
"Efforts to optimize engine functioning to avoid the chaotic
regime will be particularly important if hybrid engines, which feed
energy to an electric motor, become popular. Such engines can operate
continuously at an ideal engine speed, thus allowing parameters
such as the advance angle to remain at the most efficient levels.
3-D lithography
Conventional photolithography lays down and etches away two-dimensional
layers of materials. However, there are considerable advantages to
making fully three-dimensional objects and patterns on the microscopic
level.for example, the production of photonic crystals, whose patterns
of holes create desirable optical properties.
|
In the past few years, researchers have accomplished this
trick by using two-photon polymerization with femtosecond
lasers. This approach uses liquid resins that are transparent
to infrared light but polymerize and become solids when exposed
to ultraviolet (UV) light. When femtosecond laser pulses are
focused on these resins, the intensity in the focal point
is so high that molecules can rapidly absorb two photons,
enough to create radicals that set off polymerization. No
polymerization occurs outside the focal point, so one can
freely generate three-dimensional patterns by moving the laser's
focal point.
|
 |
Scanning electron microscope images
of a microcapsule used for drug delivery and a microscale
Venus statuette, both fabricated by means of two-photon
polymerization with femtosecond lasers.
(Laser Zentrum Hannover, e.V.) |
|
Unfortunately, the commercial resins used in these experiments
do not have the optical, mechanical, or thermal priorities desired
for many applications, including photonic crystals, because they
are relatively heat-sensitive and not very transparent.
A group of German researchers at Laser Zentrum Hannover (Germany)
and the Fraunhofer Institut fur Silicatforschung (Wurzburg, Germany)
have overcome this limitation. They did so by fabricating three-dimensional
submicrometer structures in an inorganic.organic hybrid material
that can be designed to have many of the desirable characteristics
of glass (Optics Lett. 2003, 28, 301).
Inorganic.organic hybrid polymers consist of resins and silicates
intermixed at the molecular level by a sol-gel process. They have
high optical transparency and high chemical resistance, and they
are mechanically and thermally stable in their solid form. The research
team, using a commercial hybrid called Ormocer, inscribed the three-dimensional
pattern with a Ti:sapphire laser operating at a wavelength of 780
nm, a pulse length of 50 fs, and an 80-MHz repetition rate.
Normally in photolithography, the smallest size of a feature is
limited by the wavelength of the light used. However, because the
two-photon process only functions above a sharp threshold in intensity,
it is possible to create structures with resolutions far smaller
than the 780-nm wavelength of the light. Only a small volume of
material around the peak intensity of the focal point is polymerized,
which enabled a resolution of about 200 nm.
As a demonstration, the team created a micrometer-scale statuette
and a photonic crystal array. To speed the fabrication process,
which took about 5 min, the researchers polymerized only the outer
surface of the statuette. After that, the surrounding resin was
washed away and the inside polymerized with a burst of UV light.
For practical applications, fabrication rates can be increased by
orders of magnitude by using more sensitive polymerization initiators.
"The great advantage of the hybrid materials is their flexibility,"
says Boris Chichkov of the Laser Zentrum team. "You can change
the index of refraction and maximize transparency for telecommunication
applications, or the biocompatibility for medical applications.
In addition, the materials used are inexpensive, so the range of
applications should be great."
|
Stock physics
The stock market in recent years has acted to transfer huge
sums of wealth from small investors to certain big investors.
Much of this wealth transfer occurred when the large investors
sold their huge holdings at high prices long before small
investors learned that a company's books were cooked. The
small investors then sold their stocks at vastly lower prices.
Why do the big fish (who own most of the shares) get out
safely, but when the little fish try to flee, the stock price
plummets in their stampede? A statistical study of stock trading
by Fabrizio Lillo and Rosario N. Mantegna from the University
of Palermo, Italy, and J. Doyne Farmer from the Santa Fe Institute
(NM) provides a partial quantitative answer (Nature
2003, 421, 129). The team was examining the more general issue
of how a single stock trade of a given size affects the price
of that stock. To do so, it analyzed trading data from 1995
to 1998 for the 1,000 stocks with the largest market values
on the New York Stock Exchange and found that the impact of
a given trade on the price of a stock increased more slowly
than linearly with the size of the trade. This tendency became
more pronounced for the biggest trades. The largest trades
typically involved 1,000 times as much money as the smallest
trades but moved the stock price only about 8 to 10 times
as far.
This means that across the stock market, if a huge investor
sells $10 million worth of shares in a single transaction
and causes the price to fall by 2%, 1,000 investors each selling
only $10,000 in stock can cause the stock to plummet by 90%,
even though they have withdrawn the same total cash amount.
So it is easy for big investors to get out of a stock without
heavy losses, but impossible for small investors to do so
if many of them sell at the same time. Similarly, in a rising
market, a single large investor can easily buy in early without
making the stock soar, but small investors trying to get on
board cannot do so at the bottom because their buying sends
the stock into the stratosphere.
|
 |
| Price shift, LP,
versus normalized transaction size, (t), for
buy orders initiated in 1996, where each curve represents
a set of stocks homogeneous in market capitalization,
increasing from low-(A) to high-capitalization (T)--such
as Coca Cola, General Electric, and Exxon (a). The same
data collapse onto a single curve when replotted on
axes
modified by the mean capitalization, C, raised to a power
(b). |
|
"There is not a one-to-one correspondence of the size of
an order and the size of the transaction because several small orders
can get bundled together," cautions Farmer, "but large
orders tend to have less impact per share than small ones."
|