Archive for August, 2008

Rot 'n' decay (part 2)

August 31, 2008

More highlights from the arXiv this week:

Quantum Computing with Alkaline Earth Atoms

Graphene Nanodevices: Bridging Nanoelectronics and Subwavelength Optics

Observation of Quantum Capacitance of Individual Single Walled Carbon Nanotubes

Inferring Basic Parameters of the Geodynamo from Sequences of Polarity Reversals

Can Matter Wave Interferometer Detect Translational Speed?

Rot 'n' decay

August 30, 2008

The best of the rest from the physics arXiv this week:

Gravitational Time Advancement and its Possible Detection

How Hard Is Bribery in Elections?

On the Stability of Black Holes at the LHC

The Flavour of Inflation

Dante, Astrology and Astronomy

Do nuclear decay rates depend on our distance from the sun?

August 29, 2008

radioactive-decay.jpg

Here’s an interesting conundrum involving nuclear decay rates.

We think that the decay rates of elements are constant regardless of the ambient conditions (except in a few special cases where beta decay can be influenced by powerful electric fields).

So that makes it hard to explain the curious periodic variations in the decay rates of silicon-32 and radium-226 observed by groups at the Brookhaven National Labs in the US and at the Physikalisch-Technische Bundesandstalt in Germany in the 1980s.

Today, the story gets even more puzzling. Jere Jenkins and pals at Purdue University in Indiana have re-analysed the raw data from these experiments and say that the modulations are synchronised with each other and with Earth’s distance from the sun. (Both groups, in acts of selfless dedication,  measured the decay rates of silicon-32 and radium-226 over a period of many years.)

In other words, there appears to be an annual variation in the decay rates of these elements.

Jenkins and co put forward two theories to explain why this might be happening.

First,  they say a theory developed by John Barrow at the University of Cambridge in the UK and Douglas Shaw at the University of London, suggests that the sun produces a field that changes the value of the fine structure constant on Earth as its distance from the sun varies during each orbit. Such an effect would certainly cause the kind of an annual variation in decay rates that Jenkins and co highlight.

Another idea is that the effect is caused by some kind of interaction with the neutrino flux from the sun’s interior, which could be tested by carrying out the measurements close to a nuclear reactor (which would generate its own powerful neutrino flux).

It turns out, that the notion of that nuclear decay rates are constant has been under attack for some time. In 2006, Jenkins says the decay rate of manganese-54 in their lab decreased dramtically during a solar flare on 13 December.

And numerous groups disagree over the decay rate for elements such as titanium-44, silicon-32 and cesium-137. Perhaps they took their data at different times of the year.

Keep em peeled beause we could hear more about this. Interesting stuff.

Ref: arxiv.org/abs/0808.3283: Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance

Why Galileo underestimated the distance to the stars

August 28, 2008

airy-disc.jpg

“Galileo argued that with a good telescope one could measure the angular sizes of stars, and that the stars typically measured a few arc-seconds in diameter,” says Chris Graney at Jefferson Community College in Louisvile in good ol’ Kentucky.

That doesn’t sound right.  We know today that stars appear as point sources of light, so what was Galileo talking about?

Graney says that it looks increasingly likely that Galileo had developed an ingenious technique for measuring the angular size of distant objects. This allowed him, for example, to see that Jupiter’s apparent diameter became smaller as the distance between Earth and Jupiter increased.

So how did Galileo measure the angular size of stars?

The answer according to Graney is that Galileo must have been unknowingly measuring the diffraction pattern created by stars, the so-called airy disc that makes them seem to have an apparent diameter.

This was how he arrived at absurdly close estimates for their distance. For example, he thought that the brightest stars were about 360 astronomical units away, about an order of magnitude nearer than we think today.

Makes sense, I suppose. And since the wave mechanics necessary to understand diffraction wasn’t developed for two centuries after Galileo’s death, perhaps we should forgive him this one mistake.

Ref: arxiv.org/abs/0808.3411: Objects in the Telescope are Further Than They Appear

The sound of a bouncing basketball

August 27, 2008

basketball.jpg

“A basketball bounced on a stiff surface produces a characteristic loud thump, followed by high pitched ringing,” says Joanthan Katz at Washington University in St Louis.

The question is why and, conveniently, Katz provides the answer on the arXiv today.

He assumes first that a basketball is an inextensible but perfectly flexible hollow sphere.

From this, he calculates that the thump is the result of the change of shape of the ball as it deforms when hitting a hard surface.

This creates a monopole source of sound that goes through only one full cycle of its frequency (of about 82 Hz for a full-sized basketball), hence the dull thump.

The ringing is caused by what Katz calls a dipole emission of sound and is essentially the vibration of the air within the basketball after the impact.

That’s a neat bit of calculating and Katz is honest to fault when it comes to pointing out the weaknesses of his model. He says, for example, that while the assumption of intextensibility is reasonable, perfect flexibility is a vast simplification that is necessary to make the problem tractable.

However,  he says assuming the opposite–that the ball has an extensible membrane and an incompressible filling fluid–produces neither a thump nor a ring.

If you’ve ever dropped a balloon filled with water, you’ll know what he means.

Ref: arxiv.org/abs/0808.3278: Thump, Ring: the Sound of a Bouncing Ball

The ultimate black hole size limit

August 26, 2008

ultramassive-black-holes.jpg

We have a pretty good idea that a supermassive black hole is sitting at the center of our galaxy. By supermassive, astronomers mean about 6 millions times as massive as our sun.

That’s pretty big by any standards but how big can black holes get Is there any limit to how big these monsters can become?

According to Priyamvada Natarajan at Yale University and a pal, the answer is yes.  Black holes, they say, cannot be bigger than 10^10 times the mass of the sun. (Or at least, are very unlikely to be bigger than that).

They arrive at this figure by calculating  the rate at which a black hole can swallow stuff and how much it could have gorged on since the universe was born, which seem like reasonable limits.

They call these beasts ultramassive black holes and reckon that there should be around 7×10^−7 of them per cubic megaparsec in the nearby universe. That’s not many. Our bast chance of finding one should be to look in the bright, central cluster galaxies in the local universe.

Better start scanning.

Re: arxiv.org/abs/0808.2813: Is there an upper limit to black hole masses?

Why aluminum should replace cesium as the standard of time

August 25, 2008

micromagic-clock

The second is defined as 9,192,631,770 vibrations of a cesium atom and measured in a device known as a fountain clock. These work by cooling a tiny cloud of cesium atoms to a temperature close to zero, tossing it up in the air and zapping it with microwaves as it falls.

Then you watch the cloud to see if it fluoresces. This fluorescence is maximised when the microwave frequency matches a hyperfine transition between two electronic states in the atoms, at exactly 9,192,631,770 Hz.

Various labs around the world use this method to run clocks with an accuracy of around 0.1 nanoseconds per day. That’s impressive but not perfect. Fountain clocks have one drawback: the clouds of cesium tend to disperse quickly and that limits how accurately you can take data.

Now there’s a new kid on the block which looks as if it’s going to be better at keeping time.

Today some chaps from the the University of Nevada in Reno and the University of New South Wales in Sydney outline a new clock that relies on an effect called the Stark shift in which a spectral line is split by an electric field (this is the electric analogue of the Zeeman effect in which spectral lines are split with a magnetic field).

This is a complex phenomenon but the key thing is that the same electric field can influence the split in different ways. In fact, a couple of groups have recently discovered that in certain circumstances these can cancel out each other at specific “magic” frequencies of an electric field. When that happens, the line splitting vanishes.

This should be pretty straightforward to measure. The electric field is supplied by trapping the atoms in a standing electromagnetic wave, otherwise known as a standard optical lattice. Then change the laser frequency while looking at the atomic spectra. When the line splitting vanishes, you’ve hit the magic frequency.

The big advantage of this method is that you can trap millions of atoms easily in an optical lattice and that should make such a clock much more robust than a fountain, while achieving at least the same kind of accuracy.

So what kind of atom should we choose to sit at the heart of these “micromagic clocks”? The Ozzie-American group says that, contrary to previous reports, cesium does not have a magic frequency and so can’t be used in this technique. Aluminum, on the other hand, should be perfect.

The second is dead, long live the second.

Ref: arxiv.org/abs/0808.2821: Micromagic Clock: Microwave Clock Based on Atoms in an Engineered Optical Lattice

Rocks 'n' comets (part 2)

August 24, 2008

More highlights from the physics arXiv:

Local Causality and Completeness: Bell vs. Jarrett

Superconducting Atom Chips: Advantages and Challenges

The Atmospheric Signatures of Super-Earths: How to Distinguish Between Hydrogen-Rich and Hydrogen-Poor Atmospheres

Networks of Quantum Nanorings: Programmable Spintronic Devices

The Meccano of Life

Rocks 'n' comets

August 23, 2008

The best of the rest from the physics arXiv this week:

Why it has Become more Difficult to Predict Nobel Prize Winners

A Local Scheme Accounting for EPR Quantum Correlations

Enhanced Heating of Salty Water and Ice Under Microwaves

Phonon-Induced Artificial Magnetic Fields

Ranges of Atmospheric Mass and Composition of Super Earth Exoplanets

How to find another Earth

August 22, 2008

 exoplanet.jpg

“We stand on a great divide in the detection and study of exoplanets,” says the Exoplanet Task Force on the arXiv today in describing their plan for finding another Earth orbiting another star.

On one side of this divide are the hundreds of known massive exoplanets, they say. And on the other” lies the possibility, as yet unrealized, of detecting and characterizing a true Earth analog–an “Earth-like” planet”.

And guess what, the Exoplanet Task Force knows how to bridge this divide.”

Their idea is to keep looking, using better telescopes. And when you find one, build a better telescope to look even harder.

It’s a strategy that’s guaranteed to succeed, provided you’re wearing a mask and a cape and call yourself the Exoplanet Task Force. Let’s face it, there’s nothing you can’t do while wearing your underpants on the outside.

Good luck fellas. We’re all behind ya.

Ref:  arxiv.org/abs/0808.2754: Worlds Beyond: A Strategy for the Detection and Characterization of Exoplanets