Archive for the ‘Mean machines’ Category

Qutrit breakthrough brings quantum computers closer

April 4, 2008

Toffoli gate

The folks playing with quantum computers have been claiming for years that their gadgets will one day make today’s supercomputers look like quivering lumps of jelly. But so far, their computers have yet to match the calculating prowess of a 10-year old with ADHD.

The most exciting work so far has been on universal quantum logic gates, the building blocks of any computer. A number of groups have built and demonstrated these and one team even took their gates for the computing equivalent of a run round the block by factorising the number 15.

The trouble is that, to do anything useful with universal quantum gates, you need at least dozens and preferably hundreds of them, all joined together. And because of various errors and problems that creep in, that’s more or less impossible with today’s technology.

Which is why a breakthrough by an Australian group led by Andrew White at the University of Queensland is so exciting. They have built and tested quantum logic gates that are vastly more powerful than those that have gone before by exploiting the higher dimensions available in in quantum mechanics. For example, a qubit can be encoded in a photon’s polarisation. But a photon has other dimensions which can also be used to carry information, such as its arrival time, photon number or frequency. By exploiting these, a photon can easily be used as a much more powerful three level system called a qutrit.

This is how the Ozzie team have exploited the idea: during a computation, their gates convert qubits into qutrits, process the quantum information in this more powerful form and then convert it back into qubits. All using plain old vanilla optics.

That allows a dramatic reduction in the number of gates necessary to perform a specific task. Using only three of the higher dimension logic gates, the team has built and tested a Toffoli logic gate that could only have been constructed using 6 conventional logic gates. And they say that a computer made up of 50 conventional quantum logic gates could be built using only 9 of theirs.

That’s a significant reduction. What’s more, they reckon that these kinds of numbers are possible with today’s linear optics technology.

That means these guys are right now bent over an optical bench with screwdrivers and lens cloths at the ready, attempting to build the world’s most powerful quantum computer. We may see the results–a decent factorisation perhaps–within months.

Could it be that Australia is about to become the center of the quantum computing world?

Ref: arxiv.org/abs/0804.0272: Quantum Computing using Shortcuts through Higher Dimensions

Buckyballs boost flash memory

April 1, 2008

Buckyball memory

There’s a problem on the horizon for memory chips called voltage scaling. It comes about because of a fundamental asymmetry in the design of nonvolatile charge-based memory.

These chips need to store data for about 10^12 times longer than it takes to program or erase. That’s why it’s novolatile. This asymmetry is usally achieved by applying an external voltage to program and erase the memory. But as more memory gets packed into the same area, it becomes hard to supply the right amount of voltage. What’s needed is a material that has a similar or better retention to program/erase time ratio but can be accessed with a lower voltage.

The current favourite to improve the next gen of memory chips is a technology called metal nanocrystal memory in which data is stored in tiny crystals of metal. But it looks as if this technology can easily be made even better than anyone was expecting.

Enter C60 buckyballs, the famous carbon footballs named after the architect Buckminster Fuller who popularised the geodesic dome. Tuo-Hung Hou and a few buddies at Cornell University in New York state has built and tested a new type of nonvolatile memory in which the data is stored in single molecules of C60. The molecules are relatively easy to embed in silicon because they do not clump together. They also have well defined and easily accessible energy levels that have to be overcome by tunnelling to store and erase data.

The team say that buckyballs offer a huge improvement in the data retention time compared to the program/erase time compared with today’s technology. But perhaps more importantly, buckyballs also improve the retention to program/erase time ration of metal nanocrystal memory by a whole order of magnitude. That means they could be added to the next gen of memory relatively easily.

Does that mean buckyballs will sit at the heart of future iPods? Could be.

Ref: arxiv.org/abs/0803.4038: Nonvolatile Memory with Molecule-Engineered Tunneling Barriers

Single photons bounced off orbiting satellite

March 17, 2008

Quantum satellite

Quantum physicists have been sending qubits through the atmosphere encoded in individual photons for years now. The work is the foundation of a new type of quantum communication that is perfectly secure from eavesdropping.

But there are challenges in setting up a global system of quantum communication. Not least is the problem of decoherence, in which noise destroys the quantum nature of the information as it travels though the atmosphere. This has limited the distance record for this kind of transmission to 144km (although longer distances are possible through optical fibres).

The obvious way around this is to send the signals through space via a satellite. When sent straight up, the photons need only travel through 8 kilometres of atmosphere and so are much less likely to decohere.
On Friday, Anton Zeilinger’s group in Vienna announced that they had taken the first step in this direction by bouncing single photons off an orbiting satellite soome 1400km above the Earth.

The team used a 1.5 metre telescope called the Matera Laser Ranging Observatory in Italy to bounce single photons off the Ajisai geodetic satellite, an orbiting disco ball that is used for laser ranging measurements.

Quantum communication with entangled photons can only be done by sending and detecting them one at a time so the experiment is a crucial step in making space-based quantum communication possible.

However, the team also tried bouncing photons off several other disco balls such as Lageos II, without success.

But give them their due. The experiment proves that it is possible to use existing laser ranging equipment to send and receive single photons from orbiting satellites.

“Our findings strongly underline the feasibility of Space-to-Earth quantum communication with available technology,” says the team.

Of course, this isn’t a demonstration of quantum communication itself in space. That will require an orbiting source of entangled photons.

So all they need now is somebody to build and launch a satellite that can produce and transmit entangled photons. Any takers?

Ref: arxiv.org/abs/0803.1871: Experimental Verification of the Feasibility of a Quantum
Channel between Space and Earth

Holographic quantum computing

March 5, 2008

Holographic quantum computing

After a decade or so in the lab, holographic data storage is about to burst into the hardware market big time.

Its USP is that holographic data is stored globally rather than at specific sites in the storage medium.

It is written using a pair of lasers to create an interference pattern that is recorded in the storage medium. It can then viewed by illuminating that area with a laser to recreate the pattern. Crucially, you can add and view more data by changing the angle at which you address the medium and this gives huge storage potential.

Now Karl Tordrup and colleagues at the University of Aarhus in Denmark have used the idea as inspiration for the design of a quantum computer. Their machine consists of an array of molecules that can each store a qubit. But instead of addressing them individually, Tordrup imagines storing quantum data in them as a group, by zapping them with the right kind of laser-created interference pattern. This is essentially quantum data storage, the holographic way.

What makes the idea interesting is that the group reckons that information can be processed by transferrring it to a nearby superconducting box in which the required operations can be performed. The processed data is then sent back again.

The big advantage of this idea is that, while stored in holographic form, the quantum data is incredibly robust. While any single molecule errors affect all qubits, they do so only very weakly. It also means that the molecules need only be addressed as a group, not as individuals which does away with a significant challenge that other designs of computer face

But there are problems too. The qubits will have to be protected from decoherence while in the superconducting box and travelling to and from it. And although the molecules do not need to be addressed individually, they do need to be held almost perfectly still. Those are toughies.

All they need to do now is build the thing.

Ref: arxiv.org/abs/0802.4406: Holographic Quantum Computing

The vibration harvest

February 27, 2008

Vibration harvester

All them turbines, drills and shakers in our modern factories make one almighty din.

We’re talking about a substantial amount of a-jumpin and a-jiggling which generally goes to waste. Couldn’t there be a way of harvesting this energy so that it can be re-used?

Turns out Tom Sterken and pals at IMEC, an independent nanostuff research lab in Belgium, have thought of a way to do it and have built a device that can do it.

It’s a MEMs gadget that consists of a tiny mass on a spring connected to a capacitor. As the mass bounces around, it generates a voltage which is stored by the capacitor.

When attached to an (unspecified) piece of industrial equipment, Sterken says his device generates 90 nanowatts of power. That doesn’t seem much: it’s not going to recharge your mobile phone or ipod.

But it might power the sensors needed to monitor this and other devices. And these harvesters can only get better.

Ref: arxiv.org/abs/0802.3060: Characterisation of an Electronic Vibration Harvester

Read it and beep

February 19, 2008

Reading text is a simple enough task for humans. But unless it’s cleaned up and served on a plate computers just can’t do it.

At least they couldn’t until Mireille Boutin and pals from Purdue University took a shot at the problem.

These guys have built an impressive algorithm that looks for and finds text in real-life cluttered images.

And it works well. In their, albeit limited, tests on  65 real-life images, the algorithm correctly identified the text 97 per cent of the time.

Cars that can read signposts, anyone?

Ref: arxiv.org/abs/0801.4807: Automatic Text Area Segmentation in Natural Images

Who's that on the runway?

February 14, 2008

Super dense landings

As an air traffic controller, the last thing you want is the catastrophic failure of your technology infrastructure.

And thankfully  it doesn’t usually happen like that. More often, there is a gradual degradation as one part of the system or another collapses.  Perhaps the communications die, the radar falls over or the computers crash, or maybe some of the ground structure becomes unavailable as airports have to close.

Obviously, air traffic controllers are trained to deal with these situations but could the air traffic itself be made more robust. Maxime Gariel and Eric Feron, from the Georgia Institute of technology in Atlanta, argue that it could and have even come up with a way of evaluating it.

They’ve studied various types of high-density, highly structured aircraft traffic and worked out how difficult it becomes to handle as information about it begins to fail. It turns out, they say, that certain types of traffic structure are easier to handle than others as the infrastructure degrades.

So if you’re at all worried about the introduction of Free Flight (where aircraft essentially manage their own traffic control in most areas), you needn’t be.   Gariel and Feron say it ain’t any worse than the current system should the air traffic control system go tits up.

Ref:  arxiv.org/abs/0801.4750: Graceful Degradation of Air Traffic Operations

Listening out for neutrinos

January 16, 2008

Acoustic neutrinos

A lotta neutrino detectors work by looking for the flares ‘n’ flashes of light generated on the rare occasion a neutrino smashes into something solid, like an atom.  Ya need to do a lotta lookin’ though, which is why neutrino detectors sit in vast pools of water or are dropped into the oceans or buried in ancient Antarctic ice.

Now Jonathan Perkin, a  fresh-faced physicist at the University of Sheffield suggests that it might also be possible to listen for the sound that neutrinos make when they whizz through water. The thinkin is that an ultra high energy neutrino smashing into an atom rapidly raises the temperature of the water nearby producing an acoustic pulse which could be picked up by an off-the-shelf hydrophone.

Neat idea, except that Perkin goes on to to say that to have a realistic chance of picking up the sound of neutrinos you’d need a detector that is over a thousand cubic kilometres in size with over 100 hydrophones in each cubic kilometre. That’s big enough to be a nonstarter in most people’s books.

Where neutrino noise might turn out to be useful, however, is for filtering data from the big neutrino telescopes such as IceCube under the South Pole.

Pick up both the light and sound signatures of a neutrino hit and that’s pretty good evidence that it ain’t no false alarm. So the sound of neutrinos zipping past might turn out to be useful after all.

Ref: arxiv.org/abs/0801.0991: The Acoustic Detection of Ultra High Energy Neutrinos

Cooling with sound

January 14, 2008

The next generation of chips are gonna need some major coolin’, perhaps as much as 1000 Watts of cold per square centimetre. We’re talkin’ high-speed microprocessors, optoelectronics, micro- and millimeter-wave power electronics and power conditioning transistors for electronic motor control in hybrid vehicles power converters etc. These are machines that will generate significant heat.

Ordinary coolin fans ain’t gonna work for those kindsa fluxes so alotta engineers have been thinkin’ about phase change systems in which a liquid absorbs heat, boils into a gas and carries the energy away.

The trouble is that  a vapor layer tends to form over the surface of a chip and this acts as an insulator preventing further heat transfer.

So Ari Glezer and pals at the Georgia Institue of Technology in Atlanta have come up with a way of dislodging the vapor by bombarding it with sound. It’s just a pump for pushing the liquid round with a piezovibrator to create an acoustic field.

They reckon it works well and at relatively low power. In their set up, they achieve a colling rate of 165W per square cm. But with acoustic zapping they can raise this to 338 W per square cm. That’s an improvement of 150%.

Ref: arxiv.org/abs/0801.0785: Acoustically Enhanced Boiling Heat Transfer

The Turing alternatives

January 9, 2008

Alternative Turing tests

Ya’ll heard of the Turing test for measuring machine intelligence. Seems kinda odd, doncha think, that after 50 years we ain’t never thought of any other ways to test machine intelligence.

Same thing occurred Shane “Hind” Legg at the IDSIA (Istituto Dalle Molle di Studi sull’Intelligenza Artificiale) in Switzerland and his buddy Marcus so they delved into the topic. Turns out, various researchers have come up with interesting machine intelligence tests which have been ignored by the community. So Shane and Marcus and have listed them all for ya on the arXiv.

Here are some highlights:

Compression: a machine intelligence has to compress a corpus of knowledge. This can only be done effectively when the machine has some understanding of the knowledge which allows it to extract some kind of overall structure. So a measure of compression can be interpreted as a measure of understanding

Linguistic complexity: measuring a machine’s conversational ability using techniques developed for tracking the complexity of children’s language

Competitive games: a broad definition of intelligence is the ability to do well at a wide range of tasks. So one idea is to test machines against each other on a wide range of games

The paper contains other tests so feel free to take a look.

Ref: arxiv.org/abs/0712.3825: Tests of Machine Intelligence