Archive for the ‘Nets 'n' webs’ Category

Criticality and the brain

April 8, 2008

Brain connections

Our understanding of how various parts of brain function is advancing at breakneck speed and yet we are as far away as ever from an overarching “theory of the brain” that attempts to encompass these discoveries. Such a theory would unite disparate discoveries in brain science under a unifying theme.

Now Dante Chialvo from Northwestern University in Chicago and colleagues attempt to do just that. Their proposal is that the brain is spontaneously posed at the border of a second order phase transition, just like the transition a ferromagnetic material undergoes as it swtches from a non-magnetic to a magnetic phase.

One of the features of these transitions is the existence of a critical point in which both phases exist simultaneously in a way that ensures that the distinction between them more or less disappears. At this so called “criticality”, all kinds of curious phenonena have been found, including self organising behaviour.

Chialvo and buddies say “all human behaviors, including thoughts, undirected or goal oriented actions or any state of mind, are the outcome of a dynamical system at or near a critical state.”

They make a list of features that they would expect the brain to demonstrate in experiment were it operating close to criticality.

At large scales, they say, we should see cortical long range correlations in space and time as well as large scale anti-correlated cortical states. That certainly seems to be true of our brains in general.

And at small scale, we should see “Neuronal avalanches”, as the normal homeostatic state for most neocortical circuits. And sure enough, the group point to evidence for this.

The trouble is that these look very much like an after-the-fact- predictions in this paper, a feeling that is backed up by the absence of any testable hypothesis about the brain.

If the brain is close to crticiallity (which doesn’t seem like too far fetched an idea), surely it would be possible to make some predictions about the results of experiments such as those involving human attention, optical illusions and the reaction to various stimuli.

So while Chialvo’s proposal may make the pretense of being a theory of the brain, to my mind they’ll have to settle for the status of “interesting idea” until somebody takes them significantly further.

Ref: arxiv.org/abs/0804.0032: The Brain: What is Critical about It?

The coming blackout

April 2, 2008

On Monday, 17th December 2007, Europe narrowly avoided disaster. A cold snap had lowered the temperature across much of continent to several degrees below average and that evening, as households across the continent switched on their heating systems, the power consumption hit critical levels.

France, Italy and Spain all set new records for power consumption. By sheer luck, Switzerland and Germany, which were less cold, were able to provide some 1.6 GWe of spare capacity to cover the cracks in the system.

As it turned out, the rest of the winter was abnormally mild. But had the cold snap been more widespread, the European electricity supply could have collapsed.

The problem dates from about 30 years ago when Europe’s grid system and generating capacity was built with a huge amount of spare capacity. Since then, as economies have boomed, politicians have had little incentive to upgrade the system. In the meantime, consumption has been increasing at the rate of 1-2 per cent per year and today the spare capacity has all but gone. With the simplest extrapolation being that demand will continue to grow at the same rate, a crisis looms.

Now the Union for the Co-ordination of Transmission of Electricity, an association of power providers in Europe has issued a report detailing the system’s shortcomings. And analysis on the arXiv by Michael Dittmar at the Swiss Federal Institue of Technology in Zurich paints an even gloomier picture, not least because there is no clear short term path to reducing consumption or increasing generating capacity.

Europe has suffered a number of large blackouts in recent years, notably in Italy between 28-29th September 2003 and in France and Germany on 4 November 2006. But worse looks to be on the cards. Dittmar’s message is that the next winter of 2008/9 will test the European grid to its limits.

Ref: arxiv.org/abs/0803.4421: The European Electricity Grid System and Winter Peak Load Stress

Proof that a minority of streets handle the majority of traffic

March 12, 2008

Gavle

In recent years, physicists have turned their penetrating gaze towards the structure of towns and cities. What they tend to do is measure the “connectedness” of a town by looking at how many roads each street is connected to. It turns out, that cities follow an 80/20 rule, that 80 percent of the streets have a below average connectedness while 20 per cent have an above average connectedness.

This is no surprise since the same kind of 80/20 pattern crops up with alarming regularity in all kinds of networks, particularly social ones. (The most famous is Pareto’s law which states that 80 per cent of the wealth is owned by 20 per cent of the people).

But so what? Pawing over maps and sweating over street names maybe a theoretical physicist’s idea of fun but nobody has actually proved that the 80/20 rule has any tangible effect on street use.

Now Bin Jiang at the Hong Kong Polytechnic University has come up with some actual data from a real town. He says that 80 percent of the traffic in a Swedish town called Gavle flows along 20 per cent of the streets. And 1 per cent of the most highly connected steets account for a phenomenal 20 per cent of the flow. What’s more, he says the flow is intimately linked to the topology of Gavle (a town of 70,00 people).

So there you have it. Although it seems only common sense to imagine that the most traffic flows along the best connected streets, we now have some evidence to prove it. Good, solid, unspectacular physics.
Ref: arxiv.org/abs/0802.1284: Street Hierarchies: A Minority of Streets Account for a Majority of Traffic Flow

Can data overload protect our privacy?

March 10, 2008

Messenger

If you were chatting on MSN messenger in June 2006, your conversation was being recorded and the details (but not the content) passed to Eric Horvitz and Jure Leskovec at Microsoft Research in Redmond, Washington. Using this data, these scientists have created “the largest social network constructed and analyzed to date”.

They’ve now published their results which show the habits of people who use Messenger and the scale on which it occurs. But this study is noteworthy for another reason: it gives a curious insight into the limitations of this kind of analysis. The Microsoft team says it had too much data and this affected its ability to crunch it effectively.

Here’s what they did. The researchers used data such as IP address and log in and out times as well self-reported information such as age, sex, and zip code (which are obviously highly accurate) to carry out their analysis.

The bald details are that 30 billion IM conversations took place between 180 million people all over the world in June 2006.

The researchers found that people tend to chat to individuals who share the same language, age group and geographical location (in other worlds to people like themselves). They also chat more often and for longer with members of the opposite sex.

Each account had on average 50 buddies and, in the IM world, people are separated by “7 degrees of separation”.

That’s about the strength of it and I’m underwhelmed. No fascinating insights into the correlation between chatting spikes and news broadcasts/ad breaks/episodes of Friends; or the patterns of chat in the workplace versus home using IP location changes; or how IM users travel the world. Just straightforward count ’em ‘n’ weep numbers.

But there’s a good reason for the lack of more detailed insight. The problem, say Horvitz and Leskovec, is the size of the data base: 4.5 terabytes which took 12 hours to copy to a dedicated eight-processor server. “The sheer size of the data limits the kinds of analyses one can perform,” they say.

So will data overload always protect us from Big Brother’s prying eyes? Perhaps in some circumstances like these but otherwise I wouldn’t count on it. It’s straightforward to sample big datasets like this (although that can introduce problems of its own).

I wouldn’t mind betting that with a little more effort, it would be possible to identify individuals from their travel and chatting patterns, perhaps by correlating the data with local telephone and business directories much in the same way this has been done with search data. However, it looks as if Horvitz and Leskovec have steered carefully around this issue.

Of course, Microsoft doesn’t need to do this since it can store a much fuller set of data anyway including the full text of the conversations and whatever data it has on the identity of the owners.

And you can be sure that more shadowy organisations with access to much greater computing resources will also have this full data set and be happily chewing through it as you read this.

Ref: arxiv.org/abs/0803.0939: Planetary-Scale Views on an Instant-Messaging Network

Food for thought

March 4, 2008

Food for thought

Evolution seems to crop up all over the place. In life, business, ideas. And now in recipes through the ages.

Yup, that’s recipes. For food. Osame Kinouchi from the Universidade de São Paulo in Brazil and buddies, have studied the way in which ingredients used in recipes vary around the world and through the ages. And they’ve found, they say, evidence of evolution.

The team studied the relationship betwen recipes and ingredients in four cookbooks: three editions of the Brazilian Dona Benta (1946, 1969 and 2004), the French Larousse Gastronomique, the British New Penguin Cookery Book, and the medieval Pleyn Delit.

They took the recipes from each book, counted the number of times each ingredient appeared in these recipes and ranked them according to frequency.

What’s remarkable is that the frequency-rank distribution they found is more or less the same for each cookbook. Kinouchi and co say this can be explained if recipes evolve in much the way that living organisms do–in a landscape in which some ingredients can be thought of as fitter than others, in which random mutations take place, and some ingredients die out while others prosper.

Very clever…unless they’ve missed something.

Perhaps it’s not ingredients that produce this distribution but words themselves. I’d be interested to see whether the results they get would be significantly diffierent were they to examine the frequency of adjectives or colours or numbers in these books rather than ingredients. If not, then recipes have nothing to do with the results they are presenting.

Of course, it’s possible that recipes have evolved in the way the group suggests. But the evidence they present here doesn’t look convicing to me.

Ref: arxiv.org/abs/0802.4393: The Nonequilibrium Nature of Culinary Evolution

Why silos burst

January 31, 2008

Force chain

Believe it or not, grain silos are interesting structures. They’ve been known to explode without warning, which is hard to explain since they are filled with, well, grain.

But grain turns out to be kinda interesting too. In recent years, researchers have begun to get a handle on some of the strange and counterintuitive ways in which grain behaves as it flows and as it is placed under pressure.

One of the most interesting developments has been the discovery of “force chains”, networks of particles that form as the force is passed from one grain to the next (see picture). In this way, forces of many orders of magnitude greater than expected can be transmitted through the medium.

John Wambaugh and colleagues at Duke University in Durham have been studying the force networks that are set up within a two-dimensional silo and how these can make the forces behave in an extraordinary, non-linear way.

When grain is added to the top of the silo, the pressure in the medium increases but goes on increasing in a non-linear way even after the addition of material has stopped before decaying, a so-called “giant overshoot” effect.

How to explain this? Usually, force chains break and reform as the pressure changes in a granular medium and this helps to spread the forces evenly within it.

But Wambaugh thinks the non-linear behaviour suggests that something else is going on. He says that in certain circumstances, the force chains become locked in place and so that the additional pressure spreads much further and deeper than usual, creating the giant overshoot.

It might also explain why silos sometimes burst unexpectedly.

Ref: arxiv.org/abs/0801.3387: Force Networks and Elasticity in Granular Silos

How to reduce extremism? Travel!

January 29, 2008

Extremism

Andre Martins studies agent-based computer models of extremism at the University of Sao Paulo in Brazil.

We’ve heard from him before following his claim that extremism is an emergent phenomenon in our society.

Now he’s back with the results of a study on how to reduce extremism.

Martins creates a network model in which agents can hold any position on a continuous scale of opinion. Each agent updates its opinion using a simple calculation after observing the opinions of others nearby.

Martins defines extremism as “an agent who supports one choice fervently, even when a large group
believes a different idea to be a better choice”. One of the impressive aspects of Martins’ model is that extreme behaviour emerges naturally, just as it does in real societies.

Now he has studied ways in which extremism can be reduced. He offers tantalising evidence that extremism is linked to the structure of a society because different types of networks produce different levels of extremism.

But his most interesting conlcusion is that the mobility of agents within a network is crucial:

“The extremism problem can become far less important in societies where the mobility of its agents is above a certain threshold. Therefore, efforts to reduce such a mobility can have important negative impacts in the diminishing of extremism.”

So get moving.

Ref: arxiv.org/abs/0801.2411: Mobility and Social Network Effects on Extremist Opinions

How to maximise your PageRanking

November 24, 2007

Links

Google’s PageRank system rules the web, right? This is the algorithm that determines how far up the list yer site appears in a given Google search. A better ranking can mean big bucks for some sites

The PageRank algorithm is closely guarded secret. But a growing number of academics are trying to reverse engineer the algorithm so that they can better understand how it works (and presumably to boost their own ranking).

Everyone knows that ya can boost yer ranking by getting other popular sites to link to you. That ain’t always so easy.

But another question is how best to arrange links between pages within yer own site to boost your ranking.

When you have a group of pages, various theories have been put forward: should you make your links into a ring structure with no center or a star-like structure with hub or some other shape?

Nobody knows, or at least they didn’t until Cristobald de Kerchove at the Universite Catholique de Louvain in Belgium came along. Using some reasonable assumptions about the algorithm and a few impressive mathematical techniques, he’s worked out the optimal linking strategy for boosting your PageRanking.

This is it: arrange your pages in a simple forward chain so that each links to the next but at the same time include every possible backward link (see diagram above).

Simple when ya know how.

Ref: arxiv.org/abs/0711.2867: Maximizing PageRank via outlinks

Web traffic and sand piles

November 16, 2007

Web traffic

Drop grains of sand onto a flat surface and they form a pile. Keep adding grains and eventually ya’ll witness an avalanche. The curious thing about avalanches is that yer can’t tell how big they is going to be. A single dropped grain could dislodge a handful of other grains or hundreds of grains or thousands or perhaps tens of thousands of ’em. How odd that the same trigger, the dropped grain of sand, could generate results that vary over several orders of magnitude.

Of course, the size of the avalanche don’t depend on the size of the grain at all. Instead, it depends on the complex network of forces that exist within the pile when the grain hits. It turns out that when these forces are balanced just right, the scale of avalanches cannot be determined in advance. This balance is known as self-organized criticality and it is true of earthquakes, forest fires, stock market crashes, the size of which cannot be predicted in advance.

Now it looks as if web traffic is in this same state of self organized criticality.

At least that’s what Mikhail Simkin and a friend at the University of California, Los Angeles, tell us. They have looked at the traffic at various popular websites and say it looks remarkably like the way avalanches occur, with very little activity for long periods interspersed with huge spikes in traffic that can vary by orders of magnitude.

Simkin argues that the web is like a pile of sand in a state of self organized criticality. Instead of a network of forces between grains, the web depends on a constantly changing network of links between pages. At any moment, these links could generate a flurry of activity as the page becomes dugg or slashdotted, for example. But the traffic from such an ‘avalanche’ can vary over several orders of magnitude in an entirely unpredictable way.

(This seems so obvious that ah can’t quite believe that Simkin is the first to suggest this, but if he is, well done.)

The lesson for all ya bloggers out there is to keep dropping the grains of sand.

Ref: arxiv.org/abs/0711.1235: A Theory of Web Traffic

The terrible truth about extremism

November 15, 2007

Extremism

Why is our way o’ living threatened by extremists? A natural question for anybody a-fretting and a-worrying about the state of world order. But the answer ain’t gonna please ya’ll. It’s looking increasingly as if extremism is an ordinary emergent property of societies like ours that we can’t do nothing about.

Andre Martins at the University of Sao Paulo in Brazil has modeled the way opinions meander ‘n’ flow through society. His laboratory is a virtual world populated by thousands of opinionated “agents” that interact with each other and change their minds according to various rules. Martins’ work consists of setting the model running and then putting his feet up for a snooze until a suitable amount of opinion forming has gone on.

Various people have tried to model opinions in this way but Martins is the first to reproduce the extremism we actually see in society. In other models each agent can take one of two views according to the opinion of those agents nearby. For example, one rule might be that an agent is obliged to change its view if two or more neighboring agents hold the opposite view. This kind of model can show how opinions spread through a society but extremism never evolves because the agent can hold only one view or the other.

Instead, Martins allows his agents to have a continuous spectrum of opinions that range from one extreme to another. This is then influenced by those agents nearby. The result is that:

“The appearance of extremists is naturally observed and it seems to be a characteristic of this model. This can help explain cases where people are led, by social pressure, to believe blindly in whatever opinion is shared by its local group, despite divergent voices in the larger society they live in.”

So what’s the bottom line? According to Martins, extremism is a natural property of social networks like ours. And that means we ain’t ever gonna get rid of it.

Ref: arxiv.org/abs/0711.1199: Continuous Opinions and Discrete Actions in Opinion Dynamics Problems