Archive for June 2009


Comments Off Posted on Monday 29 June 2009 at 11:45 pm by Seth Bell
In Happenings, Health & Medicine

Last week I had a cold. It was one of those ones you get after you haven’t slept properly for a few nights (and in my case, because I’d been pushing myself to work hard for once).  I felt terrible, but being British, a man, and generally lazy I made no effort to go to the doctor. And, of course, I had no Lemsips, Beechams or any kind of medicine at all in my flat.

Why am I telling you this? Well, it’s because my Chinese flatmate Will gave me some medicine. When he first offered me some I half expected a herbal remedy. But no, he produced a packet of tablets which consisted of two types: white tablets for the day time and black tablets for the night time.

I’m the kind of person who always reads the label on medicines. Not because I understand the technical jargon you find on them, I just find it reassuring to pretend I’m capable of deciding what the tablets might do to me. In this case though my knowledge of the Chinese language (i.e. none) prevented me from undergoing this ritual.

As a result I was apprehensive about taking the tablets. Which I found worrying in itself, because I completely trust my flatmate and know he wouldn’t give me anything dangerous. So why was I afraid, and what was I even afraid of? What’s more, I was more apprehensive about taking the black tablets than the white tablets. I thinks it’s probably because I’ve never taken (or even seen) a black tablet before.

In the end I just took both the tablets anyway, after realising that a) I was being irrational, and b) I felt so crappy that I was prepared to try anything. But because I couldn’t read the ingredients I wasn’t really convinced they would work, and instead thought ‘at the very least the placebo effect might kick in and make me feel better.’

When I told Colin this story we both got a bit unsure of whether the placebo effect can take place if you’ve already considered that the thing you are taking might work as a placebo. I’ve had a similar thought before about headache tablets. If I have a headache and take two headache tablets I always start to feel better about 20 minutes later (the only exception being when I’m hung-over).  But, I always ponder, is this simply because I assume they will work, stop worrying about my headache and get on with things. Or is it because the paracetemol, caffeine and so on in headache tablets actually works on me. I imagine it’s mainly the latter, but live in fear that if I ever lose my confidence in headache tablets that they will no longer work on me.

In the case of the Chinese flu tablets, I did feel better the next day. It might be because of the stuff in them, or it might be because of some placebo effect. Or it might be that I had a good nights sleep for once. But I still decided to not take any more, and left it up to time and nature to get me better.

So what is the point of this rambling parable? Well, I feel like I learned a few things. First, being able to read labels makes me feel much happier about taking medicine. Second, black tablets are slightly intimidating. Third,I don’t care if its a placebo effect thats getting me better as long as I feel better. Finally, I should buy medicine before I get sick.

Comments Off

Comments Off Posted on Monday 29 June 2009 at 10:39 pm by Emma Stokes
In Biology, Health & Medicine

Blocking the action of a gene called Sirtuin-1 reduced the symptoms of type 2 diabetes in rats, scientists have found.

People with Type 2 diabetes suffer from high blood glucose concentrations due to insulin resistance and increased glucose production. To create a similar condition in rats, the researchers put a group of rats on a four-week diet of high-fat, fructose-rich meals.

Sirtuin-1 is a gene responsible for regulating glucose production in the liver. The researchers therefore then blocked Sirtuin-1 in the ‘diabetic’ rats by injecting them with a fragment of genetic information. This fragment – called an antisense oligonucleotide – interrupts and blocks gene expression and can be targeted to specific genes.

After Sirtuin-1 inhibition, the rats were more sensitive and responsive to insulin. The rate of glucose production fell back to normal levels, resulting in a decrease in the blood plasma. Thus the scientists believe the Sirtuin-1 gene is a cause of type 2 diabetes symptoms.

The results of this study are consistent with a recent mouse study which showed that decreased expression of Sirtuin-1 led to better insulin sensitivity. The next step is to develop inhibitors targeted to Sirtuin-1 in the liver, these will be tested in rats before moving on to primates and human clinical trials if successful.

For more information on animal research and this story, please see the Understanding Animal Research site.

Comments Off

Comments Off Posted on Monday 29 June 2009 at 10:18 pm by Emma Stokes
In Biology, Health & Medicine

Using fish, scientists have discovered a signalling pathway that could be used to treat skin cancers (melanomas). The pathway, PI3K (phosphoinositide 3-kinase) had a major effect on the progression of cancerous melanomas in zebrafish. Zebrafish are ideal for studying skin cancer as the melanomas are similar to those seen in humans, and the fish themselves are easy to observe because of their light-coloured, almost transparent skin.

Signalling pathways regulate cell division, migration and death. The pathways form a complex network to relay these various commands to cells. But when the signalling molecules mutate, the result is often excess cell division which can lead to cancer.

The team looked at two major pathways called Ras and PI3K. They found fish often developed melanomas which progressed rapidly if molecules in these pathways were mutated. The discovery that PI3K was directly involved indicates that it could be a suitable target for melanoma therapy.

The mutant zebrafish also passed on the mutations to their offspring. In this they were strikingly similar to the human inherited syndrome FAMM (familial atypical mole and melanoma).

This study highlights a potential target for therapy, but also gives scientists new insights into the mechanisms of melanomas, revealing other possible targets. But further research into these models will be needed so scientists can see whether they’re as promising as this initial study indicates.

For more information on animal research and this story, please see the Understanding Animal Research site.

Comments Off

2 Comments » Posted on Monday 29 June 2009 at 10:11 pm by Jacob Aron
In Happenings

I will be spending all this week helping out at World Conference of Science Journalists at Central Hall Westminster, so you can expect nightly blog posts on my activities that day. This morning, I headed down to Westminster at noon to begin setting up at the conference. This mostly involved boxes. Lots of boxes.

A quirk of architecture – a flight of five stairs – made moving said boxes an almost farcical affair. To get everything to the exhibition hall, we had to take the goods lift from the ground floor to the third floor. A short circuit of the building later to a regular lift, it was down from there to the first floor. Repeat as necessary.

Besides the various materials needed for the exhibition stands, we also had to put together around 800 delegate bags. This required an assembly line of various leaflets, but when the bags were ready they had to be taken to the reception area. Whilst everyone else toiled away stuffing bits of paper together, the job of shifting them all fell to me. So if you’re attending the WCSJ, when you pick up your nicely packed delegate bag, remember that I personally lugged around every single one of them!

Not much in the way of science then, and I unfortunately had to miss the conference reception at the Science Museum. It’ll be an early start tomorrow, but I’m hoping to blag my way into at least a few sessions. All will be reported in the evening.

Comments Off Posted on Sunday 28 June 2009 at 3:13 pm by Jacob Aron
In Chemistry, Climate Change & Environment, Getting It Right, Health & Medicine, Science Policy, Weekly Roundup

Sun in common-sense shocker

Sometimes I worry about being too negative on Just A Theory. With all the examples of media failings I write about, it’s easy to let the good ones slip past unnoticed. As such, I thought I’d congratulate The Sun’s Dr Keith for his recent article on misused medical terms. He informs us that we probably don’t have the flu (it’s a cold), there is no such thing as a nervous breakdown, and most of us are rarely “shocked”, in a medical sense.

New hope for Copenhagen

Later this year thousands of people will descend on Copenhagen to try and come up with a new global agreement on climate change. The United Nations, in conjunction with the International Advertising Association, have launched a campaign to re-brand the conference as Hopenhagen. The idea is to move from “coping” with climate change to a “hope” that action can be taken. A silly bit of marketing? Perhaps. But if it gets people talking, it’s probably a good idea.

Check this out. It’s awesome

“But what is it?” I hear you cry. Created by Japanese artist Sachiko Kodama, the strange substance in this art work is a ferrofluid. These odd liquids combine tiny magnetic particles with water or oil, and a surfactant, which prevents the particles sticking together. Ferrofluids react in the presence of a magnetic field, creating the wonderful structures in the video above.

Whilst they do have their practical uses, like forming a liquid seal in computer hard drives or marking areas of the body in an MRI scan, I think you’ll agree that just looking cool is good enough.

Comments Off

Comments Off Posted on Sunday 28 June 2009 at 2:25 pm by Sam Wong
In Inventions & Technology

This is just awesome. Maverick designers James Auger and Jimmy Loizeau have built a collection of robotic furniture that generates its own energy by catching and digesting vermin. No, really.

This picture shows an LCD clock with a strip of flypaper on a roller. At the bottom of the roller,a blade scrapes the flies off and they fall into a microbial fuel cell, to be digested by bacteria to yield energy.

Image: Auger-Loizeau

This lamp lures flies with ultraviolet LEDs, then traps them like a pitcher plant.

Image: Auger-Loizeau

This wall-mounted robot is designed to encourage flies to build webs on it. A camera detects when a fly gets caught, then when the fly stops moving, a robotic arm nabs it from under the spider’s nose.

Image: Auger-Loizeau

Its energy needs are supplemented by the UV fly-killer depicted below.

Most gory of all is this innocuous-looking coffee table. In the centre is a trapdoor on which crumbs can be set as bait. One of the table legs is hollow, and mice can crawl up the inside. Infra-red motion sensors detect when a mouse walks onto the trapdoor, and send it tumbling into the fuel cell below. All it’s missing is the ability to emit a Bond-villain style evil laugh as the rodent is slowly devoured.

Sadly, as far as I’m aware, Ikea have no plans to sell flat-pack versions of any of these items.

The robots were inspired by was designed by the Ecobot, an energetically autonomous robot designed by Bristol Robotics Laboratory. Ecobot uses sewage to attract insects into its microbial fuel cell, where they get digested by sludge bacteria. As the bacteria metabolise the sugars yielded from breaking down the robot’s prey, electrons are released, which can be harnessed to generate an electric current. You can get your own, yeast-based microbial fuel cell from the National Centre for Biology Education at Reading University.

A brief glance at Bristol Robotics Laboratory’s website reveals that they’re working on numerous fascinating biology-inspired projects, like robots that can sense their surroundings with rodent-like whiskers, robots that can keep their vision focused on something while moving (by mimicking the vestibulo-ocular reflex) and flying robots that can save energy by soaring like albatrosses. But I can’t help getting the impression that these people are hell-bent on bringing about humanity’s destruction at the mechanical hands of robots. Some of the other projects that they’re working on are robots with the ability to heal themselves, robots that can work as a team to find food, and even robots that can develop their own culture. A project they’re involved in called SYMBRION ‘may lead not only to extremely adaptive, evolvable and scalable robotic systems, but might also enable the robot organisms to reprogram themselves without human supervision; to develop their own cognitive structures and, finally, to allow new functionality to emerge: the most suitable for the given situation.’ And there I was thinking that Terminator was far-fetched.

If you think BRL’s projects are wacky, wait until you see Auger and Loizeau’s previous work. Their website catalogues an array of mind-bogglingly bizarre projects, all aiming to offer ‘services that contrast and question current design ideology’, and ‘instigate a broader analysis of what it means to exist in a technology rich environment and its cultural implications for the present and the near future.’

The interstitial space helmet is like a digital burka that brings to reality Susan Greenfield‘s nightmare scenario in which nobody interacts face-to-face and all interaction is mediated through telecommunications devices. Social tele-presence is a system that allows the user to effectively occupy someone else’s body, or even a dog’s. The subliminal watch is a device for those too lazy to look at their wrist and instead want the time zapped into them through electric shocks. The isophone aims to make phone calls a fully immersive experience by enclosing the user’s head in a digital helmet while the rest of their body floats in water.

Ramping up the ridiculousness yet further, we have digitised banality, a series of products that revel in their own pointlessness: a device that counts ripples in a lake, an electronic leaf that alerts its owner when it falls off a tree, and a chair that records an ever-increasing total of all the weight it has ever borne. Best of all is a range of inventions designed to aid animals, including an aquatic stealth jacket for whales, an acorn positioning system for squirrels, and ‘omnivore dentures’ that allow big cats to munch on vegetable matter.

Who says technology has to be useful? I think this stuff’s bloody brilliant.

Comments Off

2 Comments » Posted on Saturday 27 June 2009 at 5:49 pm by Jacob Aron
In Health & Medicine, Musings

ResearchBlogging.org

It’s quite likely that a number of people reading this went out for a drink last night. After all, it was Friday and that’s what people do. I went to a rather enjoyable end-of-term party, and of course had a few beers. Alcohol consumption is such a normal component of our society that when you’re knocking a few back it’s difficult to remember it can actually be very harmful.

A series of papers published in The Lancet this week brings the message home. The first reports that 3.8%, or roughly one in 25, of all deaths worldwide are caused in some way by alcohol. This is about half the number caused by tobacco. Alcohol also contributes to 5% of years spent with disease or disability. Because of this, the authors recommend that the consumption of alcohol for certain health benefits should not be encouraged, as the harm far outweighs the gain.

These figures hide the details however. Due to gender differences in alcohol consumption, one in 16 men die from alcohol related causes, compared to just one in 90 women. This is changing as the number of women drinking increase.

Although these statistics are worldwide, alcohol consumption is not the same across the globe. The average adult drinks around 12 units per week, but in Europe this nearly doubles to around 23 units per week. The UK Government recommend a maximum of 14 units for women and 21 for men per week.

Whilst consumption may be high for Europe, it is in Russia where alcohol use takes the worst toll. A study of over 48,000 Russian deaths found that alcohol was responsible for more than half in those aged 15 to 54. Perhaps unsurprising, in a nation where some industrial workers drink one bottle of vodka per day.

It’s not just the health costs of alcohol that are high. In a paper calling for action on alcohol, the authors estimate that high- and middle-income countries spend more than 1% of GDP on economic costs related to alcohol. You may remember 1% of global GDP as the figure proposed by the Stern report for tackling climate change.

In the same paper, the authors question why alcohol is not higher on the global health agenda compared to tobacco and illegal drugs, considering the harm it can cause. They blame well-organised alcohol lobbyists for blocking action to curb consumption, saying that this must be combated.

This series makes for difficult reading. As a non-smoker, I celebrated when the UK ban came in and allowed me to go to the pub without smelling like a chimney. Discussions of implementing a minimum cost for alcohol however, as these reports suggest, set me protesting. Perhaps more expensive alcohol would be small price to pay however, considering the health benefits to be gained.

Rehm, J., Mathers, C., Popova, S., Thavorncharoensap, M., Teerawattananon, Y., & Patra, J. (2009). Global burden of disease and injury and economic cost attributable to alcohol use and alcohol-use disorders The Lancet, 373 (9682), 2223-2233 DOI: 10.1016/S0140-6736(09)60746-7

Zaridze, D., Brennan, P., Boreham, J., Boroda, A., Karpov, R., Lazarev, A., Konobeevskaya, I., Igitov, V., Terechova, T., & Boffetta, P. (2009). Alcohol and cause-specific mortality in Russia: a retrospective case–control study of 48 557 adult deaths The Lancet, 373 (9682), 2201-2214 DOI: 10.1016/S0140-6736(09)61034-5

Casswell, S., & Thamarangsi, T. (2009). Reducing harm from alcohol: call to action The Lancet, 373 (9682), 2247-2257 DOI: 10.1016/S0140-6736(09)60745-5

Comments Off Posted on Friday 26 June 2009 at 8:48 am by Jacob Aron
In Getting It Wrong

Apparently women are at their happiest when the reach the age of 28. Don’t take my word for it though – the hair colour company Clairol have commissioned a “study” of 4,000 women. A spokesman for the company said:

“The age of 28 has been pinpointed as the time in a woman’s life their hair looks the best, body shape is at its peak and confidence is at an all-time high.”

The “research” consisted of asking women at what point they were happiest in 12 “key areas” of their lives. If you really care, the Telegraph story linked above lists the various nonsense answers.

It also seems that 56% of women worry about losing their looks as they age. That’s good news for Clairol then, as their entire business model is built on convincing women that they are unattractive and must buy their products in order to looks good.

Psychologist Corinne Sweet offered these words of wisdom to women concerned about their appearance:

“Having a good hair day is essential to success both at work and in love, as many women still feel their hair is their crowning glory

“Considering it was found that women have six bad hair days a month, anything women can rely on to improve their hair at home, in the minimum of time with guaranteed results can mean a huge lift in well-being, confidence and self-esteem.”

Yes girls, nothing is more important than your hair. Without good hair, you cannot be successful. Buy Clairol products, or your life will be a meaningless mess. After all, “research” and a psychologist say so.

Corinne Sweet avoids the wrath of my scare quotes because, according to her biography at least, she is currently doing an MSc in Psychodynamics of Human Development at the Birkbeck Psychology Department. The rest of her website reveals she is very much a journalist/broadcaster though, so I imagine she was just approached by Clairol to slap her name on their bullshit in order to give it a veneer of respectability.

The Telegraph fell for it, and saw fit to report the “story” in their Health section. I hope all my female readers are already rushing to the chemists to stock up on Clairol products. Hurry, before it’s too late.

Comments Off

Comments Off Posted on Thursday 25 June 2009 at 7:03 pm by Jacob Aron
In Getting It Wrong, Health & Medicine

ResearchBlogging.org

A Canadian study published in the journal Obesity has found that overweight people are 17% more likely to live longer than those of normal weight. In response, the Daily Mail instructed their readers to fatten up, but I would advise against it.

The study looked at data from the Canadian National Population Health Survey, which monitors the health of participants every two years. Using over 11,000 patient records, the researchers were able to track changes in Body Mass Index (BMI) and their relationship with mortality.

BMI is a commonly-used statistic for assessing a person’s body weight. It is calculated by a formula incorporating both height and weight. Normal BMI is considered to be between 18.5 and 25, whilst 25 to 30 is overweight. Outside of this range are underweight and obese.

Unsurprisingly being underweight or obese was found to be bad news when it comes to living longer, although for younger participants aged 25-59 being underweight was not a concern. Whilst we might expect these results, the conclusion that being in the overweight category gives you a slight lifespan advantage requires deeper investigation.

The problem could lie with the way BMI is measured. For the average person BMI is a useful indicator of healthy body weight, but because it doesn’t actually measure total body fat it can be problematic. For particularly athletic or muscular people the formula doesn’t work, because muscle weighs more than fat. Thus, those in the overweight category could actually be fit and healthy with large amounts of muscle tissue – exactly the kind of people we would expect to live longer.

The authors of the study caution against inferring causality as the Daily Mail has done. Getting fatter won’t necessarily help you live longer, and as the researchers point out there is a difference between a long life and a healthy one. Being overweight has been clearly linked with heart disease and diabetes amongst other conditions, so anyone following the Mail’s advice would be putting themselves at risk of developing these afflictions.

There is also the problem of continuous weight gain. Once you start putting it on, it can be hard to stop. The Healthy Survey data shows that a quarter of Canadians who were overweight in 1994/5 had become obese by 2002/3, and obesity will certainly up your chances of an early death.

Even the Mail must realise this, but I guess the sub-editor who wrote the headline didn’t read to the bottom of the article. They report the words of Dr David Haslam, chairman of the National Obesity Forum: “This study shouldn’t be used as an excuse to put on weight.”

Orpana, H., Berthelot, J., Kaplan, M., Feeny, D., McFarland, B., & Ross, N. (2009). BMI and Mortality: Results From a National Longitudinal Study of Canadian Adults Obesity DOI: 10.1038/oby.2009.191

Comments Off

Comments Off Posted on Thursday 25 June 2009 at 4:25 pm by Jessica Bland
In Biology, Getting It Right

Despite appearances, here at Just A Theory, we don’t spend all our time trying to bash the media’s representation of science. Sometimes we try to join in. This week I even managed to get something published internationally (hurray!): here is my short article in The Economist.

It’s on recent research that shows that dinosaurs were not as big as we thought. In the spirit of Just A Theory, I try to go beyond the “Jurassic Park was wrong” story and explain, with the help of one of the research’s authors, a little bit about what they say in the actual paper. Leave a comment if you can – it only takes a second to register on The Economist website. It would be great to know whether people really want to know more than just whether Brachiosaurus weighed the same as three or seven African elephants…

Comments Off

Comments Off Posted on Wednesday 24 June 2009 at 10:37 am by Jacob Aron
In Health & Medicine, Inventions & Technology

Training to be a doctor is difficult, and not just for the medical students. For prospective physicians to have real life experience they must examine real patients, but this can be awkward for more intimate procedures such as breast exams.

Up until now the solution has been for students to practice on lifeless prosthetics, but a new initiative by the University of Florida, along with three other universities, uses a combination of prosthetics and computer technology to better simulate the experience.

A mannequin allows students to conduct the physical exam, whilst a computer representation of the patient, named Amanda Jones, responds on the screen above. This “mixed reality human” lets medical students converse with their virtual patient whilst conducting an exam.

A mixed reality breast exam in progress.
A mixed reality breast exam in progress.

Students can talk to Amanda in realtime thanks to computer speech and voice recognition software. This allows them to discover her medical history and respond to questions or concerns during the exam.

Feedback is also provided by sensors within the prosthetic breast that send data to the computer simulation, providing a colour representation of the pressure students are applying.

Different situations can be programmed in to the system, such as whether a breast abnormality is present or not, and dialogue lines can also be changed to prevent an unscripted experience. Benjamin Lok, an assistant professor of computer and information sciences and engineering at the University of Florida, says this communication practice is key.

“Studies have shown that communication skills are actually a better predictor of outcome than medical skills,” Lok said. With the virtual patient, “all of a sudden, students have to not only practice their technique, but they also have to work on their empathy.”

Although the mixed reality system is not intended to replace real exams, it does help students get more experience when volunteers are scarce. Thanks to the success of the breast exam system, researchers are now looking in to simulation other intimate procedures. Lok and team are now building a virtual prostrate exam for students to practice on.

Comments Off

Comments Off Posted on Wednesday 24 June 2009 at 10:25 am by Colin Stuart
In Inventions & Technology, Space & Astronomy

The days of the lone astronomer are long gone. Modern astronomical research is a multi-national and highly organised outfit with million dollar telescopes perched high on mountain tops in some of the most remote places on Earth. These optical leviathans don’t even need a pupil at the eyepiece; computers are more than capable of doing that for us.

You might think then, that this world of highly mechanised, suped-up star-spotters was beyond the clutches of your average Joe, but you’d be wrong. Around the world an army of enthusiastic amateurs, often armed with nothing more than their home computers, are reeling in the secrets of the universe. Meet the citizen astronomers.

Comets

David Evans works for SERTEC, a company based in Coleshill, Warwickshire, specialising in the manufacture of parts and components for the automotive industry. At least that’s what pays his bills. David’s real passion is astronomy and he has discovered nineteen comets previously unknown to science, all from the comfort of his home PC.

“My first discovery was confirmed 22 June 2002 by Derek Hammer of NASA. I found the comet in images which were taken by the SOHO Space Telescope on 13 June 2002,” David explained, referring to a telescope whose job it is to stare at the Sun. As these comets pass in front of the Sun their silhouettes can be spotted by those who have the patience to sift through the mountains of data produced by modern telescopes.

And that’s the appeal of citizen astronomers to those who research the cosmos for a living. Often a human is still better at discerning detail than computers, but the professional astronomers simply don’t have the time or the resources to analyse all the data. By farming it out in manageable chunks to citizen astronomers, more research can be done and the public get a real chance to contribute to cutting edge science.

GalaxyZoo

One extremely successful example is Galaxy Zoo, a citizen astronomy project designed to get members of the public classifying galaxies. Galaxies are huge collections of stars gathered together in space, and they come in many different shapes and sizes. The Galaxy Zoo community are presented with photos of galaxies and asked simple questions about what they can see. They might be asked to choose from a sliding scale as to how round it is, or how many spiral arms it’s got.

The beauty of Galaxy Zoo is that it sends out the same photo to many users and only if a consensus is reached between a high percentage of users do the team know they can trust the classification. Such has been the success of the project that a completely new type of galaxy has been discovered this way.

Melanie-Jane Ryal, a personal assistant, is a keen Galaxy Zoo user, “The Galaxy Zoo project is amazingly easy to get involved with. All you have to do is register and then do a short test to ensure you know what you’re looking at. As an amateur it allows you to feel involved as you’re helping to classify galaxies that very few other people have seen,” she said. That’s the kicker, sometimes you get to be the very first human ever to lay eyes on a particular galaxy, a galaxy that contains billions of stars, and perhaps even other life forms.

SETI@home

And the search for aliens, or more officially The Search for Extraterrestrial Intelligence (SETI), hasn’t been overlooked by citizen astronomy; in fact it was one of the trail blazers. I’ve previously blogged about the SETI@home project celebrating it’s tenth year keenly listening to signals from space and trying to detect evidence of an interstellar phone call. But the key to the success of this project has been that, in true citizen astronomy style, the data is farmed out to you and I. SETI@home uses your spare computer power to work its way through the radio waves received by the giant Arecibo Telescope in Puerto Rico.

Once downloaded to your PC the SETI@home program gets to work whilst your not. When you’re away from your PC having a cuppa or fielding a phone call, SETI@home kicks in and starts using your computer to decipher the messages. No signal has been found yet that astronomers believe not to have come naturally from space but thanks to home PC’s they are getting through the data much faster than would otherwise be possible.

Public Engagement

It is appropriate that SETI@home is celebrating it’s inaugural decade, just as astronomers are celebrating another temporal milestone. This year has been designated International Year of Astronomy or IYA2009, to mark four centuries since Galileo first used the telescope to gaze at the heavens. IYA2009 has been an opportunity for professional astronomers to engage with the public and Dr Marek Kukula, Public Astronomer at The Royal Observatory, Greenwich, and sees citizen astronomy as an indispensable tool in this process.

“Citizen astronomy is a tremendous opportunity to engage members of the public with real scientific research in a way which would have been impossible only a few years ago,” he said. And he agrees that the astronomers get more than just an extension of their computing power. “It’s not a one-sided process either – the scientists also benefit enormously because it enables them to answer questions which they simply couldn’t tackle on their own, getting extra value out of the large amounts of data which are now routinely gathered by telescopes, space missions and earth-monitoring experiments.”

So citizen astronomy is many things. It’s an opportunity for astronomers to engage with the public. It’s an opportunity for that public to actively, and often indispensably, contribute to cutting edge research. But most importantly it’s a way for astronomers to unlock the scientific secrets hidden amongst the astronomically sized sets of data churning out of the myriad of hardware both in space and on the ground.

As we move into the 401st year of the telescope, the next great discovery could just come from you, your friends, or the citizen astronomer next door.

Comments Off

Comments Off Posted on Tuesday 23 June 2009 at 7:47 pm by Jacob Aron
In Biology

If you want someone to pay attention, speak to their right-hand side. That’s the advice of scientists Luca Tommasi and Daniele Marzoli from the University “Gabriele d’Annunzio” Italy. They performed a series of three studies, published in the online journal Naturwissenschaften that found humans are more likely to act on a request to their right ear rather than their left.

Unfortunately this is one of those occasions where I am not able to read the paper, which is a shame because it actually sounds quite interesting, as it involved scientists going clubbing.

It seems that laboratory studies have already determined a right ear dominance, thought to be a result of the superior verbal processing of the left brain hemisphere which controls the right side of the body. In order to confirm these results in a real life environment, Tommasi and Marzoli hit the nightclubs.

The first study involved simply observation. They watched 286 clubbers involved in conversations over the loud music, and found that 72% used their right ear when listening.

In the next study, the researchers got involved. Stepping out on to the dancefloor, they went up to 160 clubbers and mumbled inaudible nonsense, waiting for their victim to turn their head and offer a particular ear. The researchers then covered their tracks by asking for a cigarette. The results showed that 58% offered their right ear, but only women had a consistent right-ear preference.

The final study saw the scientists intentionally addressing 176 clubbers in a particular ear. This time they didn’t mumble, instead directly asking for a cigarette. In the previous study, where the clubber chose which ear to offer, there was no link between the likelihood of being given a cigarette and the ear involved. With this more direct approach, the researchers found they were significantly more likely to receive a cigarette when addressing the person’s right ear.

I’ve got this wonderful (if rather stereotypical) image of lab-coated scientists running up to clubbers and whispering in their ears, all the while clutching clipboards. I’m sure it wasn’t quite like that, but it must have been fun for Tommasi and Marzoli to put in a funding request for a night on the town.

In all seriousness though, this is an interesting result because it shows once again the strange split in the two sides of our brain. This, say the authors, is one of the few studies to clearly demonstrate this difference in an everyday environment.

Comments Off

1 Comment » Posted on Monday 22 June 2009 at 4:30 pm by Jacob Aron
In Getting It Wrong

On Friday I noticed that a few papers had run a story about research into “taste dialects”, the notion that different regions of the country favour particular foods. All ready I was sceptical, and then I noticed that the research had been performed on behalf of Costa Coffee. Hmm.

Well, for various reasons I didn’t get around to writing about it. I’d probably have moved on from this story this week if it weren’t for the appearance of a press release on EurekAlert from the University of Nottingham.

Normally EurekAlert serves as a pretty reliable source for scientific press releases, so I’m a bit surprised to see this kind of “research” cropping up. I’ve cracked out the scare quotes because some of the “findings” are so subjective that they can’t in any way be called science. For example:

People from the North East seek tastes that offer immediate satisfaction, borne from a history of hungry heavy industry workers demanding foods that offer immediate sustenance.

Maybe I’m being harsh. Maybe that isn’t complete bollocks, and the researchers somehow show a causal link between a history of heavy industry and a desire for instant satisfaction. I’ve got no way to tell, because the “research” wasn’t published anywhere.

That’s unsurprising, considering the people who carried it out. Whilst Professor Andy Taylor works in the University of Nottingham Flavour Research Group, food psychologist Greg Tucker works for The Marketing Clinic, which uses the scary-sounding Interrogation Research technique to come up with market research. Some choice quotes from that page:

The consumer is under strong social pressure to provide answers which are acceptable to society.

The Marketing Clinic can ascertain what is really going on and get into consumers unconscious thoughts.

I don’t know about you, but to me that sounds like they scream “THERE ARE 5 COFFEES AND THEY ALL TASTE GREAT!” at people until they agree. Coffee, after all, was the reason for commissioning this survey. No surprise to find this “result” buried away at the bottom of the list then:

Coffee is the earliest recalled taste memory for under eighteens. In all regions, people noted the importance of getting a ‘good’ rather than ‘average’ cup of coffee.

And where might one get a good cup of coffee in all regions? Why, Costa Coffee of course!

I’m disappointed that EurekAlert are participating in this shameless marketing exercise. If you still need convincing to the true nature of this “research”, one need only glance at the contact details on the end of the press release. Lucy Whittle of Paratus Communications can tell you everything you need to know. The company even proudly display their ability to get corporate nonsense in to the papers, so I don’t think EurekAlert needs to give them any more help.

24 Comments » Posted on Monday 22 June 2009 at 12:03 pm by Sam Wong
In Biology, Climate Change & Environment

David Mitchell, as usual, wrote a very funny but also very wise column in the Observer yesterday about the Daily Mail’s ridiculous wheelie bin campaign, and about how our heightened sensitivity to injustices against us has overridden our sense of responsibility to society.

Our fear of being encroached upon has made us forget that there are few freedoms that can be fully exercised without impinging on someone else’s. The freedom to stab has long since been subordinated to the freedom not to be stabbed. But we still have the freedom not to recycle and to borrow or lend money recklessly, regardless of others’ freedom to live on a habitable planet and in a functional economy. We’ve hugely prioritised our rights over our duties because it’s only the former that tyrants try to take away.

A reader called Memoid posted a comment saying:

There’s not even been a hint of discussion about the right to have children yet, and that’s the debate we really need to have. And the world needs the vast majority of us to lose the debate.

He’s right, so let’s start the debate. There are 6.8 billion people on the planet. At the current rate, there will be 9.1 billion by 2050. Most of the increase will happen in developing countries, but even Britain’s population is expected to increase by 16 million in that time. And yet you rarely hear anyone talk about whether everyone can continue to have as many children as they like.

The Earth simply cannot provide enough food, energy and resources for that many people. And just think about the impact on the climate. How can we expect to make dramatic cuts in our carbon emissions if our population continues to grow?

People need to see having lots of children as the environmental sin that it is. You can turn all your lights off, cycle to work and insulate your house but having kids makes you more of an eco-criminal than the childless bloke next door who drives a gas-guzzler and takes 10 flights a year.

The idea of limiting one’s procreative activities will be very difficult for many to accept, for Darwinian as well as societal reasons. Surely having children is the most sacred of all human rights? I’m not advocating any government intervention in how big a family people choose to have. But I think the public needs to be more aware of the seriousness of the environmental ramifications of having children. Perhaps then more people might realise that this is one instance when our duty to society should take precedence over exercising our rights.

The Optimum Population Trust, of which David Attenborough became patron in April, runs a ‘Stop At Two’ campaign, and has a pledge that you can sign on its website. The idea will still seem outrageous to some, but I think signing the pledge is an absolutely reasonable step towards remediating unsustainable population growth.

(Incidentally, even if you plan to stop at two, it doesn’t always work out that way. My Dad found this out the second time my mum got pregnant: the egg that became me wasn’t the only one that got fertilised. As a result, my mum got her wish for three kids.)

This is all very easy for me to say. I’m 22 and single, and the prospect of having children feels almost as remote to me as arthritis. It could well be that in 10 years’ time I’ll turn out to be a massive hypocrite with three kids. But I hope, for everyone’s sake, that I will be able to restrain my reproductive urges in light of the bald truth: there are too many people on the planet already.

Comments Off Posted on Sunday 21 June 2009 at 7:52 pm by Jacob Aron
In Space & Astronomy, Weekly Roundup, Yes, But When?

That’s one small Tweet for man…

To mark the anniversary of the Apollo 11 mission next month, Nature are using Twitter to relive the Moon landing, 40 years on. You can follow @ApolloPlus40 in the run up to July 20th, and imagine what a mission to the Moon would be like in the internet age.

First image from Herschel

Emma covered the launch of Herschel and Planck, the two latest telescopes to be sent off in to space, and now Herschel’s first image has been beamed back.

The first Herschel image.
The first Herschel image.

It shows the Whirlpool Galazy, also known as M51. First discovered by Charles Messier in 1774, it lies 23 million light-years away. Impressive stuff.

World’s first spaceport begins construction

I’ve been following the progress of Virgin Galactic for quite some time, as they bring the promise of commercial spaceflight ever closer to reality. I even blogged about the company in Just A Theory’s very first week. It’s quite exciting then to see construction begin for Spaceport America in New Mexico. The design is fantastically futuristic:

You can tell it's the future, look at all the blue lights.
You can tell it's the future, look at all the blue lights.

Due to be completed in 18 months time, it will serve as the commercial base for Virgin Galactic, but other companies will eventually make use of the facility. I can’t wait.

Comments Off

1 Comment » Posted on Saturday 20 June 2009 at 7:45 pm by Jacob Aron
In Getting It Wrong, Mathematics

Cliff Arnall is the king of the “formula for” story. Earlier this year I wrote about his equation for calculating the date of Blue Monday, his self-styled worst day of the year.

At the time I failed to mention that Arnall actually trots out this rubbish not just once, but twice annually. When summer rolls round, it’s time for the happiest day of the year, which according to Arnall’s formula was yesterday.

The “story” was picked up by the Telegraph, Daily Mail, and Sun. Fact-checking obviously doesn’t occur on the happiest day of the year, because it seems that Arnall is still dining out on Cardiff University’s reputation, despite the institution making it very clear he only worked there as a part-time tutor.

I suppose its time to take a look at the formula now, but by this point do you really need me to tell you it’s nonsense? Here, in all its glory, is the “complicated equation” needed to calculate a day’s happiness rating, along with the variable definitions:

O + (N x S) + Cpm/T + He

  • O: Outdoors
  • N: Nature
  • S: Social interaction
  • Cpm: Childhood memories of summers
  • T: Temperature
  • He: Holidays

Not sure about the difference between outdoors and nature, and surely the value will be the same for each day; O = N = 1, unless there is a second outdoors that I don’t know about. Social interaction could actually be quantifiable, perhaps the number of conversations in a day, but it’s pretty unclear.

Cpm and He are both very bad notation. What is wrong with just C and H? The extra letters don’t add anything, they aren’t even an abbreviation, but they could easily be confused for additional variables. I guess this way looks more “scientific”.

In fact, the only scientifically measurable variable, temperature, is what makes this “formula” fall apart. Assuming you have at least some memory of your childhood, Cpm/T will rapidly grow to infinity as the temperature drops to 0 °C and completely dominate anything else in the equation.

I don’t know about you, but I thought it was pretty warm out yesterday. It seems that Arnall’s Blue Monday, January 19th, would be a much better candidate for happiest day of the year according to this formula. Maybe he accidentally got his bullshit mixed up with his bollocks, and gave us all the wrong date. Now that’s a thought that makes me smile.

Comments Off Posted on Saturday 20 June 2009 at 11:53 am by Emma Stokes
In Biology, Health & Medicine

sleeping beautyScientists using mice have developed a new way to deliver gene therapies. By using hollow particles to deliver a gene into cells, they successfully reversed haemophilia symptoms.

Gene therapy can be used to treat diseases caused by a mutated or missing gene. The technique involves delivering a correct copy of the gene. However, current methods haven’t worked too well in patients;often the gene binds at the wrong place in the DNA or doesn’t integrate itself into the cell. The new technique using very small nanoparticles to deliver the genes aimed to overcome these problems. The team also used a genetic element known as Sleeping Beauty to help integrate the genes into the cells’ DNA.

Haemophilia is a blood disorder caused by a lack of a protein called Factor VIII (FVIII). FVIII helps blood clot after injury; so lack of the protein means blood cannot clot effectively. The team loaded the nanoparticles with the gene that produces the FVIII protein (along with the Sleeping Beauty element), and covered the particle with chemicals to seek out and selectively bind to specialised liver cells. They then injected the particles into mice and monitored the effect on blood clotting time and levels of the FVIII protein.

At five and 50 weeks the clotting times of the treated mice were about the same as in normal mice, and much longer than in the untreated group. At 50 weeks the levels of Factor VIII in the blood of mice given the nanoparticles were also the same as in normal mice.

Using nanoparticles with the Sleeping Beauty genetic element seems to work well, and could represent a viable way to deliver gene therapies for various diseases.

More information is available on the Understanding Animal Research website.

Comments Off

Comments Off Posted on Friday 19 June 2009 at 3:07 pm by Emma Stokes
In Biology, Health & Medicine

Researchers have created a GM mouse that develops Parkinson’s disease. This mouse will allow them to study progression of the disease and test new treatments without extensive use of monkeys.

Parkinson’s disease is caused by a mutation on chromosome 12. There are a number of different mutations known to cause the disease, however the team looked at just one – LRRK2. Because the genes responsible for causing Parkinson’s are very long, traditional genetic techniques are unsuitable. So the researchers used a technique called BAC (bacterial artificial chromosome) which uses sections of bacterial DNA to introduce the gene into the mouse DNA.

The mice produced using this technique showed all the signs of Parkinson’s seen in humans. This includes slowed movement and brain cell degeneration. At 10-12 months the transgenic mice were largely immobile with severe defects in their muscle function. However, treatment with levodopa (used to treat Parkinson’s in humans) reversed these defects.

This suggests that LRRK2 is being expressed in the mice in the same way as in humans, so the mice offer the first model of Parkinson’s disease based on a known genetic mutation, replicating features of the human disease.

This is interesting research showing just how important our ability to genetically modify organisms can be. The method of using BAC was actually nicked from the Human Genome project where it was used to determine the sequencing of genes – this is the first time it has been used in this context.

If further tests show the model to be as useful as this study suggests, it could lead to significant improvements in our understanding of Parkinson’s disease. Scientists will then be able to think of more tailor made treatments for patients.

More information is available on the Understanding Animal Research website.

Comments Off

2 Comments » Posted on Friday 19 June 2009 at 8:27 am by Jacob Aron
In Getting It Right, Getting It Wrong

ResearchBlogging.org

“Science is inevitably biased to some extent,” says Dr Daniele Fanelli, “because it’s made by human beings.” One might easily dismiss this claim as unfounded, but Fanelli has the numbers to back it up. His recent research paper combined over 20 previous studies on scientific misconduct, and found that nearly 2% of scientists admit to falsifying or fabricating data.

Whilst most scientists would shudder at the thought of distorting or inventing results, it seems that a small number are prepared to do so. Fanelli, a researcher in science and technology studies at the University of Edinburgh, believes quantifying and identifying this practice is essential to improving science.

He’s not alone. The UK Research Integrity Office (UKRIO) is an independent advisory body set up in 2006 to support good practice in research and help address cases of scientific misconduct. UKRIO head James Parry stresses that whilst misconduct is not a common occurrence, it is a problem. “We need to take steps to actively promote good conduct and research,” he says.

What causes a scientist to turn away from good conduct, and good science? Fame and fortune are obvious answers, but Fanelli argues some scientists might feel forced in to it. “There is an excessive pressure to publish, an excessive reliance on publication record to assess scientific careers.” With scientists needing to keep up appearances, perhaps publishing a falsified paper in an obscure journal seems like the only solution.

It isn’t just smaller journals that fall foul of misconduct, as even the giants of the science publishing world can get it wrong. Parry recalls the case of Jan Hendrik Schön, a physicist at Bell Labs in New Jersey. Over the course of a few years Schön published a slew of papers on superconductivity in high profile journals, including Science and Nature. “It turned out he was faking results,” says Parry. “Some of the data used in one paper had actually been used in another – he’d just labelled it differently.”

Intentionally mislabelling data is high on the list of crimes against science, but Fanelli’s research shows that a much larger proportion of scientists are guilty of lesser offences. One third of those asked admit to a variety of “questionable research practices”, including dropping data based on gut feeling or allowing funding sources to influence a study. Whilst these may just be the research equivalent of a parking ticket or speeding fine, their high prevalence is worrying.

More worrying is that the true misconduct figures could be even higher. Scientists in the surveys Fanelli analysed were self-reporting, and may have chosen not to admit their misconduct. When asked about their colleagues, 14% reported knowing someone who had falsified results, whilst 72% suggested other questionable research practices were taking place. Even these figures don’t paint the whole picture, because one case of misconduct could be reported multiple times. “How these figures relate to the true frequency of misconduct is partly an open question,” says Fanelli.

Whilst just answering a survey might be easy, actually dealing with a colleague’s misconduct can be harder. “It’s a very stressful situation,” explains Parry, but the UKRIO can help. “If someone comes to us with concerns, we offer confidential and independent advice and guidance.” This support can play a crucial role in exposing potentially harmful misconduct, especially when it comes to health and biomedical research. “It’s the area where there is the most potential for mishap if things go wrong,” says Parry.

It is also the area with the most reported misconduct. “Medically related research has consistently higher admission rates,” says Fanelli. There are two possible explanations for this. Perhaps these researchers are more aware of issues surround scientific misconduct and so are more honest, or maybe misconduct rates simply are higher in medicine. Both explanations could be true.

Should we be concerned that we don’t know how many researchers are cooking the scientific books? Fanelli believes this behaviour is not necessarily bad for science, because dodgy data can be used to support research that is subsequently accepted as true. The 19th century scientist Gregor Mendel was posthumously accused of data that was too good to be true, but his work forms the foundation of modern genetics. Thus science is self-correcting in the long term, but for contemporary research misconduct is more of a problem.

The solution, says Fanelli, is greater transparency. “Scientists should report more faithfully what they actually did.” He suggests that if dropping a few data points lends weight to an argument then scientists should go ahead and do so, but must admit to it. And of course, he practices what he preaches: “I’m trying to be as unbiased and objective as I possibly can.”

Fanelli, D. (2009). How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data PLoS ONE, 4 (5) DOI: 10.1371/journal.pone.0005738

1 Comment » Posted on Thursday 18 June 2009 at 5:35 pm by Jacob Aron
In Getting It Right, Psychology

ResearchBlogging.org

Much of the scientific research in to the effects of video games on players’ behaviour concludes that violent games promote aggression. Gamers (including myself) often dismiss these findings, resulting as they nearly always do from poorly designed studies. One infamous experiment used the length of time a person held an air horn down before and after gaming as a measure of aggression – nonsense.

I doubt gamers would say the same of this latest piece of research, published in the journal Personality and Social Psychology Bulletin, which shows that playing “prosocial” games can encourage people to be more helpful and considerate to others. Douglas Gentile, an Iowa State University psychologist, was lead author:

“Dozens of studies have documented a relationship between violent video games and aggressive behaviours,

“But this is one of the first that has documented the positive effects of playing prosocial games.”

The paper presents the findings of three separate studies conducted using different scientific methods and an different countries. This, says the authors, is the best way to establish the true effect of video games on behaviour.

One study asked young teenagers in Singapore to list their favourite games as well as filling out behavioural surveys. Those who played violent games were more likely to hurt others, but players of prosocial games were more likely to help others. Whilst this is useful data, you can’t prove a causal link with this kind of research – did the games make people behave this way, or did their behaviour make them choose certain games?

The second study comes closer to an answer. Nearly 2,000 Japanese children aged 10 to 16 completed two surveys, three to four months apart. Those who increased their exposure to prosocial games became more helpful when questioned later.

Finally, a group of US college students were assigned to play a prosocial, violent or neutral game. They then had to assign puzzles of varying difficulty levels to a partner, who stood to win $10 if they could complete them all. Those who played prosocial games were more likely to assign easy puzzles, whilst hard puzzles were the choice of the violent game players.

I found reading this research very interesting, and it challenged my opinions. The scientists involved weren’t on a “games are evil” crusade, and instead conducted a series of well designed studies that show video games can have both positive and negative effects on players’ behaviour.

It’s easy for me to dismiss those who would attack video games as nothing more than murder-training simulations. It’s harder to do so when they claim positive effects. As the paper concludes: “Video games are not inherently good or bad, just as any tool is not inherently good or bad.” In future, whether I’m on a one man crime spree in Grand Theft Auto, or spring-cleaning a house in Chibi Robo, I’ll be sure to think about the effect games can have.

Gentile, D., Anderson, C., Yukawa, S., Ihori, N., Saleem, M., Lim Kam Ming, ., Shibuya, A., Liau, A., Khoo, A., Bushman, B., Rowell Huesmann, L., & Sakamoto, A. (2009). The Effects of Prosocial Video Games on Prosocial Behaviors: International Evidence From Correlational, Longitudinal, and Experimental Studies Personality and Social Psychology Bulletin, 35 (6), 752-763 DOI: 10.1177/0146167209333045

Comments Off Posted on Wednesday 17 June 2009 at 5:38 pm by Jacob Aron
In Happenings

Yesterday the Mission Impossible team were once again in the studio for another science-packed hour of radio. As always, you can stream it online if you missed it. This week we have:

  • Run-down of the latest science news
  • A look at the weird world of string theory
  • The always fantastic Call My Scientific Bluff
  • Interview with Andrew Maynard about nanotechnology
  • Discussion with our studio guest of the week, Stuart Clark about astronomy and his latest book, The Sun Kings
  • And a roundup of the latest Web2.0 news

Next week will be the final edition of Mission Impossible, so be sure to listen online, Tuesday 1pm at ICRadio.

Comments Off

2 Comments » Posted on Wednesday 17 June 2009 at 10:39 am by Jessica Bland
In Health & Medicine, Musings, Psychology

New Scientist this week reported the findings of an Australian study, which shows that the figure most men find attractive corresponds to the average UK size 14.

Looking at outline sketches of  different female torsos,  a 100 students from New South Wales were asked which they were most attracted to. Their preference for the fuller figure surprised researchers. Previous research  showed  that a 0.7 waist-to-hip ratio is most attractive irrespective of the woman’s size.

This is brilliant. I can eat as many ice creams as I like this summer, and I will only become more rather than less attractive. My stomach flab will start to roll; my thighs will wobble in places where they don’t normally have any jelly. But, apparently, none of that will matter to the boys.

Or will it. Put that body in skinny jeans and white t-shirt and it might not have scored so highly. Put it in a leopard print bikini, a tight short skirt or a strapless dress and it would probably do even worse.

Fashion is not, on the whole, created for the fuller figure. Whilst the naked silhouette of a size 14 might be more attractive, the same body but dressed often suffers from unflattering and uncomfortable lines.

So I can only roll my eyes when The Daily Mail report on this study is accompanied by pictures of curvier celebrities. There is a giant leap between what is most attractive in line drawing and what looks better in skin tight leather.

And mankind, or at least one of them, is inclined to agree. I find myself making the same point as Tom Sykes – Daily Mail journalist and resident irritant. Instead of arguing about why we don’t see size fourteen on the catwalk, he goes for the Playboy angle. Size fourteen girls aren’t the fantasy. The fantasy is the Playboy centrefold because that’s what sells.

I don’t really agree with that: couldn’t the fantasy be constructed by the magazines rather than the other way round? Isn’t a young boy who buys Playboy being influenced by those images of glamour more than the images are pandering to his tastes?

Perhaps. But that’s not the point here.

What is interesting is that both Tom and myself  looked to ways to belittle the research. Before someone showed me his comments, I had already written that it was “a 100 students from New South Wales” that were surveyed and that only line drawings were used. He went a little further:

What it actually shows is that the 100 male students surveyed at the University of New South Wales are pathetic wimps, desperate for a quiet life and terrified of offending anyone.

But the sentiment is the same. The research’s results didn’t fit with the way we see things. And so we tried to find holes in it.

I can’t imagine Ronaldo making me his next trophy. But his and Paris Hilton’s romp in LA last week was no surprise. That’s how the world works. At least, that’s how the world I live in works.  And it’s a little painful to realise that even I am willing to dismiss science if it doesn’t fit.

Comments Off Posted on Tuesday 16 June 2009 at 12:20 pm by Jacob Aron
In Health & Medicine

It’s common knowledge that drinking lots of milk will give you healthy teeth and bones, but for once this piece of health advice actually has a scientific basis. Calcium, abundant in milk, is very important in building up bone strength, particularly in young adults whose bones are not fully developed. A study published in the July/August issue of the Journal of Nutrition Education and Behavior suggests that, in the US at least, young people just aren’t getting enough.

Using data from another study designed to examine what teens eat and why, researchers at the University of Minnesota analysed the calcium intake of 1,500 young adults, 45% of which were male. The study initially quizzed participants with an average age of 16, with a follow up around five years later.

They found that the majority of teens actually reduced their calcium intake as they grew up. Age 16, more than 72% of girls and 55% of boys had calcium intakes lower than the recommended level of 1.3 grams per day. Later in life these figures fall slightly, but so does the recommended level of calcium. In young adulthood, 68% of girls and 53% of boys fail to get 1 gram per day.

The study suggests that children who are given milk at mealtimes and are encouraged to have positive attitudes towards health and nutrition are more likely to have a higher calcium intake later in life. Time spent watching television was associated with a lower intake however, as was lactose intolerance – unsurprisingly.

Dr. Nicole I. Larson of the School of Public Health at the University of Minnesota and colleagues suggest that encouraging more families to serve milk at mealtimes will combat the fall in calcium intake. As always, it boils down to simple health advice: drink more milk.

Comments Off

2 Comments » Posted on Monday 15 June 2009 at 9:40 pm by Jacob Aron
In Climate Change & Environment

ResearchBlogging.org

It is an undeniable fact that if we are to successfully hold back climate change, people are going to have to make some adjustments to their lifestyles. We simply can’t afford to continue using energy at the current rate, and carbon emissions must be cut. You already know this of course, in fact you are probably sick of hearing it. Therein lies the problem.

Change your ways, we are told. Don’t leave the bathroom light on. Be sure to put out the recycling. And have you considered installing solar panels? There is a constant niggling feeling that we must all do something to fight climate change, but no real idea if our actions have any impact. Surely, people say, there’s nothing that I can do?

New research published in last week’s Nature could have the answer. Scientists from Canada and the UK lead by Damon Matthews, a professor in Geography, Planning and the Environment at Concordia University, have come up with a way of quantifying a person’s individual impact on the climate. It works by simplifying the complex web of interactions involved in climate change in to a single number: the global temperature increase per tonne of carbon emitted.

Matthews and colleagues calculated this figure, known as the carbon-climate response (CCR), by running computer simulations and examining historical climate data.. Although other climate factors can vary significantly, the CCR appears to remain constant even over a period of 1,000 years. Depending on the model used, they estimate that releasing one trillion tonnes of carbon in to the atmosphere will raise the Earth’s temperature by between 1 and 2.1 °C. To put it another way, for every tonne of carbon you emit in your day-to-day activities, the planet will warm by 0.0000000000015 °C, or 1.5 x 10-12 °C. This is a tiny amount, but it is easy to see how emissions add up.

This 2000 US Department of Energy report gives an average value of around 1.35 pounds of carbon dioxide released for every kilowatt hour of electricity used. We can convert this to pure carbon by multiplying by 12/44, a fraction which takes into account the relative atomic masses of carbon and oxygen. Converting again from pounds to tonnes gives a figure of around 0.00017 tonnes of carbon per kilowatt hour.

From this I calculate that leaving a 100W bulb switched on for a year releases around 0.15 tonnes of carbon in to the atmosphere, resulting in a temperature increase of 2.25 × 10-13. Again, this is a very small amount, but consider how many light fixtures there are in the entire UK when you include all households, offices, shops, schools, hospitals…the list goes on. Estimating the country’s population at 61 million, with 3 light bulbs per person (a number I have admittedly pulled out of thin air, but one that seems reasonable) that works out at a temperature increase of 0.00005 °C. Now we’re talking slightly bigger numbers, especially when you consider this is just lighting, and just the UK.

The crux of the matter is that if we are to avoid catastrophic climate change, switching your lights off really does make a difference. Yes, the effects of these behaviour changes are small, but if everyone does them this new research shows that the effect on the climate can be significant. When you’re next told, for the nth time, to reduce your carbon footprint, remember that doing your bit really does matter.

Matthews, H., Gillett, N., Stott, P., & Zickfeld, K. (2009). The proportionality of global warming to cumulative carbon emissions Nature, 459 (7248), 829-832 DOI: 10.1038/nature08047

Comments Off Posted on Monday 15 June 2009 at 1:18 pm by Jessica Bland
In Getting It Wrong

In the posted version of my article on torture, I made a mistake. I misdescribed Dr. Basoglu’s definition of torture. He had corrected this previously, and he rightly pulled me up for posting the unchanged version:

Therefore, Basoglu argues, the definition of torture used in International law should be modified. “It would be based on four parameters” Intent, purpose and removal of control are all widely-accepted criteria for torture. But Basoglu adds a fourth criterion: “multiple stressors must be present.” So, both combinations of physical events and psychologically stressful situations would constitute torture under this definition.

Basoglu’s correction was:

“Intent and purpose are widely accepted criteria for torture but removal of control is not…It is (a) multiple stressors that interact with each other and (b) removal of control that define the contextual characteristics of captivity settings. It is these two criteria based on learning theory formulation of torture trauma that make the proposed definition novel and evidence-based.”

That is to say, it is not just the fourth of the parameters that is novel. It is also the third. And together they provide a contextual definition of torture.

I hope this will be one of the only times that Just A Theory bloggers tag their own writing with the ‘getting it wrong’ tab…..

My apologies to Dr. Basoglu and to anyone who read the uncorrected text.

Comments Off

Comments Off Posted on Sunday 14 June 2009 at 4:36 pm by Jacob Aron
In Biology, Inventions & Technology, Psychology, Weekly Roundup

Tweeters aren’t psychic

Earlier this month I reported on Richard Wiseman’s Twitter experiment which hoped to use the social networking site to study psychic ability. Now the results are in, and you don’t need to be able to see the future to predict them.

The experiment consisted of Wiseman going to a location each day, then asking people to Tweet their impressions of where he was. They would then select from five pictures of possible locations. In all four trials, the majority voted for an incorrect location. Even those who declared they had some form of psychic ability with high confidence scored zero. Sorry guys, but you’re not special.

Who Pooped?

A strange question yes, but one with an important answer. Scientists are able to determine the nature of an animal from their fecal matter alone, and now you can too in a game brought to you by the Minnesota Zoo. I guessed all three correctly, so perhaps there is a new career waiting for me.

Ten icons of science – but which is the best?

To celebrate its centenary, the Science Museum have selected 10 objects from their collection that changed the future. From the steam engine to penicillin, each invention or discovery has a huge influence on our lives today, but which one gets your vote? I think I’ll have to go for the Pilot ACE Computer – the first multi-tasking computer. So much of the modern world revolves around computers, and most importantly of all, you wouldn’t be reading Just A Theory without one!

Comments Off

7 Comments » Posted on Sunday 14 June 2009 at 9:55 am by Sam Wong
In Getting It Wrong

The physicist Alan Sokal famously satirised the field of postmodern cultural studies by writing a meaningless spoof paper and getting in published in a journal called Social Text. He described the paper, entitled ‘Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity’, as ‘a pastiche of left-wing cant, fawning references, grandiose quotations, and outright nonsense’. By getting it published, he sought to demonstrate that postmodernist academics were more interested in who wrote a paper and how it sounded than whether it says anything meaningful.

Sokal’s paper was published in a humanities journal with no peer review process. Could something similar happen in a peer-reviewed scientific journal? Concerned about how well papers would be scrutinised by open access journals that charge publication fees to the authors, Philip Davis decided to find out.

Davis, a graduate student at Cornell University in New York, was made suspicious by the glut of unsolicited e-mails he received from Bentham Science Publishers inviting him to submit papers to and even sit on the editorial board of journals for which he had no expertise.

To put their editorial standards to the test, Davis created a gobbledegook paper using a computer programme called SCIgen. SCIgen was developed by three students at Massachussetts Institute of Technology to generate a nonsensical paper to submit to the World Multi-Conference on Systemics, Cybernetics and Informatics (WMSCI) in 2005. Their paper, titled ‘Rooter: A Methodology for the Typical Unificiation of Access Points and Redundancy’, was accepted – not so surprising when you consider that the WMSCI charged speakers $390 to attend.

Davis’s paper was titled ‘Deconstructing Access Points’. Here’s a sample paragraph:

Several encrypted and ubiquitous heuristics have been proposed in the literature. On the other hand, the complexity of their method grows logarithmically as Boolean logic grows. Further, unlike many previous methods, we do not attempt to manage or develop the evaluation of I/O automata. Furthermore, Karthik Lakshminarayanan constructed several lossless solutions, and reported that they have tremendous effect on the deployment of Internet QoS. This is arguably unreasonable. As a result, the class of frameworks enabled by TriflingThamyn is fundamentally different from previous approaches [13, 21]. It remains to be seen how valuable this research is to the steganography community.

Just in case it wasn’t obvious enough that this was a hoax, Davis put down his institutional affiliation as the ‘Centre for Research in Applied Phrenology’ (CRAP). He submitted it to The Open Information Science Journal, and four months later it was accepted. He was invited to pay the $800 publication fee.

The journal claimed that they knew it was a hoax. ‘We tried to find out the identity of the individual by pretending the article had been accepted for publication when in fact it was not,’ Mahmood Alam, Bentham’s Director of Publications, told New Scientist. But on Friday, Bambang Parmanto, the editor-in-chief of The Open Information Science Journal resigned, blaming the mistake on ‘a breakdown in the process’.

Thanks to the internet, the subscription access model of scientific publishing is looking increasingly anachronistic, and it will surely only be a matter of time before all research papers are freely available for all to read. But the model in which journals make a profit from publication fees charged to authors is also undesirable: it risks excluding research from developing countries or less well-funded fields. Further, as Philip Davis has demonstrated, the greed of the publisher can mean that the review process is not as scrupulous as we would hope. It is important that we come up with a system in which scientific papers are published so that anyone can access them freely, but without compromising the integrity of the peer review apparatus. The best solution must be one in which no one makes a profit from publishing.

Comments Off Posted on Saturday 13 June 2009 at 4:41 pm by Jacob Aron
In Space & Astronomy

Three stories for you today from the great big universe out there. First up, astronomers have found evidence for the birth of a new planet orbiting a binary star system. A rotating molecular disk formed around a pair of stars known as V4046 Sagittarii is thought to be a planet in the making. It is also first confirmation that planets can emerge from binary star systems, giving us new places to look in the search for other planets. David Wilner of the Harvard-Smithsonian Center for Astrophysics says:

“This is strong evidence that planets can form around binary stars, which expands the number of places we can look for extrasolar planets. Somewhere in our galaxy, an alien world may enjoy double sunrises and double sunsets.”

Whilst that star system is growing, another one is getting smaller. The red supergiant Betelgeuse, located in the top left of the constellation Orion, has shrunk by 15% in 15 years.

Researchers at the University of California, Berkeley have been monitoring the star, but don’t know the cause for the shrinkage. Betelgeuse is about ten times wider than the distance from the Earth to the Sun, meaning it has shrunk be a distance equivalent to the orbit of Venus.

Discoveries like these could get harder to make in the future however. Light pollution now means that one fifth of the world’s population cannot see the Milky Way in the night sky. Those missing out are mostly in mainland Europe, the UK and the US, according to Connie Walker, an astronomer from the U.S. National Optical Astronomy Observatory in Tucson, Arizona. She presented her findings to the American Astronomical Society at a meeting this Wednesday.

Comments Off

3 Comments » Posted on Friday 12 June 2009 at 2:31 pm by Jacob Aron
In Biology, Getting It Wrong, Health & Medicine

ResearchBlogging.org

Place your hand over your heart. Now move it to your stomach. How about your thyroid? Ok, that last one is a little trickier, but I’d be shocked to meet anyone who couldn’t do the first two. Well, it’s time to be shocked.

A study published in the journal BMC Family Practice has found an appalling lack of public knowledge of human anatomy. The research, carried out by psychologists at King’s College London, aimed to discovery whether public understanding of anatomy had improved since a similar study in the 70s. It hasn’t.

Clue: It isn't D.
Clue: It isn't D.

They gave over 700 people multiple choice questions like the example above. Most were patients currently undergoing treatment for one of six types of conditions; the researchers were interested to see whether a patient with respiratory problems would be able to identify the location of the lungs, for example. The rest of the sample (133 participants) were members of the public.

In the test above, 44% of the public failed to find the true location of the heart. For cardiac patients the results were even worse, with just over half seemingly unaware of the placement of their troublesome organ.

As the researchers rightly point out, this knowledge gap poses a significant problem for doctors trying to inform patients about their illness. They point to previous studies which show that many people do not know the difference between pairs of medical terms, like heart attack and myocardial infarction, or fracture and broken bone.

I’m not too worried about that kind of knowledge – I couldn’t tell you the difference between those terms, because I’m not a doctor. What I simply can’t fathom is how it is possible for anyone to not know where their heart is. We feel it beat every second of every day. After heavy exercise, the intensity of our heartbeat is so loud that you can hear it. Other organs fair even worse: 72.9% could not correctly place the lungs. What do these people think is going on in their body?

We can take comfort reading that, as you might expect, the study found levels of knowledge increased amongst more educated participants. There was also a slight decrease in knowledge for older participants, suggesting that education is slowly improving. Perhaps public understanding of anatomy is getting better then, but this research shows that a lot more work needs to be done.

John Weinma, Gibran Yusuf, Robert Berks, Sam Rayner, & Keith Petrie (2009). How accurate is patients’ anatomical knowledge: a cross-sectional, questionnaire study of six patient groups and a general public sample. BMC Family Practice, 10 (1) DOI: 10.1186/1471-2296-10-43

Comments Off Posted on Friday 12 June 2009 at 8:38 am by Jessica Bland
In Getting It Right, Psychology, Science Policy

I wrote the following for Felix newspaper at Imperial College. I was interested in the US definition of torture. The story has become more relevant this week with the revelations about possible Waterboarding used by the Met: an interesting case of copycat tactics which shows that the US attitudes can have repercussions well outside their national borders. The news on Wednesday adds force to Shue’s comment that by not changing the US definition of torture, Obama has not done enough to prevent another Guantanamo…

———

We have a right not to be tortured. It is a basic human right – one that stretches across borders and cultures to societies that share few other values. The condemnation of torture is a constant where many other things are not.

But what do we mean by torture: forcing prisoners to stand for hours at a time? Playing them the same song over and over for three days? Recreating the feeling of drowning? Under international law, none of these are. They are mentally, but not physically abusive.   And torture is defined as physical abuse.

The public debate following Obama’s release of the details of CIA interrogations in Guantanamo has centred round whether or not these mentally abusive techniques are torturous enough to make them illegal. And new research published last week in the American Journal of Orthopsychiatry adds to the mounting evidence that they are, or at least that they should be.

Torture victims from former Yugoslavia countries and Turkey rated the stressfulness of their overall torture experience. Those that experienced high levels of cruel, inhuman and degrading treatment (CIDT), such as forced stress positions or waterboarding, rated their overall torture experience as more stressful than those who suffered physical torture. CIDT victims also showed higher rates of post-traumatic stress disorder.

“There is a widely held misconception of torture,” said Dr. Metin Basoglu, author of the study. “It is not just something that happens in the course of the interrogation process. It incorporates all of the other circumstances in which these events occur.”

Basoglu identifies 46 different contextual factors and it was the stress these caused that participants were asked to rate. “Think of it from the perspective of the person. They perceive a wide range of stressors, even when these stressors are not intentionally inflicted upon the person for torture purposes.”

It is not just that context is important. It is more important than the amount of physical pain. There is no clear correlation between increased physical pain and overall stress. But the correlation between CIDT and overall stress implies that psychological context is influential.

Therefore, Basoglu argues, the definition of torture used in International law should be modified. “It would be based on four parameters” Intent, purpose and removal of control are all widely-accepted criteria for torture. But Basoglu adds a fourth criterion: “multiple stressors must be present.” So, both combinations of physical events and psychologically stressful situations would constitute torture under this definition.

Others argue this kind of international redefinition is impractical and unnecessary. “Changing international law is not a relevant solution requires a lot of energy and negotiations. And it would take a long time to go through,” said Professor Henry Shue, Professor of International Relations at Oxford University and author of an influential writer on torture. Instead, he believes that what needs to be changed is the US legal definition of torture.

“Under the UN Convention law both torture and what Basoglu calls CIDT are illegal. So the distinction between them does not much matter. But when the US ratified the convention in 1988, Reagan interpreted the convention as only applying to physical abuse and psychological conditions arising from physical abuse.” This meaning is the one that was incorporated into US law in 2006 in the Military Commissions Act.

Shue emphasised that “this is not something that started with Richard Cheney and George Bush”. But under the recent Bush administration it became law. And despite publishing torture memos detailing interrogation techniques used in Guantanamo, Obama has done nothing to reverse the distinction between the UN convention and US law.

“I am disappointed with Obama’s response on this issue. He has said he wishes to abolish torture, but has not addressed the definition of torture,” Shue said. He explained that it still leaves open the possibility of future Guantanamo like interrogations.

In the face of research like Basolgu’s, it is difficult to see how America can keep using the narrow Reagan definition. Redefining torture might seem like a pedantic effort in the case of international law. But in the US, we have already witnessed the horrifying consequences of leaving a gap between what is torturous and what is law. Let’s hope the Obama administration doesn’t let this linger – that they don’t let it become their first mistake.

Comments Off

5 Comments » Posted on Thursday 11 June 2009 at 11:38 am by Jacob Aron
In Chemistry

The periodic table is about to get a little bigger, with the addition of element 112. Whilst it was discovered over a decade ago, the “super-heavy” element has only now been officially recognised by the International Union of Pure and Applied Chemistry (IUPAC) – and it is in need of a name.

The honour will fall to Sigurd Hofmann and his team at the Centre for Heavy Ion Research, who first created a single atom of element 112 in 1996 by using a particle accelerator to fire a beam of zinc ions at lead atoms. This fuses the nuclei of the two elements, forming a new one.

Elements at the end of the periodic table are very large and heavy, making them unstable and liable to decay. After just a few milliseconds the nuclei falls apart, releasing energy and elements from higher up the periodic table.

The short-live nature of element 112 has made pinning it down a rather tricky task, which is why it has taken so long to be officially added to the periodic table. So far, only four atoms have ever been observed. Now, it’s ready to take its rightful place alongside the other 111 elements.

The IUPAC uses a slightly strange system for the names of unconfirmed and undiscovered elements, in order to avoid people trying to nab a name before the element is official. It uses a mix of Greek and Latin to spell out the element’s atomic number, thus element 112 is currently known as ununbium – or “one one two”-ium.

Professor Hofmann and team are currently working on a shortlist of names, but a number of suggestions have cropped up on Twitter. Ones that I’ve seen include Lehrerium, after Tom Lehrer of the Elements Song, Kryptonite, the strange space rock that weakened Superman, and Obamium, which probably doesn’t need explanation.

I think in honour of his 150th anniversary, Darwinium would be appropriate. But then, Darwin didn’t really have much to do with chemistry. Any other suggestions?

Comments Off Posted on Wednesday 10 June 2009 at 9:48 pm by Jacob Aron
In Inventions & Technology, Yes, But When?

Two stories today of new technology that could one day be built in to mobile phones. The first, developed by engineering students at Duke University, would allow people to write short notes by simply waving their phones in the air. The technology makes use of accelerometers already inside many newer phones, like the iPhone, which are used to switch the display from portrait to landscape view.

“We developed an application that uses the built-in accelerometers in cell phones to recognize human writing,” said Sandip Agrawal, one of the developers of the PhonePoint Pen. “By holding the phone like a pen, you can write short messages or draw simple diagrams in the air.

“The accelerometer converts the gestures to images, which can be sent to any e-mail address for future reference,” Ionut Constandache said. “Also, say you’re in a class and there is an interesting slide on the screen. We foresee being able to take a photo of the slide and write a quick note on it for future reference. The potential uses are practically limitless. That this prototype works validates the feasibility of such a pen.”

Whilst I can see the appeal of this, I’m not sure what more it offers over just using buttons or a touchpad keyboard. Perhaps if the keys are too small for you it would be useful alternative, but I think I’ll pass.

Perhaps more useful is this second story: Nokia are working on a phone that charges itself without being plugged in. This seemingly magical feat is made possible by sucking in power from the sea of electromagnetic energy that surrounds the modern world.

We are constantly wading through radio, TV and WiFi signals emanating from all directions, and scientists at the Nokia Research Centre in Cambridge have created a phone that harvests tiny amounts of power from a wide range of frequencies. So far though they have only achieved a tiny 5 milliwatts, which isn’t much use. Their next goal is 20 milliwatts, which would allow a phone to remain on standby indefinitely. Ultimately they hope to reach 50 milliwatts, enough to slowly recharge the handset.

You’re going to have to wait a while for both of these new gadgets though. In the case of PhonePoint Pen the team expect to put a prototype up for download in the next few months, but the recharging phones are at least three to five years off.

Comments Off

3 Comments » Posted on Wednesday 10 June 2009 at 12:56 pm by Jessica Bland
In Musings, Science Policy

On Friday, the UK government department that represented science for the last couple of years, the Department for Innovation, Universities & Skills (DIUS), was disbanded. In addition, Lord Drayson changed his title from Minister of Science & Innovation to Minister of Science & Defence.

The obvious response from those with a stake in science’s political profile is to complain. And perhaps rightly so; a press release from the Chairman of the Innovation, Universities, Science and Skills Committee (IUSS), Phil Willis, showed that even he felt that science had been let down:

“The real casualty of this ill-thought out re-organisation is the nation’s strategic science base.”

But I disagree. Although we are right to complain about expensive reshuffles (according to the FT, £7 million was spent on setting up DIUS for it to last only 20 months), I don’t think that science has much to worry about.

Control of science-related policy is now with Lord Mandelson in his new Department for Business, Innovation & Skills (DBIS). This initially felt like the next in quick succession of steps by government to commercialise science. First, politicians asked scientists to outline the commercial potential of their work in all new grant proposals. Then, they skimmed off research council money to be used only for projects with clear economic potential. As I mentioned in a blog entry in May, this has already caused a public fight between George Monbiot at the Guardian and Lord Drayson. Other science community publications have picked up on it as well: “The Economic Impact Fallacy” by Philip Moriarty in Physicsworld this month provides a forceful argument against these new economic shackles for science.

But Lord Drayson has promised to keep the science budget separate from the rest of DBIS. So, despite the other, recent disappointing changes to the structure of science funding, not much has changed this time.

Moreover, as was set out in an email dialogue-cum-blog from The Times’s science correspondents, there are some palpable advantages to the move:

1) Having two Lords and Cabinet Ministers, Mandelson and Drayson, behind science is not a bad thing. Particularly given Mandelson’s healthy relationship with No. 10.

2) Phil Willis has used the disbandment of IUSS as an opportunity to ask for a new Committee on Science & Engineering. Given that the previous committee was shared with innovation and universities, this move would be upping rather than diluting government’s science dosage.

There is one niggling doubt though. Lord Drayson has swapped Innovation for Defence in his shared role with Science. And as much as scientists are worried about becoming economic pawns, there is a much greater threat in getting too close to the military.

To be fair, Drayson did well defending his move yesterday on twitter (a useful rundown of which is here). He stated clearly that the two roles are completely seperate. And as my colleague Colin Stuart (@skyponderer) tweeted,

“Hats off for the chance for dialogue. Very impressed we can all chat to the Minister for Science about such key issues.”

At least Drayson is willingto engage openly on the subject. More hope came this morning when Lord Mandelson said:

“Lord Drayson will give the overwhelming bulk of his time, to science, innovation, and technology.”

1 Comment » Posted on Tuesday 9 June 2009 at 2:37 pm by Jacob Aron
In Climate Change & Environment

The science of climate change says that we should all be making changes to our lives in order to reduce the amount of carbon dioxide we release in to the atmosphere. Switch off those lights, buy a hybrid, and do your bit. But what about the scientists themselves?

Ryan Brook of the University of Calgary in Canada believes that researchers in to climate change should be mindful of saying “Do as I say, not as I do.” Scientists who undertake expeditions to the polar regions in the name of studying climate change actually have rather large carbon footprints themselves.

Writing in the June issue of Arctic, the journal of the UoC’s Arctic Institute of North America, Brook calculates that his own research footprint amounts to 8300 kg of CO2 per year. In comparison an average citizen of the Canadian capital Toronto produces 8600 kg of CO2 per year.

“My research footprint is about the same as the annual footprint of an average Toronto resident. Basically, I have two footprints—my own personal life, which is moderate, and my research footprint,” he says.

The figure is so high because of the numerous helicopters, planes and ships required to carry out climate change research. One possible solution is for scientists to purchase carbon offsets for their research. Whilst Brook says his colleagues “dismiss them as a sham”, he believes that buying offsets will promote dialogue and leadership from the scientific community.

“There aren’t necessarily any easy answers, but we need to start talking about it,” says Brook. “This is particularly important for the next generation of scientists being trained and I hope to see them become leaders in this issue.”

1 Comment » Posted on Monday 8 June 2009 at 11:12 pm by Seth Bell
In Inventions & Technology, Psychology

If there is one thing which I’ve learned from watching films, it’s that in the future robots are going to cause a lot of trouble for human beings. Once they become advanced enough they will join together to kill us, enslave us or perhaps just keep us in containers to be used as a power source. Terminator Salvation, released last week, tells the story of how human beings are fighting against the machines for their survival even in 2018.

Can’t we just work together and get along? Well, maybe we can. Researchers on the JAST (Joint-Action Science and Technology) project are looking into ways to build robots which can anticipate human action, allowing them to collaborate with human beings on simple tasks.

The JAST team is multidisciplinary, bringing together scientists from robotics, psychology and the cognitive sciences. In order to make human-robot interactions more natural the team first examined how humans collaborate together, in particular how we observe each other when working together on an activity.

When we watch someone doing a task our brains activate mirror neurons to map the activity. This allows us to work out what is going on and notice when someone makes a mistake. The JAST team have programmed their robot to use a similar observation based system, allowing it to compare a person’s action to the task at hand.

The net result is that the JAST robot does not just learn how to do a particular task, but rather it learns how to work with a human being to accomplish it. For example, it can distinguish between different tools and different ways in which their human partner can hold it for different functions.

This looks like a step towards improved human –robot relations. Maybe we will never need John Connor for our salvation, but I still can’t shake of the feeling that robots like these are just trying to lure is into a false sense of security…

You can see a video of the robot on the ICT results website. It doesn’t look as cool or sophisticated as most movie robots, but I do like its orange top.

1 Comment » Posted on Monday 8 June 2009 at 8:05 pm by Jacob Aron
In Psychology

As any hardcore gamer will tell you, sometimes the lure of “just one more level” proves too much. It’s only when the sun begins to rise that you realise perhaps you’ve been playing a little too long. Is the occasional late night something to worry about though?

Research presented today at SLEEP 2009, the 23rd Annual Meeting of the Associated Professional Sleep Societies, suggests that it might be. Amanda Woolems of the University of Arkansas found that “excessive” gamers sleep less than casual gamers. There was a positive correlation between hours played and sleepiness as rated by the Epworth Sleepiness Scale, and participants who reported that gaming interfered with their sleep slept 1.6 hours a night less than other gamers. Those claiming to be addicted to gaming also slept one hour less on weekdays.

The study surveyed 137 students, the majority (86) of which were women. Just under 11% said that gaming interfered with their sleep, whilst close to 13% admitted a gaming addiction.

Fairly conclusive you might think, but I’ve got one problem with this research: the definition of “excessive” gaming. Participants who spent more than seven hours a week using the internet and playing games fell under this definition. To my mind, this isn’t excessive, it’s normal – especially considering the mean age of the participants was 22. I’d be very surprised if any of them spent less than seven hours a week on the internet, let alone gaming. But then why internet use even being included, if the study is looking at gaming?

Unfortunately, details are scare because this is research being presented at a conference rather than published. It also likely hasn’t undergone peer-review yet. Whilst the definition of excessive shouldn’t influence the results too much, it does call in to question the other factors measured in this research and how they were defined. Until I see anything more conclusive, I won’t worry about the occasional all-nighter.

Comments Off Posted on Sunday 7 June 2009 at 5:02 pm by Jacob Aron
In Health & Medicine, Science Policy, Weekly Roundup

New department for science

With all the political turmoil of the past week it may have slipped you by that the Department for Innovation, Universities and Skills (DIUS) is no more. As part of Gordon Brown’s reshuffle, it will merge with the Department for Business, Enterprise and Regulatory Reform (BERR) to become the new Department for Business, Innovation and Skills (DBIS).

What this means for science is unclear, though the government pledge that DBIS will “continue to invest in the UK’s world class science base and develop strategies for commercialising more of that science.” Lord Drayson, Minister for Science and Innovation in DIUS and now DBIS, stated that “The science ring-fence is safe and sound and the innovation agenda will further benefit from this move.”

Tetley: not everyone’s cup of tea

Tea makers Tetley have been banned from broadcasting an advert for green tea after the Advertising Standards Authority (ASA) ruled against misleading health claims.

The advert shows a woman about to go for a run before discovering it is raining. Instead, she makes a cup of tea, with a voice-over stating “For an easy way to help look after yourself pick up Tetley Green Tea. It’s full of antioxidants.”

Whilst the ASA dismissed four viewer complaints that Tetley were trying to equate green tea with exercise, they did decide the company were trying to claim health benefits beyond mere hydration, and banned the advert.

Whilst it’s nice to see advertisers being taken to task, I do wish the ASA would show some consistency. Why is this not allowed, when Miracle Gro can advertise their organic compost as “100% chemical free”?

Tomorrow’s World, today

The classic BBC science magazine programme Tomorrow’s World is being reinvented as Bang Goes The Theory, “a new series that looks at how science shapes the world around us.”

Terrible, terrible name aside, I’m cautiously optimistic about this new programme. The presenters all seem to have backgrounds in science and science communication, and there is even on PhD, Dr Yan Wong. The editing of the trailer (linked above) makes it look like they are trying a little too hard to be stylish, but I will reserve judgement until the first episode is broadcast. Unfortunately I can’t tell you when that is, as the BBC continue their aversion to actually telling you when their programmes start – “late July” is the best we’ve got.

Comments Off

10 Comments » Posted on Sunday 7 June 2009 at 4:18 pm by Sam Wong
In Health & Medicine

This week saw the launch of a new pill called Ateronon which, according to the press release, ‘is expected to revolutionise approaches to heart health’. Ateronon, we are told, ‘is the first formula proven to halt the oxidation of low-density lipoprotein (LDL) cholesterol, recognized as the key process of atherosclerotic build-up’. The active ingredient is lycopene, a pigment that occurs naturally in tomatoes. Lycopene isn’t very absorbable, but Nestle discovered that it can be made more absorbable by combining it with whey protein to make ‘lactolycopene’.

The new pill has been developed under license from Nestle by Cambridge Theranostics Ltd. The promotional material talks a lot about the legendary Mediterranean diet, which has been linked to a lower risk of heart disease. Because it’s made entirely from naturally occurring ingredients, it’s being treated as a food supplement and not a drug, meaning much less rigorous testing. Ateronon will be available over the counter from next month.

Does it work? Presumably they’ve published some research showing that it does. I put ‘ateronon’ into PubMed.

Your search for ateronon retrieved no results. However, a search for ‘afternoon’ retrieved the following [5500] items.

I decided to leave trawling through those results to find out whether the afternoon can prevent heart disease for another day. I tried ‘lactolycopene’ instead. Two results, one of which was relevant: a 2002 paper, looking at 33 people, which found that you get a similar amount of lycopene from the lactolycopene supplement as you get from tomato paste.

Evidence that lactolycopene could prevent heart attacks and strokes remained elusive, so I tried getting hold of Cambridge Theranostics. I didn’t get an answer, so I tried their PR company, and someone helpfully sent me some documents. One was an ‘expert report’ by Prof Alf A. Lindberg, dated 2006. It describes a pilot phase I study of 18 people with angina. After two months of taking lactolycopene, lipoprotein oxidation (a biochemical process linked with atherosclerosis) was blocked in all 18 patients. Seventeen of them showed clinical improvements, as measured by a questionnaire.

Another document they sent me described two studies by Dovgalevsky and Petyaev which involved giving lactolycopene to coronary heart disease patients. One had 12 subjects, the other 10. In both studies, lipoprotein oxidation was blocked in patients given lactolycopene. The patients also experienced ‘improvements in the clinical status as assessed by a validated clinical questionnaire’.

So the evidence for Ateronon’s efficacy is three small studies in which taking lactolycopene led to reduced lipoprotein oxidation and clinical improvements measured by a questionnaire. None of the studies tested more than 18 people. None of the studies tested healthy people. None of the studies tested whether lactolycopene can prevent heart attacks, strokes, or any actual disease. None of the studies has been published in a peer-review journal. Perhaps most criminally, none of the studies compared lactolycopene with another drug or a placebo.

Maybe lactolycopene can prevent atherosclerosis. In fact I very much hope it does. But at the moment, the claims being made for Ateronon have not come close to being proven.

Comments Off Posted on Sunday 7 June 2009 at 3:05 pm by Emma Stokes
In Space & Astronomy

When Thomas Passvogel applied for a job at the European Space Agency in 1996 he did so for one reason. He had his sights set on the role of Programme Manager for the launch of a satellite called Herschel. Four years later his dream was realised, but before long he found himself co-ordinating the launch of two satellites, as a second satellite, Planck, joined the line-up.

Herschel and Planck have been hailed as two of the most sophisticated astronomical spacecraft ever built, and the project itself is an impressive example of worldwide teamwork between scientists and technicians. Although the two satellites are going to the same region of space, they are there to observe very different things.

Herschel is looking for clues as to how stars and galaxies are formed. The problem is that much of this process occurs in the heart of vast dust clouds, so is difficult to see. This is where Herschel’s huge mirror comes in. It is the largest mirror ever to be launched into space, and allows Herschel to detect light at the far infrared. This type of light is able to penetrate through the clouds and will hopefully produce a unique insight into what happens in these areas of deep space.

The Planck satellite will look at echoes of the Big Bang itself. Background radiation still lingers from the Big Bang moment, and is subject to temperature changes. Planck will monitor these changes, to hopefully give clues as to the universe’s origin, evolution and future.

Charles Lawrence from NASA describes the project as an “outstanding example of international collaboration… despite the issues of working together across different time zones.” Dr Passvogel admits it wasn’t easy to co-ordinate teams from around the world. “In theory,” he said, “all the pieces would arrive from all the different labs on the correct day, and would be assembled together. However in practice, there was much more to it.”

The only real hiccup in the project was a delay in the launch. In March, the European Space Agency revealed the launch was being postponed by at least a couple of weeks. Although this announcement came close to the original launch date, Dr Passvogel explained that this was the best scenario as “the time allowed us to be better prepared for the launch which went like clockwork.”

The two satellites were finally launched into space on May 6th, and nobody was more excited than Dr Passvogel. He described the feeling as “fantastic but emotional, like when your kids leave home. You’re happy for them because they’re living on their own; however it’s still emotional to let them go.”

It is clear when talking to both Dr Passvogel and Dr Lawrence that they are very proud of this project, and indeed passionate about their fields. “I can’t imagine doing anything else that I enjoy as much,” says Dr Lawrence, whilst Dr Passvogel describes the launch as “the most exciting moment of my academic career.”

Comments Off

Comments Off Posted on Saturday 6 June 2009 at 3:15 pm by Jacob Aron
In Biology, Climate Change & Environment, Space & Astronomy

ResearchBlogging.org

I’m almost tempted to leave you with just the title of this post, but perhaps a little bit of explanation is required. It seems that scientists from the British Antarctic Survey (BAS) have found a rather novel way to monitor penguin population levels in the ice region – using satellite imaging to search for their poo.

Peter Fretwell and Dr Philip Trathan of the BAS outlined their novel technique in a paper published this week in Global Ecology and Biogeography. Using images taken by space satellites they were able to identify colony locations of emperor penguins in Antarctica. Despite the image quality being too low to pick out individual penguins, they were able to infer the presence of a colony by the distinctive brown stain they left behind.

Spot the stain.
Spot the stain.

Penguin poo, or guano, stands out from the white and blue sea ice as the only brown around. By picking out these areas of discolouration, Fretwell and Trathan found a total of 38 colonies, 10 of which were previously unknown. Emperor penguins are vulnerable to changes in the sea ice, so accurate information about colony locations is important in assessing the impact of climate change on the population.

Whilst searching for poo from space might sound silly, this research actually has important consequences for animal conservation. Unfortunately this method, whilst useful for finding unknown colonies, cannot really provide accurate estimates of the number of birds at each location. As such, the researchers call for further research to determine emperor penguins vulnerability to climate change.

Fretwell, P., & Trathan, P. (2009). Penguins from space: faecal stains reveal the location of emperor penguin colonies Global Ecology and Biogeography DOI: 10.1111/j.1466-8238.2009.00467.x

Comments Off

Comments Off Posted on Friday 5 June 2009 at 5:43 pm by Jacob Aron
In Biology, Mathematics

Gangs of teenagers roaming the land are generally bad news, but not when it comes to ravens. Juvenile ravens hunting in packs have gotten some scientists very excited, as this behaviour was predicted by a mathematical model before ever being seen in the wild.

Dr Sasha Dall lectures in mathematical ecology at the University of Exeter, and in 2002 set out to solve an evolutionary puzzle: why do young ravens share their food? Natural selection tells us organisms should only help themselves and their relatives. It seems that no one told the ravens.

Typically, juvenile ravens spend their winters drifting in and out of communal roosts. They scavenge for food, usually sheep carcasses, by themselves. Having found a tasty meal they return to the roost and recruit other ravens for a feast the next day. These shared dwellings can house up to 100 individuals, but they don’t stick around. Each bird will move on every few days to another roost and probably won’t encounter their former roommates again.

“From an evolutionary perspective, this is a bit weird,” says Dall. The ravens are unrelated so will not pass on their genes by helping out others. They also don’t encounter the same individuals often enough to build up a sense of co-operation. Using a technique called game theory, in which many different strategies are played out, Dall built a model to explain this unusual behaviour.

The favoured hypothesis amongst ecologists was roosts act as a kind of “information centre” to the advantage of all the juveniles. Individual birds are unlikely to find a carcass by themselves, but if every bird shares information about food locations then they all benefit.

Dall’s model showed that this strategy emerged naturally when ravens try to maximise their access to food. “In the long run, they find more carcasses than they otherwise would,” he says. Bringing a few friends along also allows young birds to chase off any adults who might lay claim to carcasses in their territory.

Problem solved then – except the model didn’t provide just one answer. “I did manage to predict this typical behaviour, but my model came up with another evolutionarily stable strategy,” explains Dall. According to the model, gangs of juvenile ravens should also fly around looking for food together, and never roost in the same place twice. But no-one had ever seen this kind of behaviour.

Perhaps this would have been dismissed as purely mathematical curiosity, if weren’t for Jonathan Wright, professor of biology at the Norwegian University of Science and Technology. Wright was studying a large raven roost in North Wales when he noticed that the juvenile birds were organising themselves into hunting packs, just as Dall predicted.

“I was surprised to discover that this behaviour had been observed somewhere,” says Dall. The variables used in the model, such as the size of the ravens’ search area, matched the real world exactly. The two scientists wrote up their findings in a joint paper, published earlier this year in the journal PLoS One.

So what will Dall turn his mathematical predictions to next? “The evolution of animal personality differences,” he says. Dall plans to investigate why animals of the same species behave differently within social groups. Perhaps game theory has the answer.

Comments Off

Comments Off Posted on Thursday 4 June 2009 at 10:52 am by Jacob Aron
In Happenings, Inventions & Technology, Just A Review

This review originally appeared in the most recent issue of Imperial College’s science magazine I, Science.

Wallace & Gromit, the nation’s most beloved plasticine duo, have arrived at the Science Museum. I went along with I,Science editor Mico Tatalovic to check out the new exhibition, Wallace & Gromit present A World of Cracking Ideas.

The duo are known for their crazy inventions that inevitably go horribly wrong, and it seemed that the Science Museum’s lifts were getting in to the spirit of things. As we waited for a ride to the exhibition floor one of the Museum’s sleek glass lifts arrived, but refused to open its doors before shooting off again. It eventually returned and we step aboard, only to find ourselves stuck between floors. “Perhaps we’ll get the stairs next time,” I said to Mico. Thankfully we were not trapped for long and, for the rest of the morning at least, the inventions on display behaved themselves.

Working in collaboration with Wallace & Gromit creators Aardman Animation, the Science Museum have recreated their home, 62 West Wallaby Street, and stuffed it full of things to see and do. With funding from the Intellectual Property Office, the £2m exhibition is designed to inspire the nation’s creativity and get us all inventing.

Visitors will find “Idea Stations” in each room of the house where they can scribble down their new creations, before sending them off to Wallace & Gromit through a suitably wacky delivery process, the Eureka Brainwave. This overhead conveyer belt channels ideas through the exhibition to the Thinking Cap Machine, which…turns them into paper hats. A bit of a let-down if you have just submitted your idea for the next iPod killer, but kids will love it.

As well as coming up with your own ideas, you can play around with Wallace & Gromit’s. In the living room you’ll find the Tellyscope, their answer to the television remote. After throwing enough balls at a target (both myself and Mico were hopeless throws), a television will move towards a massive sofa. Take a seat, and a series of levers move a gloved hand to select the button of your choice, which will play a short video clip. Very silly, very Wallace & Gromit. Other fun things include a slide down the plughole from the bathroom to the garden, where you’ll be to take part in a modelling clay activity.

It’s not just Wallace & Gromit’s inventions on display though. The Science Museum have dug through their extensive catalogue to find examples of weird and wonderful inventions from the real world. Displays range from an early electric kettle to 1960’s food packaging. You can also track the development of inventions like the telephone, from Alexander Graham Bell’s original to the latest shape-shifting Nokia prototype – unfortunately a model, and not the real thing just yet!

If old inventions aren’t your thing, there’s still a lot on show for Wallace & Gromit and fans. Sets from the films are lovingly displayed, and simply walking through the house really feels like you’re taking part in one of their crazy adventures. It would be very easy to spend almost two hours taking in everything the exhibition has to offer.

I have just one very minor criticism, of an ideological nature. A message throughout the exhibit is the importance of protecting your intellectual property by registering inventions with the Intellectual Property Office, and I have no qualms with that. Up in the bathroom, in a display all about music, was a poster that left me feeling rather different.

Nestled in a corner, away from the karaoke disco in the shower and a charming vinyl jukebox, it said that the music industry is the only way for artists could “avoid losing out to copycats” and “benefit from hitting all the right notes”. In other words, sign a record deal or go broke. In a world where internet exposure and digital distribution is making the music industry increasingly irrelevant, it struck me as nothing more than an out-of-place attempt at propaganda. I’m sure though that kids will just run past without a second glance as they head for something fun to do, so perhaps it doesn’t matter.

My woolly liberal sensibilities aside, Wallace & Gromit present A World of Cracking Ideas is well worth a visit. You might not learn anything as such, but you’ll be too busy having fun with all the crazy contraptions to care. The exhibition will run until 1st November 2009, and the usual fees apply: Adults £9, Concessions £7, with extra deals for families. Cracking good time, Gromit.

Comments Off

1 Comment » Posted on Thursday 4 June 2009 at 12:46 am by Jacob Aron
In Happenings

With the Euoprean Parliament elections tomorrow (or rather later today, as I’m a little late in posting this) I had planned to take a look at where all the major parties fielding candidates stand on science. Fellow bloggers Frank Swain of SciencePunk and Martin Robbins of Lay Scientist have gone one better though, and submitted nine questions to UKIP, Labour, the Conservatives, the Liberal Democrats, and the Green Party. They also have an editorial in the Guardian. You can check out their sites for the full details, but I thought I’d pick out some interesting points.

The questions cover a range of topics, from the obvious (climate change) to the more politically niche (open access). The three main parties gave predictable, fairly non-committal answers to many questions, but also failed completely to answer some. Perhaps more interesting was the response from the smaller Green and UKIP parties. Polls suggest that many people are thinking of casting their lot in with these parties as a protest vote, both their approach to science is quite worrying.

UKIP dismissed the importance of action on climate change, say that we already do enough already in terms of GDP spending. The Greens are of course very environmentally friendly, but would seek an EU wide ban on embryonic stem cell research whilst also supporting alternative medicine like homoeopathy. Bizarrely, they also want to make zoos illegal.

Clearly, science is just one of many issues that you should consider at the ballot box tomorrow, and with expenses claims and the future of the European Union at the front of most voter’s minds it would be easy to ignore science all together. If you are considering a protest vote with one of the smaller parties though, I would urge caution and to read the manifesto small print. If you care about science, you might regret your vote.

Comments Off Posted on Wednesday 3 June 2009 at 10:44 am by Jessica Bland
In Happenings, Science Policy

Over the last two days, London’s Royal Society hosted a discussion meeting on new frontiers in science diplomacy.  Participants represented everything from Big science in the Middle East through to the Japanese diplomatic core. And they each brought different ideas and suggestions for the interaction between international politics and science.

My original post was a run-down of the meeting’s uncomfortable moments – the points where even an outsider could sense the tension between different points of view. But worries about confidentiality moved me towards a more thematic discussion. SciDev.Net’s editor was blogging from the conference, with full permission from the speakers. And so, for a more detailed account of what went on, check out it out here.

The final SciDev post outlines three messages that came over across both days. I want to pick up on the second of them: that ‘science diplomacy’ is an unhelpful umbrella term for several activities that need to be separated.

The most helpful codification of these activities came early on Monday from the director of the International Science Cooperation Division of Japan’s Ministry of Foreign Affairs. Jun Yangi divided science diplomacy into four dimensions. First, there is science used for diplomatic purposes; second, there is diplomacy for science and technology; there can also be diplomacy based on science; and finally, science and technology is a source of soft, attractive power.

Other speakers would have done well to pick up on these distinctions more explicitly. In many cases  the content of a talk was not contentious, but the implied definition of science diplomacy was not one that even the next person on the platform would have agreed with.

The introductory speeches provided a marked example of this. The UK’s Chief Scientific Advisor, John Beddington pointed to the danger of science used for political ends – the first of the four Japanese dimensions.

Immediately following him was Nina Fedoroff, Science and Technology Advisor to the US Secretary of State. She started her address by distinguishing science diplomacy from the use of science in diplomacy. It seemed like she was drawing a similar line to Beddington between the first and the latter three dimension. Except, that her examples of science diplomacy were not really in the same vein. Building an Iraqi virtual science library to replace the books destroyed during the war has as much political as scientific colour.

Fedoroff advocated the incorporation of more scientists  into the heart of government. Whilst Beddington favoured a depoliticization of scientists and scientific discussion. A tighter definition of scientific diplomacy from the start might have forced them into a head to head discussion of this tension.

This definitional problem appeared on the second day as well. One particularly obvious instance was when the British Council representative distinguished science diplomacy and international science relations. Fifteen minutes later, his colleague, Professor Mohamed Hassan from the Academy of Sciences for the Developing World defined science diplomacy as exactly those collaborative relations the Council member distinguished it from.

Perhaps the definitional difference here was not a problem. Both contributors wanted to discuss relations; one distinguished them from diplomacy, the other did not.

It was, however, symptomatic of the same issue that divided Beddington and Fedoroff: the depth to which scientists should penetrate the political sphere. If Professor Hassan believes collaboration is diplomacy, then that is their implied diplomatic limit. He is more cautious – more like Beddington.  However, if the British Council want a separate category for science diplomacy, one that is closer to traditional diplomacy, then they are allowing scientists right into the centre of politics and offering a position closer to Fedoroff’s.

Defining science diplomacy is not just an academic debate. Different definitions map onto different national attitudes to scientists’ position in government and in politically sensitive international research. It might have been more diplomatic to sidestep the issue of an explicit definition in this conference. But a definition might be necessary in order to avoid creating rather than helping diplomatic issues in the future.

Comments Off

Comments Off Posted on Tuesday 2 June 2009 at 9:57 pm by Jacob Aron
In Happenings

Earlier today Colin and Sam joined me in broadcasting another episode of Mission Impossible, the official sci-comm radio show on IC Radio. If you missed it, you can listen again here. This week, Colin and I were presenting. I was a little nervous as it was my first time presenting, but I think it went ok. In the episode we have:

  • Run-down of the latest science news
  • Discussion with our studio guest of the week, Dr Tara La Force
  • Interview with Professor Mike Hulme about his latest book, Why We Disagree About Climate Change
  • Everyone’s favourite Call My Scientific Bluff
  • Interview with some of Imperial’s engineering students about their entry in to the Isle of Man motorbike race
  • A roundup of the latest Web2.0 news
  • And an interview with astrobiologst Dr Lewis Dartnell

Enjoy the show!

Comments Off

Comments Off Posted on Monday 1 June 2009 at 1:39 pm by Jacob Aron
In Inventions & Technology, Psychology

I’ve been using Twitter for a while now, and I’m still not entirely sure what the point of it is and what you can do with it. Well, starting tomorrow we can add one more use for Twitter to the list: science.

Psychologist Richard Wiseman, in conjunction with New Scientist, plans to conduct the first scientific experiment on Twitter. Wiseman is a professor at the University of Hertfordshire, and specialises in studying possible psychic abilities. He plans to harness the power of the Twitter crowd to investigate remote viewing, the supposed ability of psychics to identify distant locations.

Every day this week at 3pm, he will travel to a randomly selected location and then ask everyone on Twitter to give him their thoughts on where he is. Thirty minutes later, he will follow up with another Tweet showing five photos – one of the real location, and four decoys. Participants will then vote for the photo they believe to be genuine.

Today’s experiment, taking place in just over an hour, is just a test run to make sure the system works, so Wiseman will only be counting data from the rest of the week. If you’d like to participate, simply follow him on Twitter. Word of warning, he has a rather unusual background image on his Twitter page!

But is it actually worth you while to do so? Is Twitter a suitable tool for this type of experiment? It all comes down to statistics. For each location, participants have a one in five chance of choosing the correct photo simply by random chance. If some form of psychic powers truly do exist, we would expect a higher proportion of correct guesses.

The odds are stacked against you with each subsequent test however. Whilst you may have a one in five chance of getting one location right, only one in 625 participants will correctly guess all four. Wiseman hopes that 10,000 people will answer his Twitter call. A quick calculation tells us that we can expect just 16 Twitterers to score four out of four. If the results are significantly higher than this, it suggests something odd is going on – though not necessarily psychic powers!

So Twitter really is a good way of conducting this experiment. For very little cost, Wiseman can find the large numbers of people he needs to make his study work. There are problems – what if people simply re-Tweet what they view as likely guess – so I’m not sure we’re going to get any amazing results out of this trial, but I look forward to taking part.

Comments Off