The original iceman

Leave a comment
Anthropology / Ecology / History

Ever wondered what was happening 3,000 years ago? Surprisingly, there’s still someone around from that time today. A 3,000-year-old person? Yep, it’s true. In fact, he’s actually 3,300 years old. Sure, he might have stopped breathing a long time ago, but he’s looking remarkably good for his age. And whilst he no longer has the power of speech, there’s still an awful lot he can tell us about his life all those years ago. Meet Ötzi, the 3,300-year-old iceman.

Share the love: this post was written by science communication student Abraham Jones.

Surviving the test of time

Discovered on the border of Italy and Austria in 1991, Ötzi was named after the Ötzal Alps in which he was found. The oldest fossilised human ever discovered, the secret to Ötzi’s preservation is the location in which he died. Luckily for us (although not for him) his final resting place was in a shallow and sheltered gully, high up in the Alps.

Otzi, the original iceman (or at least a reconstruction, on display at the South Tyrol Museum of Archaeology in Italy). Image credit: OetziTheIceman [CC BY-NC-ND 2.0] via Flickr. Is it just me, or does he look a bit like a young Harvey Keitel?

Ötzi, the original iceman (or at least a reconstruction, on display at the South Tyrol Museum of Archaeology in Italy). Is it just me, or does he look a bit like Harvey Keitel? Image credit: OetziTheIceman via Flickr.

Immediately after he died, Ötzi’s body was covered in a layer of snow, which later froze to form ice. This process of preservation was so effective that even Ötzi’s clothes survived, including his goat-hide coat, leather loincloth, and shoes made from deer hide and insulating hay. The ice acted as a protective layer from potential scavengers, while also stopping his body from rotting away. As a result, when some German tourists were descending from the same mountain many years later, they assumed they’d simply stumbled on the body of an unlucky hiker. When recovery crews came to identify and free the body, they soon realised that Ötzi was no ordinary corpse. Subsequent analysis of his body showed his death might not have been so accidental either…

A 3,000-year-old murder

The cause of Ötzi’s death has long been discussed and debated. Initially, it was merely thought he had died from exposure after being caught in a snow storm. But further investigation revealed a much grizzlier end for our favourite mummified man. X-rays showed a flint arrowhead lodged in his left shoulder, indicating Ötzi had been shot. Whilst the wound managed to miss all vital organs (it ended up just two centimetres from his lungs), it did sever a major blood vessel in his arm, which would have caused massive bleeding. This kind of blood loss would have resulted in death in just a few short minutes.

But it appears he might not have even lasted that long, with further investigations revealing a fracture to his skull and a major brain bleed, consistent with either a fall or a blow to the head. A deep unhealed cut to the hand gives further evidence for a violent altercation just prior to his death. Was Ötzi involved in a fierce tribal conflict, or perhaps a personal argument that turned violent? Was he followed up the mountains by a bitter enemy hell-bent on revenge? Unfortunately, not even the best science can answer these questions, but there is still much that we can learn from Ötzi.

Trust your gut

In an attempt to learn more about the hours and days before Ötzi’s death, scientists performed a biopsy and examined the contents of his gut. As well as learning what he had for his last meal (some basic bread, plant leaves and animal meat), scientists could trace Ötzi’s movements based on pollen samples found in his stomach. Pollen from a particular tree only found in the warm conditions of a nearby valley indicates Ötzi’s presence there just 12 hours before his death. So what prompted Ötzi to leave the comfort of the valley for the cold and treacherous conditions of the Alps? While he was found carrying some tools and weapons, he had practically no food or water, and seemed poorly prepared for a dangerous trek up the mountain.

It seems that the more we learn, the more questions that are thrown up: did Ötzi’s copper axe indicate he was a warrior, or was it merely a symbol of status? Did his bow and arrow mean he was a hunter, or a warrior? Or perhaps the herbs he carried meant he was a shaman? Many of these questions we can only guess the true answers to, but for someone who died over 3,000 years ago, Ötzi sure has taught us a lot.

Links and stuff

A healthy dilemma: fresh or frozen?

Leave a comment
Biology / Health / Myths

In the modern world of refrigerators and freezers we are able to keep our food lasting longer than ever before. But is there a nutritional difference between fresh and frozen produce? Which should you be eating?

Share the love: this post was written by science communication student Brendan Ma.

Is fresh always best? Image credit: leibolmai [CC BY-NC-ND 2.0] via Flickr.

Is fresh always best? Or does frozen food have ‘goodness’ too? Image credit: leibolmai [CC BY-NC-ND 2.0] via Flickr.

How fresh is ‘fresh’?

Your immediate response may be to shout “of course fresh is better, it’s fresh!”, and you may be right, but it depends on the circumstances. In most cases, the produce you see on the shelves of the supermarket has been harvested under-ripe to avoid being damaged during transit to the store. This means that the produce had not had time to reach its ‘peak nutrition’ before being harvested.

The moment the produce is picked, its nutritional content immediately begins to deteriorate. The act of picking a piece of fruit from its stem, or pulling a vegetable out of the ground essentially removes it from its nutrient source. As a living organism, the produce requires a source of energy to sustain life, and it finds these calories in its own stored nutrients.

After the produce is harvested, it is loaded onto a truck, boat or plane to travel for days (or even weeks) to the shops. It then sits on the shelves at the store for some time before you decide to buy it and take it home. Once it reaches your home, chances are it will sit in your refrigerator for a few more days before you eat it. Over this time, the food loses large amounts of its nutritional value.

Frozen alternatives

Frozen foods, on the other hand, are harvested at peak nutrition and snap frozen to −18 ºC. This process of rapidly freezing produce locks in many of the vitamins and nutrients. While this harsh freezing may degrade some vitamins, the fact that it is harvested and frozen at peak nutrition results in frozen produce containing more nutrition than their ‘fresh’ counterparts. This difference in nutrition is further increased when the fresh produce is not locally in season and needs to be imported from interstate or overseas.

A series of studies have been performed which compared the nutritional content of frozen and fresh produce from supermarkets. Frozen broccoli had consistently higher levels of vitamin C and A. Frozen ‘super foods’ such as blueberries were much higher in anti-cancer nutrients and vitamins. A group of scientists from the Centre for Food Innovation at Sheffield Hallam University highlighted that green beans lost 77% of their vitamin C within seven days of being harvested, while frozen brussels sprouts scored higher on all nutrient measurements versus their fresh counterparts.

Of course, the rates at which nutrients degrade are not equal. Some vitamins last longer and do not degrade as rapidly as others. Water-soluble vitamins such as B and C remain high in frozen produce while rapidly lost in fresh. Fat-soluble vitamins (A and D), on the other hand, are much hardier and stable, and studies have shown no distinct difference between fresh and frozen.

Of course, if you pick a fresh vegetable from your garden or a local farmers’ market and consume it on the day of purchase, nothing can compare to the nutrition (or the taste), but unless you’re willing to shop daily, frozen produce may be a great nutritional alternative. In fact, it might be time to throw out the preconceived notion that ‘fresh’ is always best, and consider some frozen alternatives.

Links and stuff

Have a triple shot: coffee even better than it’s cracked up to be

comments 9
Health / Medicine / Psychology

Every second, 26,000 cups of coffee are drunk around the world. That’s more than two billion cups of coffee consumed every day. Yet most of us think of coffee as unhealthy. If coffee is your guilty pleasure you’ll be pleased to hear you can forget the guilt: coffee is good for you.

We’ve known for a long time caffeine is a very effective stimulant; it can wake you up, keep you alert and help you concentrate. Of course we also know that coffee is addictive and it’s not surprising we consider addictions to be bad for us. Caffeine is the world’s most popular drug.

But coffee is more than just caffeine. It’s actually chock-a-block full of different compounds, some of them with important health benefits. Some act as antioxidants, others reduce inflammation, and still others regulate insulin (the hormone involved in diabetes).

The life cycle of the humble coffee bean. But could it be that coffee could extend your life? Image credit : Thomas [CC BY 2.0] via Flickr.

The life cycle of the humble coffee bean. But could it be that coffee could extend your life? Image credit : ‘Thomas’ via Flickr.

Are you sure coffee isn’t bad for me?

The first thing you’ll be pleased to hear: research published in 2008 found no link between drinking coffee and an increased risk of dying. This study followed about 130,000 people in their 40s and 50s for around 20 years. These volunteers were part of the Nurses’ Health Study (all female) and Health Professionals Follow-up Study (all male).

The researchers collected detailed health information about the volunteers, including their diet and coffee-drinking habits. At the same time they kept records of who died during the study. Even people drinking six cups of coffee a day were not at higher risk of death.

Could it really be good for me?

It gets even better. Not only will coffee not kill you, it may even protect you from a whole heap of nasty illnesses.

For example, research indicates coffee consumption reduces the risk of lethal prostate cancer in men. Drinking one to three cups of coffee a day (either normal or decaf), was linked to a 30% decreased risk of this cancer. Coffee also appears to reduce the risk of liver cancer by up to 40%.

Coffee drinking also reduces the risk of type 2 diabetes. Not only that, but within limits, the more coffee you drink, the lower your risk. Three to four cups of coffee a day was associated with a 25% reduced risk of developing type 2 diabetes, compared with people who drank fewer cups each day. Another study found that each additional cup of coffee reduced the risk by a further 7 or 8%. And in type 2 diabetes sufferers, drinking coffee reduced the risk of dying during a 20-year period by 30%.

What about for my brain?

A study published in 2012 followed 124 individuals aged 65–88 years. All of these people were showing the first signs of forgetfulness that commonly leads to Alzheimer’s disease. The researchers measured levels of caffeine in the blood and assessed their brain function over two to four years. Not only was coffee drinking not associated with decreased brain function, people with not much or no caffeine in their bloodstream were much more likely to have progressed to Alzheimer’s than people who had three cups’ worth of caffeine in their system.

Similarly, coffee drinking improves our long-term memory. Researchers found caffeine significantly reduced rates of forgetting over a 24-hour period. And moderate caffeine intake reduces your risk of getting Parkinson’s disease by somewhere between 30 and 60%.

Why the bad rap?

Despite all of this, we all know coffee has a bad name. This is likely to be because of other behaviours that go hand-in-hand with coffee drinking. People who drink a lot of coffee often smoke, may not exercise very much and tend to have a more unhealthy diet in general. As a result, a lot of early studies that concluded coffee was bad may have been reporting on confounded results. This means it was the smoking, lack of exercise or diet that was to blame for poor health, not the coffee drinking.

Can you drink too much coffee?

But of course too much coffee may still not be a great idea.

If you’re drinking so much coffee that you get tremors, have sleeping problems, or feel stressed and uncomfortable, then obviously you’re drinking too much coffee. Dr Rob van Dam, Harvard School of Public Health

And if you’re pregnant, it’s probably still wise to avoid or limit your coffee intake. We don’t have a complete picture of the effects of coffee on the foetus, but we do know that caffeine can cross the placenta and that a foetus is not very good at breaking the caffeine down.

Unfortunately, all of this is terrible news for yours truly. I’m one of a small group of people actually allergic to caffeine. But for most of my fellow Melbournites: drink up, now you’ve got a whole lot more reason to enjoy your latte.

Links and stuff:

Nature’s beautiful death­traps

Leave a comment
Biology / Botany / Ecology / Evolution

We’ve all heard of the Venus flytrap, but the plant kingdom contains many more wonderful examples of meat-eating species. Some plants have been known to feed on insects, frogs, and even birds and small mammals. But if plants are so good at getting energy from the sun, why would these eating habits ever have evolved in the first place?

Share the love: this post by science communication student Lachlan Stoney

The Venus flytrap, Dionaea muscipula, is perhaps the best known of the 600-odd species of carnivorous plants. Image credit: Tristan Gillingwater via Wikimedia Commons

Meat eaters are relatively rare in the plant world, yet there is a stunning variety of carnivorous plants inhabiting almost all of the world’s continents. Out of the 300,000 or so species of flowering plants, we know of about 630 truly carnivorous ones.

All carnivorous plants rely on some kind of clever trap. Unsuspecting animals (usually insects) are lured with the promise of a nutritious reward like nectar. In some cases the animal will slip in to a pool of digestive enzymes and drown, in others the animal will trigger the release of a special trapping structure on the plant (think Venus flytrap). Some species, such as the sundews, glue their prey to the spot, only to wrap them up and slowly digest them.

This fly’s fate has been sealed by the sticky mucilage of the Cape sundew, Drosera capensis. Image credit: Rosťa Kracík via Wikimedia Commons.

In addition to nectar, some species attract their prey with fluorescent rings on their leaves that are invisible to the human eye. These rings shine with ultraviolet light, which insects are highly sensitive to.

The incredible adaptations of carnivorous plants suggest a long history of evolutionary pressure. But if plants can get their energy from the sun through photosynthesis, why would the need to feed on animals ever have arisen?

Pitcher perfect

To understand why carnivorous tendencies might have evolved in plants, lets look at one of the most famous groups of carnivorous plants: the pitcher plants. They are recognised by their jug-shaped body, the “pitcher”. With an ultra-slippery rim, the pitcher traps and drowns animals in a soup of digestive enzymes. The largest known species is probably Nepenthes rajah, which can have pitchers up to 20 cm wide. In addition to its usual diet of insects, N. rajah has been known to drown lizards, birds, and rats.

The world’s largest meat-eating plant, the pitcher plant Nepenthes rajah, has been known to consume lizards, bird and rats in its native Malaysian Borneo. Image credit: Jeremiah Harris via Wikimedia Commons.

Pitcher plants actually belong to many different plant families, rather than being from the same branch of the evolutionary tree. The pitcher body plan has evolved independently in different plant lineages from around the world — a good example of convergent evolution.

But what do all pitcher plants have in common, besides their appearance? It turns out that they all live exclusively in boggy environments that are acidic and poor in nutrients. The consumption of animals appears to be a way of supplementing a diet that would otherwise be poor in nutrients such as nitrogen and phosphorous.

This turns out to be true for all carnivorous plants. In addition to water and sunlight, all plants require a regular source of nutrients. This usually comes from the soil, but carnivorous plants have simply found a way to get around that.

Carving a niche

You might be thinking “why don’t they just live in better soil and get their nutrients that way, like normal plants do?” The answer is that by adapting to a harsh environment that other plants can’t tolerate, carnivorous plants have much less competition. They have created a niche. This is a common theme in evolution, which explains, for example, why mangroves live in salty water and even why the first land dwelling animals left the oceans all those millions of years ago.

Nepenthes extincta?

As we continue to learn from carnivorous plants and even identify new varieties, isn’t it a shame that some species might never be studied and appreciated before they disappear for good? Of the 12 new species of Nepenthes that were discovered last year, all native to the Philippines, some are feared to already be extinct from loss of habitat.

Links and stuff

Can a night owl become an early bird?

comments 18
Biology / Genetics / Health / Psychology

A couple of weeks ago I asked students in one of my classes whether they were early risers or night owls. Almost all identified as one or the other. But is either being up at dawn or burning the midnight oil simply habit, or is something else going on?

Humans are no different to most of the other creatures on earth. We have an internal body clock, which determines a 24-hour rhythm to our activity. This clock can be found in the hypothalamus, at the base of the brain. Across the animal kingdom, the majority of species are either nocturnal (active at night) or diurnal (active during the day).

I’m not nocturnal

I should know: I spent the best part of ten years following around a nocturnal species to better understand its social and mating behaviour.

Possibly not one of my better choices given that I have always functioned best when I go to bed and also get up early. I always liked the Benjamin Franklin quote: “Early to bed and early to rise makes a man healthy, wealthy, and wise.”

It turns out — although fundamentally diurnal — most humans also show distinct preferences for different times of the day and night. Your chronotype is your preferred time of sleeping and activity.

Ten percent of people qualify as true early birds, twenty percent as night owls, and everyone else falls in between. Morning people have brains that are most active at 9 am, whereas night people have brains primed for action at 9 pm. Research tells us that going to bed between 11 pm and midnight and waking between 7 and 8 am is the most common pattern among humans.

If you aren’t sure where you sit on the early—late continuum, you can take this test to find out.

But does it all come down to habit? Or do the brains and genes of early birds and night owls actually differ?

It’s in your genes

The simple answer is yes. A number of studies have identified genes that influence a person’s chronotype. The genes known as PER1, PER3 and ABCC9 all play a role in regulating our body clocks and vary predictably among people of different chronotypes.

And in your brain

Scans have also found true structural differences between the brains of early and late risers. In night owls, the quality of the white matter in the brain is compromised. White matter has the job of ensuring effective communication between the nerve cells and changes here have been linked to depression and other psychological problems.

Whether these structural differences in the brain are the cause — or result — of being a night owl, we don’t yet know.

Are you a night owl?

Are you a night owl in a world made for early birds? And could you change if you wanted to? Image credit: Amir Jina via Flickr.

Does your chronotype matter?

On the one hand, no. If you are able to get enough sleep, feel alert when you need to, and are generally happy with your chronotype, there’s no problem.

But a large body of research highlights problems faced by night owls. The main issue is a potentially large mismatch for night owls between social and biological time. Although night owls may not feel tired until 2 am, they probably still need to be up at 7 am in order to get to work on time. This mismatch has been called social jetlag. Our lives are generally structured to suit morning, but not nighttime people.

Studies have found that night owls who experience this conflict between internal and external time suffer more from mental distress and are more likely to smoke cigarettes and drink alcohol than early birds. Night owls also experience worse sleep and tend to score highly on personality tests looking at psychopathy and narcissism.

Can you change?

If you’re a night owl who wants to be an early bird, what are your chances of switching?

Probably very good, if you are willing to embrace some new habits, says sleep researcher Dr Simon Archer.

It is not all to do with your genetics. You can choose to follow a particular life pattern. You can override your genes. Dr Simon Archer

Camping might be the answer

Researchers took eight people camping in the wilderness of Colorado. Some were night owls, some early birds. No torches or electronic devices were allowed and within one week the circadian rhythms of all of the campers were synchronised and timed with sunrise and sunset. At the end of the week, all were happily rising at dawn. The key: taking away access to artificial light (think lights, mobile phones, tablets, TVs and computers) after the sun has set.

Regardless of your genetic predisposition, making changes to your habits could make all the difference.

But if you’re a night person and avoiding artificial light after sunset doesn’t appeal, another option is to change your schedule to better match your chronotype. There have been many calls for the work or school day to run from 11 am to 7 pm for the night owls among us.

Sound appealing? Now you know the science to convince your boss.

Links and stuff

This post was referenced on iflscience.com

The truth about sinister south­paws

Leave a comment
Anthropology / Genetics / History / Psychology

Why call a valued assistant a ‘right-hand man’? Why does an awkward dancer have ‘two left feet’? And why, in times gone by, were left-handers thought to be possessed by the devil? Throughout the ages left-handers have been stigmatised and persecuted. But it turns out ‘handedness’ is determined before you even leave the womb.

Are you a southpaw?

Statistics tell me only about 10% of you reading this are left-handed. In contrast, chimpanzees show a 50:50 split in preferred hand. This 90:10 ratio is true in every human population in the world and is thought to have been true throughout human history.

Does this rarity explain the persecution? The very name tells us the negative perception of left-handers has a long history. ‘Left’ comes from the Old English ‘lyft’ meaning weak or broken and ‘sinister’ in Latin means both left and bad, or unlucky. In contrast the Latin for right is ‘dexter’, which gave us the word ‘dexterous’, meaning skillful.

Are more left-handers criminals?

Back in 1903, Cesare Lombroso, the father of modern criminology, announced left-handers were more than three times as common in criminal populations than in the rest of the population.

Left-handedness… may contribute to form one of the worst characters among the human species.Cesare Lombroso

In 1977, psychologist Theodore Blau claimed left-handed children were overrepresented among academically and behaviorally-challenged children.

Another psychologist, Stanley Cohen, argued left-handers die younger than their right-handed counterparts. On closer inspection it turned out simply that the older a person was, the more likely they had been forced to override their natural left-handed preference and learn to write with their right hand. We know today only 1% of Chinese students are left-handed while the proportion in the general population is 10%.

The truth about left-handers

We know people with psychotic disorders like schizophrenia are more likely to be left-handed. We also know if you are left-handed, you are twice as likely to be male. And we find a higher proportion of left-handers among architecture, art and music students than among science students.

There are also many famous left-handers: Plato, Charles Darwin, Bill Gates, Oprah Winfrey, Jimi Hendrix, Picasso, Leonardo da Vinci and Ned Flanders to name but a few. And five out of the last seven US presidents have been left-handed. People argue that left-handers are more introverted, creative and intelligent.

g

Five of the past seven US presidents, including Barack Obama, have been left-handed. Image credit: Pete Souza (public domain) via Wikimedia Commons.

Where does hand preference come from?

Research published last year showed hand preference begins in the womb and scientists have identified the network of genes responsible. These genes establish left-right differences in the brain, which in turn influences which hand is favoured while an embryo develops. These are also the genes that ensure our internal organs are correctly positioned in our bodies – for example, our heart and stomach on the left and our liver on the right.

So left-handedness isn’t a character flaw, but simply an outcome of genetic variation. And in fact, being left-handed may deliver some important advantages.

The right brain knows what the left hand is doing

Information collected on the right side of the body (for example the right ear and eye) goes to the left hemisphere of the brain for processing. At the same time, information from the left side goes to the right brain. Right-handers tend to do most of their processing of language and speech in the left side of the brain while the right side of the brain is responsible for emotions.

But left-handers don’t tend to divide these tasks so clearly between the two hemispheres. Left-handers are more likely to process language using both sides of the brain. And tests designed to look at the speed of information flow between the two halves of the brain have found that in left-handers, the two halves of the brain are better connected.

This greater connectivity means left-handers are at an advantage when it comes to processing and responding to lots of information arriving simultaneously or in quick succession (think being a pilot, or perhaps a gamer!)

Left-handers are also better at divergent thinking than right-handers. This may be because in the process of more information passing between the two brain hemispheres, there is more opportunity for novel ideas.

So rather than being sinister or unlucky, it seems left-handers may indeed be more creative. Which raises the question – why hasn’t left-handedness evolved to be more common?

Links and stuff

To trust or not to trust, all in the blink of an eye

comments 3
Biology / Evolution / Psychology

We’ve known for a while that having certain facial features will lead people to judge you as either trustworthy or untrustworthy. And in fact our brains make those complex judgments much faster than we could have ever imagined. Now scientists have worked out how we do it.

It’s all about physiognomy

Back in 1772, Swiss poet Johann Kasper Lavater popularised an ancient Greek practice in his Essays on Physiognomy. Put simply, physiognomy is the assessment of a person’s character from their face or other external features. For example, Lavater suggested “the nearer the eyebrows are to the eyes, the more earnest, deep and firm the character”.

Sound completely ridiculous? Maybe, but over the last few decades, research has shown we absolutely do judge people according to how they look. Scientists have even modeled the specific physical characteristics responsible for our first impressions.

trust this face

Would you trust this face?

We all agree on who looks trustworthy

When it comes to trustworthiness, faces with high cheekbones, high inner eyebrows and smiles are consistently ranked as the most trustworthy.

In a study published earlier this month, researchers presented ten volunteers with 300 computer-generated faces. The ‘trustworthiness cues’ of the faces had been manipulated while keeping other facial features constant.

Exactly as predicted, there was remarkable agreement among the volunteers as to which of these faces were trustworthy. This was also true for non-manipulated pictures of strangers’ faces.

But can your brain judge the trustworthiness of a face you aren’t even aware you’ve seen? Yes, indeed it can.

We know who we trust even if we don’t ‘see’ their face

The researchers then showed a new group of volunteers the trustworthy and untrustworthy images for a split second, a mere fleeting glimpse. The pictures were flashed up and removed again so quickly that although people’s eyes were able to see the images, their brains weren’t able to actually register they had seen them.

To be absolutely sure the volunteers hadn’t consciously perceived the faces, the researchers also used a nifty technique called ‘backward masking’.

Backward masking means showing an irrelevant image so quickly after the picture of interest (in this case a face), that the study volunteers were prevented from consciously seeing the faces. This technique is known to prevent the brain from processing an image the eyes have seen.

What’s happening in the brain?

Of course the researchers wanted to know what was happening in these peoples’ brains. So at the same time as all this was going on, the brain activity of the volunteers was closely monitored and recorded.

The small almond-shaped region of the brain known as the amygdala was the area of interest. The amygdala is responsible for decision-making, memory and certain emotions. We know the amygdala is also in charge of sending out signals to enact the famous fight or flight response.

And what did the brain scans show? That even though the study participants hadn’t consciously seen the faces, the amygdala had ‘decided’ which faces were trustworthy.

Different parts of the amygdala ‘lit up’ in the scans in response to the trustworthy and untrustworthy faces. This shows that our brains are able to work out if someone is trustworthy even if we aren’t actually conscious we’ve seen their face.

So even if we believe we haven’t seen anything, our brains clearly show that at an unconscious level, we have! It’s hard to get your head around (excuse the pun).

Less than the blink of an eye

What’s more, it took just 33 milliseconds (a tenth of the time it takes to blink) to make this judgment. The fact our brains can judge something as potentially subtle and complex as trustworthiness without any conscious thought is pretty extraordinary in my book.

Our findings suggest that the brain automatically responds to a face’s trustworthiness before it is even consciously perceived.Jonathan Freeman, New York University

This research suggests we have evolved to be instantly tuned into people we perceive as untrustworthy. Perhaps our brains have evolved to make these judgments fast enough to allow us to respond appropriately – and either approach someone or hot-tail it out of there. Probably a very useful skill in days gone by when tribal conflicts were commonplace.

In fact, other research has shown we are incredibly fast at making judgments about other traits too: attractiveness, likeability, competence and aggressiveness.

What remains to be decided is whether there is any link between how trustworthy a person looks and how trustworthy he or she actually is. That might be harder to test.

Links and stuff

 

Long pig, anyone?

comments 2
Anthropology / Biology / History

This post isn’t for the squeamish. Whether we are talking about Hannibal Lecter, Sweeney Todd, the witch in Hansel and Gretel, or the many accounts from early human history, cannibalism makes us uncomfortable. And why wouldn’t it? Who wants to think about people eating other people or worse, being on the menu yourself?

Whether you are fascinated or repulsed by the topic, you won’t be surprised to hear there are researchers trying to understand the when, where and why of human cannibalism.

Cannibalism in the animal kingdom

We know more than 1,300 species of animal eat other individuals of the same species and this is likely to be a gross underestimate. A variety of spiders, insects, fish, birds and even mammals eat their own kind.

I’ll never forget the day, early in my zoological career, when we were out at dawn checking traps that we had set for antechinus (native Australian mouse-sized carnivores). One particular trap felt heavy in the hand and, upon peering inside, we expected to see two antechinuses. It was the mating season after all.

Instead, we found precisely one-and-a-half antechinuses.

Despite our tendency to view bonobos, our closest living relatives as peace-loving, you don’t have to search far to find evidence of cannibalism. Have a look on YouTube if your stomach is up for it.

Perhaps equally disturbing, there is a particular kind of cannibalism common in some insects and spiders called matriphagy, when babies eat their mothers. Now that’s taking a mother’s love to the extreme!

Cannibalism in ancient humans

Cannibalism is believed to have occurred as long ago as 800,000 years in Homo antecessor, the earliest known ancestors of humans in Europe. The evidence comes from remains found in a cave in Spain.

Researchers found the remains of at least 11 humans mixed up with those of wild sheep, deer, bears, wolves and bison. The bones of both animals and early humans bear the signature marks of stone tools, which were used to prepare meals. Fossils from another Spanish cave also suggest cannibalism in Neanderthals, our closest extinct human relative.

How do we know that a person died as a result of cannibalism? It’s a bit gruesome but there are a few clear signatures of cannibalism such as the base of the skull being missing in an otherwise intact skeleton (in order to get at the brain).

And in not-so-ancient humans

Human cannibalism made big news in 2013 when researchers showed that English colonists had resorted to cannibalism in Jamestown in the Colony of Virginia during the deadly winter of 1609–1610, known as the ‘starving time’. Eighty percent of the colonists died during that winter.

sdsa

We know cannibalism has been around a long time. But why do people eat people? Image: Cannibalism 1571 (detail). Public domain via Wikimedia Commons.

Why cannibalism?

Despite its current taboo status, evidence suggests that cannibalism has been a common practice in human history. But why did early humans resort to eating their own kind? There are thought to be examples of all of the following:

  • Gastronomic or dietary cannibalism (to supplement nutrition)
  • Starvation cannibalism (as a result of nutritional necessity)
  • Aggression cannibalism (fighting and hunting enemies)
  • Spiritual cannibalism (eating the dead as part of funeral rites)
  • Medicinal cannibalism (to tackle health concerns)
  • Psychotic cannibalism (a result of psychological imbalance)

James Cole, a lecturer from the University of Brighton in the UK set out to explore whether early human cannibalism was more commonly due to ritual and social reasons (cultural cannibalism) or as a vital source of nutrition (gastronomic cannibalism).

How nutritious are you?

As an important step, Cole decided to work out how much nutrition was actually available in a human body. He developed a “nutritional template” for a human and, based on four men aged 35–60 years, he argued that a whole cooked human cadaver would give 81,472 calories. A medium-sized apple contains about 80 calories.

Cole calculated that if every edible part is eaten, an arm yields 1,800 calories, a leg, 7,150 calories and a human heart, 722 calories. But half of these calories come from fat.

There are clearly nutrients and energy to be gained from eating another person but that still doesn’t tell us if nutrition was the primary motivation for the cannibalism. I should point out for comparison that a cow provides about 500,000 calories.

Motives…

What can tell us more about the motives for cannibalism is how the human remains were treated. To explore this, researchers analyse and interpret the cut marks on bones.

The argument goes that when humans consumed other humans primarily for nutrition, the victims were treated just like any other prey. Hence the mix of bones in the Spanish cave, all with similar tool-inflicted marks. We would also expect that the most nutritious parts of the body to be targeted.

In contrast, if the cannibalism is for cultural reasons, we might expect to see human remains treated very differently to those of other animals.

Based on this rationale, the simple answer is that Cole has found evidence for both nutritional and cultural cannibalism among early humans. He argues that in fact these categories shouldn’t be considered as mutually exclusive at all.

And as for the remaining burning question: do humans taste like chicken, who knows? Not me! But perhaps ‘long pig’ provides the best clue:

Upon it once stood the temple and about it were enacted the rites of mystery, when the priests and elders fed on the long pig that speaks, when the drums beat till dawn and wild dances maddened the blood. Frederick O’Brien, White Shadows in the South Seas, 1919

Links and stuff

Every breath you take

Leave a comment
Biology / Health

What is the longest a person can hold their breath for, and survive without obvious brain damage? No Googling… five, ten, maybe even 15 minutes? Take a deep breath… how about 22 minutes!?

That is the Guinness World Record held since 2012 by Stig Severinsen: “The man who doesn’t breathe”. Given what we know about the enormous risks of permanent damage if the brain is starved of oxygen for even short periods, how on earth can that be possible?

Try holding your breath now. How long can you last? The average person can hold their breath for about one minute. Although we naturally breathe about 12 times per minute, we can voluntarily hold our breaths. But not to the point of going unconscious.

How long can you hold your breath? Image credit:

How long can you hold your breath? Image credit: Mohamed Iujaz Zuhair via Flickr

Mammals have a very handy diving reflex

If you try holding your breath underwater, chances are you’ll find you can last longer. This is because of the ‘diving reflex’ — a physiological response to being submerged in cold water. As a result, our heart rates decrease by around 10% (in marine mammals like the sperm whale, the heart rate reduction can be up to 90% and many species can hold their breath for over an hour).

Blood vessels in the skin and limbs also constrict so that blood is directed away from the surface of the body and towards the brain and heart. Essentially the body shuts down any bits that aren’t necessary for survival in order to save energy for the parts that are!

But we still gasp for air after a minute or so

When we try to hold our breath, two things happen: we experience a shortage of oxygen and a build-up of carbon dioxide. But long before too little oxygen or too much carbon dioxide can hurt the brain, we gasp for air. Researchers call this moment of urgency the break point.

What triggers the break point? Recent research suggests perhaps nerve signals from the diaphragm may be in charge of telling our brain that taking a breath is overdue.

22 minutes, really?

So how on earth could Stig Severinsen hold his breath for 22 minutes? Well, it turns out there are different categories of breath holding.

In his famous Chinese Water Torture Cell act which he first performed in 1912, Harry Houdini held his breath for more than three minutes.

For thousands of years pearl divers have retrieved pearls from the ocean floor with little to no technology to help them. There are reports of Japanese pearl divers lasting underwater for seven minutes on one breath.

Into the Big Blue

Do you remember the film The Big Blue about champion 20th century free divers Jacques Mayol and Enzo Maiorca? Free divers go as deep as they can on a single breath and often rely on the help of an air balloon to resurface (and sometimes a weighted sled to get down). One of the current records is 145 metres.

There is also a competitive discipline in free diving called “static apnea” – this is where a person holds his or her breath underwater, without moving, for as long as possible. The title is currently held by Frenchman Stéphane Mifsud after spending an extraordinary 11 minutes, 35 seconds below water on a single breath of air.

But to go beyond 11 minutes you need to do more than just take a really deep breath. Back in 1959 a physiologist called Hermann Rahn managed to hold his breath for nearly 14 minutes by slowing his metabolism (to reduce the body’s oxygen requirements), hyperventilating (to lower the levels of carbon dioxide in the blood) and filling his lungs with pure oxygen (via a gas cylinder).

Pure oxygen helps a lot

It is this act of inhaling pure oxygen (the air we breath is only 21% oxygen) that makes all the difference. Stig Severinson’s world record for “breath holding underwater” allows for the use of pure oxygen in preparation whereas “static apnea” does not.

In 2008, illusionist and endurance artist David Blaine managed to hold his breath for 17 minutes by using a combination of these techniques while remaining completely still.

The first thing that I learned is when you’re holding your breath you should never move at all; that wastes energy. So I learned never to move. And I learned how to slow my heart rate down. I had to remain perfectly still and just relax and think that I wasn’t in my body. David Blaine

Don’t try this at home

Seriously, don’t! All of these practices are risky and extended breath-holding can result in brain damage and death. Nicholas Mevoli died in 2013 while attempting a new world record in free diving. And doctors have found abnormalities in free divers’ brains that suggest some form of brain damage is being caused by holding the breath for such long periods.

As for Stig Severinsen, he attributes his extraordinary ability to getting into the zone: “You have to get into a truly meditative state where you leave all your troubles behind”. It can’t hurt that he also has a lung capacity more than double the average man’s!

Links and stuff:

A shocking lack of Zen

comments 19
Biology / Health

When was the last time you were alone with your thoughts for more than a moment or two? Did you enjoy the peace and quiet? Or did you desperately seek distraction? New research suggests many of us will go to great lengths to avoid simply having to think to ourselves.

Can you entertain yourself?

It sounds like a very simple exercise. All you need to do is sit alone in an empty room for six to fifteen minutes. You can’t have your phone, or anything to read or write with. Other than that the only rules are that you have to stay awake and stay in your chair. Easy, right?

Psychologists from the University of Virginia and Harvard did this exercise multiple times, first with University students and later with a variety of people spanning ages 18–77 years of age and found essentially the same thing.

People don’t like doing nothing

More than half of the study participants rated the experience as somewhat or more than somewhat difficult. Nearly half admitted to not enjoying the experience much. They found the exercise far less enjoyable than reading magazines, doing crosswords or listening to music.

Results for two people had to be dumped from the study because in one case the researcher left a pen in the lab by mistake and the person used it to write a to-do list. Another time an instruction sheet was left in the room and the study participant used it to practice origami. I would have expected paper planes!

How about at home?

The researchers allowed people to repeat the experiment at home and about a third admitted they cheated and either listened to music or used their phone.

The next step was to give people specific topics to think about, like planning a holiday. But even that didn’t help people to enjoy the experience any more.

Why do we find it so difficult to just do nothing? Image credit:

Why do we find it so difficult to just do nothing? Image credit: Ed Yourdon via Flickr.

We all hate doing nothing

All of these studies suggest that people would rather be doing something rather than nothing. The people who took part in this research didn’t like spending even brief periods of time alone in a room with nothing to do but think or daydream.

So it seems we are desperate for distractions.

Is pain better than no distraction?

The next question: would people prefer an activity that’s not very nice over no activity at all? Was the experience of time with no external distractions so bad that people would avoid it by inflicting pain on themselves? You’ve already guessed the answer. For some people, yes.

At the start of the next study, participants were given a mild electric shock. When asked if the shock was bad enough that they would be willing to pay to avoid being shocked again, three-quarters said yes.

But when those people were left alone in a room for 15 minutes without distraction, 67% of men and 25% of women gave themselves electric shocks as a distraction from simply being alone with their thoughts. On average people gave themselves one to two shocks, but one man pressed the button 190 times!

Participants simply said they preferred an electric shock over boredom.

Why is it so hard to go without distraction?

The psychologists said they set up this study expecting people to find it easy to amuse themselves. After all, we have big brains full of memories and the ability to reflect on the past, plan for the future and create imaginary worlds. But that clearly wasn’t that case.

I think [our] mind is built to engage in the world. So when we don’t give it anything to focus on, it’s kind of hard to know what to do. I suppose it’s kind of circular. We wouldn’t crave these things if we weren’t in need of distractions. But having so many available keeps us from learning how to disengage.Timothy Wilson, University of Virginia

Can we lay the blame firmly at the feet of social media? Probably not. Study participants who used social media less often were no better at daydreaming. In fact enjoyment of time alone wasn’t related to social media or smart phone use, or age.

Interestingly Wilson and colleagues did find a small correlation between meditation experience and the ability to be happily alone with thoughts. Time to practice your lotus pose? Ommmm…

Links and stuff