Science-- there's something for everyone

Saturday, June 30, 2012

The source of methane on Mars


Nine years ago, a large amount of methane was discovered in the Martian atmosphere. Because methane doesn’t stay in the atmosphere for long, something must be replenishing it.  On Earth, it’s predominantly living organisms that do the replenishing. Although that's a possible source for Mars' methane as well, an inorganic origin seems more likely. Two years ago, I wrote that cosmologists had ruled out meteorites as that source.

Since then, Frank Keppler and a team of researchers from the Max Planck Institute for Chemistry, Utrecht University, the University of Edinburgh and the University of West Hungary have been successful in tracing the Martian methane to its source. They found that the methane comes from... meteorites. Yes, you read that right.

The cosmologists exposed the Murchison meteorite, a large meteorite that landed in Australia in 1969, to UV radiation at levels comparable to those found on the surface of Mars. Because Mars has no ozone layer, these levels are much higher than those on Earth.


Exposing the meteorite to UV radiation released large amounts of methane. Isotope analysis confirmed that the methane released from the Murchison meteorite was of extraterrestrial origin. In addition, the amount of methane released increased with increasing temperature. This correlates well with methane concentrations on Mars, where more methane is found in the warmer regions. The Murchison meteorite is thought to be of similar composition to the myriad meteorites that bombard Mars every day. 

 © NASA
Methane concentration on Mars: This chart depicts the calculated methane concentrations in parts per billion (ppb) on Mars during summer in the Northern hemisphere. Violet and blue are indications for little quantities of methane, red areas for larger ones.
© NASA.


So why did I report previously that the methane could not have come from meteorites? In the prior study, cosmologists looked at how much methane could be released from meteorites as they ablated, or burned up, in the atmosphere. Apparently not that much. However, meteorites that make it to the ground and are subsequently pummeled with UV radiation can and do give off significant amounts of methane.


And thus, science marches on.

Friday, June 29, 2012

Alzheimer’s disease caused by prions


Prions are infectious misfolded proteins (first purified in 1982 by Stanley Prusiner and his team from the University of California, San Francisco). Once introduced, prions induce normal proteins to misfold as well, propagating the malformed proteins throughout the infected tissue. Creutzfeldt–Jakob disease is a fatal neurological disease caused by prions in humans. Similar diseases in cows and sheep are called bovine spongiform encephalopathy (mad cow disease) and scrapie, respectively. Now Prusiner, along with Jan Stöhr, Joel Watts and their colleagues from the University of California, San Francisco have shown that Alzheimer’ disease (AD) may also be caused by prions.

Autopsies of AD patients show that their brains contain plaques made of amyloid beta peptides. As these plaques spread throughout the brain, they are believed to be responsible for the neurological problems associated with AD. Prusiner and the other researchers hypothesized that AD is initiated by the introduction of an amyloid beta prion that sets off a wave of plaque-creating misfolded amyloid beta peptides.

To test this, the scientists used mice that were genetically bred so that any plaques in their brains would fluoresce. Injecting the mice with either purified amyloid beta protein or with synthetically constructed amyloid beta peptides initiated a chain reaction of plaque formation in the mice’s brains. This strongly suggests that the peptides are acting as prions, that is, that they are infections proteins.

For more on this story, read Ed Yong’s article in The Scientist.

Thursday, June 28, 2012

Lead bullets stunt California condor recovery


The California condor (Gymnogyps californianus) is one of the rarest birds in the world. It came very close to extinction in 1982 when the last 22 individuals were caught and placed in captive breeding programs. That program was largely successful, and as of 2010, there were almost 400 condors, half of them in the wild. That’s the good news.

The bad news is that the birds still require regular interventions to keep them from dying out, including food supplementation and vaccination. Free-flying condors are caught twice a year for health checks and often need prolonged stays in treatment facilities. And even with all that assistance, only 24 chicks have fledged in the wild. The rest of the free-flying birds were all hatched in captivity and released as adults.

One of the leading culprits for the original decline of California condors was lead poisoning. Unfortunately, Myra Finkelstein of the University of California, Santa Cruz and her colleagues have found that lead poisoning is still a huge problem for the raptors. Lead is a heavy metal that has adverse effects on a number of tissues and organs, including the nervous system. The Centers for Disease Control sets an acceptable threshold for human adults of 25 micrograms of lead per deciliter of blood (µg/dl) and for children of 5 µg/dl. Children with blood lead levels above 45 µg/dl require chelation therapy. Wildlife managers have similar guidelines for condors.

The researchers measured the blood lead levels of 150 birds between 1997 and 2010.  Because each bird was caught multiple times, there were over 1100 independent blood samples. Each year, from 50 to 88% of captured birds exceeded the safe limit, and about 20% had levels above 45 µg/dl, requiring chelation therapy for acute lead poisoning. Almost half the birds required chelation at least once during the thirteen years of testing, and many were poisoned multiple times. The highest recorded blood lead level was 610 µg/dl.

In the U.S., lead has been banned from gasoline since 1986 and from household paints since 1987. So where is all this lead coming from? To find out, the researchers used stable isotopic analysis. Lead, like many elements, comes in more than one isotope or weight. Lead paint has a different ratio of these isotopes than lead ammunition or background environmental lead. Captive condors that have never been in the wild have lead with the background ratio, but the lead found in most poisoned free-flying birds has the same ratio as that of lead ammunition. It’s not that condors are being shot (though unfortunately, and illegally, some are), it’s that they eat carcasses that have been shot with lead bullets. Remember, condors are obligate scavengers, meaning they only eat dead animals. The birds inadvertently swallow bullet or buckshot fragments while consuming their prey.

Without an outright ban on lead ammunition, what’s the outlook for the condor? The authors estimate that with the current monitoring and interventions (but without future releases of chicks raised in captivity) there should be a stable population of a hundred and fifty or more birds… in about 1800 years. In other words, things are not going well for the California condor.

Wednesday, June 27, 2012

Just for fun: Enrichment stations



Students from Rice University’s George R. Brown School of Engineering were set the task of designing feeding stations that could double as mental enrichment tools for animals at the Houston Zoo.  Here’s what they came up with:

For orangutans:

 


I think my daughter's pediatrician’s office had something similar in the waiting room.

And for giraffes:



Tuesday, June 26, 2012

Asian origin of primates



There’s no question that the later stages of human evolution, from about 40 million years ago (mya), took place in Africa. However, primitive primates from before that time have been found in Asia, not in Africa, leading anthropologists to conclude that Asia is the original cradle of all hominins. At some point, early primates migrated from Asia to Africa (quite a trek if you consider that at that time there was a large sea separating the two continents). Researchers from Thailand, Myanmar, France and the U.S., led by Jean-Jacques Jaeger of the Université de Poitiers, discovered a new fossil primate (Afrasia djijidae) that sheds light on when that might have occurred.

The oldest well-documented African anthropoid fossils (Afrotarsius libycus) are found in North Africa and date from 38 to 39 mya. There is no record of primates in Africa from before that time. Meanwhile, A. djijidae was found in Myanmar and dates from about 37 mya. The two creatures are remarkably similar, although the Asian A. djijidae has slightly more primitive teeth. This strongly suggests that primates first arrived on the shores of Africa just under 40 mya. From there, they evolved into a variety of hominins that eventually included Homo sapiens.

Only four A. djijidae upper molars have been discovered thus far, and it took six field seasons of sifting through tons of sediment just to find those. This might not seem like much to go on, but you’d be surprised how much a paleontologist can learn from teeth. The authors give a two-page description detailing every facet and angle of the molars.

Below, you can see fossil molars from the two primates superimposed on a map of the region at that time. Note the Tethys Sea dividing Africa from Eurasia.


Striking morphological resemblance between the right upper molars of the Asian Afrasia djijidae and the contemporaneous African Afrotarsius libycus supports an Asia-to-Africa anthropoid dispersal during the middle Eocene. The regions where the two taxa were discovered are positioned on a paleogeographic map of the Old World during the late Eocene (35 mya).

By the way, the name ‘djijidae’ was chosen in memory of a young girl from the village near where the teeth were found. I think that’s rather nice.

Monday, June 25, 2012

Bad news about iris scanners



I don’t know about you, but I look forward to the day when I can stop memorizing passwords and use biometric identification (BI) systems instead. Unfortunately, iris BI may not be the ideal solution. According to Samuel Fenker and Kevin Bowyer of the University of Notre Dame, iris BI systems suffer from a phenomenon called ‘template aging’. That is, the inaccuracies in matching the enrollment (template) scan to subsequent scans mount as time passes.

In the case of iris scans, accuracy begins declining almost immediately and rapidly gets worse. After only one year, the false reject rate (the rate at which the system fails to accept that an iris scan is from the same person) went up by 25-60%. After three years, the false reject rate went up by 150%. 

To be clear, these are increases in percentages rather than actual numbers. In other words, if the false reject rate of pictures taken a month apart was one in a million, then three years later that same person would be rejected by a scanner two and a half times out of every million. If this sounds like nothing to worry about, remember that if iris BI systems became widespread with millions of people using the device, there will be many false rejections every day. 

So what causes template aging? The most obvious source of this problem is that a person’s physical attributes have shifted in some way. Contrary to popular opinion, irises do change as people age. However, Bowyer points out that other causes, such as deteriorating imaging equipment or even posture and environment (which might influence how dilated a pupil is) can also attribute to template aging. It's difficult to reconcile the short time frames at which template aging is observed with any of these processes though. Do people's eyes change that rapidly? Do they choose progressively odd positions in front of the scanner as time passes? It's hard to say what exactly is going on. In any case, it's important to be aware of the problem.

The authors believe that there will continue to be a place for iris BI systems, which is good news for companies that are already using it. Perhaps people will have to redo their template scans every year or so. On the other hand, we may discover what’s causing template aging and find a way to compensate for it. 


Sunday, June 24, 2012

Injections without needles


Here’s an innovation I think we can all get behind: needless injections. Andrew Taberner of the University of Auckland and Catherine Hogan and Ian Hunter from MIT have developed a controllable jet injection device that could replace standard needles for delivering drugs. Medicines would be delivered through the skin in the form of a high-pressured jet with no stabbing. The tiny hole created by the injection would quickly and painlessly heal.

Jet injection systems aren’t new. They were used to deliver vaccines as far back as the ‘60’s. I should note that this was before the notion was popularized by the original Star Trek series. Conventional jet injectors rely on a variety of energy sources, often chemical reactions or compressed springs or gases, to achieve the power necessary for high pressure drug delivery. Unfortunately, this means that doctors cannot control the injection speed once it’s begun.

In contrast, Hunter and his team used a Lorentz-force motor (essentially a magnet covered with a coil of wire). When current is applied to the device, this creates a force that plunges a piston and delivers the drug. By altering the amount of current, researchers can control the velocity and pressure of the drug delivery in real time.  For example, they can use greater force to penetrate the skin, and then less force to dissipate the drug once it’s passed through that barrier.

If I’m breaching a baby’s skin to deliver vaccine, I won’t need as much pressure as I would need to breach my skin. We can tailor the pressure profile to be able to do that, and that’s the beauty of this device.

Needleless injection system

MIT-engineered device injects drug without needles, delivering a high-velocity jet of liquid that breaches the skin at the speed of sound.
Image courtesy of the MIT BioInstrumentation Lab.

Thus far, the device has only been tested on gels and animals, not humans. Due to concerns about infection, the World Health Organization no longer recommends conventional jet injectors for the delivery of vaccines. This new injector would have to have a better safety record to win over the medical establishment. I for one, am rooting for it to do so!

You can see an explanation by Hunter and Hogan below:


Saturday, June 23, 2012

Crayfish have teeth like us?


Shmuel Bentov of Ben-Gurion University and his colleagues have found that one type of freshwater crayfish (Cherax quadricarinatus) has an enamel-like coating over its molars. Right now, I’m imagining Sebastian the crab from Disney’s The Little Mermaid breaking into a toothy grin. That’s not quite what the researchers found.

In reality, crayfish look like this:


They do not have teeth that look like this:
Digital image created by Sam Fentress 7 June, 2005.

They do however have mandibles with a grinding surface that are analogous to our molars.

Teeth have a hard outer surface, a softer, more pliable center for absorbing shock and tension, and a binding layer between the two. For vertebrates such as ourselves, the outer layer is composed of calcium phosphate hydroxyapatite. Invertebrates have employed a number of minerals to harden their teeth including calcium carbonate, iron oxide and silica, but crystalline apatite has not been one of them. Apparently, C. quadricarinatus has not been keeping up with the literature because its mandibles contain fluorapatite crystals that are quite similar to the apatite crystals in our teeth.

Unlike vertebrate teeth, the crayfish mandibles are part of their exoskeleton. As such, they are shed every time the animal molts. Despite this rather significant difference in tooth maintenance, C. quadricarinatus has teeth that are quite similar to the ones found in vertebrates. This is most likely an example of convergent evolution (separate lineages converging on a single solution to a common problem, in this case, how to process food).

Friday, June 22, 2012

Nerve transfer restores function


You may have heard about connecting brains directly to robotic arms. However, that’s not the only way to allow a paralyzed person to pick up a soda. Susan Mackinnon, Andrew Yee and Wilson Ray from the Washington School of Medicine, St. Louis and their colleagues chose a different tactic to give a paralyzed man back the use of his hands.

The man had not been able to move his fingers since a car accident injured his spinal cord two years ago. This was because the nerve that normally activates hand muscles (shown in red below) originates below the injury site at the C7 vertebrate. Doctors determined that one of the nerves responsible for elbow function (green) originated above the injury and was still functional. The surgeons therefore attached the upper arm nerve to the hand nerve. Once this was done (in two separate surgeries, one for each hand), signals from the man’s brain that previously would have only affected his upper arms were now able to travel all the way to his hands.


To detour around the block in this patient's C7 spinal cord injury and return hand function, Mackinnon operated in the upper arms. There, the working nerves that connect above the injury (green) and the non-working nerves that connect below the injury (red) run parallel to each other, making it possible to tap into a functional nerve and direct those signals to a non-functional neighbor (yellow arrow).

This is by no means a quick fix. It took the patient ten months just to be able to pinch his thumbs and index fingers together. Nevertheless, everyone involved in this case has every reason to be pleased with the outcome. As of this writing, the patient can feed himself and even do some writing. There’s every expectation that he will continue to improve. 

If I lost the use of my hands, would I prefer to have my nerves rewired so I could regain that usage, or to control objects with my mind? That’s a tough question to answer for someone who has had decades of experience controlling things with her hands and zero experience controlling things with her mind. In any case, we’re far from being able to give patients that choice. The important thing is that paralysis is becoming less and less of a life sentence.

Thursday, June 21, 2012

Octopus: master of disguise



As we saw yesterday, octopuses have an amazing ability to camouflage themselves. This is particularly remarkable when you consider that any predator the octopus would be hiding from would have a completely different vantage point. The octopus’s view of the sea floor  is entirely different from that of a fish hunting from above. If the target area is littered with many colors and textures, as in a coral reef, the octopus has to choose whether to mimic a specific item or to match the general appearance of the background. Noam Josef and Nadav Shashar of Ben-Gurion University of the Negev, and Piero Amodio and Graziano Fiorito of Stazione Zoologica Anton Dohm tested some octopuses to see which way they would blend.

The researchers used eleven images of Octopus cyanea and O. vulgaris, taken in the wild by SCUBA divers. Because of the way the sampling was done, the authors were confident that each image represented a different specimen. However, as the octopuses were not tagged (and were indistinguishable to human eyes), there is some chance that the same individual was photographed twice. For each picture, the appearance of the mantle (body of the octopus) was compared to that of the total background area and to specific objects within that field of view.

It turns out that octopuses mimic particular objects around them, and not the general appearance of a stretch of sea floor. In other words, an octopus will try to impersonate one rock rather than a rock-strewn field. This means that you can often see the octopus easily enough, you just didn’t realize that it was an octopus. If you watch yesterday’s video clip again, you can find examples of this. 

Wednesday, June 20, 2012

Just for fun: Camouflage!



Cephalopods are masters of disguise.  They can alter not only their color, but their texture and size as well.  Here are some great examples shown by oceanographer David Gallo at a TED talk:




Tuesday, June 19, 2012

Circadian rhythm disruption affects fertility in mice



I know what you’re thinking: Who needs more mice? Actually, this result could also be of importance for humans who seek to have children. Keith Summa, Martha Vitaterna and Fred Turek from Northwestern University have shown that major disruptions in day/night cycles are associated with dramatic decreases in successful pregnancies, at least for mice.

The researchers divided newly pregnant mice into three groups. One group (control) lived in a constant and uniform cycle of twelve hours light, twelve hours dark. The second group (phase-delayed) had their lights go on six hours later every five to six days, and the final group (phase-advanced) had reveille six hours earlier every five to six days. This means that the ‘daylight’ hours of the latter two groups had been shifted completely around the clock by the time they gave birth three weeks after conception.

Mice in the control group went on to successfully have pups 90% of the time. In contrast, the phase-delayed and phase-advanced groups had babies only 50% and 22% respectively. Like most humans, mice apparently find getting less sleep more troubling than getting more sleep.

Previous studies have shown a correlation between circadian rhythms and the ability to conceive. This new data shows that rapid shifting of the day/night schedule can also affect pregnancy outcomes after conception. Keep in mind that this study only involved 48 mice and that results from mouse studies don’t always apply to humans. Still, if you are attempting to have a child, you might consider avoiding shift-work if you can.

For a fascinating discussion of circadian rhythms, don’t miss Ed Yong’s post here.


Monday, June 18, 2012

Exercise and genotype affect cognition


It's well established that exercise can affect mental as well as physical capacities. However, your genotype (genetic makeup) may play a surprisingly large role in determining just how much influence exercise can have on your cognitive abilities. David Bucci and his colleagues from Dartmouth College ran some experiments to demonstrate this fact.

Fifty-four healthy, sedentary young adults completed their study. All participants underwent genetic testing to determine which type of brain-derived neurotrophic factor (BDNF) they hadBDNF, a protein involved in stimulating neuron growth and known to increase with exercise (at least in rats) comes in various flavors, or alleles. The most common alleles encode either a valine (Val) or a methionine (Met) at position 66 of the protein. A single base pair change in the DNA determines which of these types of BDNF a person will make.

On day one of the study, volunteers were assessed for physical fitness. They were then given a ‘novel object recognition’ (NOR) test that consisted of two parts. First, they were shown a set of 50 images, one by one. They next took a fifteen-minute break to fill out a mental health questionnaire. Finally, they were shown 100 images, half of which they had seen before and half of which were brand new pictures. Their task was to distinguish between the two.

The participants were then divided into groups and told to return in four weeks. One group was instructed to continue not exercising at all (control). A second group was asked to exercise for at least 30 minutes four times a week. This group was further divided into two, one of which also exercised on the final test day (4W+) and one of which did not exercise that day (4W-). A last group remained sedentary for the entire four weeks but did exercise on the final test day (0W+). The participants all wore pedometers to prevent cheating.

Four weeks later, physical fitness exams and NOR tests were repeated. The only group to show improvement in the final NOR test was the 4W+ group. These were the people who had exercised for four weeks and on test day. It’s no great surprise that regular exercise improves cognitive function. But here’s the fascinating part. When the 4W+ group was further divided by genotype, it turned out that only those who were homozygous for the Val allele (meaning neither of their two copies of BDNF contained methionine at position 66) showed improvement. Met carriers did not show improvement.

The 4W+ group did accrue other benefits, such as lowered stress levels, regardless of genotype. I don’t think the take home lesson from this experiment should be, ‘don’t bother exercising if you have a BDNF Met gene.’ However, knowing how one’s genotype can affect one’s health might lead to more personalized exercise regimens.  

Sunday, June 17, 2012

Concussions are worse for women and kids



Concussions are often in the news and for good reason. Over 300,000 sports related concussions occur in the United States each year. And those are just the ones that resulted in visits to emergency rooms. As efforts are made to diagnose, treat and prevent concussions, one thing is becoming clear. Not all people react to or recover from concussions equally.

First, what is a concussion? A concussion is simply a brain injury caused when the brain collides with the inside of the skull. This can be the result of a direct impact to the head or of the rapid deceleration of the entire body. Just as you catapult against your seat belt (or your windshield if you’re foolish enough to not buckle up) when your car suddenly stops, your brain will slam into the inside of your skull when your body suddenly stops.

Needless to say, brain injuries are bad things, especially for kids. High school students can have memory impairments for up to 2 weeks and reaction time impairments for up to 3 weeks after suffering from a concussion. In contrast, college students are usually back to normal in less than one week. Tracey Covassin of Michigan State University and her colleagues wanted to see whether gender also played a role in recovery from concussion.

Ideally, for a study of post-concussive cognitive impairment, you’d want to compare results in each individual before and after receiving a concussion. Unfortunately, you can’t hit people over the head with a crow bar, even for science. You can, however, give a baseline test to a large group of people and hope that some of them will later suffer a traumatic brain injury. Thus, the researchers administered the pre-concussion cognitive tests to 2000 high school and college athletes, and the post-concussion tests to the 222 of them that were thoughtful enough to provide a concussion sample.

As expected, high school kids had more prolonged deficits than college-aged kids. Women also suffered from a slightly greater loss of function than men.  In addition, women complained of more post-injury symptoms (though they may have just been more willing to admit to having symptoms). However, the differences between genders were not as significant as the differences between high school age and college age athletes. 

The data suggests that, like teenagers, women be held back from rejoining their teams a bit longer than men. While one week may be more than enough recovery time for adult men, women and kids may require as much as three times as long to get back to normal.

Image by Patrick J. Lynch, medical illustrator, 2/2/2008.



Covassin T, Elbin RJ, Harris W, Parker T, & Kontos A (2012). The role of age and sex in symptoms, neurocognitive performance, and postural stability in athletes after concussion. The American journal of sports medicine, 40 (6), 1303-12 PMID: 22539534.




Saturday, June 16, 2012

Miniaturizing crops


Given the necessity to provide food for more people with less space (see my post on biodiversity), it makes sense to cultivate more efficient crops. Now there may be a way to do so. 


Like animals, plants produce and respond to a variety of steroidal hormones. One group of these molecules, the brassinoseteroids (BRs), affect many facets of plant growth. By blocking this group of hormones, you can create plants that are much smaller than their ‘wild-type’ or normal cohorts. Such smaller plants would require less water and fertilizer for the same yield.

There are ways to block BRs, but they tend to be very expensive. The typical BR inhibitor is called brassinazole and costs over $20,000 per gram. This is far too expensive for most researchers, let alone farmers. Luckily, Bernard Schulz of Purdue University and his colleagues from Purdue, Stanford and Seoul National University have found a substitute. It turns out that the common fungicide propiconazole (Pcz) will do the same thing for ten cents a gram.

When plants are treated with Pcz they show a dose dependent size reduction. In addition, some plants, such as sorghum and maize, are feminized. That is, they produce only female parts instead of both male and female. This makes it easier to control pollination. By carefully titrating the amount of Pcz given to plants, scientists can control the size and yield of that crop. This could also lead to slow-growing grasses that need much less care and water.

You can see Schulz’s explanation below. As an aside, nice shoes.





Friday, June 15, 2012

No bow shock as the sun travels around the galaxy



Our sun isn't preceded by a bow shock. Unless you know a whole lot more about cosmology than I do, I’ll need to give some background information before I can discuss this finding. In particular, we’re going to need a crash course in heliology, and lesson one is that this is the study of the sun.

Just as the Earth and other planets orbit the sun, the sun drags all its minions along as it in turn orbits the center of the galaxy. As it moves, the sun ejects a continuous stream of charged particles (the solar wind) that surrounds the sun and all the planets in a giant bubble. This inflated bubble, known as the heliosphere, includes a spherical inner region that encompasses the entire solar system and a much larger tear-drop shaped outer cocoon called the heliosheath.  The solar wind travels at high speed within the inner bubble, and then slows down abruptly as it passes through the termination shock into the heliosheath. Eventually, the solar wind dies out completely at the edge of the heliosphere. This final boundary between the entire heliosphere and the interstellar medium that makes up outer space is called the heliopause.

You can see a diagram of all this below.



There’s just one problem with this picture: there is no bow shock. Which brings us to the paper authored by David McComas of the University of Texas and his colleagues. They used the Interstellar Boundary Explorer (IBEX) spacecraft to prove that the heliosphere is traveling through space too slowly to produce a bow shock.

The prevailing theory of the past quarter century had been that the heliosphere was plowing through space quickly enough to cause a bow shock, not unlike the sonic booms caused by supersonic jets and bullwhips. However, it seems that cosmologists have overestimated the speed of our sun. To make matters worse, the researchers also found that the strength of the magnetic field outside the heliosphere had been underestimated, meaning it would have required even greater speeds to create a bow shock. Taken together, this means that there’s little chance that our solar system is pushing a bow shock ahead of it as it travels through space.

As McComas explains:
While bow shocks certainly exist ahead of many other stars, we're finding that our Sun's interaction doesn't reach the critical threshold to form a shock, so a wave is a more accurate depiction of what's happening ahead of our heliosphere -- much like the wave made by the bow of a boat as it glides through the water.


Thursday, June 14, 2012

Bad news for biodiversity and humankind


It has been known since the 80’s that removing a single species (aka extinction) from a biological community can have profound effects on that ecosystem. But just how closely are biodiversity (a measure of the variety of species) and healthy ecosystems linked? To answer that question, Bradley Cardinale from the University of Michigan and 16 other authors from institutions in the U.S., U.K., Canada, Sweden and France reviewed two decades of research on biodiversity and its effect on local and global ecosystems.

Among their findings are the following:

The rate of conversion of resources (be they photons, plants or prey) into new biomass suffers as biodiversity decreases. In other words, fewer species means less food, fiber and other products. Other natural processes, such as decomposition and recycling, also decline as species number goes down.

It's a combination of the presence of specific key species and the total diversity that makes an ecosystem productive. You need both.

And perhaps most sinister: the change is nonlinear. That is, as you lose biodiversity, the rate at which that loss impacts our environment speeds up. Add to this the fact that the rate of species loss has been steadily increasing, and you have a crisis in the making.

In fact, another international study led by Anthony Barnosky of the University of California, Berkeley and Elizabeth Hadly from Stanford University suggests that we may already be at a tipping point of irreversible change. According to the 22 authors of that study, our present rate of climate change exceeds that seen at any time since the end of dinosaurs.

It’s not only carbon emissions that matter though, but land use as well. Once somewhere between 50-90% of an area has been altered (as for farms or cities), that region can no longer support the original ecosystem. As of right now, about 43% of the Earth’s total land surface has been completely modified for human usage. That percentage is expected to increase to 50% by 2025.

The authors recommend that governments take immediate action to forestall the coming catastrophe. In particular, they insist that the Earth’s nations:
Reduce world population growth and per-capita resource use, replace fossil fuels with sustainable sources, develop more efficient food production and distribution without taking over more land, and better manage the land and ocean areas not already dominated by humans as reservoirs of biodiversity and ecosystem services.
Unfortunately, I put the chances that my government (the U.S.) will agree to these actions at somewhere between zip and null. However, there’s a United Nations Earth Summit planned for June 20-22 in Rio de Janeiro, Brazil to discuss biodiversity and climate change. Perhaps something can be accomplished.

Image caption: The Earth may be approaching a tipping point due to climate change and increasing population.
Credit: Cheng (Lily) Li.

Wednesday, June 13, 2012

Just for fun: Periodic Table symbol song



David Newman sings the periodic table:





If you didn't catch the names of all the elements from that, try this one:





Once you sing these songs a few times, you too will be able to fill in a periodic table from memory:







Hat tip: Skepchick.

Tuesday, June 12, 2012

The rewriteable DNA module



Wouldn’t it be great if we could insert a bit of programmable DNA into an organism’s genome and then modify that DNA after the fact to contain whatever instructions we wished? Well, we can’t quite do that yet. But we may be one step closer to that goal, thanks to the work of Jerome Bonnet, Pakpoom Subsoontorn and Drew Endy of Stanford University. They have designed what they refer to as ‘rewriteable recombinase addressable data' (RAD) module.

In essence, this is a cartridge of DNA that contains one set of instructions when inserted in one direction, but at different set of instructions when inverted. At any point, the researchers can determine which instructions are given to the cell (in this case E. coli) by controlling the orientation of that cartridge.

The researchers relied on the fact that some types of bacteriophages (also called phages), have perfected the art of shuttling their own DNA in and out of their hosts’ genomes. These are viruses that only attack bacteria. They use enzymes (integrases) to insert their genomes into that of the host cell at strategic locations. When the viruses are ready to leave the cell, other enzymes (excisionases) clip out the viral DNA and package it into protein coats.

The RAD module includes the genes for an integrase and an excisionase. Under some conditions only integrase is made and under others both are made. When integrase alone is produced, the module is inserted in one direction. When both are produced, the module is snipped out, inverted and reinserted in the opposite orientation.  Because the RAD module also encodes fluorescent reporter proteins, the orientation of the module can be monitored by observing whether the cell glows red or green.

To be clear, once inserted, the RAD module can’t be rewritten to include any new instructions. It only contains whatever nucleotides it started with. In this case, it contained the instructions to glow red or to glow green. However, the researchers were able to control which of those two instructions the cell followed. After some fine tuning to prevent spontaneous flipping, they were able to reliably switch back and forth between those two sets of instructions.

I’m not convinced that this qualifies as a rewriteable system. In my opinion, it’s more like inserting an on/off switch. But there's no denying that it's an ingenious idea. More importantly, the RAD module could turn out to be extremely useful, especially if you could insert whatever genes you wished in place of the ones encoding red or green fluorescent proteins.

Monday, June 11, 2012

A potential cure for Type 1 diabetes


There may one day be a cure for people suffering from Type 1 diabetes. Defu Zeng of the City of Hope National Medical Center plus over a dozen colleagues were successful in reversing Type 1 diabetes in 60% of their mouse subjects.

Contrary to its nickname ‘juvenile diabetes’, Type 1 diabetes can occur at any age, though it is most often diagnosed before early adulthood. In non-diabetics, the pancreas contains specialized 'beta cells' that produce insulin. Without beta cells, there is no insulin, and without insulin we can’t move glucose from our blood streams into our cells. People with Type 1 diabetes suffer from an autoimmune disease in which they destroy their own beta cells.

This means that transplanting beta cells into people with Type 1 diabetes is only a temporary fix. Before long, their immune systems will attack the new cells just as they did the original ones. Therefore, any permanent solution must cure the immune system of its taste for beta cells.

To combat this problem, the researchers used a combination of two methods that independently were not able to treat the disease. First, they used bone marrow transplants combined with immunosuppressive therapies to create ‘mixed chimeras’ between diabetic mice and healthy donors. The immune systems of these mice no longer attacked beta cells. Next, they used the hormone gastrin plus epidermal growth factor to stimulate the growth of new beta cells. In 60% of the mice treated with both regiments, Type 1 diabetes was fully reversed.

The authors are the first to caution that this technique is far from clinical trials, let alone FDA approval. Still, it may give new hope to the millions of people suffering from diabetes.

Sunday, June 10, 2012

UN Millennium Development Goal number four



The United Nations has put forth eight ‘Millennium Development Goals’(MDG) aimed at improving the health and happiness of people all over the world. These are rather ambitious goals. The first one is to end poverty and hunger, and the second to ensure that every child on Earth receive at least a primary school education. Did I mention that these projects are to be accomplished by the year 2015?

The fourth goal is to reduce by two thirds (from a base 1990 rate) the child mortality rate (the number of children who die before the age of five). Researchers from Johns Hopkins, the London School of Hygiene and Tropical Medicine, the University of Edinburgh and the World Health Organization analyzed the causes of death for children under age five over the past decade to see whether MDG 4 is on track.

The scientists found that 64% of all early childhood deaths were due to infections, half of these from pneumonia, diarrhoea or malaria. 40% of deaths occurred within the first month of life, the majority of these from pre-term birth complications (still births and late miscarriages).

There were some striking differences in mortality rates in different regions. Everywhere except Africa, around 50% of deaths occurred within the first month of life. In Africa, only 30% of childhood deaths occurred during the neonatal stage. Not surprisingly, children in Africa also suffered from much higher rates of malaria and AIDS than those from any other regions. Death from injury was much more common in the Americas than anywhere else.

The good news is that this represents a decrease in childhood mortality by about 25% over the past decade. Most of this improvement is due to a reduction in infections. Unfortunately, there has not been a compensatory decrease in neonatal deaths, and without that, we’ll never reach the UN’s MDG 4.

In case you’re interested, here are the other five goals:
  • MDG 3: Promote gender equality and empower women.
  • MDG 5: Improve Maternal health by reducing maternal mortality and providing reproductive health care.
  • MDG 6: Combat HIV/AIDS and other diseases
  • MDG 7: Ensure environmental sustainability
  • MDG 8: Develop a global partnership with developing countries 


Saturday, June 9, 2012

Recombination between RNA and DNA viruses



Viruses are classified into three broad groups depending on their nucleic acid content. There are RNA viruses, DNA viruses, and retroviruses (that contain RNA but use a DNA intermediate in order to replicate). Examples of these three types are rhinoviruses, parvoviruses and HIV, respectively. Although genetic material is readily exchanged within groups, gene transfer from one group to another has not been observed.  Until now.

Geoffrey Deimer and Kenneth Stedman from Portland University discovered a virus with a genome consisting of a circular strand of DNA. So far, this isn’t unusual. However, this particular virus includes the gene for a protein that has only been seen in RNA viruses. This strongly suggests that this gene hopped from an RNA virus into a DNA virus, an unprecedented event.

Think for a minute what this must have entailed. The DNA within our cells resides in the nucleus. It gets transcribed into RNA and that RNA migrates into the cytoplasm where it is translated into proteins. The RNA virus skips this DNA step. Once an RNA virus enters a cell, the machinery in that cell’s cytoplasm translates the genes directly into protein. The RNA virus can ignore the cell nucleus as no DNA is involved at all. To be incorporated into a DNA genome, the viral RNA gene would have had to have been reverse transcribed into DNA, requiring an enzyme that retroviruses supply, but that is not present in cells or other types of viruses. Yet, somehow this event must have occurred to create the virus found by the researchers.

Exchanging bits between two different RNA viruses or between two DNA viruses requires little more than a bit of cutting and pasting. If viruses can also transfer genes between groups, the possibilities for new combinations go up dramatically.

Friday, June 8, 2012

Why do chimps nest on the ground?



All great apes spend each night in a freshly constructed nest. While gorillas often nest on the ground, the other great apes (chimpanzees, bonobos and orangutans) mostly nest in trees. Nevertheless, chimps have been observed to make the occasional ground nest. Why the move down from the trees? Kathelijne Koops and her colleagues from the University of Cambridge plus William McGrew from Kyoto University tested one possible reason.

The researchers hypothesized that males were putting their nests below the arboreal nests of sexually receptive females. In other words, the males were camping out below potential mates to guard them from the attentions of other males. To test this idea, the researchers went to the Seringbara region of the Nimba Mountains in the Republic of Guinea, a local where chimps are known to nest on the ground much more frequently than in other regions.

Because nests are abandoned every day, the researchers were able to approach old nests and extract hairs for genetic sequencing. DNA tests showed that at least twelve individuals out of a total community of 36 adults nested on the ground. Thus, about a third of the chimps in the area regularly nested on the ground. Of those ground nests, 65% had definitely been occupied by males and only 9% by females (the rest were indeterminate—you can’t always get good DNA samples from hair). So far so good for the Juliet’s balcony theory. Unfortunately, the closest tree nest above these ground nests also tended to be occupied by males. So much for the guarding sexual partners hypothesis.

By the time our hominin ancestors started sleeping on the ground, they may have had the protection of fire to keep away predators. This clearly is not the case for ground-nesting chimps. Likewise, the decision of where to spend the night was not influenced by either weather or scarcity of proper sleeping trees. In fact, many of the chimps built nests both in trees and on the ground, sometimes on the same night. If it’s not weather, resources, predators or sex, on what basis do chimps decide to sleep on the ground? This study was not able to answer this question, which is too bad because it might have shed some light on why our ancestors made the same decision a few million years ago.


Update: Since I wrote this, two new studies appeared on this subject, this time linking the choice of nest site to wind and humidity respectively. These studies suggest that chimps may simply be nesting wherever they feel most comfortable. I'm not convinced that the answer is that simple, especially since weather did not seem to be a determining factor in the original study.