March 13, 2012

What Have We Got in Common With a Gorilla? Insight Into Human Evolution from Gorilla Genome Sequence


Researchers have just completed the genome sequence for the gorilla -- the last genus of the living great apes to have its genome decoded. While confirming that our closest relative is the chimpanzee, the team show that much of the human genome more closely resembles the gorilla than it does the chimpanzee genome.
This is the first time scientists have been able to compare the genomes of all four living great apes: humans, chimpanzees, gorillas and orang-utans. This study provides a unique perspective on our own origins and is an important resource for research into human evolution and biology, as well as for gorilla biology and conservation.
"The gorilla genome is important because it sheds light on the time when our ancestors diverged from our closest evolutionary cousins. It also lets us explore the similarities and differences between our genes and those of gorilla, the largest living primate," says Aylwyn Scally, first author from the Wellcome Trust Sanger Institute. "Using DNA from Kamilah, a female western lowland gorilla, we assembled a gorilla genome sequence and compared it with the genomes of the other great apes. We also sampled DNA sequences from other gorillas in order to explore genetic differences between gorilla species."
The team searched more than 11,000 genes in human, chimpanzee and gorilla for genetic changes important in evolution. Humans and chimpanzees are genetically closest to each other over most of the genome, but the team found many places where this is not the case. 15% of the human genome is closer to the gorilla genome than it is to chimpanzee, and 15% of the chimpanzee genome is closer to the gorilla than human.
"Our most significant findings reveal not only differences between the species reflecting millions of years of evolutionary divergence, but also similarities in parallel changes over time since their common ancestor," says Dr Chris Tyler-Smith, senior author from the Wellcome Trust Sanger Institute. "We found that gorillas share many parallel genetic changes with humans including the evolution of our hearing. Scientists had suggested that the rapid evolution of human hearing genes was linked to the evolution of language. Our results cast doubt on this, as hearing genes have evolved in gorillas at a similar rate to those in humans."
This research also illuminates the timing of splits between species. Although we commonly think of species diverging at a single point in time, this does not always reflect reality: species can separate over an extended period of time.
The team found that divergence of gorillas from humans and chimpanzees occurred around ten million years ago. The split between eastern and western gorillas was much more recent, in the last million years or so, and was gradual, although they are now genetically distinct. This split is comparable in some ways to the split between chimpanzees and bonobos, or modern humans and Neanderthals.
"Our research completes the genetic picture for overall comparisons of the great apes," says Dr Richard Durbin, senior author from the Wellcome Trust Sanger Institute, "After decades of debate, our genetic interpretations are now consistent with the fossil record and provide a way for palaeontologists and geneticists to work within the same framework.
"Our data are the last genetic piece we can gather for this puzzle: there are no other living great ape genera to study."
Gorillas survive today in just a few isolated and endangered populations in the equatorial forests of central Africa. They are severely threatened and their numbers are diminishing. This research not only informs us about human evolution, but highlights the importance of protecting and conserving the full diversity of these remarkable species.

Post Comment

HP Labs Using Laser-Powered Chips for Faster, Energy-Efficient Computing


Nothing is faster than the speed of light(except the Neutrinos). So you might think that is why HP Labs wants to use photonics (light) instead of electrons to set new computing speed limits. But you would be wrong.
HP Labs' laser-based chip project, codenamed Corona, would use light instead of electricity to communicate information between chip cores and also from the chips to memory in order to save space and to save energy.
Laser beam

A Short History of Data on the Wire

If you are old enough (or curious enough), you may have tested your soldering skills assembling electronic bits and pieces into primitive number crunchers consisting of transistors that are, literally, wired together. All those electronic parts and wires were replaced by the integrated circuits, aka microchips, which remain the basis of the modern computer.
An integrated circuit consists, typically, of a silicon wafer that carries thousands, or even millions, of electronic components (resistors, capacitors, and transistors) that are "wired" together by electrodeposited metals. The miniaturization and compression of these components are the subject of "Moore's Law", which observes that the number of components in an integrated circuit doubles every 18 months.
There are two major obstacles foreseeable with the continuing growth in computer processing speeds based on the current technology:
  1. Miniaturization approaches non-negotiable physical barriers, and
  2. Energy consumption increases wildly as data processing rates speed up.
Data transmission by laser addresses both issues.

Rainbows at Nano-scale

Communicating data by light already occurs on a wide-spread basis, via fiber optics. Scaling laser light generation and integrating it onto microchips lies at the heart of the HP Corona project.
MIT has demonstrated the technology to generate laser light using materials that are compatible with microchip manufacturing processes, meaning that tiny lasers could be built directly into integrated circuits. Once generated, the laser light beams along a nano-scale waveguide many times thinner than a single fiber optic cable.
The trick that really saves space copies a technique called Dense Wavelength Division Multiplexing (DWDM), which is currently used in telecommunications: light of different wavelengths can multiply the information sent over a single "light wire". HP Labs has demonstrated that up to 64 wavelengths can be managed in 64 "ring resonators", a circle along the path of waveguide, in less that one millimeter. Using electrical impulses at 10GHz to "tune" the ring resonators results in 10Gb/s of data transmission times 64 wavelengths for 640 Gigabits of data per second running along a single "wire".

Lightening the Energy Demand

Projecting from the energy consumption of current super-computers, an exascale computer (performing 1018 or a million trillion operations/second) would require its own Hoover Dam worth of power. Most of that power is consumed not in performing operations, but in communicating between parallel processors to schedule tasks and balance loads or in sending data to memory.
Laser-based data transmission reduces heat loads compared to electricity, and the greater bandwidths for communication significantly reduce the power requirements. According to Wired: "Using electronics for a 10-terabytes-per second channel between a CPU and external memory would require 160 watts of power. But HP Labs researchers calculate that using integrated photonics lowers that to 6.4 watts."
Even before exascale computing appears on every desktop, the energy savings could be significant.

Coming Soon to a Desktop Near You

Photonic data transfer between components within an integrated chip needs a decade more research, but data transfer between cores or from cores to memory could come to fruition soon. HP targets bringing a 3-D chip with high-bandwidth photonic communications between 256 processor cores to the market by 2017.
HP's Corona project races against other giants -- such as Intel (Runnemede), MIT (Angstrom), NVIDIA (Echelon), and Sandia National Labs (X-calibur) -- in the search to make high performance computing ubiquitous.

Post Comment

Apple Advances Toward the iWallet: Is This Good or Bad?


Apple has received a major Granted Patent for a system that will allow it to link users' iTunes accounts to mobile purchases, reported Forbes. To put it extremely simply, this is Apple's war on cold hard cash: the iWallet. But what does the potential end of paper and metal currency mean for the planet, and just how much better or worse off would be off using iPhones and iPads instead?

The Upsides

A dollar bill has 3 grams of embodied greenhouse gases in it. (A dollar coin has 15 grams, but lasts longer and is recyclable, among other advantages.)
Taking the physical money out of the equation makes that carbon footprint go away. If Apple (and Google, which has a similar experiment going) can head off security questions, it could make carrying around cash at night a whole lot safer. Of course, you can still get mugged for your iPhone.
Also, shopping with your smartphone can make you a more sustainably-minded and informed consumer.

The Downsides

It's not all good. First of all, this is more of what Apple does best: promote and enable consumerism, which provokes existential questions about the green movement, not to mention our economy and society. But that's another discussion.
Assuming that we would buy as much with cash as with an iWallet, e-money comes with its own carbon footprint. However many good green things you can do with them, smartphones and tablets take an enormous toll on the planet through their production. Not to mention the ethical firestorm Apple just rode out over its factories in China.

So?

Apple and Google aren't the only ones attempting to "cash in" on this wave of the future: Square and Intuit already process payments via smartphone. If they all fail, more attempts will likely be made anyway. Like books, money is moving more and more out of our hands and onto our screens.
Ultimately, tacking consumerism, and not its 21st century incarnation, must be the way to go. Kill the snake by cutting off its head.

Post Comment