New layer in human eye discovered

Published June 12, 2013

LiveScience

  • 0_21_450_Eye.jpg
    iStock
Scientists have discovered a previously unknown layer lurking in the human eye.

The newfound body part, dubbed Dua’s layer, is a skinny but tough structure measuring just 15 microns thick, where one micron is one-millionth of a meter and more than 25,000 microns equal an inch. It sits at the back of the cornea, the sensitive, transparent tissue at the very front of the human eye that helps to focus incoming light, researchers say.

The feature is named for its discoverer, Harminder Dua, a professor of ophthalmology and visual sciences at the University of Nottingham. Dua said in a statement that the finding will not only change what ophthalmologists know about human eye anatomy, but it will also make operations safer and simpler for patients with an injury in this layer.

“From a clinical perspective, there are many diseases that affect the back of the cornea, which clinicians across the world are already beginning to relate to the presence, absence or tear in this layer,” Dua said in a statement.

Dua and colleagues, for example, believe that a tear in the Dua layer is what causes corneal hydrops, which occurs when water from inside the eye rushes in and leads to a fluid buildup in the cornea. This phenomenon is seen in patients with keratoconus, a degenerative eye disorder that causes the cornea to take on a cone shape.

Dua’s layer adds to the five previously known layers of the cornea: the corneal epithelium at the very front, followed by Bowman’s layer, the corneal stroma, Descemet’s membrane and the corneal endothelium at the very back.

Dua and colleagues found the new layer between the corneal stroma and Descemet’s membrane through corneal transplants and grafts on eyes donated for research. They injected tiny air bubbles to separate the different layers of the cornea and scanned each using an electron microscope.

The research was detailed in the journal Ophthalmology.

Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Read more: http://www.foxnews.com/science/2013/06/12/new-body-part-layer-in-human-eye-discovered/?intcmp=features#ixzz2Wa7PexyT

New layer in human eye discovered

Published June 12, 2013

LiveScience

  • 0_21_450_Eye.jpg
    iStock
Scientists have discovered a previously unknown layer lurking in the human eye.

The newfound body part, dubbed Dua’s layer, is a skinny but tough structure measuring just 15 microns thick, where one micron is one-millionth of a meter and more than 25,000 microns equal an inch. It sits at the back of the cornea, the sensitive, transparent tissue at the very front of the human eye that helps to focus incoming light, researchers say.

The feature is named for its discoverer, Harminder Dua, a professor of ophthalmology and visual sciences at the University of Nottingham. Dua said in a statement that the finding will not only change what ophthalmologists know about human eye anatomy, but it will also make operations safer and simpler for patients with an injury in this layer.

“From a clinical perspective, there are many diseases that affect the back of the cornea, which clinicians across the world are already beginning to relate to the presence, absence or tear in this layer,” Dua said in a statement.

Dua and colleagues, for example, believe that a tear in the Dua layer is what causes corneal hydrops, which occurs when water from inside the eye rushes in and leads to a fluid buildup in the cornea. This phenomenon is seen in patients with keratoconus, a degenerative eye disorder that causes the cornea to take on a cone shape.

Dua’s layer adds to the five previously known layers of the cornea: the corneal epithelium at the very front, followed by Bowman’s layer, the corneal stroma, Descemet’s membrane and the corneal endothelium at the very back.

Dua and colleagues found the new layer between the corneal stroma and Descemet’s membrane through corneal transplants and grafts on eyes donated for research. They injected tiny air bubbles to separate the different layers of the cornea and scanned each using an electron microscope.

The research was detailed in the journal Ophthalmology.

Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Read more: http://www.foxnews.com/science/2013/06/12/new-body-part-layer-in-human-eye-discovered/?intcmp=features#ixzz2W8LCeWV0

Humans in 100,000 years: What will we look like?

By Michael Roppolo

Published June 12, 2013

FoxNews.com

  • Future faces - Today.jpg

    Modern-day humans may someday evolve to have larger eyes, more pigmented skin and a thicker eyelids, thanks to genetic engineering technology. Here’s how they’ll change. (Nickolay Lamm / MyVoucherCodes.co.uk)

  • Future faces - 20,000 Years.jpg

    In 20,000 years, in a world where genetic engineering is commonplace and humans have established colonies in space, human knowledge of the universe will increase and as such, the size of the brain will increase, Dr. Alan Kwan theorizes. As a result, the human head will have to become larger to accommodate the larger brain size. (Nickolay Lamm / MyVoucherCodes.co.uk)

  • Future faces - 60,000 Years.jpg

    In 60,000 years, Dr. Alan Kwan states that after millennia of traveling through space, zygotic genome engineering will be used to create humans with larger eyes, more pigmented skin and a thicker eyelids. This will be done in order to see better in the dimmer environment of space, to shield humans from the UV rays and alleviate the effects of low to no gravity like today’s astronauts on the International Space Station. (Nickolay Lamm / MyVoucherCodes.co.uk)

  • Future faces - 100,000 Years.jpg

    100,000 years from now, Dr. Alan Kwan believes that future humans will have much larger eyes and “eye-shine” due to the tapetum lucidum, a layer of tissue behind the retina of the eye. This would be done to help protect our eyes from cosmic rays. (Nickolay Lamm / MyVoucherCodes.co.uk)

Homo sapiens have slowly evolved over thousands of millennia, but what happens when modern technology comes into play?

Visual artist, Nickolay Lamm of Pittsburgh, Pa., tried to answer that question. Interested in illustrating how humans would look like in 100,000 years, he asked science for the answers.

“Because I’m not expert in evolution, [I] got in touch with Dr. [Alan] Kwan who gave me his educated guess at what we may look like,” Lamm told FoxNews.com in an email.

Working with Dr. Kwan, who has a PhD in computational genomics from Washington University, they established “one possible timeline” to future human evolution of sorts. It’s not science — just a “thought experiment,” Kwan has clarified — but it’s fascinating to think about.

Published on MyVoucherCodes.co.uk, these changes to modern-day humans were based on the assumption that by the 210th century, scientists will be able to modify human appearances before birth through zygotic genome engineering technology.

Kwan based his theories on the accepted idea that between 800,000 and 200,000 years ago, the Earth underwent a period of fluctuation in its climate, which resulted in a tripling of the human brain, as well as skull size. Scientists agree that the rapid changes in climate may have created a favorable environment for those with the ability to adapt to new challenges and situations.

This trend has noticeably continued, for British scientists have found that modern humans have less prominent features and higher foreheads than people during medieval times.

“My goal is to get people talking and thinking about things they otherwise wouldn’t have. For example, this ‘Future Face’ project is getting people talking about whether or not something like ‘Gattaca’ may happen,” Lamm told FoxNews.com, referring to the 1997 movie starring Ethan Hawke.

Some have criticized Dr. Kwan for appearing to ignore common scientific knowledge. Such 100,000 year projections are “fantasy,” Razib Khan, a geneticist, told Matthew Herper of Forbes.

“This is more of a speculative look than a scientific look into one possible future where human engineering replaces natural evolution in determining human physiology, but we have been very happy that our humble project has garnered so much attention and provided a platform for others share their own vision of the future,” Kwan said, according to Lamm.

Read more: http://www.foxnews.com/science/2013/06/12/humans-in-100000-years-what-will-look-like/?intcmp=features#ixzz2W8KlAmfd

400-year-old plant, found frozen, grows again

By Michael Roppolo

Published June 10, 2013

FoxNews.com

  • iceagminiiceage.jpg

    Canadian researchers discovered plants that grew in areas now covered by glaciers can now grow again. (NASA)

Canadian researchers have found that plants can come back to life, despite having been buried under ice for centuries, reports the Agence France-Presse.

Catherine La Farge, a University of Alberta researcher, had collected what was believed to be dead mosses, or bryophytes, from a retreating glacier off of the Ellesmere Island, located off the northwest coast of Greenland.

Carbon dating would help determine that the plants had been approximately 400 years old and had frozen sometime in the period of the Little Ice Age, which occurred between 155 and 1850.

La Farge later revived the plants in a lab, disproving the assumption that all plants that grew in areas now covered by glaciers can never grow again.

After grinding up the plant material and planting them in soil, seven out of the 24 samples planted showed growth within weeks. Her findings were published in the latest issue of the journal of the Proceedings of the National Academy of Sciences.

“We know that bryophytes can remain dormant for many years (for example, in deserts) and then are reactivated, but nobody expected them to rejuvenate after nearly 400 years beneath a glacier,” La Farge said in a statement to the AFP.

Evolving from sea algae, moss has been around for 400 million years.

Mosses reproduce by cloning their cells, unlike other plants, La Farge told the AFP. “Any bryophyte cell can reprogram itself to initiate the development of an entire new plant. This is equivalent to stem cells in faunal systems.”

Read more: http://www.foxnews.com/science/2013/06/10/400-year-old-plant-found-frozen-grows-again/?intcmp=HPBucket#ixzz2VussaxMO

What the bomb reveals about the brain

By Stephanie Pappas

Published June 07, 2013

LiveScience

  • A 23-kiloton nuclear bomb detonated on 18 April, 1953

    A 23-kiloton nuclear bomb detonated on 18 April, 1953, at a Nevada test site released this mushroom cloud. (National Nuclear Security Administration / Nevada Site Office)

Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.

Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable or plastic the adult brain can be.

Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.

“There was a lot in the literature showing there was neurogenesis in rodents and every animal studied,” said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, “But there was very little evidence of whether this happens in humans.” [Top 10 Mysteries of the Mind]

Tantalizing clues
Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)

The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.

The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.

Learning to love the bomb
Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon. [The 10 Greatest Explosions Ever]

This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.

When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.

Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.

They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.

This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.

“Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility,” she said.

Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.

The new findings were detailed June 6 in the journal Cell.

Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Read more: http://www.foxnews.com/science/2013/06/07/nuclear-bomb-tests-new-brain-cells/?intcmp=features#ixzz2VbNyFTnX

Woman to have ‘dolphin-assisted’ birth

Published May 28, 2013

FoxNews.com

  • Dolphins help battle addictive behavior
A pregnant woman and her husband have traveled to Hawaii where they plan on having a “dolphin-assisted birth,” a water delivery among dolphins, according to Medical Daily.

Heather Barrington, 27, and her husband Adam, 29, of South Carolina, are preparing for the July arrival of their first child through a series of prenatal and postnatal swims with a pod of dolphins at The Sirius Institute in Pohoa, Hawaii.

The Sirius Institute describes itself as a “a research consortium with the purpose of ‘dolphinizing’ the planet.” They recently set up the Dolphin Attended, Water, Natural and Gentle Birth Center (DAWN), due to what they claim is an increasing demand on their web site for people looking to give birth near dolphins. The Sirius Institute claims that giving birth with dolphins is part of an ancient native Hawaiian practice.

While dolphin-assisted births are rare, dolphin assisted therapy (DAT) has been used for more than 25 years in patients with mental and physical disabilities and autism, according to Medical Daily. During DAT, patients swim and play with dolphins living in captivity while completing tasks meant to improve skills like hand-eye coordination. However, scientists claim there is little scientific evidence indicating that DAT is therapeutically effective.

Water births – without the presence of dolphins –have proven benefits, including more efficient contractions, improved blood circulation for the mother, less pain and more oxygen for the baby, according to the American Pregnancy Association (APA). However, the APA noted that few studies have been done examining the risks associated with water births.

In the event that a “dolphin-assisted” birth cannot occur, the couple has made plans to deliver with a midwife. Experts point out that dolphins are predators and can become aggressive, though dolphin-related injuries among people are relatively rare, Medical Daily reported.

“Having that connection with the pod of dolphins anytime – even if the birth doesn’t happen in the water – still brings peace, comfort and strength to the mother and baby during labor,” Heather told the South Charlotte News.

Click for more from Medical Daily.

Read more: http://www.foxnews.com/health/2013/05/28/woman-to-have-dolphin-assisted-birth/?intcmp=obnetwork#ixzz2UkRHKx8C

Could humans be cloned?

By Rachael Rettner

Published May 19, 2013

LiveScience

  • Battlestar Galactica cylons.jpg

    Actresses Tricia Helfer (left) and Grace Park (right), who played humanoid Cylons with countless clones on the TV show “Battlestar Galactica.” (Syfy)

  • Egg nucleus transfer final.jpg

    The first step during SCNT is enucleation or removal of nuclear genetic material (chromosomal) from a human egg. An egg is positioned with holding pipette (on the left) and egg’s chromosomes are visualized under polarized microscope. A hole is made in the egg’s shell (zone pellucida) using a laser and a smaller pipette (on the right) is inserted through the opening. The chromosomes then sucked in inside the pipette and slowly removed from the egg. (Cell, Tachibana et al.)

The news that researchers have used cloning to make human embryos for the purpose of producing stem cells may have some people wondering if it would ever be possible to clone a person.

Although it would be unethical, experts say it is likely biologically possible to clone a human being. But even putting ethics aside, the sheer amount of resources needed to do it is a significant barrier.

Since the 1950s when researchers cloned a frog, scientists have cloned dozens of animal species, including mice, cats, sheep, pigs and cows.

 

‘It’s grossly unethical.’

– Dr. Robert Lanza, chief scientific officer at the biotech company Advanced Cell Technology

 

In each case, researchers encountered problems that needed to be overcome with trial and error, said Dr. Robert Lanza, chief scientific officer at the biotech company Advanced Cell Technology, which works on cell therapies for human diseases, and has cloned animals.

With mice, researchers were able to use thousands of eggs, and conduct many experiments, to work out these problems, Lanza said. “Its a numbers game,” he said.

But with primates, eggs are a very precious resource, and it is not easy to acquire them to conduct experiments, Lanza said.

In addition, researchers can’t simply apply what they’ve learned from cloning mice or cows to cloning people.

For instance, cloning an animal requires that researchers first remove the nucleus of an egg cell. When researchers do this, they also remove proteins that are essential to help cells divide, Lanza said. In mice, this isn’t a problem, because the embryo that is ultimately created is able to make these proteins again. But primates aren’t able to do this, and researchers think it may be one reason that attempts to clone monkeys have failed, Lanza said. [See How Stem Cell Cloning Works (Infographic)]

What’s more, cloned animals often have different kinds of genetic abnormalities that can prevent embryo implantation in a uterus, or cause the fetus to spontaneously abort, or the animal to die shortly after birth, Lanza said.

These abnormities are common because cloned embryos have just one parent rather than two, which means that a molecular process known as “imprinting” does not occur properly in cloned embryos, Lanza said. Imprinting takes place during embryo development, and selectively silences certain genes from one parent or the other.

Problems with imprinting can result in extremely large placentas, which ultimately leads to problems with blood flow for the fetus, Lanza said. In one experiment, Lanza and colleagues cloned a species of cattle called banteng, and it was born at twice the size of a normal banteng. It had to be euthanized, Lanza said.

The extremely high rate of death, and the risk of developmental abnormities from cloning makes cloning people unethical, Lanza said.

“It’s like sending your baby up in a rocket knowing there’s a 50-50 chance it’s going to blow up. It’s grossly unethical,” Lanza said.

Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Read more: http://www.foxnews.com/science/2013/05/19/could-humans-be-cloned/#ixzz2TqNdBvr6

Can you train your brain? Lumosity, BrainHQ say yes

By 

Personal Tech

Published May 15, 2013

FoxNews.com

  • brain power
    Flickr/illuminaut
Call it the great brain train.

Baby boomers, students, and the elderly all share at least one anxiety: Are my mental abilities holding me back? So it’s not surprising that online cognitive exercises, or brain training, are finding a particularly receptive audience these days.

One popular service from Lumosity now has 40 million members. Its exercises are generally entertaining — if a little humbling at first. New users fill out a very simple questionnaire about their concerns and focus (do you want to better remember people’s names or improve your concentration and avoid distractions). Then Lumosity creates a daily regime of exercises for you.

Typical tasks include remembering ever more complex patterns, visual positions, or recalling multiple symbols or images in quick succession. The idea is to continually challenge the user in an attempt to increase particular mental functions, including working memory and executive function. Lumosity is $14.95 a month. A similar program, Posit Science’s BrainHQ, is $14 a month. I’ve tried both and found them each to be engaging — at least for 20 minutes a day.

 

‘It’s still the early days [in cognitive training research].’

– Dr. Joe Hardy, vice president of research and development at Lumosity

 

With Angelina Jolie’ revelations about her breast cancer risk this week, it’s particularly interesting to note a new study also released this week of women who had undergone breast cancer treatment. Dr. Shelli Kesler, a neuropsychologist at Stanford University, used a subset of Lumosity’s exercises to work with 41 breast cancers survivors in order to see if it could help them overcome what can be the mentally enervating effects of cancer treatment. She focused on executive functions, the ability to make decisions.

“This approach has the advantage of adapting and changing the difficulty level,” Dr. Kesler told FoxNews.com of the computer-based training, which the patients performed on their own, “but were highly motivated.” She said most patients exhibited significant improvement in executive functions after the 20- to 30-minute sessions, which occurred 4 times a week for 12 weeks.

In spite of several studies that show brain training can be effective — including a large study know as ACTIVE or the Advanced Cognitive Training for Independent and Vital Elderly that showed it can be effective even years after the training is finished — such cognitive exercises have been controversial. A recent overview of research conducted by professors at the University of Oslo concluded that the exercises only made people better at…doing the exercises. However, the Oslo study only looked at one aspect, working memory, and did not take into account the tremendous variance in the ages of the participants in the studies. In other words, it cast a skeptical eye on cognitive training but was not by any means conclusive.

There is always reason for some skepticism. Even in research that yields positive results, not every person experiences gains. And it can vary depending on the goal. It helped women subjected to chemotherapy but does it help students with learning issues? Can people in their 50’s experience improvement or is it too late? (Please don’t say it’s too late.)

“It’s still early days,” in cognitive training research Dr. Joe Hardy, vice president of research and development at Lumosity told FoxNews.com. Consequently, the company is committed to doing further studies and continually improving its exercises based on new data. He said that’s why Lumosity is involved in 38 different university research projects at the moment.

True, other popular, supposedly intelligence enhancing techniques have fallen flat. Crossword puzzles, for example, were supposed to boost our intellectual prowess. However, a recent National Institutes of Health funded study of over 600 individuals demonstrated no appreciable gains from doing the Sunday puzzles, whereas cognitive training exercises did show some positive results.

It’s obvious that at a very fundamental level you can train your brain. You can learn a new language or learn how to play the clarinet. But the issue isn’t whether practicing an instrument makes you better at playing an instrument. The question is, can brain games make you better at other intellectual endeavors?

In at least one specific area I’ve found it personally effective: Driving. So-called useful field of view exercises do seem to increase awareness on the road. I found that regular training gave me a heightened focus while behind the wheel, especially in city traffic, and independent studies seem to confirm the effect.

In an era in which healthy kids are taking ADHD drugs just to get better scores on their SATs, online cognitive training looks harmless and possibly quite beneficial. But it’s important to note that another factor plays an extremely important role in intelligence and mental alacrity: Exercise. Dr. Kesler emphasizes that exercise is essential in creating new neurons.

Of course, just as all the weight training and cardio workouts in the world won’t turn me into Roger Federer, simply exercising your brain on Lumosity won’t help you pass a test in American history if you didn’t study the revolutionary war. You’ve got to do some work on your own.

So keep your expectations in check. Remember: Flash cards do make you better at performing mathematical calculations, just don’t expect them to turn you into Einstein.

Follow John R. Quain on Twitter @jqontech or find more tech coverage at J-Q.com.

Read more: http://www.foxnews.com/tech/2013/05/15/can-train-your-brain-lumosity-brainhq-say-yes/#ixzz2TQSsKDbb

Man brought back to life after being clinically dead for 40 minutes

Published May 13, 2013

news.com.au

  • Heart ECG
    iStock
An Australian man who was clinically dead for 40 minutes has been brought back to life by a brand new resuscitation technique.

Colin Fiedler, 39 from Victoria, was one of three cardiac arrest patients brought back to life after being dead for between 40 and 60 minutes at The Alfred hospital in Melbourne, using two new techniques in the emergency department.

The Alfred is testing a mechanical CPR machine, which performs constant chest compressions, and a portable heart-lung machine — normally used in theatre — to keep oxygen and blood flowing to the patient’s brain and vital organs.

Fiedler had a heart attack and was clinically dead for 40 minutes before being revived last June.

“I’m so grateful, more than I could ever say,” he told the Herald Sun.

So far, seven cardiac arrest patients have been treated with the AutoPulse machine and extracorporeal membrane oxygenation.

It allows doctors to diagnose the cause of the cardiac arrest and treat it, but keep blood and oxygen flowing to the vital organs and brain, which reduces the risk of permanent disability.

Fiedler is one of the three patients who were revived and returned home without disability. In the ambulance, paramedics had given him a choice of two hospitals.

“For some reason, I said The Alfred, which is pretty lucky, because they are the only one that has it,” he said.

The system is available only at The Alfred, but senior intensive care physician Professor Stephen Bernard said the results from the first two years of the trial were exciting, and he hopes to eventually expand the system across Melbourne.

Click for more from news.com.au.

Read more: http://www.foxnews.com/health/2013/05/13/man-brought-back-to-life-after-being-clinically-dead-for-40-minutes/?intcmp=features#ixzz2TH8a4kBu

Genetically Modified Sheep That Glow-In-The-Dark Created By Uruguay Scientists

 

Published April 29, 2013

Fox News Latino

  • Glow In The Dark Sheep.JPG

    Scientists at the Animal Reproduction Institute of Uruguay used the fluorescent protein from an Aequarea jelly fish to give sheep a distinct glowing green color when exposed to certain ultraviolet light. ((PHOTO : FUNDACIÓN IRAUY / J. CALVELO))

These sheep really do shine from within.

A group of scientists in Uruguay have announced that they have successfully modified the genetic makeup of sheep to make them glow-in-the-dark.

Scientists from the Animal Reproduction Institute of Uruguay said they used a fluorescent protein from an Aequarea jelly fish to give a flock of nine sheep a distinct glowing green color when exposed to certain ultraviolet light.

So far, aside from their unique genetic modification, the animals are developing normally and roaming the fields like any other sheep.

One of the team’s lead researchers, Alejo Menchaca, said that the modification was done not out of medical research but the desire to “fine-tune the technique.”

Sheep are not the first animals to be modified to glow-in-the-dark.

Scientists have created a glow-in-the-dark animal trend using zebrafish, cats, dogs, pigs, scorpions, worms, monkeys, mice, and more.

These seemingly wacky experiments do have a purpose though.

According to researchers, they believe these genetically modified animals can help us better understand diseases and how they develop in both animals and humans. Scientists have used glow-in-the-dark cats to research HIV and AIDS.

“The application of the new technology,” scientists from the Roslin Institute at the University of Edinburgh told The Guardian, should provide “valuable information for the study of AIDS.”

Follow us on twitter.com/foxnewslatinohttp://global.fncstatic.com/static/v/all/img/external-link.png
Like us at facebook.com/foxnewslatinohttp://global.fncstatic.com/static/v/all/img/external-link.png

Read more: http://latino.foxnews.com/latino/news/2013/04/29/scientists-in-uruguay-create-genetically-modified-sheep-that-glow-in-dark/#ixzz2Rv69A6YA