31 December 2011

The Science of Champagne

With the close of 2011, Quantum Day wishes each and everyone of you a great and prosperous 2012! Enjoy this last entry of the year!

Happy New Year!!!

Champagne is a sparkling wine produced by inducing the in-bottle secondary fermentation of the wine to effect carbonation. The term "Champagne" is used to refer to wine produced exclusively within the Champagne region of France. The primary grapes used in the production of Champagne are Pinot noir, Chardonnay and Pinot Meunier.

Champagne is very much identified with momentous celebrations such as sports victories, weddings and anniversaries and of course, New Year's Eve. More champagne corks are popped on New Year's Eve than any other day in the year.

2011 is the International Year of Chemistry. And to mark the end of this momentous year, The American Chemical Society posted a video on the science and chemistry of champagne. The video explains that champagne, unlike other wines, undergoes a second fermentation in the bottle to trap carbon dioxide gas, which dissolves into the wine and forms the fabled bubbles in the bubbly. More than 600 different chemical compounds join carbon dioxide in champagne, each lending its own unique quality to the aroma and flavor of champagne.

Video: A Toast to the Chemistry of Champagne

But even with all of that flavor, champagne would be just another white wine without those tiny bubbles. As the bubbles ascend the length of a glass in tiny trains, they drag along molecules of those 600 flavor and aroma substances. They literally explode out of the surface as the bubbles burst, tickling the nose and stimulating the senses.

Some accounts say that a French Benedictine monk named Dom Pierre Pérignon discovered champagne in the mid-1600s, and became namesake for the famous champagne cuvée, Dom Pérignon. The video points out that early champagne makers had a tough time with that second fermentation. Some bottles wound up with no bubbles at all. Others got too much carbon dioxide, and exploded under the enormous pressure, wasting the precious vintage.

So what's the best way to pour a glass of bubbly and maximize the sensory experience?

For an answer, the video turns to a study published in the Journal of Agricultural and Food Chemistry, one of more than 40 peer-reviewed scientific journals published by the ACS. Pouring champagne on an angle retains up to twice as much carbon dioxide in the champagne when compared to pouring down the middle of the glass. Those additional bubbles carry out more of the hundreds of flavor compounds in champagne.


American Chemical Society
Bytesize Science
International Year of Chemistry
Journal of Agricultural and Food Chemistry
Science's Breakthrough of the Year: HIV treatment as prevention
What Is Metabolomics And Its Importance
Fabric Cleans Itself When Exposed to Sunlight
The Science of Food
Bartenders Use Physics in Mixing Cocktails
Danceroom Spectroscopy: Quantum Physics on the Dance Floor.
The Spaceships of Virgin Galactic
Gasoline from Algae
Move Over Nicotine Patch. Cigarette Candies Are Coming!
Professor to make US$400,000 Hamburger

30 December 2011

Healthy Diet Leads to Better Mental Performance and Minimizes Brain Shrinkage

Scientists found that elderly people with high levels of vitamins and omega 3 fatty acids present in their blood performed better on mental acuity tests and showed less brain shrinkage. Brain shrinkage is typical for people with Alzheimer's disease.

Study also showed that those who had less healthy food and ate more junk food had the opposite result.

The research team was composed by scientists from the Oregon Health and Science University in Portland, Ore., and the Linus Pauling Institute at Oregon State University.

Instead of using food questionnaires in gathering data, the scientists specifically measured a wide range of blood nutrient levels. They found positive effects in people with high levels of vitamins B, C, D, E and the healthy oils such as omega 3 fatty acid most commonly found in fish.

Omega 3 fatty acids include α-linolenic acid (ALA), eicosapentaenoic acid (EPA), and docosahexaenoic acid (DHA), all of which are polyunsaturated. Omega 3 fatty acids are commonly found in fish oils and plant oils such as flaxseed oil, algal oil and hemp seed oil. Omega 3 is important in ones diet because Mammals such as humans, cannot synthesize omega 3 fatty acids. Omega 3 fatty acids reduces triglycerides, heart rate, blood pressure, and atherosclerosis

"This approach clearly shows the biological and neurological activity that's associated with actual nutrient levels, both good and bad," said Maret Traber, a principal investigator with the Linus Pauling Institute and co-author on the study.

"The vitamins and nutrients you get from eating a wide range of fruits, vegetables and fish can be measured in blood biomarkers," Traber said. "I'm a firm believer these nutrients have strong potential to protect your brain and make it work better."

Video: Healthy Eating

The study was done with 104 people, at an average age of 87, with no special risk factors for memory or mental acuity. It tested 30 different nutrient biomarkers in their blood, and 42 participants also had MRI scans to measure their brain volume.

"These findings are based on average people eating average American diets," Traber said. "If anyone right now is considering a New Year's resolution to improve their diet, this would certainly give them another reason to eat more fruits and vegetables."

Among the findings and observations:
  • The most favorable cognitive outcomes and brain size measurements were associated with two dietary patterns – high levels of marine fatty acids, and high levels of vitamins B, C, D and E.
  • Consistently worse cognitive performance was associated with a higher intake of the type of trans-fats found in baked and fried foods, margarine, fast food and other less-healthy dietary choices.
  • The range of demographic and lifestyle habits examined included age, gender, education, smoking, drinking, blood pressure, body mass index and many others.
  • The use of blood analysis helped to eliminate issues such as people's flawed recollection of what they ate, and personal variability in nutrients absorbed.
  • Much of the variation in mental performance depended on factors such as age or education, but nutrient status accounted for 17 percent of thinking and memory scores and 37 percent of the variation in brain size.
  • Cognitive changes related to different diets may be due both to impacts on brain size and cardiovascular function.
  • The epidemiology of Alzheimer's disease has suggested a role for nutrition, the researchers said in their study, but previous research using conventional analysis, and looking in isolation at single nutrients or small groups, have been disappointing. The study of 30 different blood nutrient levels done in this research reflects a wider range of nutrients and adds specificity to the findings.

The study needs to be confirmed with further research and other variables tested, the scientists said.

This work is published in Neurology, the medical journal of the American Academy of Neurology and is supported by the National Institutes of Health.


Oregon State University
American Academy of Neurology
National Institutes of Health
Poor Eating Habits At Work May Lead to Obesity, Diabetes and Other Disease
Science's Breakthrough of the Year: HIV treatment as prevention
New Developments in Treatment of Asthma, Allergies and Arthritis
What Is Metabolomics And Its Importance
Breakthrough in Fight Against Alzheimer's Disease
Alzheimer's Disease Risk Minimized by Eating Fish
Stem Cell Breakthrough for Parkinson's Disease Treatment
US$10 Million Contest to Sequence Centenarian Genome
Human Embryo Cloned for Stem Cell Production
Vaccine to Treat Lung Cancer Being Developed
Researchers Look into Lung Regeneration

29 December 2011

First Three Full Facial Transplantation in the US Successful

Most of the general public were made aware of Full Facial Transplantation through the 1997 movie, Face/Off, starring John Travolta and Nicholas Cage. In the movie, the two actors play an FBI agent and a terrorist who exchange faces through hi-tech surgery.

Fourteen years later, Full Facial Transplantation is already a viable option in the treatment of severe facial deformaties and injuries. Not exactly like in the movie where the two "swap" faces but the technology is close enough to recreate that scenario in the real world.

Boston, MA - In March 2011, a surgical team at Brigham and Women's Hospital (BWH) performed the first full face transplantation (FFT) in the United States and went on to complete a total of three FFTs this year. Now, in the first research publication to evaluate FFT in the US, and largest series worldwide, the researchers describe details of patient preparation, novel design and execution of the operation as well as unique immunosuppression protocol allowing for lowest long-term maintenance drug regimen. They also share details of the early functional outcomes and demonstrate FFT as a viable option in the treatment of severe facial deformities and injuries. This research is published in the New England Journal of Medicine in the December 27, 2011 issue.

"Unlike conventional reconstruction, facial transplantation seeks to transform severely deformed features to a near-normal appearance and function that conventional reconstructive plastic surgical techniques cannot match," said lead author Dr. Bohdan Pomahac, Director of the Plastic Surgery Transplantation Program at BWH and lead surgeon in all three FFT procedures. "It truly is a life-giving procedure for these patients."

In an effort to advance the field of face transplantation, Pomahac and colleagues document the novel processes involved in a successful face transplant program from screening candidates to the transplant procedure itself and the follow up management of the recipients.

Video: Face Transplant Animation

Researchers describe the rigorous screening and consent process that each patient must pass, which includes evaluation by a team of physicians who determine whether the patient is physically and mentally prepared for the procedure through numerous clinical and psychological evaulations. Once a candidate is approved by the face transplant team and the Institutional Review Board at BWH, BWH physicians work closely with the New England Organ Bank (NEOB) to identify the criteria for suitable donors and the process for obtaining consent for this unique transplantation.

Next, researchers outline the details of the surgeries with a focus on the multi-disciplinary collaborative efforts of an entire team of clinicians. Surgeons and staff coordinate their tasks while preparing the recipient and simultaneously retrieving the donor tissue within a limited time frame. The researchers describe the similarities and difference between each procedure, noting the various differences that occurred in the one FFT recipient who also concurrently received a bilateral hand transplant.

Video: Actual Full Face Transplantation Surgery

Lastly, researchers explain the care of the recipient post-transplant. Following the surgery, physicians monitor and adjust immunosuppressants (anti-rejection medications) while methodically screening for any signs of organ rejection. The researchers discuss occurrences of single episodes of rejection in two patients, as well as describe other complications following surgery, like occurrence of infection. Pomahac and colleagues discuss how the transplanted tissue transformed and adapted to match the features of the recipient in each case.

"Our focus moving forward continues to be on monitoring and documenting the progress of patients who have undergone FFT, and refining the use of immunosuppressants, with the hope that one day patients will eventually need to take little or none," said Pomahac. We are also learning how brain reintegrates the new parts, and follow closely motor and sensory return. Important part of the study is also calculation of cost-effectiveness.
The Plastic Surgery Transplantation Program continues the rich history in transplantation at BWH, with the first successful human organ transplant occurring at the hospital in 1954, and this tradition continues with a current concentration on FFT, hand transplantation and many other innovations and research in the field of transplantation.

The study was funded by a research contract between the United States Department of Defense and Brigham and Women's Hospital under the Biomedical Translational Initiative Program.

Brigham and Women's Hospital (BWH) is a 793-bed nonprofit teaching affiliate of Harvard Medical School and a founding member of Partners HealthCare, an integrated health care delivery network. BWH is the home of the Carl J. and Ruth Shapiro Cardiovascular Center, the most advanced center of its kind. BWH is committed to excellence in patient care with expertise in virtually every specialty of medicine and surgery. The BWH medical preeminence dates back to 1832, and today that rich history in clinical care is coupled with its national leadership in quality improvement and patient safety initiatives and its dedication to educating and training the next generation of health care professionals. Through investigation and discovery conducted at its Biomedical Research Institute (BRI), BWH is an international leader in basic, clinical and translational research on human diseases, involving more than 900 physician-investigators and renowned biomedical scientists and faculty supported by more than $537 M in funding. BWH is also home to major landmark epidemiologic population studies, including the Nurses' and Physicians' Health Studies and the Women's Health Initiative. For more information about BWH, please visit www.brighamandwomens.org

Brigham and Women's Hospital
Biomedical Research Institute (BRI)
New England Journal of Medicine
New England Organ Bank
New Findings in Electron Density Lead to Better Imaging Devices and Applications
New Developments in Treatment of Asthma, Allergies and Arthritis
Medical Treatments Through Photonics
What is Ultrasound Surgery
Color Your Eyes with Lasers: Cosmetic Eye Surgery
Human Embryo Cloned for Stem Cell Production

28 December 2011

Poor Eating Habits At Work May Lead to Obesity, Diabetes and Other Disease

In a previous work published in the Public Library of Science Medicine (PLoS Medicine), a link was established between an increased risk of type 2 diabetes and rotating patterns of shift work, particularly in US nurses. Building up on that study, an editorial was published in the December issue of PLoS Medicine citing that poor eating habits of shift workers should be considered a new occupational health hazard.

Around 15% to 20% of workers in Europe and the US engage in shift work, particularly in the health care industry. Shift work is a labor practice where the establishment or institution provide service 24 hours, 7 days a week. Employees are given a time period to report to work for the business to continue operating. Shift work is notoriously associated with poor patterns of eating, which is made worse by easy access to junk food compared with more healthy options.

People who work night shifts tend to have lesser eating options than day shifts. And with the limited time given for breaks, it is more convenient for workers to eat in fast food restaurants or use food vending machines in the place of work.

The editorial continues that working patterns should now be considered a specific risk factor for eating related disorders such as obesity and type 2 diabetes, which are currently at epidemic proportions . It suggests that firm action is needed to address this epidemic, i.e. that "governments need to legislate to improve the habits of consumers and take specific steps to ensure that it is easier and cheaper to eat healthily than not". The editorial specifically suggests that unhealthy eating could legitimately be considered a new form of occupational hazard and that workplaces, specifically those who employ shift workers, should lead the way in eliminating this hazard.

Video: New Culprit in the Obesity Epidemic: The Workplace

In related news, the number of people who suffer from one or more of the adverse complications of obesity, including type 2 diabetes and heart disease is rapidly increasing. Currently, drugs designed to treat obesity have shown limited efficacy and are associated with serious side effects. This is largely because of limited understanding on the effects of obesity on the natural mechanisms of body weight control.

For example, while great strides have been made in the understanding of how the brain controls the desire to feed, as well as the processes underlying the balancing of energy intake and expenditure, little is known about how they are altered by obesity. Two independent groups of researchers have now generated data that begin to address this issue.

In brief, a team of researchers led by Michael Schwartz, at the University of Washington, Seattle, has found that in both humans and rodents, obesity is associated with neuronal injury in an area of the brain crucial for body weight control (the hypothalamus). A second team of researchers, led by Jeffrey Flier, at Beth Israel Deaconess Medical Center, Boston, has determined that turnover of nerve cells in the hypothalamus is inhibited by obesity.

Video: Obesity, Diabetes and Energy Metabolism


Poor Diet in Shift Workers: A New Occupational Health Hazard?
Public Library of Science
Journal of Clinical Investigation
Science's Breakthrough of the Year: HIV treatment as prevention
New Developments in Treatment of Asthma, Allergies and Arthritis
The Science of Food
Alzheimer's Disease Risk Minimized by Eating Fish
Professor to make US$400,000 Hamburger
Human Embryo Cloned for Stem Cell Production

27 December 2011

Lightweight Solar Power Generator Enters Full Production For Military Use

The U.S. Department of Navy, through funding from the Office of Naval Research (ONR) has entered into full production its solar generator, the Ground Renewable Expeditionary ENergy System (GREENS).

GREENS is a portable, 300-watt, hybrid battery generator that uses the light from the sun to produce electricity. It was developed to provide Marines with continuous power in the battlefield. In fact, several GREENS are already being used in the field.

Cliff Anderson is the logistics program officer in ONR's Expeditionary Maneuver Warfare & Combating Terrorism Department. He says that GREENS significantly reduces the amount of fuel that has to be delivered to the field. With less fuel to deliver, it minimizes the number of warfighters on the roads, convoys and hazards. It also lowers the cost of logistics associated with distributing fuel.

"That was really the objective: to get warfighters out of harm's way and reduce the cost of transporting fuel."

The GREENS project was conceived in 2008 when a Universal Needs Statement was submitted from Iraq for an expeditionary renewable power system. Approval for this project was expedited and technical execution took less than six months with the first unit tested in July 2009.

The system transitioned from the ONR to the Marine Corps Systems Command (MCSC). From there, GREENS went into production. The solar powered generator provided Marines with power to use various military devices. Several small Marine Corps outposts have even used GREENS as their sole energy source.

Video: Ground Renewable Expeditionary ENergy System

The overall goal for GREENS is that it will reduce the logistics burden for providing power to remote locations. It provides the military the AC and DC power it needs to charge typical communication, targeting, and computing devices. GREENS also reduces fossil fuel use otherwise needed for typical generators, and will lessen the need for fuel resupply, reducing the associated threats to vehicle convoys in Afghanistan and Iraq.

Justin Govar, chemical engineer and program manager, Expeditionary Power Sources Office, says, "GREENS is important because within the Marine Corps we are fighting in areas that are remote, that require very difficult logistical trains to get to."

"Infantry battalions that are far forward do not have immediate access to a wide range of logistics and maintenance equipment; therefore, any source of power that requires no [military-grade fuel], low maintenance and no special skills to operate becomes an instant success," said Maj Sean Sadlier, a logistics analyst with the Marine Corps Expeditionary Energy Office, who trained users on and tested GREENS in the field with India Company, 3rd Battalion, 5th Marine Regiment. Additionally, "GREENS is modular, portable, rugged and intuitive enough to deploy in a combat environment. Units trained on GREENS as part of pre-deployment training have provided positive feedback."

The Ground Renewable Expeditionary ENergy System supports the Marine Corps' objective of generating all power needed for sustainment and command, control, communications, computers and intelligence equipment in place in the field by 2025. This vision, as laid out in the USMC Expeditionary Energy Strategy, aligns with the Marine Corps Vision and Strategy 2025. The goal is to enable Marines to travel more lightly and quickly by reducing the amount of fuel needed.

GREENS was set up and tested at NSWC Carderock in West Bethesda, Md. It was subjected to a 300-Watt continuous power testing at NAVAIR China Lake during the 2010 Empire Challenge. Ambient temperatures recorded during the China Lake exercise exceeded 116 degrees Fahrenheit and even under these extreme temperature conditions, the system provided an average of 85% of the rated energy. This result exceeded expectations and led to an MCSC request that the product be rapidly developed and readied for acquisition.

Even with the positive feedback, The Naval Surface Warfare Center are still looking into getting the average rated energy up from 85% up to 100%

The solar powered generator is composed of 1600-watt solar arrays and rechargeable batteries combined to provide 300 watts of continuous electricity for Marines in remote locations. Additionally, there is a toolkit feature that allows Marines to enter their expected mission profile and determine which components from GREENS that they will need to take with them. GREENS can be rapidly deployed and is High Mobility Multipurpose Wheeled Vehicle (HMMWV) transportable.


Office of Naval Research
Marine Corps Systems Command
Ground Renewable Expeditionary ENergy System
Naval Surface Warfare Center Carderock Division
Solar Paint That Can Generate Electricity
Guided Rockets Hit Fast-Moving Targets in Test
Digital Contact Lens for Heads Up Display and Augmented Reality
Advances in Lithium Ion Batteries: 1 Week Power on a 15 Minute Charge
The Eco Friendly Hybrid Car
Helium 3 to be Mined in the Moon
Gasoline from Algae

26 December 2011

Twitter Growth in US Aided By Other Social Networks

CAMBRIDGE, Mass. — We’ve all heard it: The Internet has flattened the world, allowing social networks to spring up overnight, independent of geography or socioeconomic status. Who needs face time with the people around you when you can email, text or tweet to and from almost anywhere in the world? Twitter, the social networking and microblogging site, is said to have more than 300 million users worldwide who follow, forward and respond to each other’s 140-character tweets about anything and everything, 24/7.

But MIT researchers who studied the growth of the newly hatched Twitter from 2006 to 2009 say the site’s growth in the United States actually relied primarily on media attention and traditional social networks based on geographic proximity and socioeconomic similarity. In other words, at least during those early years, birds of a feather flocked — and tweeted — together.

In their study of Twitter’s “contagion process,” the researchers looked at data from 16,000 U.S. cities, focusing on the 408 with the highest number of Twitter users and seeking to update traditional models of how information spreads and technology is adopted.

Just as marketing experts sometimes label consumers as early adopters, early majority adopters, late majority adopters or laggards, the researchers characterized cities in those terms, based on when Twitter accounts in a given city reached critical mass. Critical mass is generally defined as the point when something reaches 13.5 percent of the population, which for this study was 13.5 percent of the highest total number of Twitter users in a city through August 2009, the end of the study period.

Video: Follow Your Interests. Discover Your World. Twitter

As with most technologies, the growth in popularity initially spread via young, tech-savvy “innovators,” in this case from Twitter’s birthplace in San Francisco to greater Boston. But the site’s popularity then took a more traditional route of traveling only short distances, implying face-to-face interactions; this approach made early adopters of Somerville, Mass., and Berkeley, Calif. — cities close to Boston and San Francisco, respectively. Twitter use then spread through early majority cities such as Santa Fe and Los Angeles and late majority cities such as Baltimore and Las Vegas before reaching laggards such as Palm Beach, Fla., and Newark, N.J. All these cities ultimately ranked among the 408 nationwide with the largest numbers of Twitter accounts.

“Even on the Internet where we may think the world is flat, it’s not,” says Marta González, assistant professor of civil and environmental engineering and engineering systems at MIT, who is co-author of a paper on this subject appearing this month in the journal PLoS ONE. “The big question for people in industry is ‘How do we find the right person or hub to adopt our new app so that it will go viral?’ But we found that the lone tech-savvy person can’t do it; this also requires word of mouth. The social network needs geographical proximity. … In the U.S. anyway, space and similarity matter.”

For nearly 50 years, marketers have studied the “diffusion of innovations” (named by Everett Rogers in his 1962 book of the same title) to predict how the purchase of expensive, durable goods such as cars and refrigerators will spread. But the diffusion of high-tech websites and cheap smartphone apps is thought to occur in a very different way.

“Nobody has ever really looked at the diffusion among innovators of a no-risk, free or low-cost product that’s only useful if other people join you. It’s a new paradigm in economics: what to do with all these new things that are free and easy to share,” says MIT graduate student Jameson Toole, a co-author of the paper.

Meeyoung Cha of the Korea Advanced Institute of Science and Technology is the third co-author, and also the person who had the prescience to begin downloading Twitter-published user data (via Twitter API) in May 2006, when there were only a couple of hundred users. She downloaded data through August 2009, when user growth dropped off for a time.

Video: US Twitter Adoption

Each circle represents a U.S. city containing Twitter users. As time goes on, circles grow in size as more users sign up in that location. When a location has reached a 'critical mass' of users, or 13.5 percent of all eventual users have signed up, the location turns red. The line being drawn across the center of the screen is a time series of the number of new users that signed up across the whole country in a given week

González and Toole said their model of Twitter contagion didn’t fit Cha’s data until they added media influence, based on the number of news stories appearing weekly in Google News searches, data they acquired using Google Insights for Search, which provides historical search-engine data.

“Other studies have included news media in their models, but usually as a constant,” González says. “We saw that news media is not a constant. Instead, it’s media responding to people’s interest and vice versa, so we included it as random spikes.”

The study data include the growth spike that began April 15, 2009, when actor Ashton Kutcher challenged CNN to see who could first attract 1 million Twitter followers. Kutcher ultimately won, reaching the million mark in the wee hours of April 17, about half an hour before CNN. Popular talk-show host Oprah Winfrey invited Kutcher to appear on her show that same day; when she ceremoniously sent her first tweet, the pace of new news stories picked up again, and so did new Twitter accounts.

The Twitter bird was suddenly on all the wires, and Twitter’s user accounts increased fourfold because of the media attention, indicating that as recently as 2009, location-based social networks and media attention still held sway over computer-based social networks.


Quantum Day Twitter Site
Project: Modeling the diffusion of social contagion
Department of Civil and Environmental Engineering
Engineering Systems Division
Marta González
Hackers Getting More Advanced and Dangerous
Nightwork: The MIT Hacker
Bartenders Use Physics in Mixing Cocktails
Understanding Consciousness: Types of Consciousness
The Science of Understanding Stress
The Tech of Storytelling

25 December 2011

Follow Up Report on OPERA Faster Than Light Particles

18 March 2012
CERN UPDATE: ICARUS Experiment Indicate Neutrino Speed Consistent With Speed Of Light

When the announcement that particles called neutrinos traveled faster than the speed of light, the whole science community was startled. One of the basic foundations of physics is that nothing can travel faster than the speed of light. The experiment was called OPERA (Oscillation Project with Emulsion-Tracking Apparatus) and conducted by physicists at the European Organization for Nuclear Research (CERN). Finding no mistake in it, they asked the world to take a second look at their experiment.

The Director of the McDonnell Center for the Space Sciences at Washington University in St. Louis and Professor of Physics in Arts & Sciences, Ramanath Cowsik, PhD responded.

Doctor Cowsik and his collaborators found what appears to be an insurmountable problem with the experiment

The OPERA experiment, a collaboration between the CERN physics laboratory in Geneva, Switzerland, and the Laboratori Nazionali del Gran Sasso (LNGS) in Gran Sasso, Italy, timed particles called neutrinos traveling through Earth from the physics laboratory CERN to a detector in an underground laboratory in Gran Sasso 730 kilometers (450 miles) away.

OPERA reported online and in Physics Letters B in September that the neutrinos arrived at Gran Sasso around 60 nanoseconds faster than expected if they were traveling at the speed of light in a vacuum.

Video: Professor Marcus du Sautoy

Neutrinos are thought to have a tiny, but nonzero, mass. According to the Einstein's Theory of Special Relativity, any particle that has mass may come close to but cannot quite reach the speed of light. So superluminal (faster than light) neutrinos should not exist.

The neutrinos in the experiment were created by slamming speeding protons into a stationary target, producing a pulse of pions — unstable particles that were magnetically focused into a long tunnel where they decayed in flight into muons and neutrinos.

The muons were stopped at the end of the tunnel, but the neutrinos, which slip through matter like ghosts through walls, passed through the barrier and disappeared in the direction of Gran Sasso.

In their journal article, Cowsik and an international team of collaborators took a close look at the first step of this process. "We have investigated whether pion decays would produce superluminal neutrinos, assuming energy and momentum are conserved," he says.

The OPERA neutrinos had energies of about 17 gigaelectron volts. "They had a lot of energy but very little mass," Cowsik says, "so they should go very fast." The question is whether they went faster than the speed of light.

"We've shown in this paper that if the neutrino that comes out of a pion decay were going faster than the speed of light, the pion lifetime would get longer, and the neutrino would carry a smaller fraction of the energy shared by the neutrino and the muon," Cowsik says.

"What's more," he says, "these difficulties would only increase as the pion energy increases.

"So we are saying that in the present framework of physics, superluminal neutrinos would be difficult to produce," Cowsik explains.

In addition, he says, there's an experimental check on this theoretical conclusion. The creation of neutrinos at CERN is duplicated naturally when cosmic rays hit Earth's atmosphere.

A neutrino observatory called IceCube detects these neutrinos when they collide with other particles generating muons that leave trails of light flashes as they plow into the thick, clear ice of Antarctica.

"IceCube has seen neutrinos with energies 10,000 times higher than those the OPERA experiment is creating," Cowsik says.."Thus, the energies of their parent pions should be correspondingly high. Simple calculations, based on the conservation of energy and momentum, dictate that the lifetimes of those pions should be too long for them ever to decay into superluminal neutrinos.

"But the observation of high-energy neutrinos by IceCube indicates that these high-energy pions do decay according to the standard ideas of physics, generating neutrinos whose speed approaches that of light but never exceeds it.

Cowsik's objection to the OPERA results isn't the only one that has been raised.

Physicists Andrew G. Cohen and Sheldon L. Glashow published a paper in Physical Review Letters in October showing that superluminal neutrinos would rapidly radiate energy in the form of electron-positron pairs.

"We are saying that, given physics as we know it today, it should be hard to produce any neutrinos with superluminal velocities, and Cohen and Glashow are saying that even if you did, they'd quickly radiate away their energy and slow down," Cowsik says.

Interview of OPERA's spokespersons

"I have very high regard for the OPERA experimenters," Cowsik adds. "They got faster-than-light speeds when they analyzed their data in March, but they struggled for months to eliminate possible errors in their experiment before publishing it.

"Not finding any mistakes," Cowsik says, "they had an ethical obligation to publish so that the community could help resolve the difficulty. That's the demanding code physicists live by," he says.

Related Links

Washington University in St. Louis
Physical Review Letters
Physics Letters B
CERN UPDATE: ICARUS Experiment Indicate Neutrino Speed Consistent With Speed Of Light
CERN Update: Faster Than Speed of Light May Be Due To Hardware Fault
Particles Travel Faster Than Light Again
Speed of Light Theory To Be Challenged Again
Famous Scientists of the 21st Century
Project Sixtrack: The Large Hadron Collider and Your Computer
What Is The Higgs Boson And Why It Matters
What is String Theory?
Whats New @CERN 07 Nov 2011
Whats New @CERN 06 Dec 2011

May You Have A Merry Merry Quantum Christmas!

Hope all you guys have a Merry Merry Christmas!

Oh and the closest relevant science video I could get is umm.. MST3K's Patrick Swayze Christmas.

24 December 2011

New Technique For Cooling Quantum Gases

Physicists find a new way to cool quantum gases.

By using a quantum algorithm to remove excess energy, physicists at Harvard University have found a new way to cool synthetic materials. It is the first application of the technique to ulta-cold atomic gasses; "algorithmic cooling". This research may pave the way to new discoveries from materials science to quantum computation.

The research is published in the journal Nature.

"Ultracold atoms are the coldest objects in the known universe," explains senior author Markus Greiner, associate professor of Physics at Harvard. "Their temperature is only a billionth of a degree above absolute zero temperature, but we will need to make them even colder if we are to harness their unique properties to learn about quantum mechanics.

Greiner and his colleagues study quantum many-body physics, the exotic and complex behaviors that result when simple quantum particles interact. It is these behaviors which give rise to high-temperature superconductivity and quantum magnetism, and that many physicists hope to employ in quantum computers.

"We simulate real-world materials by building synthetic counterparts composed of ultra-cold atoms trapped in laser lattices," says co-author Waseem Bakr, a graduate student in physics at Harvard. "This approach enables us to image and manipulate the individual particles in a way that has not been possible in real materials."

Video: A Guide to Quantum Mechanics

Observing the quantum mechanical effects that Greiner, Bakr and colleagues seek requires extreme temperatures.

"One typically thinks of the quantum world as being small," says Bakr, " but the truth is that many bizarre features of quantum mechanics, like entanglement, are equally dependent upon extreme cold."

When an object gets hotter, more of its constituent particles are being moved around. This obscures the quantum world just like when shaken camera blurs a photograph.

The push to ever-lower temperatures is driven by techniques like "laser cooling" and "evaporative cooling," which are approaching their limits at nanoKelvin temperatures. In a proof-of-principle experiment, the Harvard team has demonstrated that they can actively remove the fluctuations which constitute temperature, rather than merely waiting for hot particles to leave as in evaporative cooling.

Almost like placing or fitting one egg per slot in an egg carton, the process of orbital excitation blockade removes the excess atoms from a crystal until there is precisely one atom for each site.

"The collective behaviors of atoms at these temperatures remain an important open question, and the breathtaking control we now exert over individual atoms will be a powerful tool for answering it," said Greiner. "We are glimpsing a mysterious and wonderful world that has never been seen in this way before."
Greiner and Bakr's co-authors in Harvard's Department of Physics are Philipp Preiss, Eric Tai, Ruichao Ma and Jonathan Simon.

Their work was supported by the Army Research Office through the DARPA OLE program, the AFOSR MURI program, and by grants from the NSF.

Harvard University
Harvard's Department of Physics
Air Force Office of Scientific Research (AFOSR)
National Science Foundation
Army Research Office
Defense Advanced Research Projects Agency (DARPA)
Famous Scientists of the 21st Century
Project Sixtrack: The Large Hadron Collider and Your Computer
Quantum Computers: Tomorrows Technology
What Is The Higgs Boson And Why It Matters
What is String Theory?
Bartenders Use Physics in Mixing Cocktails
Danceroom Spectroscopy: Quantum Physics on the Dance Floor.
Whats New @CERN 06 Dec 2011
Multi-purpose Photonic Chip Developed for Quantum Computers
Medical Treatments Through Photonics
CERN Press Release: Higgs Particle Search Status Still Inconclusive
Solar Paint That Can Generate Electricity

23 December 2011

Science's Breakthrough of the Year: HIV treatment as prevention

AAAS PRESS RELEASE: A clinical trial that revitalized HIV research tops the journal's list of advances in 2011

The journal Science has lauded an eye-opening HIV study, known as HPTN 052, as the most important scientific breakthrough of 2011. This clinical trial demonstrated that people infected with HIV are 96 percent less likely to transmit the virus to their partners if they take antiretroviral drugs (ARVs).

The findings end a long-standing debate over whether ARVs could provide a double benefit by treating the virus in individual patients while simultaneously cutting transmission rates. It's now clear that ARVs can provide treatment as well as prevention when it comes to HIV, researchers agree.

In addition to recognizing HPTN 052 as the 2011 Breakthrough of the Year, Science and its publisher, AAAS, the nonprofit science society, have identified nine other groundbreaking scientific accomplishments from the past year and compiled them into a top 10 list that will appear in the 23 December issue.

Myron Cohen from the University of North Carolina's School of Medicine in Chapel Hill, N.C. and an international team of colleagues kicked off the HPTN 052 study in 2007 by enrolling 1,763 heterosexual couples from nine different countries: Brazil, India, Thailand, the United States, Botswana, Kenya, Malawi, South Africa and Zimbabwe. Each participating couple included one partner with an HIV infection.

The researchers administered ARVs to half of those HIV-infected individuals immediately and waited for the other half of the infected participants to develop CD4 counts below 250 — indicative of severe immune damage — before offering treatment. (A CD4 count below 200 indicates AIDS.)

Video: Get to know AAAS

Then, earlier this year, four years before the study was officially scheduled to end, an independent monitoring board decided that all infected study participants should receive ARVs at once. The board members had seen the dramatic effects of early ARV treatment on HIV transmission rates, and they recommended that the trial's findings be made public as soon as possible. Subsequently, the results of HPTN 052 appeared in the 11 August issue of the New England Journal of Medicine.

"This [HPTN 052 trial] does not mean that treating people alone will end an epidemic," said Science news correspondent Jon Cohen, who wrote about the trial for Science's Breakthrough of the Year feature. "But, combined with three other major biomedical preventions that have proven their worth in large clinical studies since 2005, many researchers now believe it is possible to break the back of the epidemic in specific locales with the right package of interventions."

Treatment with ARVs was already known to reduce the viral load, or the actual amount of HIV, in an infected individual. Many HIV/AIDS researchers had thus reasoned that treated individuals should also be less infectious. But, before HPTN 052, skeptics had contended that such a theory was unproven — and that the viral load might not reflect levels of virus in genital secretions.

"Most everyone expected that reducing the amount of virus in a person would somewhat reduce infectiousness," explained Jon Cohen. "What was surprising was the magnitude of protection and then the impact the results had among HIV/AIDS researchers, advocates and policy-makers."

These findings have added important momentum to a movement, already underway, that promotes the ongoing treatment of HIV to reduce viral loads in communities and could possibly eliminate HIV/AIDS epidemics in some countries. But moving forward won't be easy, researchers say.

Video: Myron Cohen on HPTN 052 HIV Prevention Trial

"There are huge hurdles when it comes to applying this clinical trial evidence to a population," said Jon Cohen. "Some 52 percent of the people who need ARVs immediately for their own health right now have no access — and that's 7.6 million people. What's more, there are all sorts of obstacles that hinder attempts to scale this up that have more to do with infrastructure than the purchase price of drugs."

Still, some researchers consider HPTN 052 a "game-changer" because of its near-100 percent efficacy in reducing HIV transmission rates. And, indeed, it has already sprung many clinicians and policy-makers into action. For all these reasons, Science spotlights the HPTN 052 study as the 2011 Breakthrough of the Year.

Science's list of nine other groundbreaking scientific achievements from 2011 follows.

The Hayabusa Mission: After some near-disastrous technical difficulties and a stunningly successful recovery, Japan's Hayabusa spacecraft returned to Earth with dust from the surface of a large, S-type asteroid. This asteroid dust represented the first direct sampling of a planetary body in 35 years, and analysis of the grains confirmed that the most common meteorites found on Earth, known as ordinary chondrules, are born from these much larger, S-type asteroids.

Unraveling Human Origins: Studying the genetic code of both ancient and modern human beings, researchers discovered that many humans still carry DNA variants inherited from archaic humans, such as the mysterious Denisovans in Asia and still-unidentified ancestors in Africa. One study this year revealed how archaic humans likely shaped our modern immune systems, and an analysis of Australopithecus sediba fossils in South Africa showed that the ancient hominin possessed both primitive and Homo-like traits.

Capturing a Photosynthetic Protein: In vivid detail, researchers in Japan have mapped the structure of the Photosystem II, or PSII, protein that plants use to split water into hydrogen and oxygen atoms. The crystal-clear image shows off the protein's catalytic core and reveals the specific orientation of atoms within. Now, scientists have access to this catalytic structure that is essential for life on Earth — one that may also hold the key to a powerful source of clean energy.

Pristine Gas in Space: Astronomers using the Keck telescope in Hawaii to probe the faraway universe wound up discovering two clouds of hydrogen gas that seem to have maintained their original chemistry for two billion years after the big bang. Other researchers identified a star that is almost completely devoid of metals, just as the universe's earliest stars must have been, but that formed much later. The discoveries show that pockets of matter persisted unscathed amid eons of cosmic violence.

Getting to Know the Microbiome: Research into the countless microbes that dwell in the human gut demonstrated that everyone has a dominant bacterium leading the gang in their digestive tract: Bacteroides, Prevotella or Ruminococcus. Follow-up studies revealed that one of these bacteria thrives on a high-protein diet while another prefers vegetarian fare. These findings and more helped to clarify the interplay between diet and microbes in nutrition and disease.

A Promising Malaria Vaccine: Early results of the clinical trial of a malaria vaccine, known as RTS,S, provided a shot in the arm to malaria vaccine research. The ongoing trial, which has enrolled more than 15,000 children from seven African countries, reassured malaria researchers, who are used to bitter disappointment, that discovering a malaria vaccine remains possible.

Strange Solar Systems: This year, astronomers got their first good views of several distant planetary systems and discovered that things are pretty weird out there. First, NASA's Kepler observatory helped identify a star system with planets orbiting in ways that today's models cannot explain. Then, researchers discovered a gas giant caught in a rare "retrograde" orbit, a planet circling a binary star system and 10 planets that seem to be freely floating in space — all unlike anything found in our own solar system.

Designer Zeolites: Zeolites are porous minerals that are used as catalysts and molecular sieves to convert oil into gasoline, purify water, filter air and produce laundry detergents (to name a few uses). This year, chemists really showed off their creativity by designing a range of new zeolites that are cheaper, thinner and better equipped to process larger organic molecules.

Clearing Senescent Cells: Experiments revealed that clearing senescent cells, or those that have stopped dividing, from the bodies of mice can delay the onset of age-related symptoms, such as cataracts and muscle weakness. Mice whose bodies were cleared of these loitering cells didn't live longer than their untreated cage-mates — but they did seem to live better, which provided researchers with some hope that banishing senescent cells might also prolong our golden years.
The American Association for the Advancement of Science (AAAS) is the world’s largest general scientific society, and publisher of the journal, Science as well as Science Translational Medicine and Science Signaling. AAAS was founded in 1848, and includes some 262 affiliated societies and academies of science, serving 10 million individuals. Science has the largest paid circulation of any peer-reviewed general science journal in the world, with an estimated total readership of 1 million. The non-profit AAAS (www.aaas.org) is open to all and fulfills its mission to “advance science and serve society” through initiatives in science policy; international programs; science education; and more. For the latest research news, log onto EurekAlert!, www.eurekalert.org, the premier science-news Web site, a service of AAAS.


The American Association for the Advancement of Science
Science Translational Medicine
Science Signaling
Vaccine to Treat Lung Cancer Being Developed
Breakthrough in Fight Against Alzheimer's Disease
What Is Metabolomics And Its Importance
Human Embryo Cloned for Stem Cell Production
US$10 Million Contest to Sequence Centenarian Genome
Researchers Look into Lung Regeneration
Photodynamic Therapy: Shining A Light To Fight Cancer
New Developments in Treatment of Asthma, Allergies and Arthritis

Mending A Broken Heart

Scientists have found a way for the human heart to repair itself. This discovery may open up new dimensions in cardiac regeneration and repair.

A damaged heart tissue does not have enough capacity for repair. It has no inherent system to do so. But scientists are close to discovering the chemical signals to have the heart produce replacement cardiac muscle cells. Researchers have identified a family of molecules that can stimulate stem cells to develop into beating heart muscle cells. The research is published by Cell Press in the December 21st issue of the journal Chemistry & Biology.

"Despite advances in modern medicine, management of myocardial infarction and heart failure remains a major challenge," explains senior study author Dr. Tao P. Zhong from Fudan University in Shanghai, China. "There is intense interest in developing agents that can influence stem cells to differentiate into cardiac cells as well as enhance the inherent regenerative capacities of the heart. Developing therapies that can stimulate heart muscle regeneration in areas of infarction would have enormous medical impact."

To search for new molecules involved in heart development, Dr. Zhong and colleagues developed a robust small molecule screen using a zebrafish system. The zebrafish is an excellent model organism to study heart growth and development because there are established genetic approaches that permit visualization of fluorescent beating hearts within transparent embryos. After screening nearly 4,000 compounds, the researchers discovered three structurally related molecules that could selectively enlarge the size of the embryonic heart. The compounds, cardionogen-1, -2, and -3, could promote or inhibit heart formation, depending on when they were administered during development.

Video: Beating Heart Stem Cells

Cardionogen treatment enlarged the zebrafish heart by stimulating production of new cardiac muscle cells from stem cells. The researchers went on to show that cardionogen could stimulate mouse embryonic stem cells to differentiation into beating cardiac muscle cells. The effects of cardionogen were linked to Wnt signaling, a pathway best known for its role in embryonic and heart development. Cardionogen opposes Wnt signaling to induce cardiac muscle cell formation. Importantly, the interaction of cardionogen with Wnt seemed to be restricted to specific cell types.

Taken together, the results identify the cardionogen family members as important modulators of cardiac muscle cell development. "Evaluating the potential of cardionogen on human adult and embryonic stem cells is the next logical step," concludes Dr. Zhong. "This may ultimately aid in design of therapeutic approaches to enhance repopulation of damaged heart muscle and restore function in diseased hearts."

Video: Adult Stem Cells Used To Rebuild Heart Tissue Video


Cell Press
Chemistry & Biology
Human Embryo Cloned for Stem Cell Production
Europe Court Rules Against Stem Cell Patent
US$10 Million Contest to Sequence Centenarian Genome
Researchers Look into Lung Regeneration
Newly Discovered Cardiac Stem Cells Repair Damaged Heart
What Is Metabolomics And Its Importance

22 December 2011

Frankincense Supply is Dwindling

The Three Kings brought three gifts, gold, frankincense, and myrrh. It represented the rarest and most precious tributes one could give a king. Now, frankincense has become even rarer and will continue to do so

Frankincense, also called olibanum is an aromatic resin obtained from the desert tree Boswellia. It is used in incense and perfumes.

There are four main species of Boswellia which produce true frankincense and each type of resin is available in various grades. The grades depend on the time of harvesting, and the resin is hand-sorted for quality.

Unfortunately, frankincense, has now become even more rare. In the Journal of Applied Ecology, researchers say that frankincense will continue declining in supply. Boswellia trees have had trouble reproducing in recent years, and ecologists believed that they were weakened when traders tapped them for resin.

Video: The spiritual, medicinal and historical significance of Frankincense

Working in Ethiopia over a period of 2 years, the researchers monitored 12 copses of B. papyrifera: six that had been tapped and six that had not. A copse is A thicket of small trees or shrubs that is being pruned for resin. They found that the tapped trees were able to reproduce as well as the untapped, ruling out human interference as the major killer.

Instead, the biggest threats seemed to be grazing livestock, fires, and the longhorn beetle, which burrows into trees' bark, kills them, and leaves them as ready fuel for forest fires. If these problems aren't remedied soon, the team's models suggest that frankincense production could drop by 50% in the next 15 years: a tough blow to the economies of Ethiopia and Eritrea who export it.


ScienceShot: Future of Frankincense Not So Sweet
Journal of Applied Ecology
What Is Metabolomics And Its Importance
Glacial Water Shrinking in Peru
The Science of Food
Drug Makes Brain Tumors Glow Hot Pink
Helium 3 to be Mined in the Moon
Climate Change Findings Not As Severe

Solar Paint That Can Generate Electricity

Researchers at the University of Notre Dame has created an inexpensive “solar paint” that uses semiconducting nanoparticles to produce energy.

“We want to do something transformative, to move beyond current silicon-based solar technology,” says Prashant Kamat, John A. Zahm Professor of Science in Chemistry and Biochemistry and an investigator in Notre Dame’s Center for Nano Science and Technology (NDnano), who leads the research.

“By incorporating power-producing nanoparticles, called quantum dots, into a spreadable compound, we’ve made a one-coat solar paint that can be applied to any conductive surface without special equipment.”

They focused on nano-sized particles of titanium dioxide which were coated with either cadmium sulfide or cadmium selenide. The particles were then suspended in a water-alcohol mixture to create a paste. When this paste was brushed onto a conducting material and exposed to light, it created electricity.

The solar cell is easy to assemble because the middle layer can be painted onto a clear electrode. T-butanol, water, cadmium sulfide and titanium dioxide is mixed for thirty minutes. A clear electrode is masked off with office tape. Once the tape is in place, the mixture is spread onto the electrode and annealed with a heat gun. Afterwards, an electrolyte solution is sandwiched between the new electrode and a graphene composite electrode. Once this is done, the solar cell is ready for testing under a beam of artificial light.

Video: Painting Solar Cells with Nanoparticle Paste

Their research is described in the journal ACS Nano.

The best light-to-energy conversion efficiency we’ve reached so far is 1 percent, which is well behind the usual 10 to 15 percent efficiency of commercial silicon solar cells,” explains Kamat.

“But this paint can be made cheaply and in large quantities. If we can improve the efficiency somewhat, we may be able to make a real difference in meeting energy needs in the future.”

“That’s why we’ve christened the new paint, Sun-Believable,” he adds.

“Kamat and his team also plan to study ways to improve the stability of the new material.

NDnano is one of the leading nanotechnology centers in the world. Its mission is to study and manipulate the properties of materials and devices, as well as their interfaces with living systems, at the nano-scale.

This research was funded by the Department of Energy’s Office of Basic Energy Sciences.

University of Notre Dame
Center for Nano Science and Technology
Notre Dame researchers develop paint-on solar cells
Department of Energy's Office of Basic Energy Sciences
New Findings in Electron Density Lead to Better Imaging Devices and Applications
Application of Nanotechnology and Thermodynamics in Measuring Devices
Nanotechnology Electric Car Made From One Molecule
NASA Develops Material That Is Blacker Than Black
Fabric Cleans Itself When Exposed to Sunlight
Advances in Lithium Ion Batteries: 1 Week Power on a 15 Minute Charge
What Does 4G Technology Do For Mobile Phones?
New Way in OLED Production
The Wonders of Graphene

Expanding Vocabulary Through Hip Hop Music

Most music listeners have difficulty correctly understanding and remembering song lyrics.

However, studies show that young adults can learn African-American English (AAE) vocabulary from listening to hip hop music. The study show a positive association between the number of hip-hop artists listened to by participants and AAE comprehension vocabulary scores. Participants to the study were also more likely to know a vocabulary item if the hip-hop artists they listen to use the word in their song lyrics. Together, these results suggest that young adults can acquire vocabulary through exposure to hip-hop music, a finding relevant for research on vocabulary acquisition, the construction of adolescent and adult identities, and the adoption of lexical innovations.

According to a study published in the Dec. 21 issue of the online journal PLoS ONE, people who listen to hip hop music can learn new vocabulary even though the lyrics may be difficult to understand. Paula Chesley of the Department of Linguistics, University of Albeta, found that the number of hip-hop artists that a participant listened to was predictive of the participant's knowledge of words and phrases that are not common mainstream words and are used in hip-hop songs. Words such as "road dog" (friend) and "guap" (lots of money) are examples.

Chesley is the author of the study.

According to her, these effects were seen even when other factors, such as demographics, general pop culture knowledge, and overall musical preferences, were taken into account.

Video: The Art of Hip-Hop Sampling at Duke University

Most work on vocabulary learning from media exposure has focused on infants or non-native speakers. Therefore, investigating how adolescents learn vocabulary from voluntary exposure to music reveals novel aspects of language learning, and takes into account the intention and motivation of the learner. Constructing a vocabulary can be a vital part of defining the speaker's identity, so further research into the mechanism of vocabulary development may continue to shed light on this important process.

Citation: Chesley P (2011) You Know What It Is: Learning Words through Listening to Hip-Hop.

About PLoS ONE: PLoS ONE is the first journal of primary research from all areas of science to employ a combination of peer review and post-publication rating and commenting, to maximize the impact of every report it publishes. PLoS ONE is published by the Public Library of Science (PLoS), the open-access publisher whose goal is to make the world's scientific and medical literature a public resource.

Public Library of Science
Reaserch Suggests Hearing Disability May Be Linked To Dyslexia
Understanding Consciousness: Types of Consciousness
Dream Sleep Relieves Stress from Emotional Pain
Words About Size and Shape Help Promote Spatial Skills in Children
How Our Brains Keep Us Focused
New Insights Into Psychopathy
The Science of Understanding Stress
Noisy Toys May Cause Hearing Damage
The Tech of Storytelling
Steve Jobs Next Big Thing?
Mysterious Coded Manuscript Cracked After 300 Years