03 January 2013

Simulating Molecules Through Nonequilibrium Statistical Mechanics


Dynamic computer simulations of molecular systems depend on finite time steps, but these introduce apparent extra work that pushes the molecules around. Using models of water molecules in a box, researchers have learned to separate this "shadow work" from the protocol work explicitly modeled in the simulations.
Credit: Lawrence Berkeley National Laboratory
Scientists have devised a way using nonequilibrium statistical mechanics to study molecular simulations without the accompanying errors in data gathering.

Scientists look at nonequilibrium statistical mechanics to simulate molecular behavior in a much more natural way. Most models are always at flux. They are continuously in motion and changing. This method tries to interpret real world mechanics into a computer simulation.

The Cell

The smallest form of life is a single cell. The cell is made up of molecules. Science tries to understand how these molecules interact with each other through microscopes.

Microscopes have limitations on the size of the objects it can observe. On a molecular level where microscopes cannot clearly observe, science looks at computer simulations.

A computational microscope doesn't use lenses or glass, it uses a computer to simulate molecules and how they act and interact. An example would be a simulation on how a virus enters and infects a healthy host. By studying its interactions on a simulation, scientists can understand its mechanics.

Simulations and Algorithms

At the heart of a computer simulation is its algorithm. It is a series of instructions on how a particular model or any of the model's components would behave given a particular scenario. There are many factors to take into account when creating an algorithm that would simulate the behavior of a molecule. Some factors to consider are the temperature, environment, time (duration), amount of light, and motion and direction of the molecule.

Scientists are continuously striving to perfect these simulations which will benefit fields of science such as physics, quantum mechanics, medicine, and biology.

Algorithms are precise. Given a specific situation, the algorithm will instruct the simulation to behave in a particular way. Real world scenarios aren't that "clean" or clear-cut, arbitrary factors not included in the algorithm may get in the way and produce a completely different result.

Non-Equilibrium Statistical Mechanics

But what is nonequilibrium statistical mechanics? This is a method on determining the behavior of models (or in this case, molecules) that are not in equilibrium. Equilibrium is defined here as a model that is in a state of balance or does not change.

Natural systems are mostly non-equilibrium because of many underlying factors. Changes to the model may also come in different forms such as chemical, thermal, radiative, and mechanical (physical) change.

Non-equilibrium statistical mechanics tries to take in all the factors and resulting changes to the model into account when constructing a simulation. Because of the large number of factors and changes that are calculated, computational errors are most likely to occur.

Digital Representations of the Real World

Because modern computers have to depict the real world with digital representations of numbers instead of physical analogues, to simulate the continuous passage of time they have to digitize time into small slices. This kind of simulation is essential in disciplines from medical and biological research, to new materials, to fundamental considerations of quantum mechanics, and the fact that it inevitably introduces errors is an ongoing problem for scientists.

Scientists at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have now identified and characterized the source of tenacious errors and come up with a way to separate the realistic aspects of a simulation from the artifacts of the computer method. The research was done by David Sivak and his advisor Gavin Crooks in Berkeley Lab's Physical Biosciences Division and John Chodera, a colleague at the California Institute of Quantitative Biosciences (QB3) at the University of California at Berkeley. The three report their results in Physical Review X.

"Our group uses a theoretical method called nonequilibrium statistical mechanics to study molecular machines, the protein complexes essential to processes like photosynthesis and DNA repair," says Sivak. "But when we applied common algorithms to model the behavior in biological molecules, we found persistent, significant errors in the simulation results."

Systems in equilibrium are relatively easy to simulate, but natural systems are often driven far from equilibrium by absorbing light, burning energy-dense chemical fuel, or other driving forces. Sivak, who recently joined the University of California at San Francisco as a Systems Biology Fellow, describes nonequilibrium statistical mechanics as "a way of understanding situations where conditions change abruptly and the system has to play catch-up," a kind of problem in which there are few exact analytical results.

How biological molecules move is hardly the only field where computer simulations of molecular-scale motion are essential. The need to use computers to test theories and model experiments that can't be done on a lab bench is ubiquitous, and the problems that Sivak and his colleagues encountered weren't new.

"A simulation of a physical process on a computer cannot use the exact, continuous equations of motion; the calculations must use approximations over discrete intervals of time," says Sivak. "It's well known that standard algorithms that use discrete time steps don't conserve energy exactly in these calculations."

One workhorse method for modeling molecular systems is Langevin dynamics, based on equations first developed by the French physicist Paul Langevin over a century ago to model Brownian motion. Brownian motion is the random movement of particles in a fluid (originally pollen grains on water) as they collide with the fluid's molecules – particle paths resembling a "drunkard's walk," which Albert Einstein had used just a few years earlier to establish the reality of atoms and molecules. Instead of impractical-to-calculate velocity, momentum, and acceleration for every molecule in the fluid, Langevin's method substituted an effective friction to damp the motion of the particle, plus a series of random jolts.

Video: Development of Non-equilibrium Statistical Mechanical Descriptions

When Sivak and his colleagues used Langevin dynamics to model the behavior of molecular machines, they saw significant differences between what their exact theories predicted and what their simulations produced. They tried to come up with a physical picture of what it would take to produce these wrong answers.

"It was as if extra work were being done to push our molecules around," Sivak says. "In the real world, this would be a driven physical process, but it existed only in the simulation, so we called it 'shadow work.' It took exactly the form of a nonequilibrium driving force."

They first tested this insight with "toy" models having only a single degree of freedom, and found that when they ignored the shadow work, the calculations were systematically biased. But when they accounted for the shadow work, accurate calculations could be recovered.

"Next we looked at systems with hundreds or thousands of simple molecules," says Sivak. Using models of water molecules in a box, they simulated the state of the system over time, starting from a given thermal energy but with no "pushing" from outside. "We wanted to know how far the water simulation would be pushed by the shadow work alone."

The result confirmed that even in the absence of an explicit driving force, the finite-time-step Langevin dynamics simulation acted by itself as a driving nonequilibrium process. Systematic errors resulted from failing to separate this shadow work from the actual "protocol work" that they explicitly modeled in their simulations. For the first time, Sivak and his colleagues were able to quantify the magnitude of the deviations in various test systems.

Such simulation errors can be reduced in several ways, for example by dividing the evolution of the system into ever-finer time steps, because the shadow work is larger when the discrete time steps are larger. But doing so increases the computational expense.

The better approach is to use a correction factor that isolates the shadow work from the physically meaningful work, says Sivak. "We can apply results from our calculation in a meaningful way to characterize the error and correct for it, separating the physically realistic aspects of the simulation from the artifacts of the computer method."

RELATED LINKS

DOE/Lawrence Berkeley National Laboratory
California Institute of Quantitative Biosciences (QB3)
Physical Review X
DOE's Office of Science
Neuroprosthetic Algorithm Leads To Better Thought Controlled Objects Such As Bionic Arms and Legs
Successful Test of Artificial Pancreas System, Hypoglycemia Hyperglycemia Minimizer (HHM) System For Type 1 Diabetes
The Stock Market - High Frequency Trading, the Algorithms and the Science Behind It
MIT News: Speeding Up GPS Algorithms Through Data Compression, Line Simplification, and Signal Clustering
Sparse Microwave Imaging Addresses New Complexities in Imaging Systems and Technology
Algorithm Developed To Trace Source of Internet Rumor, Epidemic, or Terrorist Attack Within A Network
MIT NEWS: The faster-than-fast Fourier transform
MIT News: Researchers Develop Algorithm That Allows Robots To Learn And Understand
Algorithm Developed For Resampling Framework Of Point Set Surfaces With Gaussian Spheres
MIT News: Algorithm Developed To Allow Cars To Connect To Wi-Fi Network
MIT News: Algorithm Developed For Determining Trajectory For Robot Planes Without GPS