next up previous contents
Next: Program structure Up: Execution Previous: Files used by the   Contents

Memory considerations

Parameters defining arrays boundaries are given in dimpar.h file. They define also the required memory. They can be decreased to save memory or increased if they are not enough for the simulated system.

One of the most memory-consuming arrays is the list of the atom pairs within the cutoff distance. Its size is defined by parameters NTOT (maximum number of atoms) and NBLMX - maximum number of neighbors (within cutoff) for each atom. Since the list of neighbors is distributed among the available nodes, this parameter can be decreased in the case of parallel execution.

Another "memory consuming" parameters is MAXCF which defines the number of point for calculation of time correlation functions. It can be set to 1 if calculation of time correlation functions is not carried out during the program run. Note also that calculation of TCF is not parallelized, that is why it is often more appropriate for TCF calculations to dump the trajectory and recalculate TCFs afterwords using the tranal utility.

Much memory can be also taken by the array with intermediate averages for different parameters. Its size is NRQS*LHIST, where NRQS is maximum number of calculated different averages and LHIST is the maximum number of series for which these averages are calculated. It may be really big for large macromolecules, if keyword Average_internal is set to yes. In this case, all bond lengths, covalent and torsion angles are calculated and remembered.

The program checks correspondence of array boundaries to the input data and stops if something is wrong. After correction of dimpar.h, the code must be recompiled.


next up previous contents
Next: Program structure Up: Execution Previous: Files used by the   Contents
Alexander Lyubartsev 2010-01-07