From: David Hardy (dhardy_at_ks.uiuc.edu)
Date: Thu Jun 07 2018 - 14:18:06 CDT
What is the "patch problem" that you are trying to avoid by doubling the number of patches in each dimension? Since this doubling in each dimension actually increases the number of patches by a factor of 8, doing so could be the cause of the out-of-memory error that you are experiencing.
-- David J. Hardy, Ph.D. Beckman Institute University of Illinois at Urbana-Champaign 405 N. Mathews Ave., Urbana, IL 61801 dhardy_at_ks.uiuc.edu, http://www.ks.uiuc.edu/~dhardy/ > On Jun 7, 2018, at 6:24 AM, Laura Tiessler <lauratiesslersala_at_gmail.com> wrote: > > Hi all, > running MD using NAMD version: NAMD/2.12-CrayIntel-17.08-cuda-8.0 on a system of about 44 millions atoms (proteins in explicit water). > At the stage of the first minimization, I got this memory problem: > Reason: Could not malloc()--are we out of memory? > > I am using structure and coordinates in binary format, and this the input file: > # > # Input Namd Configuration File. > # Protein Minimization. > # > > # molecular system > usePluginIO yes > structure ionized.js > bincoordinates ionized.coor > > # force field > paratypecharmm on > parameters ./par_all36_prot.prm > parameters ./par_all36_na.prm > parameters ./toppar_water_ions_namd.str > exclude scaled1-4 > 1-4scaling 1.0 > > # Doubling the number of patches (trying to avoid patch problem) > twoAwayX yes > twoAwayY yes > twoAwayZ yes > # approximations > switching on > switchdist 10 > cutoff 12 > pairlistdist 13.5 > > # constraints > constraints on > conskfile restraint.pdb > conskcol B > consref ionized.pdb > > # output files > binaryoutput yes > noPatchesOnOne yes > outputname min_solv_namd.md > > # run minimization > minimization on > minimize 500 > > > Does anyone have suggestions how could I solve the memory problems? Anyone that worked with large systems >10millions > > thanks > regards >
This archive was generated by hypermail 2.1.6 : Sat Sep 14 2019 - 23:19:35 CDT