Running Condor jobs with large memory requirements

By default, Condor assigns each process you launch 1 GB of RAM. If your job grows too large, one of two things will happen.

  • Condor may evict it, causing it to return to the queue and stay in the idle ("I") state.
  • If the combination of jobs on a machine exceeds the amount of available RAM and swap, the kernel out of memory killer will kill processes until memory becomes available.
Both of these problems can be avoided by giving Condor a realistic idea of how much memory your job needs.

Running jobs larger than 1 GB

If you have a job with processes that consume more than 1 GB of memory, you can tell Condor how much RAM they require by adding the require_memory keyword to your submit file. This value should be specified in megabytes.

Here's an example submit script for an executable called hugejob, which requires at least 7 GB of memory to run:

executable = hugejob
getenv = true
input =
output = hugejob.out
error = hugejob.err
log = hugejob.log
require_memory = 7*1024
Edit | Attach | Print version | History: r10 < r9 < r8 < r7 < r6 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r9 - 2011-09-09 - 18:00:56 - brodbd

This site is powered by the TWiki collaboration platformCopyright & by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback
Privacy Statement Terms & Conditions