python - How to submit huge number of small jobs using PBS queue system -
i need run huge number of small jobs (runs several minutes) using pbs queue system. these jobs using same script work on different input , take different time.
because number of jobs large pbs queue cannot handle well. , because different jobs take different time, using pbsdsh not efficient.
so idea solution wrap number of jobs (for example 100 small job) 1 job. 1 job can submitted node 16 cores. on node, 16 processes (corresponding 16 small jobs) run parallel on each core. once 1 process finished on core, new process runs on core. if can this, both reduce number of jobs lot (100 times) , not waste computing time.
does 1 have recommendation solution on this?
thanks.
snakemake might fit in situation. take @ documentation --cluster
, -j n
options. when run snakemake
-j n
option submit n number of jobs @ time. each job finishes, start new one.
p
Comments
Post a Comment