python - How to submit huge number of small jobs using PBS queue system -


i need run huge number of small jobs (runs several minutes) using pbs queue system. these jobs using same script work on different input , take different time.

because number of jobs large pbs queue cannot handle well. , because different jobs take different time, using pbsdsh not efficient.

so idea solution wrap number of jobs (for example 100 small job) 1 job. 1 job can submitted node 16 cores. on node, 16 processes (corresponding 16 small jobs) run parallel on each core. once 1 process finished on core, new process runs on core. if can this, both reduce number of jobs lot (100 times) , not waste computing time.

does 1 have recommendation solution on this?

thanks.

snakemake might fit in situation. take @ documentation --cluster , -j n options. when run snakemake -j n option submit n number of jobs @ time. each job finishes, start new one.

p


Comments

Popular posts from this blog

java - UnknownEntityTypeException: Unable to locate persister (Hibernate 5.0) -

python - ValueError: empty vocabulary; perhaps the documents only contain stop words -

ubuntu - collect2: fatal error: ld terminated with signal 9 [Killed] -