Hi everybody, I have a dataset of 347899 protein sequences which I want to compare to each other (all-against-all blast). I have access to the compute cluster which is running Score (version 5.8.4.r3) as an mpi environment and have 25 nodes, each with 4 cores and 8 GB of RAM.
We have the latest version of mpiblast installed. I started a mpiblast job, (for comparing 347899 sequences against each other), on 44 processors using the following commandline: mpiformatdb -i 36FungalJGIanigNbcin_M40 --nfrags=42 -p T --skip-reorder mpisub 44 /usr/local/mpiblast_tool/bin/mpiblast -p blastp -d 36FungalJGIanigNbcin_M40 -i /users/zzalssn4/scratch/mpiblast/work/36FungalJGIanigNbcin_M40 -m 8 -e 1e-5 -o /users/zzalssn4/scratch/mpiblast/work/36FungalJGIanigNbcin_M40.outF42C44 This job was running for about 12 days and only 22% or 10122202 matches of the total 47342483 known significant matches were received, still all the processes running to the full (>90% usage) on all specified processors. The same all-against-all blast job using standard blast on 36 processors, where I made 36 chunks of the dataset and blasted each chunk against the complete dataset on a single processor, got completed in less than 24hrs, resulting in 47342483 significant sequence matches. May be I am missing something in running mpiblast properly, so here I need some help in whether I could improve the running time of mpiblast on the size of the datasets mentioned above. Hope to hear from you soon. Regards, Intikhab -- Dr. Intikhab Alam Research Associate School of Computer Science University of Manchester LF7, Kilburn Building, Oxford Road Manchester, M13 9PL United Kingdom http://www.cs.man.ac.uk/~ialam ------------------------------------------------------------------------- Take Surveys. Earn Cash. Influence the Future of IT Join SourceForge.net's Techsay panel and you'll get the chance to share your opinions on IT & business topics through brief surveys-and earn cash http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV _______________________________________________ Mpiblast-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/mpiblast-users
