You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But, when running lira ppower for single neuron's sequence data (BAM file with 35G~237G size), some I/O errors occur. Here is part of the errors:
[E::hts_open] fail to open file '.bulk.6248_200804_cer/power_bams/100.bulk.6248_200804_cer.bam'
samtools: Could not open ".bulk.6248_200804_cer/power_bams/100.bulk.6248_200804_cer.bam": Remote I/O error
In local.power.function, it will apply samtools view with mapping result as input. When using batch.size=10, It seems that, function submit.jobs will submit 10 bash scripts for each slurm job? And each bash script will call local.power.function. Is it possible that this is the cause of the error? And should I use a smaller batch.size?
The text was updated successfully, but these errors were encountered:
Hi, it seems that I've managed to run LiRA successfully with slurm for cancer's data (from Li et al.). Here is the summary information:
But, when running
lira ppower
for single neuron's sequence data (BAM file with 35G~237G size), some I/O errors occur. Here is part of the errors:In
local.power.function
, it will applysamtools view
with mapping result as input. When usingbatch.size=10
, It seems that, functionsubmit.jobs
will submit 10 bash scripts for each slurm job? And each bash script will calllocal.power.function
. Is it possible that this is the cause of the error? And should I use a smallerbatch.size
?The text was updated successfully, but these errors were encountered: