Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

coconet run error~ #6

Open
ChaoXianSen opened this issue Oct 14, 2023 · 1 comment
Open

coconet run error~ #6

ChaoXianSen opened this issue Oct 14, 2023 · 1 comment

Comments

@ChaoXianSen
Copy link

Hi !@Puumanamana
run pipeline :
coconet run
--fasta R63.contigs.fa
--bam R63_sort.bam
--output coconet_binning_results
--threads 16

err report:

00:50:09 (Mem: 0.2 GB) INFO: Using 16 threads
00:50:09 (Mem: 0.2 GB) INFO: Features: coverage, composition
00:50:11 (Mem: 0.2 GB) INFO: Processing 681,530 contigs
00:50:17 (Mem: 0.2 GB) INFO: Length filter (L>2048 bp) -> 43,057 contigs remaining
00:56:50 (Mem: 0.2 GB) INFO: Complete contigs filtering (DTR > 10 bp) -> 42993 contigs remaining
00:56:50 (Mem: 0.2 GB) INFO: Processing alignments and converting to h5 format
01:19:26 (Mem: 0.5 GB) INFO: Coverage filtering summary:
- 28,185,071 total reads
- 87.3% reads (mapped)
- 100.0% reads (primary alignments)
- 81.8% reads (mapq>30)
- 100.0% reads (coverage>50%)
- 74.7% reads (flag & 3596 == 0)
01:20:29 (Mem: 0.5 GB) INFO: 43,057 contigs are only present in the composition. Taking the intersection (0 contigs)
01:20:29 (Mem: 0.5 GB) INFO: Prevalence filter (prevalence>=2) -> 0 contigs remaining
01:20:29 (Mem: 0.5 GB) INFO: Making train/test examples
Traceback (most recent call last):
File "/public/home/bioinfo_wang/00_software/miniconda3/envs/coconet/bin/coconet", line 10, in
sys.exit(main())
File "/public/home/bioinfo_wang/00_software/miniconda3/envs/coconet/lib/python3.7/site-packages/coconet/coconet.py", line 61, in main
make_train_test(cfg)
File "/public/home/bioinfo_wang/00_software/miniconda3/envs/coconet/lib/python3.7/site-packages/coconet/coconet.py", line 167, in make_train_te
assembly_idx = dict(test=np.random.choice(n_ctg, n_ctg_for_test))
File "mtrand.pyx", line 909, in numpy.random.mtrand.RandomState.choice
ValueError: a must be greater than 0 unless no samples are taken

Any idea about this problem?

Thanks!

@Vini2
Copy link

Vini2 commented Sep 2, 2024

Hi @Puumanamana,

I'm getting the same error with the following command.

coconet run --fasta contigs.fasta --bam bam_files/*.bam --output coconet_results --threads 64

This is the output I get.

15:39:32 (Mem:  0.4 GB)    <CoCoNet>    INFO: Using 64 threads
15:39:32 (Mem:  0.4 GB)    <CoCoNet>    INFO: Features: coverage, composition
15:39:34 (Mem:  0.4 GB) <preprocessing> INFO: Processing 571,798 contigs
15:39:38 (Mem:  0.4 GB) <preprocessing> INFO: Length filter (L>2048 bp) -> 31,394 contigs remaining
15:42:22 (Mem:  0.4 GB) <preprocessing> INFO: Complete contigs filtering (DTR > 10 bp) -> 31207 contigs remaining
15:42:22 (Mem:  0.4 GB) <preprocessing> INFO: Processing alignments and converting to h5 format
15:45:10 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 1,000 contigs processed
15:46:22 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 2,000 contigs processed
15:47:07 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 3,000 contigs processed
15:47:43 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 4,000 contigs processed
15:48:13 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 5,000 contigs processed
15:48:39 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 6,000 contigs processed
15:49:03 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 7,000 contigs processed
15:49:25 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 8,000 contigs processed
15:49:44 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 9,000 contigs processed
15:50:03 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 10,000 contigs processed
15:50:19 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 11,000 contigs processed
15:50:35 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 12,000 contigs processed
15:50:50 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 13,000 contigs processed
15:51:03 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 14,000 contigs processed
15:51:17 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 15,000 contigs processed
15:51:28 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 16,000 contigs processed
15:51:40 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 17,000 contigs processed
15:51:51 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 18,000 contigs processed
15:52:02 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 19,000 contigs processed
15:52:13 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 20,000 contigs processed
15:52:26 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 21,000 contigs processed
15:52:36 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 22,000 contigs processed
15:52:45 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 23,000 contigs processed
15:52:54 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 24,000 contigs processed
15:53:03 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 25,000 contigs processed
15:53:11 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 26,000 contigs processed
15:53:19 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 27,000 contigs processed
15:53:27 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 28,000 contigs processed
15:53:35 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 29,000 contigs processed
15:53:43 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 30,000 contigs processed
15:53:50 (Mem:  0.8 GB) <preprocessing> DEBUG: Coverage: 31,000 contigs processed
15:53:55 (Mem:  0.7 GB) <preprocessing> INFO: Coverage filtering summary:
                                              - 32,155,348 total reads
                                              - 94.9% reads (mapped)
                                              - 100.0% reads (primary alignments)
                                              - 89.1% reads (mapq>30)
                                              - 98.2% reads (coverage>50%)
                                              - 89.2% reads (flag & 3596 == 0)
15:54:11 (Mem:  0.7 GB) <preprocessing> INFO: 31,394 contigs are only present in the composition. Taking the intersection (0 contigs)
15:54:11 (Mem:  0.7 GB) <preprocessing> INFO: Prevalence filter (prevalence>=2) -> 0 contigs remaining
15:54:11 (Mem:  0.7 GB)   <learning>    INFO: Making train/test examples
Traceback (most recent call last):
  File "/home/mall0133/miniconda3/envs/coconet/bin/coconet", line 10, in <module>
    sys.exit(main())
  File "/home/mall0133/.local/lib/python3.7/site-packages/coconet/coconet.py", line 61, in main
    make_train_test(cfg)
  File "/home/mall0133/.local/lib/python3.7/site-packages/coconet/coconet.py", line 167, in make_train_test
    assembly_idx = dict(test=np.random.choice(n_ctg, n_ctg_for_test))
  File "mtrand.pyx", line 909, in numpy.random.mtrand.RandomState.choice
ValueError: a must be greater than 0 unless no samples are taken

I tried changing the parameter --min-ctg-len to 1024, 512 and 256 but all still failed giving the same error.

Appreciate any help to fix this problem.

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants