You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to execute predict.lua on some toy text and wav file of Librispeech.
the toy text (test.txt) is : HE HOPED THERE WOULD BE STEW FOR DINNER TURNIPS AND CARROTS AND BRUISED POTATOES AND FAT MUTTON PIECES TO BE LADLED OUT IN THICK PEPPERED FLOUR FATTENED SAUCE
and when I do th Predict.lua -modelPath libri_deepspeech.t7.1 -audioPath prepare_datasets/LibriSpeech/test/test-clean/1089/134686/test.wav -dictionaryPath prepare_datasets/LibriSpeech/test/test-clean/1089/134686/test.txt ,
I get this error: /home/byuns9334/torch/install/bin/luajit: ./Mapper.lua:59: attempt to concatenate a nil value
stack traceback:
./Mapper.lua:59: in function 'tokensToText'
Predict.lua:44: in main chunk
[C]: in function 'dofile'
...9334/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x00405d50
how should I resolve this error?
The text was updated successfully, but these errors were encountered:
Hi, I am trying to execute predict.lua on some toy text and wav file of Librispeech.
the toy text (test.txt) is : HE HOPED THERE WOULD BE STEW FOR DINNER TURNIPS AND CARROTS AND BRUISED POTATOES AND FAT MUTTON PIECES TO BE LADLED OUT IN THICK PEPPERED FLOUR FATTENED SAUCE
and when I do th Predict.lua -modelPath libri_deepspeech.t7.1 -audioPath prepare_datasets/LibriSpeech/test/test-clean/1089/134686/test.wav -dictionaryPath prepare_datasets/LibriSpeech/test/test-clean/1089/134686/test.txt ,
I get this error: /home/byuns9334/torch/install/bin/luajit: ./Mapper.lua:59: attempt to concatenate a nil value
stack traceback:
./Mapper.lua:59: in function 'tokensToText'
Predict.lua:44: in main chunk
[C]: in function 'dofile'
...9334/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:150: in main chunk
[C]: at 0x00405d50
how should I resolve this error?
The text was updated successfully, but these errors were encountered: