We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[root@c8e3006d5ed0 ~/rwkv-cpp-cuda/examples/terminalchat/release]$ ./chat /root/work/rwkv-cpp-cuda/examples/terminalchat/release/model.bin n_layers: 32 n_embed: 4096 loading: xbuf loading: embed loading: layernorms loading: state_xy loading: state_aa loading: state_bb loading: state_pp loading: state_dd loading: buffer1 loading: buffer2 loading: buffer3 loading: buffer4 loading: mix_k loading: mix_v loading: mix_r loading: km loading: vm loading: rm loading: kr loading: vr loading: rr loading: o1 loading: o2 loading: o3 loading: att_out loading: att_out_r loading: att_out_o loading: ffn_mix_k loading: ffn_mix_v loading: ffn_k loading: ffn_v loading: ffn_r loading: ffn_kr loading: ffn_vr loading: ffn_rr loading: ffn_ko loading: ffn_vo loading: ffn_ro loading: ffn_k_buffer loading: ffn_v_buffer loading: ffn_r_buffer loading: decay loading: bonus loading: head loading: head_r loading: head_o Loaded model loading context
attout -0.0620827 -0.658374
attout 0.066268 -1.02534
attout 0.0661107 -0.864594
attout 0.584045 -0.852568
attout 0.818303 -1.11242
attout 0.858096 -0.945886
attout 0.616216 -0.832412
attout 0.894563 -0.913151
attout 1.11017 -1.40658
attout 0.576667 -1.19348
attout 0.776536 -0.428046
attout 0.632295 0.881825
attout
The text was updated successfully, but these errors were encountered:
Ahh, sorry, you must have gotten a debug commit
Sorry, something went wrong.
should be good now
No branches or pull requests
[root@c8e3006d5ed0 ~/rwkv-cpp-cuda/examples/terminalchat/release]$ ./chat
/root/work/rwkv-cpp-cuda/examples/terminalchat/release/model.bin
n_layers: 32
n_embed: 4096
loading: xbuf
loading: embed
loading: layernorms
loading: state_xy
loading: state_aa
loading: state_bb
loading: state_pp
loading: state_dd
loading: buffer1
loading: buffer2
loading: buffer3
loading: buffer4
loading: mix_k
loading: mix_v
loading: mix_r
loading: km
loading: vm
loading: rm
loading: kr
loading: vr
loading: rr
loading: o1
loading: o2
loading: o3
loading: att_out
loading: att_out_r
loading: att_out_o
loading: ffn_mix_k
loading: ffn_mix_v
loading: ffn_k
loading: ffn_v
loading: ffn_r
loading: ffn_kr
loading: ffn_vr
loading: ffn_rr
loading: ffn_ko
loading: ffn_vo
loading: ffn_ro
loading: ffn_k_buffer
loading: ffn_v_buffer
loading: ffn_r_buffer
loading: decay
loading: bonus
loading: head
loading: head_r
loading: head_o
Loaded model
loading context
attout
-0.0620827
-0.658374
attout
0.066268
-1.02534
attout
0.0661107
-0.864594
attout
0.584045
-0.852568
attout
0.818303
-1.11242
attout
0.858096
-0.945886
attout
0.616216
-0.832412
attout
0.894563
-0.913151
attout
1.11017
-1.40658
attout
0.576667
-1.19348
attout
0.776536
-0.428046
attout
0.632295
0.881825
attout
The text was updated successfully, but these errors were encountered: