You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. dataloader.py: change the index from 0,1,2 to train, val, test (easier to understand)
2. Use allow growth gpu option configuration. (in eval.py and train.py)
3. eval.py: use a better way to merge eval options and saved options in infos.
4. Add an att_hid_size option, so that the hidden size of attention in show attend tell could change.
5. 3 models: fix the beam search code.(I introduced a bug)
6. 3 models: allow different settings of optimization.
7. ShowAttenTellModel: use fc7 to initialize the state; the old version is in ShowAttendTellModel_old.py
8. ShowAttendTellModel: fix a huge bug!!!!!! (I used initial state as the state input at all time steps)
9. ShowAttendTellModel: shorten the code, put some reusable part in a function.
Copy file name to clipboardExpand all lines: README.md
+2-3
Original file line number
Diff line number
Diff line change
@@ -20,6 +20,7 @@ Currently if you want to use my code, you need to train the model from scratch (
20
20
-~~sample_max~~
21
21
-~~eval on unseen images~~
22
22
- eval on test
23
+
- visualize attention map
23
24
24
25
# Requirements
25
26
Python 2.7
@@ -69,9 +70,7 @@ If you'd like to evaluate BLEU/METEOR/CIDEr scores during training in addition t
69
70
70
71
### Caption images after training
71
72
72
-
In this case you want to run the evaluation script on a pretrained model checkpoint.
73
-
I trained a decent one on the [MS COCO dataset](http://mscoco.org/) that you can run on your images.
74
-
The pretrained checkpoint can be downloaded here: [pretrained checkpoint link](http://cs.stanford.edu/people/karpathy/neuraltalk2/checkpoint_v1.zip) (600MB). It's large because it contains the weights of a finetuned VGGNet. Now place all your images of interest into a folder, e.g. `blah`, and run
73
+
Now place all your images of interest into a folder, e.g. `blah`, and run
0 commit comments