An implementation of the image-to-poem model described in the paper: "Beyond Narrative Description: Generating Poetry from Images by Multi-Adversarial Training." Accepted as the best paper of ACM MM2018.
Bei Liu, Jianlong Fu, Makoto P. Kato, Masatoshi Yoshikawa
Full text available at: https://arxiv.org/abs/1804.08473
The Img2poem model is a deep neural network that learns how to generate poems from images. For example:
(It is recommended to install the dependencies under Conda environment.)
- python2.7
- tensorflow1.6
- mxnet
- opencv
- tqdm
- colorama
- flask
Name | #Poem | #Line/poem | #Word/line |
---|---|---|---|
MultiM-Poem | 8,292 | 7.2 | 5.7 |
UniM-Poem | 93,265 | 5.7 | 6.2 |
MultiM-Poem(Ex) | 26,161 | 5.4 | 5.9 |
Both datasets are formatted in JSON files.
MultiM-Poem.json: image and poem pairs
[
{
"poem": str,
"image_url": str,
"id": int
},
...
]
UniM-Poem.json: poem corpus
[
{
"poem": str,
"id": int
},
...
]
Please download models from https://1drv.ms/u/s!AkLgJBAHL_VFgSyyfpeGyGFZux56 and put it under "code/".
The following command line will generate poem for an image.
python test.py
Type in the relative path to the test image in the console and the poem will be generated.
../images/test.jpg
Example output:
the sun is singing in the forest wind
and let us go to the wind of the sun
let the sun be free
let us be the storm of heaven
and let us be the slow sun
we keep our own strength together
we live in love and hate
Here are some examples of poems generated by eight methods for an image.
If you find this repo useful in your research, please consider citing the following papers:
@inproceedings{liu2018beyond,
title={Beyond narrative description: Generating poetry from images by multi-adversarial training},
author={Liu, Bei and Fu, Jianlong and Kato, Makoto P and Yoshikawa, Masatoshi},
booktitle={Proceedings of the 26th ACM international conference on Multimedia},
pages={783--791},
year={2018}
}