Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Applying TTA during inference on LFPnet #1

Open
ofirkris opened this issue Feb 15, 2022 · 1 comment
Open

Applying TTA during inference on LFPnet #1

ofirkris opened this issue Feb 15, 2022 · 1 comment

Comments

@ofirkris
Copy link

In the following paper: "IMPROVING DEEP IMAGE MATTING VIA LOCAL SMOOTHNESS ASSUMPTION"
you've applied TTA during inference to achieve better results for LFPnet - can you point me to the relevant code to experiment this on LFPnet?

@kfeng123
Copy link
Owner

Hi,

First I should say that even with TTA, our reported results are not better than the results of LFPnet without TTA. Nevertheless, when training 200 epochs, our results are better, although these results are not reported in our paper.
May be we can achieve even better results by adding some global modules. But this technique is uncorrelated with the local smoothness assumption, the theme of our paper. So we do not pursuit this.

In our paper, the reported results for LFPnet are exactly from their paper, see arxiv.org/ftp/arxiv/papers/2109/2109.12252.pdf, page 6. And their TTA code is in github.com/QLYoo/LFPNet/blob/master/eval(TTA).py.

Our TTA code is only tested for our proposed LSANet. Our code to perform TTA is the function inference_rotation_multiscale() in the file matting/inference.py. This TTA function is also described in our paper.
We also provide some other TTA functions in the file matting/inference.py, but the results of these TTA functions are not reported in the paper (some TTA functions may be better than the reported one, but may cost more time to apply).

I think our TTA function may be better than the TTA functions used in other works, but I have not experiment this claim.

If you have any issue when reproducing our results, please do not hesitate to contact me.

By the way, it seems that the code of LFPnet uses many techniques of FBANet. However, while LFPnet has published, FBANet is still not published formally. What a pity! FBANet is really a good work.

Repository owner deleted a comment from tim-tepia Mar 18, 2024
@github-staff github-staff deleted a comment from uc-Pri Mar 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants