Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loss_t grows when training M-net? #28

Open
Jason-xin opened this issue Sep 2, 2019 · 1 comment
Open

loss_t grows when training M-net? #28

Jason-xin opened this issue Sep 2, 2019 · 1 comment

Comments

@Jason-xin
Copy link

loss_t grows when training M-net?

@vlordier
Copy link

You can split Trimap and Matting networks by freezing each, and use requires_grad = False in train.py

For instance, after loading the model :

# depending on the training phase, freeze model part
if args.train_phase == 't_net':
	for name, child in model.named_children():
		if name is 'm_net':	
			child.requires_grad = False

if args.train_phase == 'm_net':
	for name, child in model.named_children():
		if name is 'm_net':
			child.requires_grad = False

also, you can change the loss to focus on each part of the network

if args.train_phase == 't_net':
	loss = L_t

if args.train_phase == 'm_net':
	loss = L_p

if args.train_phase == 'end_to_end':
		loss = L_p      + 0.01*L_t

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants