Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the deploy.prototxt of LargeMargin_Softmax_Loss #20

Open
qinxianyuzi opened this issue Oct 19, 2017 · 5 comments
Open

the deploy.prototxt of LargeMargin_Softmax_Loss #20

qinxianyuzi opened this issue Oct 19, 2017 · 5 comments

Comments

@qinxianyuzi
Copy link

After finished training. How can I use LargeMargin_Softmax_Loss in the deploy.prototxt? thank you!

@mnikitin
Copy link

You should export LargeMarginInnerProduct weights into new InnerProduct layer without bias term.

@cocowf
Copy link

cocowf commented Jan 21, 2018

the same question :the deploy.prototxt of LargeMargin_Softmax_Loss?
How to edit deploy.prototxt

@gxstudy
Copy link

gxstudy commented Apr 11, 2018

May I ask how to export LargeMarginInnerProduct weights into new InnerProduct layer without bias term in deploy file?

Base on my understanding, when I train the network, I use:

layer {
  name: "fc9"
  type: "LargeMarginInnerProduct"
  bottom: "fc8"
  bottom: "label"
  top: "fc9"
  top: "lambda"
  param {
    name: "fc9"
    lr_mult: 10
  }
  largemargin_inner_product_param {
    num_output: 2
    type: SINGLE
    base: 0
    type: QUADRUPLE 
    base: 1000
    gamma: 0.000025
    power: 35
    iteration: 0
    lambda_min: 0
    weight_filler {
      type: "msra"
    }
  }
  include {
    phase: TRAIN
  }
}

In the deploy, I put:

layer {
  name: "fc9"
  type: "InnerProduct"
  bottom: "fc8"
  top: "fc9"
  param {
    lr_mult: 10
    decay_mult: 1
  }
  inner_product_param {
    num_output: 2
  }
}

Then I got an error when using python interface to test,
Check failed: target_blobs.size() == source_layer.blobs_size() (2 vs. 1) Incompatible number of blobs for layer fc9 *** Check failure stack trace: ***
I guess either because in LargeMarginInnerProduct I have two bottom or two top. Could you please let me know where I am wrong? Thanks a lot. @mnikitin @wy1iu @ydwen

@mnikitin
Copy link

@gxstudy Your deploy fc layer should look like:

layer {
  name: "fc9"
  type: "InnerProduct"
  bottom: "fc8"
  top: "fc9"
  inner_product_param {
    num_output: 2
    bias_term: false
  }
}

@gxstudy
Copy link

gxstudy commented Apr 13, 2018

Thank you so much, it works. @mnikitin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants