Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch_norm layer 放在 RecurrentLayerGroup 里面时,配置解析的一个bug #961

Closed
lcy-seso opened this issue Dec 20, 2016 · 2 comments
Assignees

Comments

@lcy-seso
Copy link
Contributor

batch norm layer 作为 RecurrentLayerGroup step 函数里面的一个layer,配置解析时会出错。

比如下面的layer A 在recurrent layer group 里面, layer_group 的name 是 ”layer_group“:

    Layer(name="B",
          type='batch_norm',
          active_type="relu",
          inputs=Input("A", image=Image(channels=128, img_size=1)), )

调用 inputs=Input("A", image=Image(channels=128, img_size=1)), ) 时,input[0].input_layer_name 会被 MakeLayerNameInParentSubmodel 函数命名成 :A@layer_group。

然后batch norm layer 在解析配置时会继续调用下面的代码(config_parser.py 的 1840 ~ 1847 行)隐式地添加 mean 和 std :

        for i in xrange(2):
            inputs.append(
                Input(
                    inputs[0].input_layer_name,
                    initial_std=0.0,
                    initial_mean=0.0,
                    is_static=True,
                    is_shared=is_shared, ))

这里的 Input 会再次调用 MakeLayerNameInParentSubmodel , 将 inputs[0].input_layer_name (这时已经是 A@layer_group)命名为 :A@layer_group@layer_group,这一次重命名会引起错误。

@qingqing01
Copy link
Contributor

参考 #966

@lcy-seso
Copy link
Contributor Author

谢谢 @qingqing01 ~

zhhsplendid pushed a commit to zhhsplendid/Paddle that referenced this issue Sep 25, 2019
* add lite

* lite clean

* update

* up

* update

* refine lite mobile doc (PaddlePaddle#962)

* update

* update
lizexu123 pushed a commit to lizexu123/Paddle that referenced this issue Feb 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants