Skip to content
This repository has been archived by the owner on Nov 1, 2021. It is now read-only.

Fix zero gradient for subtensor assignment. #127

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Commits on Jul 13, 2016

  1. Fix zero gradient for subtensor assignment.

    The variable that is being assigned has its gradient correctly
    calculated (g[k]) but later on when the gradient of the variable being
    assigned to is calculated g[k] is set to 0. This gives the correct
    gradient for the variable being assigned to, but because it shares the
    same storage it actually overrides the earlier gradient incorrectly to
    zero. This fixes that.
    bartvm authored and Bart van Merriënboer committed Jul 13, 2016
    Configuration menu
    Copy the full SHA
    db06c97 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    1fa109a View commit details
    Browse the repository at this point in the history