Make reductionInferShape conservative to fix gorgonia#384#411
Merged
chewxy merged 2 commits intogorgonia:masterfrom Jun 15, 2020
wzzhu:master
Merged
Make reductionInferShape conservative to fix gorgonia#384#411chewxy merged 2 commits intogorgonia:masterfrom wzzhu:master
chewxy merged 2 commits intogorgonia:masterfrom
wzzhu:master
Conversation
The reductionInferShape currently doesn't respect along initially. It aggressively squeezes dimensions. Not only does it affect normal tensor operation, but also it breaks the backprop autoDiff algorithm sometimes when the network containing BroadcastAdd, resulting in crash when calling Grad(). The change tries to strictly respect the parameter along, e.g., (100, 1) along 0, reduction to shape (1) instead () (1, 64, 1, 64) along 3 will reduce to (1, 64, 1) (64, 1, 3, 2) along (2,3) will reduce to (64, 1). Fixed unit tests.
After changing reductionType to subtract len(along) from reduction op, SumOp's dimension need to be adjusted with it. Otherwise SymDiff will crash for Sum in calcBroadcastShap.
Member
|
LGTM. Sorry for the late review. Life has been nuts and I barely have time at the computer |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
The reductionInferShape currently doesn't respect along initially.
It aggressively squeezes dimensions. Not only does it affect normal tensor operation, but also it breaks the backprop autoDiff algorithm sometimes when the network containing BroadcastAdd, resulting in crash when calling Grad().
The change tries to strictly respect the parameter along, e.g.,
(100, 1) along 0, reduction to shape (1) instead ()
(1, 64, 1, 64) along 3 will reduce to (1, 64, 1)
(64, 1, 3, 2) along (2,3) will reduce to (64, 1).
Fixed unit tests.