-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AMP OP&Test] add fp16/bf16 unittest for softmax_with_cross_entropy ops #52412
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
self.dtype = np.uint16 | ||
|
||
# NOTE: numpy bf16 have very low accuracy, use float32 for numpy check. | ||
date_type = np.float32 if core.is_compiled_with_rocm() else np.float64 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里就用float32就行
|
||
def test_check_output(self): | ||
if self.python_api is not None: | ||
self.check_output(atol=1e-2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bf16默认值为1e-2,这里无需设置
|
||
def test_check_grad(self): | ||
if self.python_api is not None: | ||
self.check_grad(["Logits"], "Loss", max_relative_error=0.1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里的max_relative_error尝试使用默认值看能否通过,若不能再调整
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
fp16的单测需要尝试默认的 |
@@ -508,12 +509,12 @@ def setUp(self): | |||
|
|||
def test_check_output(self): | |||
if self.python_api is not None: | |||
self.check_output(atol=1e-2) | |||
self.check_output() | |||
self.check_output(atol=1e-2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里也需要删掉
@@ -917,6 +918,53 @@ def initParams(self): | |||
self.use_softmax = True | |||
|
|||
|
|||
class TestSoftmaxWithCrossEntropyOpBF16(TestSoftmaxWithCrossEntropyOp): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
加入skipif跳过某些不支持bf16的place
self.attrs['axis'] = self.axis | ||
|
||
def test_check_output(self): | ||
if self.python_api is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
调用check_output/grad_with_place,否则会跑到CPU报错
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
OPs
Describe
add fp16/bf16 unittest for softmax_with_cross_entropy