| funasr/export/models/modules/multihead_att.py | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 |
funasr/export/models/modules/multihead_att.py
@@ -75,8 +75,8 @@ return x, cache torch_version = float(".".join(torch.__version__.split(".")[:2])) if torch_version >= 1.8: torch_version = tuple([int(i) for i in torch.__version__.split(".")[:2]]) if torch_version >= (1, 8): import torch.fx torch.fx.wrap('preprocess_for_attn')