| funasr/models/llm_asr_nar/model.py | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 |
funasr/models/llm_asr_nar/model.py
@@ -315,8 +315,10 @@ model_outputs = self.llm(inputs_embeds=inputs_embeds, attention_mask=attention_mask, labels=None) preds = torch.argmax(model_outputs.logits, -1) text = tokenizer.batch_decode(preds, add_special_tokens=False, skip_special_tokens=True) text = text[0].split(': ')[-1] text = text.strip() # preds = torch.argmax(model_outputs.logits, -1) ibest_writer = None