From c441eb08c44dfd4a7a8c68970fd3ebe7943d06ee Mon Sep 17 00:00:00 2001
From: shixian.shi <shixian.shi@alibaba-inc.com>
Date: 星期四, 09 三月 2023 15:26:03 +0800
Subject: [PATCH] tp_inference device bug

---
 funasr/runtime/python/onnxruntime/README.md |    2 +-
 1 files changed, 1 insertions(+), 1 deletions(-)

diff --git a/funasr/runtime/python/onnxruntime/README.md b/funasr/runtime/python/onnxruntime/README.md
index ca6f6b6..6ed9849 100644
--- a/funasr/runtime/python/onnxruntime/README.md
+++ b/funasr/runtime/python/onnxruntime/README.md
@@ -11,7 +11,7 @@
 
 ### Steps:
 1. Export the model.
-   - Command: (`Tips`: torch 1.11.0 is required.)
+   - Command: (`Tips`: torch >= 1.11.0 is required.)
 
       ```shell
       python -m funasr.export.export_model [model_name] [export_dir] [true]

--
Gitblit v1.9.1