| runtime/docs/SDK_advanced_guide_offline.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/docs/SDK_advanced_guide_offline_en.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/docs/SDK_advanced_guide_offline_en_zh.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/docs/SDK_advanced_guide_offline_zh.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/docs/SDK_advanced_guide_online.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/docs/SDK_advanced_guide_online_zh.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/onnxruntime/readme.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/websocket/readme.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 | |
| runtime/websocket/readme_zh.md | ●●●●● 补丁 | 查看 | 原始文档 | blame | 历史 |
runtime/docs/SDK_advanced_guide_offline.md
@@ -83,13 +83,13 @@ Introduction to run_server.sh parameters: ```text --download-model-dir: Model download address, download models from Modelscope by setting the model ID. --model-dir: Modelscope model ID. --model-dir: modelscope model ID or local model path. --quantize: True for quantized ASR model, False for non-quantized ASR model. Default is True. --vad-dir: Modelscope model ID. --vad-dir: modelscope model ID or local model path. --vad-quant: True for quantized VAD model, False for non-quantized VAD model. Default is True. --punc-dir: Modelscope model ID. --punc-dir: modelscope model ID or local model path. --punc-quant: True for quantized PUNC model, False for non-quantized PUNC model. Default is True. --itn-dir modelscope model ID --itn-dir modelscope model ID or local model path. --port: Port number that the server listens on. Default is 10095. --decoder-thread-num: Number of inference threads that the server starts. Default is 8. --io-thread-num: Number of IO threads that the server starts. Default is 1. runtime/docs/SDK_advanced_guide_offline_en.md
@@ -65,13 +65,13 @@ Introduction to run_server.sh parameters: ```text --download-model-dir: Model download address, download models from Modelscope by setting the model ID. --model-dir: Modelscope model ID. --model-dir: modelscope model ID or local model path. --quantize: True for quantized ASR model, False for non-quantized ASR model. Default is True. --vad-dir: Modelscope model ID. --vad-dir: modelscope model ID or local model path. --vad-quant: True for quantized VAD model, False for non-quantized VAD model. Default is True. --punc-dir: Modelscope model ID. --punc-dir: modelscope model ID or local model path. --punc-quant: True for quantized PUNC model, False for non-quantized PUNC model. Default is True. --itn-dir modelscope model ID --itn-dir modelscope model ID or local model path. --port: Port number that the server listens on. Default is 10095. --decoder-thread-num: Number of inference threads that the server starts. Default is 8. --io-thread-num: Number of IO threads that the server starts. Default is 1. runtime/docs/SDK_advanced_guide_offline_en_zh.md
@@ -150,13 +150,13 @@ **run_server.sh命令参数介绍** ```text --download-model-dir 模型下载地址,通过设置model ID从Modelscope下载模型 --model-dir modelscope model ID --model-dir modelscope model ID 或者 本地模型路径 --quantize True为量化ASR模型,False为非量化ASR模型,默认是True --vad-dir modelscope model ID --vad-dir modelscope model ID 或者 本地模型路径 --vad-quant True为量化VAD模型,False为非量化VAD模型,默认是True --punc-dir modelscope model ID --punc-dir modelscope model ID 或者 本地模型路径 --punc-quant True为量化PUNC模型,False为非量化PUNC模型,默认是True --itn-dir modelscope model ID --itn-dir modelscope model ID 或者 本地模型路径 --port 服务端监听的端口号,默认为 10095 --decoder-thread-num 服务端启动的推理线程数,默认为 8 --io-thread-num 服务端启动的IO线程数,默认为 1 runtime/docs/SDK_advanced_guide_offline_zh.md
@@ -164,14 +164,14 @@ **run_server.sh命令参数介绍** ```text --download-model-dir 模型下载地址,通过设置model ID从Modelscope下载模型 --model-dir modelscope model ID --model-dir modelscope model ID 或者 本地模型路径 --quantize True为量化ASR模型,False为非量化ASR模型,默认是True --vad-dir modelscope model ID --vad-dir modelscope model ID 或者 本地模型路径 --vad-quant True为量化VAD模型,False为非量化VAD模型,默认是True --punc-dir modelscope model ID --punc-dir modelscope model ID 或者 本地模型路径 --punc-quant True为量化PUNC模型,False为非量化PUNC模型,默认是True --lm-dir modelscope model ID --itn-dir modelscope model ID --lm-dir modelscope model ID 或者 本地模型路径 --itn-dir modelscope model ID 或者 本地模型路径 --port 服务端监听的端口号,默认为 10095 --decoder-thread-num 服务端启动的推理线程数,默认为 8 --io-thread-num 服务端启动的IO线程数,默认为 1 runtime/docs/SDK_advanced_guide_online.md
@@ -100,14 +100,14 @@ ### More details about the script run_server_2pass.sh: ```text --download-model-dir: Model download address, download models from Modelscope by setting the model ID. --model-dir: Modelscope model ID. --model-dir: modelscope model ID or local model path. --online-model-dir modelscope model ID --quantize: True for quantized ASR model, False for non-quantized ASR model. Default is True. --vad-dir: Modelscope model ID. --vad-dir: modelscope model ID or local model path. --vad-quant: True for quantized VAD model, False for non-quantized VAD model. Default is True. --punc-dir: Modelscope model ID. --punc-dir: modelscope model ID or local model path. --punc-quant: True for quantized PUNC model, False for non-quantized PUNC model. Default is True. --itn-dir modelscope model ID --itn-dir modelscope model ID or local model path. --port: Port number that the server listens on. Default is 10095. --decoder-thread-num: Number of inference threads that the server starts. Default is 8. --io-thread-num: Number of IO threads that the server starts. Default is 1. runtime/docs/SDK_advanced_guide_online_zh.md
@@ -108,14 +108,14 @@ **run_server_2pass.sh命令参数介绍** ```text --download-model-dir 模型下载地址,通过设置model ID从Modelscope下载模型 --model-dir modelscope model ID --online-model-dir modelscope model ID --model-dir modelscope model ID 或者 本地模型路径 --online-model-dir modelscope model ID 或者 本地模型路径 --quantize True为量化ASR模型,False为非量化ASR模型,默认是True --vad-dir modelscope model ID --vad-dir modelscope model ID 或者 本地模型路径 --vad-quant True为量化VAD模型,False为非量化VAD模型,默认是True --punc-dir modelscope model ID --punc-dir modelscope model ID 或者 本地模型路径 --punc-quant True为量化PUNC模型,False为非量化PUNC模型,默认是True --itn-dir modelscope model ID --itn-dir modelscope model ID 或者 本地模型路径 --port 服务端监听的端口号,默认为 10095 --decoder-thread-num 服务端启动的推理线程数,默认为 8 --io-thread-num 服务端启动的IO线程数,默认为 1 runtime/onnxruntime/readme.md
@@ -36,12 +36,12 @@ ## Building for Windows ### Download onnxruntime https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl-shared.zip https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/onnxruntime-win-x64-1.16.1.zip Download and unzip to d:\ffmpeg-master-latest-win64-gpl-shared ### Download ffmpeg https://github.com/microsoft/onnxruntime/releases/download/v1.16.1/onnxruntime-win-x64-1.16.1.zip https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-master-latest-win64-gpl-shared.zip Download and unzip to d:\onnxruntime-win-x64-1.16.1 runtime/websocket/readme.md
@@ -6,14 +6,14 @@ ## Building for Linux/Unix ### Download onnxruntime ```shell wget https://github.com/microsoft/onnxruntime/releases/download/v1.14.0/onnxruntime-linux-x64-1.14.0.tgz wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/onnxruntime-linux-x64-1.14.0.tgz tar -zxvf onnxruntime-linux-x64-1.14.0.tgz ``` ### Download ffmpeg ```shell wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-N-111383-g20b8688092-linux64-gpl-shared.tar.xz tar -xvf ffmpeg-N-111383-g20b8688092-linux64-gpl-shared.tar.xz wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-master-latest-linux64-gpl-shared.tar.xz tar -xvf ffmpeg-master-latest-linux64-gpl-shared.tar.xz ``` ### Install deps @@ -31,6 +31,44 @@ ```shell git clone https://github.com/alibaba-damo-academy/FunASR.git && cd FunASR/runtime/websocket mkdir build && cd build cmake -DCMAKE_BUILD_TYPE=release .. -DONNXRUNTIME_DIR=/path/to/onnxruntime-linux-x64-1.14.0 -DFFMPEG_DIR=/path/to/ffmpeg-N-111383-g20b8688092-linux64-gpl-shared cmake -DCMAKE_BUILD_TYPE=release .. -DONNXRUNTIME_DIR=/path/to/onnxruntime-linux-x64-1.14.0 -DFFMPEG_DIR=/path/to/ffmpeg-master-latest-linux64-gpl-shared make -j 4 ``` ## Building for Windows ### Download onnxruntime https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/onnxruntime-win-x64-1.16.1.zip Download to d:\ffmpeg-master-latest-win64-gpl-shared ### Download ffmpeg https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-master-latest-win64-gpl-shared.zip Download to d:\onnxruntime-win-x64-1.16.1 ### Download openssl https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/openssl-1.1.1w.tar.gz Download to d:/src/openssl-1.1.1w Open x64 Native Tools Command Prompt and execute the following compilation steps ``` d: cd d:/src/openssl-1.1.1w perl Configure VC-WIN64A --prefix=d:/openssl-1.1.1w nmake namke install ``` ### Build runtime ``` git clone https://github.com/alibaba-damo-academy/FunASR.git cd FunASR/runtime/websocket mkdir build cd build cmake ../ -D OPENSSL_ROOT_DIR=d:/openssl-1.1.1w -D FFMPEG_DIR=d:/ffmpeg-master-latest-win64-gpl-shared -D ONNXRUNTIME_DIR=d:/onnxruntime-win-x64-1.16.1 ``` Open FunASRWebscoket.sln in Visual Studio and complete the compilation. runtime/websocket/readme_zh.md
@@ -6,14 +6,14 @@ ## Linux/Unix 平台编译 ### 下载 onnxruntime ```shell wget https://github.com/microsoft/onnxruntime/releases/download/v1.14.0/onnxruntime-linux-x64-1.14.0.tgz wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/onnxruntime-linux-x64-1.14.0.tgz tar -zxvf onnxruntime-linux-x64-1.14.0.tgz ``` ### 下载 ffmpeg ```shell wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-N-111383-g20b8688092-linux64-gpl-shared.tar.xz tar -xvf ffmpeg-N-111383-g20b8688092-linux64-gpl-shared.tar.xz wget https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-master-latest-linux64-gpl-shared.tar.xz tar -xvf ffmpeg-master-latest-linux64-gpl-shared.tar.xz ``` ### 安装依赖 @@ -32,24 +32,24 @@ ```shell git clone https://github.com/alibaba-damo-academy/FunASR.git && cd FunASR/runtime/websocket mkdir build && cd build cmake -DCMAKE_BUILD_TYPE=release .. -DONNXRUNTIME_DIR=/path/to/onnxruntime-linux-x64-1.14.0 -DFFMPEG_DIR=/path/to/ffmpeg-N-111383-g20b8688092-linux64-gpl-shared cmake -DCMAKE_BUILD_TYPE=release .. -DONNXRUNTIME_DIR=/path/to/onnxruntime-linux-x64-1.14.0 -DFFMPEG_DIR=/path/to/ffmpeg-master-latest-linux64-gpl-shared make -j 4 ``` ## Windows 平台编译 ### 下载 onnxruntime https://github.com/BtbN/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl-\shared.zip https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/onnxruntime-win-x64-1.16.1.zip 下载并解压到 d:\ffmpeg-master-latest-win64-gpl-shared ### 下载 ffmpeg https://github.com/microsoft/onnxruntime/releases/download/v1.16.1/onnxruntime-win-x64-1.16.1.zip https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/ffmpeg-master-latest-win64-gpl-shared.zip 下载并解压到 d:\onnxruntime-win-x64-1.16.1 ### 编译 openssl https://www.openssl.org/source/openssl-1.1.1w.tar.gz https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/dep_libs/openssl-1.1.1w.tar.gz 下载解压到 d:/src/openssl-1.1.1w