Toggle navigation
Toggle navigation
此项目
正在载入...
Sign in
xuning
/
sherpaonnx
转到一个项目
Toggle navigation
项目
群组
代码片段
帮助
Toggle navigation pinning
Project
Activity
Repository
Pipelines
Graphs
Issues
0
Merge Requests
0
Wiki
Network
Create a new issue
Builds
Commits
Authored by
Fangjun Kuang
2023-12-20 11:12:12 +0800
Browse Files
Options
Browse Files
Download
Email Patches
Plain Diff
Committed by
GitHub
2023-12-20 11:12:12 +0800
Commit
ef8d112aaaa0509bba31f6bbb8eaf8e805c21613
ef8d112a
1 parent
03ff9db5
Fix whisper test script for the latest onnxruntime. (#494)
隐藏空白字符变更
内嵌
并排对比
正在显示
3 个修改的文件
包含
18 行增加
和
3 行删除
.github/workflows/export-whisper-to-onnx.yaml
build-apk-two-pass.sh
scripts/whisper/test.py
.github/workflows/export-whisper-to-onnx.yaml
查看文件 @
ef8d112
...
...
@@ -31,7 +31,7 @@ jobs:
-
name
:
Install dependencies
shell
:
bash
run
:
|
python3 -m pip install torch==1.13.0 -f https://download.pytorch.org/whl/cpu/torch_stable.html
python3 -m pip install torch==1.13.0
torchaudio==0.13.0
-f https://download.pytorch.org/whl/cpu/torch_stable.html
python3 -m pip install openai-whisper==20230314 onnxruntime onnx
-
name
:
export ${{ matrix.model }}
...
...
@@ -108,6 +108,19 @@ jobs:
repo_token
:
${{ secrets.UPLOAD_GH_SHERPA_ONNX_TOKEN }}
tag
:
asr-models
-
name
:
Test ${{ matrix.model }}
shell
:
bash
run
:
|
python3 -m pip install kaldi-native-fbank
git checkout .
model=${{ matrix.model }}
src=sherpa-onnx-whisper-$model
python3 scripts/whisper/test.py \
--encoder $src/$model-encoder.int8.onnx \
--decoder $src/$model-decoder.int8.onnx \
--tokens $src/$model-tokens.txt \
$src/test_wavs/0.wav
-
name
:
Publish ${{ matrix.model }} to huggingface
shell
:
bash
env
:
...
...
build-apk-two-pass.sh
查看文件 @
ef8d112
...
...
@@ -74,11 +74,11 @@ git lfs pull --include "*.onnx"
# remove .git to save spaces
rm -rf .git
rm README.md
rm
-fv
README.md
rm -rf test_wavs
rm .gitattributes
rm
*
.ort
rm
-fv
*
.ort
rm tiny.en-encoder.onnx
rm tiny.en-decoder.onnx
...
...
scripts/whisper/test.py
查看文件 @
ef8d112
...
...
@@ -82,6 +82,7 @@ class OnnxModel:
self
.
encoder
=
ort
.
InferenceSession
(
encoder
,
sess_options
=
self
.
session_opts
,
providers
=
[
"CPUExecutionProvider"
],
)
meta
=
self
.
encoder
.
get_modelmeta
()
.
custom_metadata_map
...
...
@@ -113,6 +114,7 @@ class OnnxModel:
self
.
decoder
=
ort
.
InferenceSession
(
decoder
,
sess_options
=
self
.
session_opts
,
providers
=
[
"CPUExecutionProvider"
],
)
def
run_encoder
(
...
...
请
注册
或
登录
后发表评论