load()¶
- audonnx.load(root, *, model_file='model.yaml', labels_file='labels.yaml', transform_file='transform.yaml', device='cpu', num_workers=1, session_options=None, auto_install=False)[source]¶
Load model from folder.
Tries to load model from YAML file.
Otherwise creates object from ONNX file (legacy mode). In this case it will load labels and the transform from the corresponding YAML files if provided. The model is expected to be located at
audeer.replace_file_extension(model_file, 'onnx')
.- Parameters
root (
str
) – root foldermodel_file (
str
) – model YAML file, that needs to end with.yaml
. In legacy mode path to model ONNX filelabels_file (
str
) – YAML file with labelstransform_file (
str
) – YAML file with transformationdevice (
Union
[str
,Tuple
[str
,Dict
],Sequence
[Union
[str
,Tuple
[str
,Dict
]]]]) – set device ('cpu'
,'cuda'
, or'cuda:<id>'
) or a (list of) provider(s)num_workers (
Optional
[int
]) – number of threads for running onnxruntime inference on cpu. IfNone
andsession_options
isNone
, onnxruntime chooses the number of threadssession_options (
Optional
[SessionOptions
]) –onnxruntime.SessionOptions
to use for inference. IfNone
the default options are used and the number of threads for running inference on cpu is determined bynum_workers
. Otherwise, the provided options are used and thesession_options
propertiesinter_op_num_threads
andintra_op_num_threads
determine the number of threads for inference on cpu andnum_workers
is ignoredauto_install (
bool
) – install missing packages needed to create the object
- Return type
- Returns
model
Examples
>>> model = load('tests') >>> model Input: feature: shape: [18, -1] dtype: tensor(float) transform: opensmile.core.smile.Smile Output: gender: shape: [2] dtype: tensor(float) labels: [female, male]