load()

audonnx.load(root, *, model_file='model.yaml', labels_file='labels.yaml', transform_file='transform.yaml', device='cpu', num_workers=1, session_options=None, auto_install=False)[source]

Load model from folder.

Tries to load model from YAML file.

Otherwise creates object from ONNX file (legacy mode). In this case it will load labels and the transform from the corresponding YAML files if provided. The model is expected to be located at audeer.replace_file_extension(model_file, 'onnx').

Parameters
  • root (str) – root folder

  • model_file (str) – model YAML file, that needs to end with .yaml. In legacy mode path to model ONNX file

  • labels_file (str) – YAML file with labels

  • transform_file (str) – YAML file with transformation

  • device (Union[str, Tuple[str, Dict], Sequence[Union[str, Tuple[str, Dict]]]]) – set device ('cpu', 'cuda', or 'cuda:<id>') or a (list of) provider(s)

  • num_workers (Optional[int]) – number of threads for running onnxruntime inference on cpu. If None and session_options is None, onnxruntime chooses the number of threads

  • session_options (Optional[SessionOptions]) – onnxruntime.SessionOptions to use for inference. If None the default options are used and the number of threads for running inference on cpu is determined by num_workers. Otherwise, the provided options are used and the session_options properties inter_op_num_threads and intra_op_num_threads determine the number of threads for inference on cpu and num_workers is ignored

  • auto_install (bool) – install missing packages needed to create the object

Return type

Model

Returns

model

Examples

>>> model = load('tests')
>>> model
Input:
  feature:
    shape: [18, -1]
    dtype: tensor(float)
    transform: opensmile.core.smile.Smile
Output:
  gender:
    shape: [2]
    dtype: tensor(float)
    labels: [female, male]