class
ONNXRuntimeEngineMicrosofts ONNX Runtime inference engine with DirectX/ML support.
Contents
This inferene engine is windows only.
Base classes
- class InferenceEngine
Constructors, destructors, conversion operators
- ~ONNXRuntimeEngine() override
- ONNXRuntimeEngine()
Public functions
- void run() override
- void load() override
- auto getPreferredImageOrdering() const -> ImageOrdering override
- auto getName() const -> std::string override
- auto getSupportedModelFormats() const -> std::vector<ModelFormat> virtual
- auto getPreferredModelFormat() const -> ModelFormat virtual
- void setMaxBatchSize(int maxBathSize) virtual
- auto getMaxBatchSize() const -> int
- auto getDeviceList() -> std::vector<InferenceDeviceInfo> override
Function documentation
std::vector<InferenceDeviceInfo> fast:: ONNXRuntimeEngine:: getDeviceList() override
Returns | vector with info on each device |
---|
Get a list of devices available for this inference engine.