fast::TensorFlowEngine class

Base classes

class InferenceEngine

Constructors, destructors, conversion operators

~TensorFlowEngine() override
TensorFlowEngine()

Public functions

void load() override
void run() override
auto getName() const -> std::string override
auto getPreferredImageOrdering() const -> ImageOrdering override
auto getSupportedModelFormats() const -> std::vector<ModelFormat> virtual
auto getPreferredModelFormat() const -> ModelFormat virtual
auto getDeviceList() -> std::vector<InferenceDeviceInfo> virtual
void loadCustomPlugins(std::vector<std::string> filename) override

Protected variables

std::unique_ptr<tensorflow::Session> mSession
std::unique_ptr<tensorflow::SavedModelBundle> mSavedModelBundle
std::vector<std::string> mLearningPhaseTensors

Function documentation

std::vector<InferenceDeviceInfo> fast::TensorFlowEngine::getDeviceList() virtual

Returns vector with info on each device

Get a list of devices available for this inference engine.

void fast::TensorFlowEngine::loadCustomPlugins(std::vector<std::string> filename) override

Parameters
filename

Load a custom operator (op). You have to do this BEFORE calling load() to load the model/graph.