coder.TensorRTConfig
Parameters to configure deep learning code generation with the NVIDIA TensorRT library
Description
The coder.TensorRTConfig object contains NVIDIA® high performance deep learning inference optimizer and run-time library
(TensorRT) specific parameters. codegen uses those parameters for generating
CUDA® code for deep neural networks.
To use a coder.TensorRTConfig object for code generation, create a code
configuration object by using the coder.gpuConfig
function, and set the DeepLearningConfig property of the object to the
coder.TensorRTConfig object.
Creation
Create a TensorRT configuration object by using the coder.DeepLearningConfig function with target library set as
'tensorrt'.
Properties
Examples
Version History
Introduced in R2018bSee Also
Functions
codegen|imagePretrainedNetwork(Deep Learning Toolbox) |coder.DeepLearningConfig|coder.loadDeepLearningNetwork