TensorRT Tar File Installation

https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-515/tensorrt-install-guide/index.html#installing-tar

 

 

4.3. Tar File Installation

  Note: Before issuing the following commands, you'll need to replace 5.1.x.x with your specific TensorRT version. The following commands are examples.
  1. Install the following dependencies, if not already present:
    • Install the CUDA Toolkit 9.0, 10.0 or 10.1
    • cuDNN 7.5.0
    • Python 2 or Python 3 (Optional)
  2. Download the TensorRT tar file that matches the Linux distribution you are using.
  3. Choose where you want to install TensorRT. This tar file will install everything into a subdirectory called TensorRT-5.1.x.x.
  4. Unpack the tar file.
    $ tar xzvf TensorRT-5.1.x.x.<os>.<arch>-gnu.cuda-x.x.cudnn7.x.tar.gz
    Where:
    • 5.1.x.x is your TensorRT version
    • <os> is Ubuntu-14.04.5, Ubuntu-16.04.5Ubuntu-18.04.2Red-Hat, or CentOS-Linux
    • <arch> is x86_64 or ppc64le
    • cuda-x.x is CUDA version 9.010.0, or 10.1
    • cudnn7.x is cuDNN version 7.5
    This directory will have sub-directories like libincludedata, etc…
    $ ls TensorRT-5.1.x.x
    bin  data  doc  graphsurgeon  include  lib  python  samples  targets  TensorRT-Release-Notes.pdf  uff
    
  5. Add the absolute path to the TensorRTlib directory to the environment variable LD_LIBRARY_PATH:
    $ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<eg:TensorRT-5.1.x.x/lib>
  6. Install the Python TensorRT wheel file.
    $ cd TensorRT-5.1.x.x/python

     

    If using Python 2.7:
    $ sudo pip2 install tensorrt-5.1.x.x-cp27-none-linux_x86_64.whl

     

    If using Python 3.x:
    $ sudo pip3 install tensorrt-5.1.x.x-cp3x-none-linux_x86_64.whl
  7. Install the Python UFF wheel file. This is only required if you plan to use TensorRT with TensorFlow.
    $ cd TensorRT-5.1.x.x/uff

     

    If using Python 2.7:
    $ sudo pip2 install uff-0.6.3-py2.py3-none-any.whl

     

    If using Python 3.x:
    $ sudo pip3 install uff-0.6.3-py2.py3-none-any.whl

     

    In either case:
    $ which convert-to-uff
    /usr/local/bin/convert-to-uff
    
  8. Install the Python graphsurgeon wheel file.
    $ cd TensorRT-5.1.x.x/graphsurgeon

     

    If using Python 2.7:
    $ sudo pip2 install graphsurgeon-0.4.1-py2.py3-none-any.whl
    

     

    If using Python 3.x:
    $ sudo pip3 install graphsurgeon-0.4.1-py2.py3-none-any.whl
    
  9. Verify the installation:
    1. Ensure that the installed files are located in the correct directories. For example, run the tree -d command to check whether all supported installed files are in place in the libincludedata, etc… directories.
    2. Build and run one of the shipped samples, for example, sampleMNIST in the installed directory. You should be able to compile and execute the sample without additional settings. For more information about sampleMNSIT, see the "Hello World" For TensorRT sample.
    3. The Python samples are in the samples/python directory.
上一篇:Recommenders with TensorRT


下一篇:Jetson Nano 【7】调研错误 'tensorrt.tensorrt.ActivationType' has no attribute 'LEAKY_RELU