No module named 'graphsurgeon
Note: Before issuing the following commands, you'll need to replace 5.1.x.x with your specific TensorRT version. The following commands are examples.
- Install the following dependencies, if not already present:
- Install the CUDA Toolkit 9.0, 10.0 or 10.1
- cuDNN 7.5.0
- Python 2 or Python 3 (Optional)
- Download the TensorRT tar file that matches the Linux distribution you are using.
- Choose where you want to install TensorRT. This tar file will install everything into a subdirectory called TensorRT-5.1.x.x.
- Unpack the tar file.
Where:$ tar xzvf TensorRT-5.1.x.x.Ubuntu-1x.04.x.x86_64-gnu.cuda-x.x.cudnn7.x.tar.gz
- 5.1.x.x is your TensorRT version
- Ubuntu-1x.04.x is 14.04.5, 16.04.4 or 18.04.1
- cuda-x.x is CUDA version 9.0, 10.0, or 10.1
- cudnn7.x is cuDNN version 7.5
$ ls TensorRT-5.1.x.x bin data doc graphsurgeon include lib python samples targets TensorRT-Release-Notes.pdf uff
- Add the absolute path to the TensorRT lib directory to the environment variable LD_LIBRARY_PATH:
$ export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<eg:TensorRT-5.1.x.x/lib>
- Install the Python TensorRT wheel file.
$ cd TensorRT-5.1.x.x/python
$ sudo pip2 install tensorrt-5.1.x.x-cp27-none-linux_x86_64.whl
$ sudo pip3 install tensorrt-5.1.x.x-cp3x-none-linux_x86_64.whl
- Install the Python UFF wheel file. This is only required if you plan to use TensorRT with TensorFlow.
$ cd TensorRT-5.1.x.x/uff
$ sudo pip2 install uff-0.6.3-py2.py3-none-any.whl
$ sudo pip3 install uff-0.6.3-py2.py3-none-any.whl
$ which convert-to-uff /usr/local/bin/convert-to-uff
- Install the Python graphsurgeon wheel file.
$ cd TensorRT-5.1.x.x/graphsurgeon
$ sudo pip2 install graphsurgeon-0.4.1-py2.py3-none-any.whl
$ sudo pip3 install graphsurgeon-0.4.1-py2.py3-none-any.whl
- Verify the installation:
- Ensure that the installed files are located in the correct directories. For example, run the tree -d command to check whether all supported installed files are in place in the lib, include, data, etc… directories.
- Build and run one of the shipped samples, for example, sampleMNIST in the installed directory. You should be able to compile and execute the sample without additional settings. For more information about sampleMNSIT, see the TensorRT Sample Support Guide.
- The Python samples are in the samples/python direc
- python包一定安装,在使用python API转换tensorflow model的时候,会报错找不到graphsurgeon库.