A hacky way to use Tesla p4 for virtual display, but just suggest for developments, since it will take some percent of device memory, finally impact the inference perfįirst need to install nvidia graphic driver with opengl installed Output to sink type 1 Fakesink or 3 File Ģ. gst_nvvidconv_planar_to_planar_conversion, 773"ģA: We do not support MJPEG as input now.ĤQ: Do trt-yolo-app support video stream as inputĤA: Video stream input not supported now, just images as inputĥQ: Customer commonly met sometimes need to output to screen, but just with Tesla card which used as compute card, 2 ways to get throughġ. Gnu/tegra/libnvll_infer.so.1.0.0: undefined symbol: initLibNvInferPluginsġA: objdump -txT /usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvinfer.so |grep initLibNvInferPlugins, return nothing means initLibNvInferPlugins not defined, TRT changed theĪPI name from 5.0.0 to 5.0.3 release versionĢ Irespective Tesla or Tegra, try to run deepstream-app from config under directory /ds package path/samples/configs/deepstream-app/ other than source30*** or source4*** is notĪ right way, these files is part of source4***, or source30***, should run with config source4***, or source30***ģQ: MJPEG as input source, "Conversion not supported. (gst-plugin-scanner:28556): GStreamer-WARNING **: 09:31:40.171: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvinfer.so': /usr/lib/aarch64-linux.
, this link is for nvidia Deepstream sdk full introduction FAQġQ: DeepStream SDK 3.0 on Jetson AGX Xavier, deepstream-app -c configs/deepstream-app/source30_720p_dec_infer-resnet_tiled_display_int8.txt