YOLO-NAS-Pose-JetPack5 / yolo_nas_pose_l_fp32.onnx.fp16.engine.log
Luigi's picture
Upload TensorRT & ONNX Model files
127e10d
raw
history blame
No virus
42.4 kB
&&&& RUNNING TensorRT.trtexec [TensorRT v8502] # /usr/src/tensorrt/bin/trtexec --onnx=yolo_nas_pose_l_fp32.onnx --fp16 --avgRuns=100 --duration=15 --saveEngine=yolo_nas_pose_l_fp32.onnx.fp16.engine
[12/28/2023-11:52:55] [I] === Model Options ===
[12/28/2023-11:52:55] [I] Format: ONNX
[12/28/2023-11:52:55] [I] Model: yolo_nas_pose_l_fp32.onnx
[12/28/2023-11:52:55] [I] Output:
[12/28/2023-11:52:55] [I] === Build Options ===
[12/28/2023-11:52:55] [I] Max batch: explicit batch
[12/28/2023-11:52:55] [I] Memory Pools: workspace: default, dlaSRAM: default, dlaLocalDRAM: default, dlaGlobalDRAM: default
[12/28/2023-11:52:55] [I] minTiming: 1
[12/28/2023-11:52:55] [I] avgTiming: 8
[12/28/2023-11:52:55] [I] Precision: FP32+FP16
[12/28/2023-11:52:55] [I] LayerPrecisions:
[12/28/2023-11:52:55] [I] Calibration:
[12/28/2023-11:52:55] [I] Refit: Disabled
[12/28/2023-11:52:55] [I] Sparsity: Disabled
[12/28/2023-11:52:55] [I] Safe mode: Disabled
[12/28/2023-11:52:55] [I] DirectIO mode: Disabled
[12/28/2023-11:52:55] [I] Restricted mode: Disabled
[12/28/2023-11:52:55] [I] Build only: Disabled
[12/28/2023-11:52:55] [I] Save engine: yolo_nas_pose_l_fp32.onnx.fp16.engine
[12/28/2023-11:52:55] [I] Load engine:
[12/28/2023-11:52:55] [I] Profiling verbosity: 0
[12/28/2023-11:52:55] [I] Tactic sources: Using default tactic sources
[12/28/2023-11:52:55] [I] timingCacheMode: local
[12/28/2023-11:52:55] [I] timingCacheFile:
[12/28/2023-11:52:55] [I] Heuristic: Disabled
[12/28/2023-11:52:55] [I] Preview Features: Use default preview flags.
[12/28/2023-11:52:55] [I] Input(s)s format: fp32:CHW
[12/28/2023-11:52:55] [I] Output(s)s format: fp32:CHW
[12/28/2023-11:52:55] [I] Input build shapes: model
[12/28/2023-11:52:55] [I] Input calibration shapes: model
[12/28/2023-11:52:55] [I] === System Options ===
[12/28/2023-11:52:55] [I] Device: 0
[12/28/2023-11:52:55] [I] DLACore:
[12/28/2023-11:52:55] [I] Plugins:
[12/28/2023-11:52:55] [I] === Inference Options ===
[12/28/2023-11:52:55] [I] Batch: Explicit
[12/28/2023-11:52:55] [I] Input inference shapes: model
[12/28/2023-11:52:55] [I] Iterations: 10
[12/28/2023-11:52:55] [I] Duration: 15s (+ 200ms warm up)
[12/28/2023-11:52:55] [I] Sleep time: 0ms
[12/28/2023-11:52:55] [I] Idle time: 0ms
[12/28/2023-11:52:55] [I] Streams: 1
[12/28/2023-11:52:55] [I] ExposeDMA: Disabled
[12/28/2023-11:52:55] [I] Data transfers: Enabled
[12/28/2023-11:52:55] [I] Spin-wait: Disabled
[12/28/2023-11:52:55] [I] Multithreading: Disabled
[12/28/2023-11:52:55] [I] CUDA Graph: Disabled
[12/28/2023-11:52:55] [I] Separate profiling: Disabled
[12/28/2023-11:52:55] [I] Time Deserialize: Disabled
[12/28/2023-11:52:55] [I] Time Refit: Disabled
[12/28/2023-11:52:55] [I] NVTX verbosity: 0
[12/28/2023-11:52:55] [I] Persistent Cache Ratio: 0
[12/28/2023-11:52:55] [I] Inputs:
[12/28/2023-11:52:55] [I] === Reporting Options ===
[12/28/2023-11:52:55] [I] Verbose: Disabled
[12/28/2023-11:52:55] [I] Averages: 100 inferences
[12/28/2023-11:52:55] [I] Percentiles: 90,95,99
[12/28/2023-11:52:55] [I] Dump refittable layers:Disabled
[12/28/2023-11:52:55] [I] Dump output: Disabled
[12/28/2023-11:52:55] [I] Profile: Disabled
[12/28/2023-11:52:55] [I] Export timing to JSON file:
[12/28/2023-11:52:55] [I] Export output to JSON file:
[12/28/2023-11:52:55] [I] Export profile to JSON file:
[12/28/2023-11:52:55] [I]
[12/28/2023-11:52:55] [I] === Device Information ===
[12/28/2023-11:52:55] [I] Selected Device: Orin
[12/28/2023-11:52:55] [I] Compute Capability: 8.7
[12/28/2023-11:52:55] [I] SMs: 8
[12/28/2023-11:52:55] [I] Compute Clock Rate: 0.624 GHz
[12/28/2023-11:52:55] [I] Device Global Memory: 7471 MiB
[12/28/2023-11:52:55] [I] Shared Memory per SM: 164 KiB
[12/28/2023-11:52:55] [I] Memory Bus Width: 128 bits (ECC disabled)
[12/28/2023-11:52:55] [I] Memory Clock Rate: 0.624 GHz
[12/28/2023-11:52:55] [I]
[12/28/2023-11:52:55] [I] TensorRT version: 8.5.2
[12/28/2023-11:52:56] [I] [TRT] [MemUsageChange] Init CUDA: CPU +220, GPU +0, now: CPU 249, GPU 3001 (MiB)
[12/28/2023-11:52:59] [I] [TRT] [MemUsageChange] Init builder kernel library: CPU +302, GPU +283, now: CPU 574, GPU 3306 (MiB)
[12/28/2023-11:52:59] [I] Start parsing network model
[12/28/2023-11:53:02] [I] [TRT] ----------------------------------------------------------------
[12/28/2023-11:53:02] [I] [TRT] Input filename: yolo_nas_pose_l_fp32.onnx
[12/28/2023-11:53:02] [I] [TRT] ONNX IR version: 0.0.8
[12/28/2023-11:53:02] [I] [TRT] Opset version: 17
[12/28/2023-11:53:02] [I] [TRT] Producer name: pytorch
[12/28/2023-11:53:02] [I] [TRT] Producer version: 2.1.2
[12/28/2023-11:53:02] [I] [TRT] Domain:
[12/28/2023-11:53:02] [I] [TRT] Model version: 0
[12/28/2023-11:53:02] [I] [TRT] Doc string:
[12/28/2023-11:53:02] [I] [TRT] ----------------------------------------------------------------
[12/28/2023-11:53:03] [I] Finish parsing network model
[12/28/2023-11:53:04] [I] [TRT] ---------- Layers Running on DLA ----------
[12/28/2023-11:53:04] [I] [TRT] ---------- Layers Running on GPU ----------
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation1]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/pre_process/pre_process.0/Cast.../pre_process/pre_process.2/Mul]}
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 455) [Constant]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 456) [Constant]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 457) [Constant]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stem/conv/rbr_reparam/Conv + /model/backbone/stem/conv/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/downsample/rbr_reparam/Conv + /model/backbone/stage1/downsample/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/conv2/conv/Conv + /model/backbone/stage1/blocks/conv2/act/Relu || /model/backbone/stage1/blocks/conv1/conv/Conv + /model/backbone/stage1/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage1.blocks.bottlenecks.0.alpha + (Unnamed Layer* 15) [Shuffle] + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage1.blocks.bottlenecks.1.alpha + (Unnamed Layer* 23) [Shuffle] + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/conv3/conv/Conv + /model/backbone/stage1/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/reduce_skip2/conv/Conv + /model/neck/neck2/reduce_skip2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/downsample/rbr_reparam/Conv + /model/backbone/stage2/downsample/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/downsample/conv/Conv + /model/neck/neck2/downsample/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/conv2/conv/Conv + /model/backbone/stage2/blocks/conv2/act/Relu || /model/backbone/stage2/blocks/conv1/conv/Conv + /model/backbone/stage2/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.0.alpha + (Unnamed Layer* 44) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.1.alpha + (Unnamed Layer* 52) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.2.alpha + (Unnamed Layer* 60) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/conv3/conv/Conv + /model/backbone/stage2/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_skip2/conv/Conv + /model/neck/neck1/reduce_skip2/act/Relu || /model/neck/neck2/reduce_skip1/conv/Conv + /model/neck/neck2/reduce_skip1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/downsample/rbr_reparam/Conv + /model/backbone/stage3/downsample/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/downsample/conv/Conv + /model/neck/neck1/downsample/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/conv2/conv/Conv + /model/backbone/stage3/blocks/conv2/act/Relu || /model/backbone/stage3/blocks/conv1/conv/Conv + /model/backbone/stage3/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.0.alpha + (Unnamed Layer* 83) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.1.alpha + (Unnamed Layer* 91) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.2.alpha + (Unnamed Layer* 99) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.3.alpha + (Unnamed Layer* 107) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.4.alpha + (Unnamed Layer* 115) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/conv3/conv/Conv + /model/backbone/stage3/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_skip1/conv/Conv + /model/neck/neck1/reduce_skip1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/downsample/rbr_reparam/Conv + /model/backbone/stage4/downsample/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/conv2/conv/Conv + /model/backbone/stage4/blocks/conv2/act/Relu || /model/backbone/stage4/blocks/conv1/conv/Conv + /model/backbone/stage4/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage4.blocks.bottlenecks.0.alpha + (Unnamed Layer* 134) [Shuffle] + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage4.blocks.bottlenecks.1.alpha + (Unnamed Layer* 142) [Shuffle] + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/conv3/conv/Conv + /model/backbone/stage4/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/context_module/cv1/conv/Conv + /model/backbone/context_module/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.2/MaxPool
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.1/MaxPool
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.0/MaxPool
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/backbone/context_module/cv1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/context_module/cv2/conv/Conv + /model/backbone/context_module/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/conv/conv/Conv + /model/neck/neck1/conv/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] DECONVOLUTION: /model/neck/neck1/upsample/ConvTranspose
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_after_concat/conv/Conv + /model/neck/neck1/reduce_after_concat/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/conv2/conv/Conv + /model/neck/neck1/blocks/conv2/act/Relu || /model/neck/neck1/blocks/conv1/conv/Conv + /model/neck/neck1/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.0.alpha + (Unnamed Layer* 171) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.1.alpha + (Unnamed Layer* 179) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.2.alpha + (Unnamed Layer* 187) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.3.alpha + (Unnamed Layer* 195) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/neck/neck1/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/conv3/conv/Conv + /model/neck/neck1/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/conv/conv/Conv + /model/neck/neck2/conv/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] DECONVOLUTION: /model/neck/neck2/upsample/ConvTranspose
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/neck/neck2/reduce_skip1/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/reduce_after_concat/conv/Conv + /model/neck/neck2/reduce_after_concat/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/conv2/conv/Conv + /model/neck/neck2/blocks/conv2/act/Relu || /model/neck/neck2/blocks/conv1/conv/Conv + /model/neck/neck2/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.0.alpha + (Unnamed Layer* 216) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.1.alpha + (Unnamed Layer* 224) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.2.alpha + (Unnamed Layer* 232) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.3.alpha + (Unnamed Layer* 240) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/neck/neck2/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/conv3/conv/Conv + /model/neck/neck2/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/bbox_stem/seq/conv/Conv + /model/heads/head1/bbox_stem/seq/act/Relu || /model/heads/head1/pose_stem/seq/conv/Conv + /model/heads/head1/pose_stem/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/conv/conv/Conv + /model/neck/neck3/conv/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head1/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head1/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head1/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head1/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/conv2/conv/Conv + /model/neck/neck3/blocks/conv2/act/Relu || /model/neck/neck3/blocks/conv1/conv/Conv + /model/neck/neck3/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/cls_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/reg_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head1/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape + /model/heads/Transpose
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.0.alpha + (Unnamed Layer* 271) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.1.alpha + (Unnamed Layer* 294) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.2.alpha + (Unnamed Layer* 302) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.3.alpha + (Unnamed Layer* 310) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/neck/neck3/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/conv3/conv/Conv + /model/neck/neck3/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_stem/seq/conv/Conv + /model/heads/head2/pose_stem/seq/act/Relu || /model/heads/head2/bbox_stem/seq/conv/Conv + /model/heads/head2/bbox_stem/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/conv/conv/Conv + /model/neck/neck4/conv/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head2/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head2/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head2/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head2/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/conv2/conv/Conv + /model/neck/neck4/blocks/conv2/act/Relu || /model/neck/neck4/blocks/conv1/conv/Conv + /model/neck/neck4/blocks/conv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/cls_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/reg_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head2/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape_4 + /model/heads/Transpose_3
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax_1
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.0.alpha + (Unnamed Layer* 341) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv_1
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.1.alpha + (Unnamed Layer* 364) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.2.alpha + (Unnamed Layer* 372) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv1/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv2/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.3.alpha + (Unnamed Layer* 380) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] COPY: /model/neck/neck4/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/conv3/conv/Conv + /model/neck/neck4/blocks/conv3/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/bbox_stem/seq/conv/Conv + /model/heads/head3/bbox_stem/seq/act/Relu || /model/heads/head3/pose_stem/seq/conv/Conv + /model/heads/head3/pose_stem/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head3/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head3/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head3/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/cls_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/reg_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape_8 + /model/heads/Transpose_6
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.2/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.2/seq/act/Relu
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax_2
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_pred/Conv
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv_2
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/model/heads/head1/Slice_1.../post_process/Reshape_2]}
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] NMS: batched_nms_26
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] DEVICE_TO_SHAPE_HOST: (Unnamed Layer* 459) [NMS]_1_output[DevicetoShapeHostCopy]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation2]
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/model/heads/head1/Slice...graph2_/Concat_5]}
[12/28/2023-11:53:04] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation3]
[12/28/2023-11:53:10] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +534, GPU +444, now: CPU 1350, GPU 4042 (MiB)
[12/28/2023-11:53:11] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +83, GPU +74, now: CPU 1433, GPU 4116 (MiB)
[12/28/2023-11:53:11] [I] [TRT] Local timing cache in use. Profiling results in this builder pass will not be stored.
[12/28/2023-12:57:56] [I] [TRT] Total Activation Memory: 8060146176
[12/28/2023-12:57:56] [I] [TRT] Detected 1 inputs and 1 output network tensors.
[12/28/2023-12:58:10] [I] [TRT] Total Host Persistent Memory: 376864
[12/28/2023-12:58:10] [I] [TRT] Total Device Persistent Memory: 61440
[12/28/2023-12:58:10] [I] [TRT] Total Scratch Memory: 134217728
[12/28/2023-12:58:10] [I] [TRT] [MemUsageStats] Peak memory usage of TRT CPU/GPU memory allocators: CPU 139 MiB, GPU 2131 MiB
[12/28/2023-12:58:10] [I] [TRT] [BlockAssignment] Started assigning block shifts. This will take 209 steps to complete.
[12/28/2023-12:58:10] [I] [TRT] [BlockAssignment] Algorithm ShiftNTopDown took 267.621ms to assign 16 blocks to 209 nodes requiring 160521216 bytes.
[12/28/2023-12:58:10] [I] [TRT] Total Activation Memory: 160521216
[12/28/2023-12:58:16] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +1, GPU +0, now: CPU 1885, GPU 5668 (MiB)
[12/28/2023-12:58:16] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in building engine: CPU +104, GPU +128, now: CPU 104, GPU 128 (MiB)
[12/28/2023-12:58:17] [I] Engine built in 3922.11 sec.
[12/28/2023-12:58:17] [I] [TRT] Loaded engine size: 105 MiB
[12/28/2023-12:58:18] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +0, now: CPU 1347, GPU 5215 (MiB)
[12/28/2023-12:58:18] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +103, now: CPU 0, GPU 103 (MiB)
[12/28/2023-12:58:18] [I] Engine deserialized in 0.277587 sec.
[12/28/2023-12:58:18] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +0, now: CPU 1348, GPU 5215 (MiB)
[12/28/2023-12:58:18] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +154, now: CPU 0, GPU 257 (MiB)
[12/28/2023-12:58:18] [I] Setting persistentCacheLimit to 0 bytes.
[12/28/2023-12:58:18] [I] Using random values for input onnx::Cast_0
[12/28/2023-12:58:18] [I] Created input binding for onnx::Cast_0 with dimensions 1x3x640x640
[12/28/2023-12:58:18] [I] Using random values for output graph2_flat_predictions
[12/28/2023-12:58:18] [I] Created output binding for graph2_flat_predictions with dimensions -1x57
[12/28/2023-12:58:18] [I] Starting inference
[12/28/2023-12:58:33] [I] Warmup completed 2 queries over 200 ms
[12/28/2023-12:58:33] [I] Timing trace has 437 queries over 15.0687 s
[12/28/2023-12:58:33] [I]
[12/28/2023-12:58:33] [I] === Trace details ===
[12/28/2023-12:58:33] [I] Trace averages of 100 runs:
[12/28/2023-12:58:33] [I] Average on 100 runs - GPU latency: 34.2122 ms - Host latency: 34.3209 ms (enqueue 34.261 ms)
[12/28/2023-12:58:33] [I] Average on 100 runs - GPU latency: 34.6286 ms - Host latency: 34.7419 ms (enqueue 34.6733 ms)
[12/28/2023-12:58:33] [I] Average on 100 runs - GPU latency: 34.32 ms - Host latency: 34.4264 ms (enqueue 34.3737 ms)
[12/28/2023-12:58:33] [I] Average on 100 runs - GPU latency: 34.1213 ms - Host latency: 34.2258 ms (enqueue 34.1666 ms)
[12/28/2023-12:58:33] [I]
[12/28/2023-12:58:33] [I] === Performance summary ===
[12/28/2023-12:58:33] [I] Throughput: 29.0005 qps
[12/28/2023-12:58:33] [I] Latency: min = 32.0405 ms, max = 46.5195 ms, mean = 34.4576 ms, median = 34.021 ms, percentile(90%) = 35.2378 ms, percentile(95%) = 37.8905 ms, percentile(99%) = 44.1553 ms
[12/28/2023-12:58:33] [I] Enqueue Time: min = 32.0081 ms, max = 46.4473 ms, mean = 34.3937 ms, median = 33.9883 ms, percentile(90%) = 35.166 ms, percentile(95%) = 37.6458 ms, percentile(99%) = 44.0703 ms
[12/28/2023-12:58:33] [I] H2D Latency: min = 0.0800781 ms, max = 0.119629 ms, mean = 0.0890405 ms, median = 0.0893555 ms, percentile(90%) = 0.0913086 ms, percentile(95%) = 0.0917969 ms, percentile(99%) = 0.103516 ms
[12/28/2023-12:58:33] [I] GPU Compute Time: min = 31.9365 ms, max = 46.3989 ms, mean = 34.3493 ms, median = 33.9121 ms, percentile(90%) = 35.1299 ms, percentile(95%) = 37.7999 ms, percentile(99%) = 44.0645 ms
[12/28/2023-12:58:33] [I] D2H Latency: min = 0.00292969 ms, max = 0.0566406 ms, mean = 0.0192997 ms, median = 0.0175781 ms, percentile(90%) = 0.0292969 ms, percentile(95%) = 0.03125 ms, percentile(99%) = 0.0371094 ms
[12/28/2023-12:58:33] [I] Total Host Walltime: 15.0687 s
[12/28/2023-12:58:33] [I] Total GPU Compute Time: 15.0106 s
[12/28/2023-12:58:33] [I] Explanations of the performance metrics are printed in the verbose logs.
[12/28/2023-12:58:33] [I]
&&&& PASSED TensorRT.trtexec [TensorRT v8502] # /usr/src/tensorrt/bin/trtexec --onnx=yolo_nas_pose_l_fp32.onnx --fp16 --avgRuns=100 --duration=15 --saveEngine=yolo_nas_pose_l_fp32.onnx.fp16.engine