YOLO-NAS-Pose-JetPack5 / yolo_nas_pose_l_fp16.onnx.fp16.engine.log
Luigi's picture
Upload TensorRT & ONNX Model files
127e10d
raw
history blame
No virus
42.4 kB
&&&& RUNNING TensorRT.trtexec [TensorRT v8502] # /usr/src/tensorrt/bin/trtexec --onnx=yolo_nas_pose_l_fp16.onnx --fp16 --avgRuns=100 --duration=15 --saveEngine=yolo_nas_pose_l_fp16.onnx.fp16.engine
[12/28/2023-16:09:46] [I] === Model Options ===
[12/28/2023-16:09:46] [I] Format: ONNX
[12/28/2023-16:09:46] [I] Model: yolo_nas_pose_l_fp16.onnx
[12/28/2023-16:09:46] [I] Output:
[12/28/2023-16:09:46] [I] === Build Options ===
[12/28/2023-16:09:46] [I] Max batch: explicit batch
[12/28/2023-16:09:46] [I] Memory Pools: workspace: default, dlaSRAM: default, dlaLocalDRAM: default, dlaGlobalDRAM: default
[12/28/2023-16:09:46] [I] minTiming: 1
[12/28/2023-16:09:46] [I] avgTiming: 8
[12/28/2023-16:09:46] [I] Precision: FP32+FP16
[12/28/2023-16:09:46] [I] LayerPrecisions:
[12/28/2023-16:09:46] [I] Calibration:
[12/28/2023-16:09:46] [I] Refit: Disabled
[12/28/2023-16:09:46] [I] Sparsity: Disabled
[12/28/2023-16:09:46] [I] Safe mode: Disabled
[12/28/2023-16:09:46] [I] DirectIO mode: Disabled
[12/28/2023-16:09:46] [I] Restricted mode: Disabled
[12/28/2023-16:09:46] [I] Build only: Disabled
[12/28/2023-16:09:46] [I] Save engine: yolo_nas_pose_l_fp16.onnx.fp16.engine
[12/28/2023-16:09:46] [I] Load engine:
[12/28/2023-16:09:46] [I] Profiling verbosity: 0
[12/28/2023-16:09:46] [I] Tactic sources: Using default tactic sources
[12/28/2023-16:09:46] [I] timingCacheMode: local
[12/28/2023-16:09:46] [I] timingCacheFile:
[12/28/2023-16:09:46] [I] Heuristic: Disabled
[12/28/2023-16:09:46] [I] Preview Features: Use default preview flags.
[12/28/2023-16:09:46] [I] Input(s)s format: fp32:CHW
[12/28/2023-16:09:46] [I] Output(s)s format: fp32:CHW
[12/28/2023-16:09:46] [I] Input build shapes: model
[12/28/2023-16:09:46] [I] Input calibration shapes: model
[12/28/2023-16:09:46] [I] === System Options ===
[12/28/2023-16:09:46] [I] Device: 0
[12/28/2023-16:09:46] [I] DLACore:
[12/28/2023-16:09:46] [I] Plugins:
[12/28/2023-16:09:46] [I] === Inference Options ===
[12/28/2023-16:09:46] [I] Batch: Explicit
[12/28/2023-16:09:46] [I] Input inference shapes: model
[12/28/2023-16:09:46] [I] Iterations: 10
[12/28/2023-16:09:46] [I] Duration: 15s (+ 200ms warm up)
[12/28/2023-16:09:46] [I] Sleep time: 0ms
[12/28/2023-16:09:46] [I] Idle time: 0ms
[12/28/2023-16:09:46] [I] Streams: 1
[12/28/2023-16:09:46] [I] ExposeDMA: Disabled
[12/28/2023-16:09:46] [I] Data transfers: Enabled
[12/28/2023-16:09:46] [I] Spin-wait: Disabled
[12/28/2023-16:09:46] [I] Multithreading: Disabled
[12/28/2023-16:09:46] [I] CUDA Graph: Disabled
[12/28/2023-16:09:46] [I] Separate profiling: Disabled
[12/28/2023-16:09:46] [I] Time Deserialize: Disabled
[12/28/2023-16:09:46] [I] Time Refit: Disabled
[12/28/2023-16:09:46] [I] NVTX verbosity: 0
[12/28/2023-16:09:46] [I] Persistent Cache Ratio: 0
[12/28/2023-16:09:46] [I] Inputs:
[12/28/2023-16:09:46] [I] === Reporting Options ===
[12/28/2023-16:09:46] [I] Verbose: Disabled
[12/28/2023-16:09:46] [I] Averages: 100 inferences
[12/28/2023-16:09:46] [I] Percentiles: 90,95,99
[12/28/2023-16:09:46] [I] Dump refittable layers:Disabled
[12/28/2023-16:09:46] [I] Dump output: Disabled
[12/28/2023-16:09:46] [I] Profile: Disabled
[12/28/2023-16:09:46] [I] Export timing to JSON file:
[12/28/2023-16:09:46] [I] Export output to JSON file:
[12/28/2023-16:09:46] [I] Export profile to JSON file:
[12/28/2023-16:09:46] [I]
[12/28/2023-16:09:46] [I] === Device Information ===
[12/28/2023-16:09:46] [I] Selected Device: Orin
[12/28/2023-16:09:46] [I] Compute Capability: 8.7
[12/28/2023-16:09:46] [I] SMs: 8
[12/28/2023-16:09:46] [I] Compute Clock Rate: 0.624 GHz
[12/28/2023-16:09:46] [I] Device Global Memory: 7471 MiB
[12/28/2023-16:09:46] [I] Shared Memory per SM: 164 KiB
[12/28/2023-16:09:46] [I] Memory Bus Width: 128 bits (ECC disabled)
[12/28/2023-16:09:46] [I] Memory Clock Rate: 0.624 GHz
[12/28/2023-16:09:46] [I]
[12/28/2023-16:09:46] [I] TensorRT version: 8.5.2
[12/28/2023-16:09:46] [I] [TRT] [MemUsageChange] Init CUDA: CPU +220, GPU +0, now: CPU 249, GPU 2974 (MiB)
[12/28/2023-16:09:49] [I] [TRT] [MemUsageChange] Init builder kernel library: CPU +302, GPU +283, now: CPU 574, GPU 3280 (MiB)
[12/28/2023-16:09:49] [I] Start parsing network model
[12/28/2023-16:09:49] [I] [TRT] ----------------------------------------------------------------
[12/28/2023-16:09:49] [I] [TRT] Input filename: yolo_nas_pose_l_fp16.onnx
[12/28/2023-16:09:49] [I] [TRT] ONNX IR version: 0.0.8
[12/28/2023-16:09:49] [I] [TRT] Opset version: 17
[12/28/2023-16:09:49] [I] [TRT] Producer name: pytorch
[12/28/2023-16:09:49] [I] [TRT] Producer version: 2.1.2
[12/28/2023-16:09:49] [I] [TRT] Domain:
[12/28/2023-16:09:49] [I] [TRT] Model version: 0
[12/28/2023-16:09:49] [I] [TRT] Doc string:
[12/28/2023-16:09:49] [I] [TRT] ----------------------------------------------------------------
[12/28/2023-16:09:49] [I] Finish parsing network model
[12/28/2023-16:09:50] [I] [TRT] ---------- Layers Running on DLA ----------
[12/28/2023-16:09:50] [I] [TRT] ---------- Layers Running on GPU ----------
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation1]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/pre_process/pre_process.0/Cast.../pre_process/pre_process.2/Mul]}
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 458) [Constant]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 459) [Constant]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONSTANT: (Unnamed Layer* 460) [Constant]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stem/conv/rbr_reparam/Conv + /model/backbone/stem/conv/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/downsample/rbr_reparam/Conv + /model/backbone/stage1/downsample/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/conv2/conv/Conv + /model/backbone/stage1/blocks/conv2/act/Relu || /model/backbone/stage1/blocks/conv1/conv/Conv + /model/backbone/stage1/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage1.blocks.bottlenecks.0.alpha + (Unnamed Layer* 15) [Shuffle] + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage1.blocks.bottlenecks.1.alpha + (Unnamed Layer* 23) [Shuffle] + /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage1/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage1/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage1/blocks/conv3/conv/Conv + /model/backbone/stage1/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/reduce_skip2/conv/Conv + /model/neck/neck2/reduce_skip2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/downsample/rbr_reparam/Conv + /model/backbone/stage2/downsample/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/downsample/conv/Conv + /model/neck/neck2/downsample/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/conv2/conv/Conv + /model/backbone/stage2/blocks/conv2/act/Relu || /model/backbone/stage2/blocks/conv1/conv/Conv + /model/backbone/stage2/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.0.alpha + (Unnamed Layer* 44) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.1.alpha + (Unnamed Layer* 52) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage2.blocks.bottlenecks.2.alpha + (Unnamed Layer* 60) [Shuffle] + /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/Mul, /model/backbone/stage2/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/bottlenecks/bottlenecks.1/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage2/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage2/blocks/conv3/conv/Conv + /model/backbone/stage2/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_skip2/conv/Conv + /model/neck/neck1/reduce_skip2/act/Relu || /model/neck/neck2/reduce_skip1/conv/Conv + /model/neck/neck2/reduce_skip1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/downsample/rbr_reparam/Conv + /model/backbone/stage3/downsample/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/downsample/conv/Conv + /model/neck/neck1/downsample/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/conv2/conv/Conv + /model/backbone/stage3/blocks/conv2/act/Relu || /model/backbone/stage3/blocks/conv1/conv/Conv + /model/backbone/stage3/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.0.alpha + (Unnamed Layer* 83) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.1.alpha + (Unnamed Layer* 91) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.2.alpha + (Unnamed Layer* 99) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.3.alpha + (Unnamed Layer* 107) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv1/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv2/rbr_reparam/Conv + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage3.blocks.bottlenecks.4.alpha + (Unnamed Layer* 115) [Shuffle] + /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/Mul, /model/backbone/stage3/blocks/bottlenecks/bottlenecks.4/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.1/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.2/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/bottlenecks/bottlenecks.3/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage3/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage3/blocks/conv3/conv/Conv + /model/backbone/stage3/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_skip1/conv/Conv + /model/neck/neck1/reduce_skip1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/downsample/rbr_reparam/Conv + /model/backbone/stage4/downsample/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/conv2/conv/Conv + /model/backbone/stage4/blocks/conv2/act/Relu || /model/backbone/stage4/blocks/conv1/conv/Conv + /model/backbone/stage4/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage4.blocks.bottlenecks.0.alpha + (Unnamed Layer* 134) [Shuffle] + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Mul, /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.backbone.stage4.blocks.bottlenecks.1.alpha + (Unnamed Layer* 142) [Shuffle] + /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/Mul, /model/backbone/stage4/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/conv1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/bottlenecks/bottlenecks.0/Add_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/stage4/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/stage4/blocks/conv3/conv/Conv + /model/backbone/stage4/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/context_module/cv1/conv/Conv + /model/backbone/context_module/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.2/MaxPool
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.1/MaxPool
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POOLING: /model/backbone/context_module/m.0/MaxPool
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/backbone/context_module/cv1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/backbone/context_module/cv2/conv/Conv + /model/backbone/context_module/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/conv/conv/Conv + /model/neck/neck1/conv/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] DECONVOLUTION: /model/neck/neck1/upsample/ConvTranspose
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/reduce_after_concat/conv/Conv + /model/neck/neck1/reduce_after_concat/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/conv2/conv/Conv + /model/neck/neck1/blocks/conv2/act/Relu || /model/neck/neck1/blocks/conv1/conv/Conv + /model/neck/neck1/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.0.alpha + (Unnamed Layer* 171) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.1.alpha + (Unnamed Layer* 179) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.2.alpha + (Unnamed Layer* 187) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck1.blocks.bottlenecks.3.alpha + (Unnamed Layer* 195) [Shuffle] + /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck1/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/neck/neck1/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck1/blocks/conv3/conv/Conv + /model/neck/neck1/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/conv/conv/Conv + /model/neck/neck2/conv/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] DECONVOLUTION: /model/neck/neck2/upsample/ConvTranspose
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/neck/neck2/reduce_skip1/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/reduce_after_concat/conv/Conv + /model/neck/neck2/reduce_after_concat/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/conv2/conv/Conv + /model/neck/neck2/blocks/conv2/act/Relu || /model/neck/neck2/blocks/conv1/conv/Conv + /model/neck/neck2/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.0.alpha + (Unnamed Layer* 216) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.1.alpha + (Unnamed Layer* 224) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.2.alpha + (Unnamed Layer* 232) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv1/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv1/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv2/rbr_reparam/Conv + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/cv2/nonlinearity/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck2.blocks.bottlenecks.3.alpha + (Unnamed Layer* 240) [Shuffle] + /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck2/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/neck/neck2/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck2/blocks/conv3/conv/Conv + /model/neck/neck2/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/bbox_stem/seq/conv/Conv + /model/heads/head1/bbox_stem/seq/act/Relu || /model/heads/head1/pose_stem/seq/conv/Conv + /model/heads/head1/pose_stem/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/conv/conv/Conv + /model/neck/neck3/conv/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head1/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head1/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head1/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head1/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/conv2/conv/Conv + /model/neck/neck3/blocks/conv2/act/Relu || /model/neck/neck3/blocks/conv1/conv/Conv + /model/neck/neck3/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/cls_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/reg_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head1/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape + /model/heads/Transpose
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head1/pose_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.0.alpha + (Unnamed Layer* 271) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.1.alpha + (Unnamed Layer* 294) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.2.alpha + (Unnamed Layer* 302) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv1/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv2/conv/Conv + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck3.blocks.bottlenecks.3.alpha + (Unnamed Layer* 310) [Shuffle] + /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck3/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/neck/neck3/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck3/blocks/conv3/conv/Conv + /model/neck/neck3/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_stem/seq/conv/Conv + /model/heads/head2/pose_stem/seq/act/Relu || /model/heads/head2/bbox_stem/seq/conv/Conv + /model/heads/head2/bbox_stem/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/conv/conv/Conv + /model/neck/neck4/conv/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head2/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head2/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head2/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head2/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/conv2/conv/Conv + /model/neck/neck4/blocks/conv2/act/Relu || /model/neck/neck4/blocks/conv1/conv/Conv + /model/neck/neck4/blocks/conv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/cls_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/reg_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head2/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape_4 + /model/heads/Transpose_3
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head2/pose_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax_1
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.0.alpha + (Unnamed Layer* 341) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.0/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv_1
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.1.alpha + (Unnamed Layer* 364) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.1/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.2.alpha + (Unnamed Layer* 372) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.2/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv1/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv1/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv2/conv/Conv + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/cv2/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] POINTWISE: PWN(model.neck.neck4.blocks.bottlenecks.3.alpha + (Unnamed Layer* 380) [Shuffle] + /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/Mul, /model/neck/neck4/blocks/bottlenecks/bottlenecks.3/Add)
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] COPY: /model/neck/neck4/blocks/conv2/act/Relu_output_0 copy
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/neck/neck4/blocks/conv3/conv/Conv + /model/neck/neck4/blocks/conv3/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/bbox_stem/seq/conv/Conv + /model/heads/head3/bbox_stem/seq/act/Relu || /model/heads/head3/pose_stem/seq/conv/Conv + /model/heads/head3/pose_stem/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/reg_convs/reg_convs.0/seq/conv/Conv + /model/heads/head3/reg_convs/reg_convs.0/seq/act/Relu || /model/heads/head3/cls_convs/cls_convs.0/seq/conv/Conv + /model/heads/head3/cls_convs/cls_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.0/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.0/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/cls_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/reg_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.1/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.1/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SHUFFLE: /model/heads/Reshape_8 + /model/heads/Transpose_6
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_convs/pose_convs.2/seq/conv/Conv + /model/heads/head3/pose_convs/pose_convs.2/seq/act/Relu
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] SOFTMAX: /model/heads/Softmax_2
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/head3/pose_pred/Conv
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] CONVOLUTION: /model/heads/Conv_2
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/model/heads/head1/Slice_1...cast_boxes_to_fp32]}
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] NMS: batched_nms_26
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] DEVICE_TO_SHAPE_HOST: (Unnamed Layer* 462) [NMS]_1_output[DevicetoShapeHostCopy]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation2]
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] MYELIN: {ForeignNode[/model/heads/head1/Slice...graph2_/Concat_5]}
[12/28/2023-16:09:50] [I] [TRT] [GpuLayer] TRAIN_STATION: [trainStation3]
[12/28/2023-16:10:01] [I] [TRT] [MemUsageChange] Init cuBLAS/cuBLASLt: CPU +534, GPU +353, now: CPU 1231, GPU 3827 (MiB)
[12/28/2023-16:10:03] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +82, GPU +65, now: CPU 1313, GPU 3892 (MiB)
[12/28/2023-16:10:03] [I] [TRT] Local timing cache in use. Profiling results in this builder pass will not be stored.
[12/28/2023-17:14:35] [I] [TRT] Total Activation Memory: 8058785792
[12/28/2023-17:14:35] [I] [TRT] Detected 1 inputs and 1 output network tensors.
[12/28/2023-17:14:49] [I] [TRT] Total Host Persistent Memory: 387328
[12/28/2023-17:14:49] [I] [TRT] Total Device Persistent Memory: 51712
[12/28/2023-17:14:49] [I] [TRT] Total Scratch Memory: 134217728
[12/28/2023-17:14:49] [I] [TRT] [MemUsageStats] Peak memory usage of TRT CPU/GPU memory allocators: CPU 69 MiB, GPU 2131 MiB
[12/28/2023-17:14:49] [I] [TRT] [BlockAssignment] Started assigning block shifts. This will take 196 steps to complete.
[12/28/2023-17:14:49] [I] [TRT] [BlockAssignment] Algorithm ShiftNTopDown took 90.8216ms to assign 13 blocks to 196 nodes requiring 160489472 bytes.
[12/28/2023-17:14:49] [I] [TRT] Total Activation Memory: 160489472
[12/28/2023-17:14:55] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU -16, now: CPU 1661, GPU 5679 (MiB)
[12/28/2023-17:14:55] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in building engine: CPU +16, GPU +128, now: CPU 16, GPU 128 (MiB)
[12/28/2023-17:14:55] [I] Engine built in 3909.9 sec.
[12/28/2023-17:14:56] [I] [TRT] Loaded engine size: 105 MiB
[12/28/2023-17:14:56] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU +0, now: CPU 1347, GPU 5281 (MiB)
[12/28/2023-17:14:56] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +103, now: CPU 0, GPU 103 (MiB)
[12/28/2023-17:14:56] [I] Engine deserialized in 0.281193 sec.
[12/28/2023-17:14:56] [I] [TRT] [MemUsageChange] Init cuDNN: CPU +0, GPU -1, now: CPU 1348, GPU 5280 (MiB)
[12/28/2023-17:14:56] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +154, now: CPU 0, GPU 257 (MiB)
[12/28/2023-17:14:56] [I] Setting persistentCacheLimit to 0 bytes.
[12/28/2023-17:14:56] [I] Using random values for input onnx::Cast_0
[12/28/2023-17:14:56] [I] Created input binding for onnx::Cast_0 with dimensions 1x3x640x640
[12/28/2023-17:14:56] [I] Using random values for output graph2_flat_predictions
[12/28/2023-17:14:56] [I] Created output binding for graph2_flat_predictions with dimensions -1x57
[12/28/2023-17:14:56] [I] Starting inference
[12/28/2023-17:15:12] [I] Warmup completed 1 queries over 200 ms
[12/28/2023-17:15:12] [I] Timing trace has 446 queries over 15.0628 s
[12/28/2023-17:15:12] [I]
[12/28/2023-17:15:12] [I] === Trace details ===
[12/28/2023-17:15:12] [I] Trace averages of 100 runs:
[12/28/2023-17:15:12] [I] Average on 100 runs - GPU latency: 33.8846 ms - Host latency: 34.0014 ms (enqueue 33.9409 ms)
[12/28/2023-17:15:12] [I] Average on 100 runs - GPU latency: 33.6823 ms - Host latency: 33.7942 ms (enqueue 33.7436 ms)
[12/28/2023-17:15:12] [I] Average on 100 runs - GPU latency: 33.5307 ms - Host latency: 33.6433 ms (enqueue 33.5802 ms)
[12/28/2023-17:15:12] [I] Average on 100 runs - GPU latency: 33.377 ms - Host latency: 33.4884 ms (enqueue 33.4514 ms)
[12/28/2023-17:15:12] [I]
[12/28/2023-17:15:12] [I] === Performance summary ===
[12/28/2023-17:15:12] [I] Throughput: 29.6093 qps
[12/28/2023-17:15:12] [I] Latency: min = 31.7529 ms, max = 44.1514 ms, mean = 33.7469 ms, median = 33.4268 ms, percentile(90%) = 34.5583 ms, percentile(95%) = 37.8339 ms, percentile(99%) = 42.0027 ms
[12/28/2023-17:15:12] [I] Enqueue Time: min = 31.7227 ms, max = 44.1133 ms, mean = 33.6924 ms, median = 33.4424 ms, percentile(90%) = 34.3545 ms, percentile(95%) = 37.7542 ms, percentile(99%) = 41.9526 ms
[12/28/2023-17:15:12] [I] H2D Latency: min = 0.0800781 ms, max = 0.114746 ms, mean = 0.0954138 ms, median = 0.0969238 ms, percentile(90%) = 0.100098 ms, percentile(95%) = 0.100586 ms, percentile(99%) = 0.103027 ms
[12/28/2023-17:15:12] [I] GPU Compute Time: min = 31.6406 ms, max = 44.0381 ms, mean = 33.6343 ms, median = 33.311 ms, percentile(90%) = 34.4478 ms, percentile(95%) = 37.7091 ms, percentile(99%) = 41.8918 ms
[12/28/2023-17:15:12] [I] D2H Latency: min = 0.00292969 ms, max = 0.0541992 ms, mean = 0.017179 ms, median = 0.0146484 ms, percentile(90%) = 0.0290527 ms, percentile(95%) = 0.0314941 ms, percentile(99%) = 0.0361328 ms
[12/28/2023-17:15:12] [I] Total Host Walltime: 15.0628 s
[12/28/2023-17:15:12] [I] Total GPU Compute Time: 15.0009 s
[12/28/2023-17:15:12] [I] Explanations of the performance metrics are printed in the verbose logs.
[12/28/2023-17:15:12] [I]
&&&& PASSED TensorRT.trtexec [TensorRT v8502] # /usr/src/tensorrt/bin/trtexec --onnx=yolo_nas_pose_l_fp16.onnx --fp16 --avgRuns=100 --duration=15 --saveEngine=yolo_nas_pose_l_fp16.onnx.fp16.engine