DEPTHAI_LEVEL=debug python3 main.py --name frog [2022-01-18 13:28:48.081] [debug] Python bindings - version: 2.14.1.0.dev+27fa4519f289498e84768ab5229a1a45efb7e4df from 2022-01-12 23:20:36 +0100 build: 2022-01-12 22:35:15 +0000 [2022-01-18 13:28:48.081] [debug] Library information - version: 2.14.1, commit: 55408918f4fb299cad45d292f8268aa93b1c10bb from 2022-01-12 23:20:24 +0100, build: 2022-01-12 22:27:24 +0000 [2022-01-18 13:28:48.082] [debug] Initialize - finished Creating pipeline... Creating Color Camera... Creating Face Detection Neural Network... Creating Head pose estimation NN Creating face recognition ImageManip/NN [2022-01-18 13:28:48.155] [debug] Resources - Archive 'depthai-bootloader-fwp-0.0.15.tar.xz' open: 3ms, archive read: 69ms [2022-01-18 13:28:48.425] [debug] Resources - Archive 'depthai-device-fwp-8c8315579295367f49100d0884157ad9c93339b6.tar.xz' open: 2ms, archive read: 340ms [2022-01-18 13:28:48.766] [debug] Device - OpenVINO version: 2021.2 [2022-01-18 13:28:48.766] [debug] Patching OpenVINO FW version from 2021.4 to 2021.2 [14442C10D12853D000] [38.900] [system] [info] Memory Usage - DDR: 0.12 / 357.39 MiB, CMX: 2.05 / 2.50 MiB, LeonOS Heap: 6.71 / 78.54 MiB, LeonRT Heap: 2.88 / 25.72 MiB [14442C10D12853D000] [38.900] [system] [info] Temperatures - Average: 36.30 °C, CSS: 37.48 °C, MSS 36.30 °C, UPA: 35.59 °C, DSS: 35.83 °C [14442C10D12853D000] [38.900] [system] [info] Cpu Usage - LeonOS 10.28%, LeonRT: 1.83% [2022-01-18 13:28:50.210] [debug] Schema dump: {"connections":[{"node1Id":10,"node1Output":"out","node1OutputGroup":"","node2Id":11,"node2Input":"in","node2InputGroup":""},{"node1Id":8,"node1Output":"out","node1OutputGroup":"","node2Id":10,"node2Input":"in","node2InputGroup":""},{"node1Id":4,"node1Output":"manip2_cfg","node1OutputGroup":"io","node2Id":9,"node2Input":"in","node2InputGroup":""},{"node1Id":4,"node1Output":"manip2_img","node1OutputGroup":"io","node2Id":8,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":4,"node1Output":"manip2_cfg","node1OutputGroup":"io","node2Id":8,"node2Input":"inputConfig","node2InputGroup":""},{"node1Id":6,"node1Output":"out","node1OutputGroup":"","node2Id":7,"node2Input":"in","node2InputGroup":""},{"node1Id":4,"node1Output":"manip_img","node1OutputGroup":"io","node2Id":6,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":4,"node1Output":"manip_cfg","node1OutputGroup":"io","node2Id":6,"node2Input":"inputConfig","node2InputGroup":""},{"node1Id":5,"node1Output":"out","node1OutputGroup":"","node2Id":2,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":0,"node1Output":"preview","node1OutputGroup":"","node2Id":5,"node2Input":"inputImage","node2InputGroup":""},{"node1Id":7,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"headpose_in","node2InputGroup":"io"},{"node1Id":7,"node1Output":"passthrough","node1OutputGroup":"","node2Id":4,"node2Input":"headpose_pass","node2InputGroup":"io"},{"node1Id":5,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"preview","node2InputGroup":"io"},{"node1Id":3,"node1Output":"passthrough","node1OutputGroup":"","node2Id":4,"node2Input":"face_pass","node2InputGroup":"io"},{"node1Id":3,"node1Output":"out","node1OutputGroup":"","node2Id":4,"node2Input":"face_det_in","node2InputGroup":"io"},{"node1Id":2,"node1Output":"out","node1OutputGroup":"","node2Id":3,"node2Input":"in","node2InputGroup":""},{"node1Id":0,"node1Output":"video","node1OutputGroup":"","node2Id":1,"node2Input":"in","node2InputGroup":""}],"globalProperties":{"calibData":null,"cameraTuningBlobSize":null,"cameraTuningBlobUri":"","leonCssFrequencyHz":700000000.0,"leonMssFrequencyHz":700000000.0,"pipelineName":null,"pipelineVersion":null,"xlinkChunkSize":-1},"nodes":[[0,{"id":0,"ioInfo":[[["","preview"],{"blocking":false,"group":"","name":"preview","queueSize":8,"type":0,"waitForMessage":false}],[["","still"],{"blocking":false,"group":"","name":"still","queueSize":8,"type":0,"waitForMessage":false}],[["","isp"],{"blocking":false,"group":"","name":"isp","queueSize":8,"type":0,"waitForMessage":false}],[["","video"],{"blocking":false,"group":"","name":"video","queueSize":8,"type":0,"waitForMessage":false}],[["","raw"],{"blocking":false,"group":"","name":"raw","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":false,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputControl"],{"blocking":true,"group":"","name":"inputControl","queueSize":8,"type":3,"waitForMessage":false}]],"name":"ColorCamera","properties":[185,18,185,20,0,3,0,185,3,0,0,0,185,5,0,0,0,0,0,185,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,255,0,0,0,129,48,4,129,48,4,133,48,4,133,48,4,255,255,0,136,0,0,240,65,136,0,0,128,191,136,0,0,128,191,1,185,4,0,0,0,0]}],[1,{"id":1,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,5,102,114,97,109,101,0]}],[2,{"id":2,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,3,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,133,44,1,133,44,1,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,2,7,0,0,1,1,0,0,134,0,0,16,0,4]}],[3,{"id":3,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"DetectionNetwork","properties":[185,12,1,130,192,117,20,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0,136,0,0,0,63,0,0,186,0,187,0,136,0,0,0,0]}],[4,{"id":4,"ioInfo":[[["io","manip_img"],{"blocking":false,"group":"io","name":"manip_img","queueSize":8,"type":0,"waitForMessage":false}],[["io","manip2_cfg"],{"blocking":false,"group":"io","name":"manip2_cfg","queueSize":8,"type":0,"waitForMessage":false}],[["io","face_det_in"],{"blocking":true,"group":"io","name":"face_det_in","queueSize":8,"type":3,"waitForMessage":false}],[["io","manip2_img"],{"blocking":false,"group":"io","name":"manip2_img","queueSize":8,"type":0,"waitForMessage":false}],[["io","face_pass"],{"blocking":true,"group":"io","name":"face_pass","queueSize":8,"type":3,"waitForMessage":false}],[["io","manip_cfg"],{"blocking":false,"group":"io","name":"manip_cfg","queueSize":8,"type":0,"waitForMessage":false}],[["io","preview"],{"blocking":true,"group":"io","name":"preview","queueSize":8,"type":3,"waitForMessage":false}],[["io","headpose_in"],{"blocking":true,"group":"io","name":"headpose_in","queueSize":8,"type":3,"waitForMessage":false}],[["io","headpose_pass"],{"blocking":true,"group":"io","name":"headpose_pass","queueSize":8,"type":3,"waitForMessage":false}]],"name":"Script","properties":[185,3,189,14,97,115,115,101,116,58,95,95,115,99,114,105,112,116,189,8,60,115,99,114,105,112,116,62,1]}],[5,{"id":5,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,3,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,0,0,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,2,7,0,0,0,0,0,0,134,0,155,52,0,20]}],[6,{"id":6,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,3,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,60,60,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,2,7,0,0,1,0,0,0,134,0,0,16,0,4]}],[7,{"id":7,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"NeuralNetwork","properties":[185,5,130,0,153,58,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0]}],[8,{"id":8,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","inputConfig"],{"blocking":true,"group":"","name":"inputConfig","queueSize":8,"type":3,"waitForMessage":false}],[["","inputImage"],{"blocking":true,"group":"","name":"inputImage","queueSize":8,"type":3,"waitForMessage":true}]],"name":"ImageManip","properties":[185,3,185,8,185,7,185,4,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,185,3,185,2,136,0,0,0,0,136,0,0,0,0,185,2,136,0,0,0,0,136,0,0,0,0,136,0,0,0,0,0,136,0,0,128,63,136,0,0,128,63,0,1,185,15,112,112,0,0,0,0,186,0,1,0,186,0,0,0,136,0,0,0,0,0,1,185,2,7,0,0,1,0,0,0,134,0,0,16,0,4]}],[9,{"id":9,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,16,102,97,99,101,95,114,101,99,95,99,102,103,95,111,117,116,0]}],[10,{"id":10,"ioInfo":[[["","out"],{"blocking":false,"group":"","name":"out","queueSize":8,"type":0,"waitForMessage":false}],[["","passthrough"],{"blocking":false,"group":"","name":"passthrough","queueSize":8,"type":0,"waitForMessage":false}],[["","in"],{"blocking":true,"group":"","name":"in","queueSize":5,"type":3,"waitForMessage":true}]],"name":"NeuralNetwork","properties":[185,5,130,192,74,73,0,189,12,97,115,115,101,116,58,95,95,98,108,111,98,8,0,0]}],[11,{"id":11,"ioInfo":[[["","in"],{"blocking":true,"group":"","name":"in","queueSize":8,"type":3,"waitForMessage":true}]],"name":"XLinkOut","properties":[185,3,136,0,0,128,191,189,7,97,114,99,95,111,117,116,0]}]]} [2022-01-18 13:28:50.211] [debug] Asset map dump: {"map":{"/node/10/__blob":{"alignment":64,"offset":0,"size":4803264},"/node/3/__blob":{"alignment":64,"offset":8646720,"size":1340864},"/node/4/__script":{"alignment":64,"offset":8643520,"size":3137},"/node/7/__blob":{"alignment":64,"offset":4803264,"size":3840256}}} [14442C10D12853D000] [39.040] [system] [info] ImageManip internal buffer size '203904'B, shave buffer size '23552'B [14442C10D12853D000] [39.040] [system] [info] SIPP (Signal Image Processing Pipeline) internal buffer size '16384'B [14442C10D12853D000] [39.075] [system] [info] NeuralNetwork allocated resources: shaves: [0-12] cmx slices: [0-12] [14442C10D12853D000] [39.075] [system] [info] ColorCamera allocated resources: no shaves; cmx slices: [13-15] [14442C10D12853D000] [39.075] [system] [info] ImageManip allocated resources: shaves: [15-15] no cmx slices. [14442C10D12853D000] [39.089] [NeuralNetwork(10)] [info] Needed resources: shaves: 4, ddr: 1605632 [14442C10D12853D000] [39.089] [NeuralNetwork(10)] [warning] Network compiled for 4 shaves, maximum available 13, compiling for 6 shaves likely will yield in better performance [14442C10D12853D000] [39.091] [DetectionNetwork(3)] [info] Needed resources: shaves: 6, ddr: 2728832 [14442C10D12853D000] [39.323] [NeuralNetwork(7)] [info] Needed resources: shaves: 6, ddr: 21632 [14442C10D12853D000] [39.339] [NeuralNetwork(10)] [warning] The issued warnings are orientative, based on optimal settings for a single network, if multiple networks are running in parallel the optimal settings may vary [14442C10D12853D000] [39.339] [NeuralNetwork(10)] [info] Inference thread count: 2, number of shaves allocated per thread: 4, number of Neural Compute Engines (NCE) allocated per thread: 1 [14442C10D12853D000] [39.339] [DetectionNetwork(3)] [info] Inference thread count: 2, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 1 [14442C10D12853D000] [39.340] [NeuralNetwork(7)] [info] Inference thread count: 2, number of shaves allocated per thread: 6, number of Neural Compute Engines (NCE) allocated per thread: 1 [14442C10D12853D000] [39.901] [system] [info] Memory Usage - DDR: 143.69 / 357.39 MiB, CMX: 2.30 / 2.50 MiB, LeonOS Heap: 21.75 / 78.54 MiB, LeonRT Heap: 7.52 / 25.72 MiB [14442C10D12853D000] [39.901] [system] [info] Temperatures - Average: 39.11 °C, CSS: 40.51 °C, MSS 38.65 °C, UPA: 38.88 °C, DSS: 38.41 °C [14442C10D12853D000] [39.901] [system] [info] Cpu Usage - LeonOS 27.42%, LeonRT: 81.06% [2022-01-18 13:28:51.658] [debug] DataOutputQueue (face_rec_cfg_out) closed [2022-01-18 13:28:51.689] [debug] Device about to be closed... [2022-01-18 13:28:51.700] [debug] Watchdog thread exception caught: Couldn't write data to stream: '__watchdog' (X_LINK_ERROR) [2022-01-18 13:28:51.819] [debug] DataOutputQueue (arc_out) closed [2022-01-18 13:28:51.819] [debug] Timesync thread exception caught: Couldn't read data from stream: '__timesync' (X_LINK_ERROR) [2022-01-18 13:28:51.819] [debug] DataOutputQueue (frame) closed [2022-01-18 13:28:51.819] [debug] Log thread exception caught: Couldn't read data from stream: '__log' (X_LINK_ERROR) [2022-01-18 13:28:53.798] [debug] XLinkResetRemote of linkId: (0) [2022-01-18 13:28:53.798] [debug] Device closed, 2109 Traceback (most recent call last): File "main.py", line 262, in cfg = recCfgQ.tryGet() RuntimeError: Communication exception - possible device error/misconfiguration. Original message 'Unexpected Encoding Type'