打开微信,使用扫一扫进入页面后,点击右上角菜单,
点击“发送给朋友”或“分享到朋友圈”完成分享
【寒武纪硬件产品型号】必填*:
MLU220
【使用操作系统】必填*:
ubuntu16.04
【版本和错误信息】必填*:
CNML: 7.10.2 ba20487
CNRT: 4.10.1 a884a9a
2022-12-27 18:57:58.569389: [cnrtWarning] [48412] [Card : NONE] Failed to initialize CNDEV. Host manage interface disabled
2022-12-27 18:57:58.571866: [cnrtError] [48412] [Card : NONE] No MLU can be found !
2022-12-27 18:57:58.571891: [cnrtError] [48412] [Card : NONE] Error occurred in cnrtInit during calling driver interface.
2022-12-27 18:57:58.571895: [cnrtError] [48412] [Card : NONE] Return value is 5, MLU_ERROR_NO_DEVICE, means that "No useful mlu device"
2022-12-27 18:57:58.571922: [cnmlError] No MLU device
2022-12-27 18:57:58.572040: [cnmlError] No MLU device
WARNING:root:The 'iteration' key in qconfig has been deprecated in this version. Users needn't
set this key. We will calculate quantization parameters by real forward times controlled by users.
[PosixPath('coco-eval/000000000502.jpg'), PosixPath('coco-eval/000000000508.jpg'), PosixPath('coco-eval/000000000510.jpg'), PosixPath('coco-eval/000000000514.jpg'), PosixPath('coco-eval/000000000520.jpg'), PosixPath('coco-eval/000000000529.jpg'), PosixPath('coco-eval/000000000531.jpg'), PosixPath('coco-eval/000000000532.jpg'), PosixPath('coco-eval/000000000536.jpg'), PosixPath('coco-eval/000000000540.jpg'), PosixPath('coco-eval/000000000542.jpg'), PosixPath('coco-eval/000000000544.jpg'), PosixPath('coco-eval/000000000560.jpg'), PosixPath('coco-eval/000000000562.jpg'), PosixPath('coco-eval/000000000564.jpg'), PosixPath('coco-eval/000000000569.jpg'), PosixPath('coco-eval/000000000572.jpg'), PosixPath('coco-eval/000000000575.jpg'), PosixPath('coco-eval/000000000581.jpg'), PosixPath('coco-eval/000000000584.jpg'), PosixPath('coco-eval/000000000589.jpg'), PosixPath('coco-eval/000000000590.jpg'), PosixPath('coco-eval/000000000595.jpg'), PosixPath('coco-eval/000000000597.jpg'), PosixPath('coco-eval/000000000599.jpg'), PosixPath('coco-eval/g.jpg')]
coco-eval/000000000564.jpg
coco-eval/000000000590.jpg
coco-eval/000000000510.jpg
coco-eval/000000000575.jpg
coco-eval/000000000562.jpg
coco-eval/000000000520.jpg
coco-eval/000000000599.jpg
coco-eval/000000000536.jpg
coco-eval/000000000569.jpg
coco-eval/000000000508.jpg
test.pth save over
******************** using mlu ************************
2022-12-27 18:57:59.574036: [cnrtError] [48412] [Card : NONE] input param is invalid device handle in cnrtSetCurrentDevice
[ERROR][/pytorch/catch/torch_mlu/csrc/aten/core/tensor_impl.cpp][line:866][cpu_data][thread:140569947932416][process:48412]:
Both cpu_storage and mlu_storage are not initialized!
Please check is there any invalid tensor operates such as:
output = input.cpu() or output = input.to("cpu") in pytorch model when doing mlu/mfus inference.
Can not call cpu_data on an empty tensor.
[WARNING][/pytorch/catch/torch_mlu/csrc/aten/operators/op_methods.cpp][line:68][copy_][thread:140569947932416][process:48412]:
copy_ Op cannot run on MLU device, start running on CPU!
[ERROR][/pytorch/catch/torch_mlu/csrc/aten/core/tensor_impl.cpp][line:866][cpu_data][thread:140569947932416][process:48412]:
Both cpu_storage and mlu_storage are not initialized!
Please check is there any invalid tensor operates such as:
output = input.cpu() or output = input.to("cpu") in pytorch model when doing mlu/mfus inference.
Can not call cpu_data on an empty tensor.
[WARNING][/pytorch/catch/torch_mlu/csrc/aten/operators/op_methods.cpp][line:2791][equal][thread:140569947932416][process:48412]:
equal Op cannot run on MLU device, start running on CPU!
[ERROR][/pytorch/catch/torch_mlu/csrc/aten/core/tensor_impl.cpp][line:866][cpu_data][thread:140569947932416][process:48412]:
Both cpu_storage and mlu_storage are not initialized!
Please check is there any invalid tensor operates such as:
output = input.cpu() or output = input.to("cpu") in pytorch model when doing mlu/mfus inference.
Can not call cpu_data on an empty tensor.
[WARNING][/pytorch/catch/torch_mlu/csrc/aten/operators/op_methods.cpp][line:68][copy_][thread:140569947932416][process:48412]:
copy_ Op cannot run on MLU device, start running on CPU!
[ERROR][/pytorch/catch/torch_mlu/csrc/aten/core/tensor_impl.cpp][line:866][cpu_data][thread:140569947932416][process:48412]:
Both cpu_storage and mlu_storage are not initialized!
Please check is there any invalid tensor operates such as:
output = input.cpu() or output = input.to("cpu") in pytorch model when doing mlu/mfus inference.
Traceback (most recent call last):
File "quantification_transform.py", line 182, in <module>
category = args.category)
File "quantification_transform.py", line 155, in mlu_forward
fusion_model(example_tensor)
File "/torch/venv3/pytorch/lib/python3.6/site-packages/torch/nn/modules/module.py", line 541, in __call__
result = self.forward(*input, **kwargs)
RuntimeError: Can not call cpu_data on an empty tensor. (cpu_data at /pytorch/catch/torch_mlu/csrc/aten/core/tensor_impl.cpp:870)
#0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 0x57 (0x7fd8f7d2e237 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libc10.so)
#1: torch_mlu::MLUTensorImpl::cpu_data() + 0xed4 (0x7fd810fc5bb4 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#2: torch_mlu::copy_to_cpu(at::Tensor&, at::Tensor const&) + 0x120 (0x7fd810fd4c80 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#3: torch_mlu::OpMethods::copy_(at::Tensor&, at::Tensor const&, bool) + 0x4c (0x7fd8114bb3cc in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#4: torch_mlu::CnmlOps::copy_(at::Tensor&, at::Tensor const&, bool) + 0x295 (0x7fd811270d15 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#5: torch_mlu::AtenMluType::copy_(at::Tensor&, at::Tensor const&, bool) + 0x3b (0x7fd810f9171b in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#6: <unknown function> + 0xb2b6d7 (0x7fd8cbe4f6d7 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#7: at::native::to(at::Tensor const&, c10::TensorOptions const&, bool, bool) + 0x342 (0x7fd8cbe50c22 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#8: <unknown function> + 0xe9e203 (0x7fd8cc1c2203 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#9: <unknown function> + 0x5bdb73 (0x7fd8114bab73 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#10: at::Tensor::cpu() const + 0xca (0x7fd81150630a in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#11: torch_mlu::OpMethods::equal(at::Tensor const&, at::Tensor const&) + 0x59 (0x7fd8114fc2b9 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#12: torch_mlu::CnmlOps::equal(at::Tensor const&, at::Tensor const&) + 0x214 (0x7fd811274b94 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#13: torch_mlu::AtenMluType::equal(at::Tensor const&, at::Tensor const&) + 0x37 (0x7fd810f924f7 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch_mlu/csrc/lib/libaten_mlu.so)
#14: <unknown function> + 0x2887bc4 (0x7fd8cdbabbc4 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#15: at::Tensor::equal(at::Tensor const&) const + 0x100 (0x7fd8f8b57020 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
#16: <unknown function> + 0x2cff884 (0x7fd8ce023884 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#17: <unknown function> + 0x2d6da7b (0x7fd8ce091a7b in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#18: <unknown function> + 0x2d6cce0 (0x7fd8ce090ce0 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#19: torch::jit::ConstantPooling(std::shared_ptr<torch::jit::Graph> const&) + 0xc1 (0x7fd8ce091681 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#20: <unknown function> + 0x2cca0eb (0x7fd8cdfee0eb in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#21: <unknown function> + 0x2ccb3a5 (0x7fd8cdfef3a5 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#22: <unknown function> + 0x2ccb57b (0x7fd8cdfef57b in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#23: <unknown function> + 0x2cc2a79 (0x7fd8cdfe6a79 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#24: torch::jit::Function::run(std::vector<c10::IValue, std::allocator<c10::IValue> >&) + 0x63 (0x7fd8ce2a84a3 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch.so)
#25: <unknown function> + 0x560b80 (0x7fd8f8a46b80 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
#26: <unknown function> + 0x561455 (0x7fd8f8a47455 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
#27: <unknown function> + 0x532414 (0x7fd8f8a18414 in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
#28: <unknown function> + 0x1dce3a (0x7fd8f86c2e3a in /torch/venv3/pytorch/lib/python3.6/site-packages/torch/lib/libtorch_python.so)
<omitting python s>
#55: __libc_start_main + 0xf0 (0x7fd8fcff6840 in /lib/x86_64-linux-gnu/libc.so.6)
同样的模型转换代码,转换yolov5是没问题的,但是转换fastestdet模型就上面的错误。
热门帖子
精华帖子