kagula
2021-11-11
内容概要如何学习Nvidia Jetson Nano 2G的指引。
学习环境- Win10 OS的Intel x86计算机
- Jetson Nano 2G开发板
- CSI接口IMX219 Camera(建议同开发板一同购买),64/128G SD Card
一、开发板不自带风扇,下面给出选择风扇的参考
Both the NF-A4x20 5V PWM and NF-A4x10 5V PWM are great choices for upgrading the cooling of NVIDIA’s Jetson Nano Developer Kit. The NF-A4x20 5V PWM provides slightly better performance and is officially recommended by NVIDIA. Being slightly smaller (40x40x10mm instead of 40x40x20mm) the NF-A4x10 5V PWM is an excellent choice for more compact cases that can not fit 20mm thick fans. A 3D printed mesh case design that fits the NF-A4x20 can be found here.
Both models support PWM-based speed control and can be connected directly to the 4-pin PWM fan header on the Jetson Nano Developer Kit, no adaptors are required. As the Jetson Nano provides 5V, please make sure to purchase the 5V PWM versions of the fans. The regular 12V PWM versions will not work. For a tutorial on how to set up PWM-based fan control, please see this thread.
The mounting holes of the NF-A4x20 5V PWM and NF-A4x10 5V PWM align with the holes in the heatsink of the NVIDIA Jetson Nano Developer Kit. Please note that the holes of the heatsink are 2.5mm in diameter and not threaded, so you can either use self-tapping M3 screws or regular M2 or M2.5 screws with nuts for mounting the fan to the heatsink. As it is not easy to hold the nuts in place, we recommend using self-tapping M3 screws, but please note that there can be aluminum shavings from tapping the thread, so make sure to clean off all shavings from the motherboard using canned air or a vacuum cleaner in order to avoid short-circuits. We recommend 15mm long screws for installing the NF-A4x10 and 25mm long screws for installing the NF-A4x20.
参考下面的链接,可以调节风扇转速
Jetson nano : PWM风扇调速。_momodosky的博客-CSDN博客安装硬件温度检测工具sensorssudo apt install lm-sensors,安装成功以后,输入 sensors:PWM风扇手动调速sudo sh -c 'echo XXX > /sys/devices/pwm-fan/target_pwm' XXX : 范围 0-255 代表转速。PWM风扇自动调速检查版本:python3 -V ,需要python3,如果没有sudo apt install python3-dev安装软件下载:https://g...https://blog.csdn.net/momodosky/article/details/116118301?spm=1001.2101.3001.6650.1&utm_medium=distribute.pc_relevant.none-task-blog-2%7Edefault%7ECTRLIST%7Edefault-1.no_search_link&depth_1-utm_source=distribute.pc_relevant.none-task-blog-2%7Edefault%7ECTRLIST%7Edefault-1.no_search_link
二、开发板不自带外壳,参考下面的资料可以自己做个外壳
Jetson Nano 2GB Developer Kit User Guide | NVIDIA Developer
注意:上文含《Jetson Nano 2GB Developer Kit 3D CAD STEP Model》,以及如何安装CSI2接口Camera的操作指导。
三、安装和设置操作系统参考
《Getting Started with Jetson Nano 2GB Developer Kit》
Getting Started with Jetson Nano 2GB Developer Kit | NVIDIA Developer
这一步的工作内容是,为了在Win10上远程桌面Jetson Nano,安装和设置VNC。测试docker是否能运行L4T Image,Camera是否能正常工作。
除非你有工作或则学习上新的需要,否则不要升级系统或则软件,我就遇到了升级后Container不能正常运行的问题,如何解决升级后遇到image不能运行的问题,参考资料[1]。
以下是我Jetson Nano 2G的版本信息
$ uname -r 4.9.253-tegra $ apt show nvidia-jetpack Package: nvidia-jetpack Version: 4.6-b199 Priority: standard Section: metapackages Maintainer: NVIDIA Corporation Installed-Size: 199 kB Depends: nvidia-cuda (= 4.6-b199), nvidia-opencv (= 4.6-b199), nvidia-cudnn8 (= 4.6-b199), nvidia-tensorrt (= 4.6-b199), nvidia-visionworks (= 4.6-b199), nvidia-container (= 4.6-b199), nvidia-vpi (= 4.6-b199), nvidia-l4t-jetson-multimedia-api (>> 32.6-0), nvidia-l4t-jetson-multimedia-api (<< 32.7-0) Homepage: http://developer.nvidia.com/jetson Download-Size: 29.4 kB APT-Sources: https://repo.download.nvidia.com/jetson/t210 r32.6/main arm64 Packages Description: NVIDIA Jetpack meta Package $ cat /etc/nv_tegra_release # R32 (release), REVISION: 6.1, GCID: 27863751, BOARD: t210ref, EABI: aarch64, DATE: Mon Jul 26 19:20:30 UTC 2021 $ docker --version Docker version 20.10.2, build 20.10.2-0ubuntu1~18.04.2 $ sudo cat /proc/device-tree/nvidia,dtsfilename /dvs/git/dirty/git-master_linux/kernel/kernel-4.9/arch/arm64/boot/dts/../../../../../../hardware/nvidia/platform/t210/batuu/kernel-dts/tegra210-p3448-0003-p3542-0000.dts
四、关于NVIDIA “Hello AI World” 使用已经build好的container进行图像分类和识别
点击Nvidia文档中的”点击Hello AI World”链接会进入”GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.”,由于我的网络问题直接git clone会出错, 只能zip打包下载下来解压到Jetson Nano开发板中,我把zip包解压到了自己的“/home/kagula”目录。
运行脚本,只要有文件下载的地方,就有可能出错,所以只能以zip形式下载模型,打开” Releases · dusty-nv/jetson-inference · GitHub”链接,下载GoogleNet.tar.gz,SSD-Mobilenet-v2.tar.gz这两个含模型的文件并解压到“/home/kagula/Downloads/jetson-inference-master/data/networks”路径下。
chmod +x /home/kagula/Downloads/jetson-inference-master/dockerrun.sh
然后运行这个脚本进入容器。现在我们在容器中,切换到“/jetson-inference/build/aarch64/bin”路径里。
检查CSI接口摄像头:使用” video-viewer csi://0”命令,在桌面上出现窗口以720p显示从camera中capture的frame。
imageNet默认使用GoogleNet模型,分类图像,下面的命令识别出源图是jellyfish,并在结果图片中标识出来。大该等待77+毫秒。Container中的”images/test/”路径同宿主机共享,让我们更容易访问输出结果。
#./imagenet images/jellyfish.jpg images/test/jellyfish.jpg
detectNet默认使用SSD-Mobilenet模型,识别图像中的对象,下面的命令识别出源图中有四个行人,并在结果图片中标识出来。大该等待181+毫秒。
#./detectnet images/peds_0.jpg images/test/peds_0.jpg
我们可以在Host的“/home/kagula/Downloads/jetson-inference-master/data/images/test”路径下,查阅到Container中以上两条命令输出的图片。
已经搭建好的C++ build环境使用“dockerrun.sh”启动的容器,已经有了搭建好的C++ build环境,可以正确运行下面my-recognition of ImageNet的c++例子。运行这个例子前,从上文提到的链接中下载ResNet-18.tar.gz模型。
jetson-inference/imagenet-example-2.md at master · dusty-nv/jetson-inference · GitHub
我们可以写几行代码建立test.cu文件在容器中测试C++ Build环境
#include__global__ void cuda_hello() { printf("n Hello CUDA nn"); } int main() { cuda_hello<<<1,1>>>(); cudaDeviceSynchronize(); return 0; }
nvcc test.cu && ./a.out
五、Docker常用示例
把host中的test.cpp文件复制到容器中的/home路径
$ sudo docker container ls
ConTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
edf2aafa715f nvcr.io/nvidia/l4t-base:r32.4.3 "/bin/bash" 3 hours ago Up 3 hours exciting_chaplygin
$ sudo docker cp ./test.cpp edf2aafa715f:/home
退出容器,使用exit命令,或Ctrl+D
退出容器,使用exit命令,或Ctrl+D
Mount宿主机目录到容器中
$ docker/run.sh --volume /my/host/path:/my/container/path # these should be absolute paths
参考资料
[1]《Docker fails to create container after upgrading docker on Jetpack 4.9 #108》
https://github.com/dusty-nv/jetson-containers/issues/108
[2]《Taking Your First Picture with CSI or USB Camera》
https://developer.nvidia.com/embedded/learn/tutorials/first-picture-csi-usb-camera
[3]《Camera Streaming and Multimedia》
关于如何指定输出流,输入流的参数
https://github.com/dusty-nv/jetson-inference/blob/master/docs/aux-streaming.md



