**实验  深度学习与应用:行人跟踪 **
 ------
 **1、 实验目的**
 ------
 - 了解行人跟踪模型基础处理流程
 - 熟悉行人跟踪模型的基本原理
 - 掌握 行人跟踪模型的参数微调训练以及推理的能力
 - 掌握行人跟踪模型对实际问题的应用能力,了解如何在特定的场景和任务中应用该模型
 **2、实验环境**
 ------
 **[镜像详情]**
 虚拟机数量:1个(需GPU  >=4GB)
 虚拟机信息:
1. 操作系统:Ubuntu20.04
2. 代码位置:/home/zkpk/experiment/yolo_tracking_main
3. MOT17数据集存储位置:examples/val_utils/data/MOT17
    (数据集下载地址:Https://motchallenge.net)
4. 已安装软件:python版本:python 3.9,显卡驱动,cuda版本:cuda11.3 cudnn 版本:8.4.1,torch==1.12.1+cu113,torchvision= 0.13.1+cu113
 5. 根据requirements.txt,合理配置python环境
**3、实验内容**
 ------
 - 准备多目标跟踪数据集MOT17 ,下载地址位于(Https://motchallenge.net),放置于工程路径为:(examples/val_utils/data/MOT17)  
 - 根据不用的行人跟踪算法实现行人跟踪实验
 - 根据实验效果微调行人跟踪算法模型参数
 - 实现离线视频的行人跟踪
 **4、实验关键点**
 ------
 -  下载数据集放置于指定的文件夹下
 -  配置好算法所需的python虚拟环境
 -  掌握行人跟踪所需的算法基础
 -  具备一定的代码能力,解决实际问题
   
 **5、实验效果图**
 ------
 行人跟踪效果截图:
 
   <center>图 1</center>  
行人跟踪视频效果:
目标跟踪
**6、实验步骤**
 ------
 - 6.1 准备数据集,下载多目标跟踪数据集MOT17 ,下载地址位于(Https://motchallenge.net),将数据集放置于(examples/val_utils/data/MOT17)路径,如下图所示:
   
   <center>图 1</center>  
 - 6.2 实现行人跟踪方法对视频的实时检测,运行一下命令进入yolo_tracking_main\examples:  
   
   ```shell
   cd   /home/zkpk/experiment/yolo_tracking_main/examples
   ```
 运行python的track.py脚本,命令如下:
 ```shell
 python --yolo-model weights/yolov8n --tracking-method  deepocsort  ----reid-model  weights/lmbn_n_cuhk03_d.pt  --source  testvideo.mp4   --conf  0.3  --iou  0.5  
                                                          botsort  
                                                          strongsort
                                                          ocsort  
                                                          bytetrack
```
 分别对应5种不同的目标跟踪模型,实现对行人目标的跟踪

运行日志如下:
 ```
 Successfully loaded imagenet pretrained weights from "weights\osnet_x1_0_imagenet.pth"
 video 1/1 (1/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 63.4ms
 video 1/1 (2/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (3/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (4/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (5/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (6/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (7/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 13.0ms
 video 1/1 (8/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
 video 1/1 (9/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
 video 1/1 (10/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (11/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 13.0ms
 video 1/1 (12/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 9.0ms
 video 1/1 (13/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (14/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
 video 1/1 (15/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (16/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (17/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 9.0ms
 video 1/1 (18/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 13.0ms
 video 1/1 (19/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (20/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
 video 1/1 (21/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 13.0ms
 video 1/1 (22/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (23/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (24/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 15.0ms
 video 1/1 (25/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (26/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 10.0ms
 video 1/1 (27/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 11.0ms
 video 1/1 (28/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 13.0ms
 video 1/1 (29/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 4 persons, 13.0ms
 video 1/1 (30/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 4 persons, 13.0ms
 video 1/1 (31/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
 video 1/1 (32/2385) E:\PycharmProjects\yolo_tracking_main\examples\testvideo.mp4: 480x640 3 persons, 14.0ms
```
 6.3 根据上一步骤6.3 行人跟踪的效果,假如不理想可以使用MOT17数据集微调模型参数(在配置好数据集的情况才可以微调),运行一下命令:
 ``` shell
 python  --yolo-model weights/yolov8n.pt --tracking-method  deepocsort --benchmark  MOT17  --conf  0.45
```
微调参数过程日志如下:
```
 2023-11-17 17:34:48.482 | INFO     | val:eval:204 - Staring evaluation process on E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1
 2023-11-17 17:34:48.560 | INFO     | val:eval:204 - Staring evaluation process on E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1
 2023-11-17 17:35:00.221 | SUCCESS  | boxmot.appearance.reid_model_factory:load_pretrained_weights:207 - Successfully loaded pretrained weights from "E:\PycharmProjects\yolo_tracking_main\examples\weights\osnet_x0_25_msmt17.pt"
 2023-11-17 17:35:00.221 | WARNING  | boxmot.appearance.reid_model_factory:load_pretrained_weights:211 - The following layers are discarded due to unmatched keys or layer size: ('classifier.weight', 'classifier.bias')
 2023-11-17 17:35:00.228 | SUCCESS  | boxmot.appearance.reid_model_factory:load_pretrained_weights:207 - Successfully loaded pretrained weights from "E:\PycharmProjects\yolo_tracking_main\examples\weights\osnet_x0_25_msmt17.pt"
 2023-11-17 17:35:00.228 | WARNING  | boxmot.appearance.reid_model_factory:load_pretrained_weights:211 - The following layers are discarded due to unmatched keys or layer size: ('classifier.weight', 'classifier.bias')
 image 1/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000001.jpg: 736x1280 11 persons, 610.4ms
 image 1/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000001.jpg: 736x1280 25 persons, 652.3ms
 image 2/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000002.jpg: 736x1280 9 persons, 442.2ms
 image 2/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000002.jpg: 736x1280 22 persons, 454.5ms
 image 3/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000003.jpg: 736x1280 9 persons, 370.0ms
 image 3/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000003.jpg: 736x1280 24 persons, 450.9ms
 image 4/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000004.jpg: 736x1280 9 persons, 460.8ms
 image 4/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000004.jpg: 736x1280 23 persons, 385.0ms
 image 5/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000005.jpg: 736x1280 10 persons, 460.4ms
 image 5/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000005.jpg: 736x1280 22 persons, 399.6ms
 image 6/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000006.jpg: 736x1280 10 persons, 443.0ms
 image 7/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000007.jpg: 736x1280 10 persons, 460.8ms
 image 6/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000006.jpg: 736x1280 22 persons, 429.7ms
 image 8/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000008.jpg: 736x1280 10 persons, 434.5ms
 image 7/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000007.jpg: 736x1280 24 persons, 448.3ms
 image 9/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000009.jpg: 736x1280 10 persons, 386.9ms
 image 8/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000008.jpg: 736x1280 23 persons, 476.7ms
 image 10/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000010.jpg: 736x1280 11 persons, 869.1ms
 image 9/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000009.jpg: 736x1280 23 persons, 453.9ms
 image 11/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000011.jpg: 736x1280 11 persons, 460.8ms
 image 10/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000010.jpg: 736x1280 23 persons, 428.2ms
 image 12/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000012.jpg: 736x1280 9 persons, 439.9ms
 image 11/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000011.jpg: 736x1280 19 persons, 470.3ms
 image 13/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000013.jpg: 736x1280 10 persons, 440.8ms
 image 12/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000012.jpg: 736x1280 19 persons, 460.8ms
 image 14/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000014.jpg: 736x1280 9 persons, 434.2ms
 image 13/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000013.jpg: 736x1280 20 persons, 439.0ms
 image 15/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000015.jpg: 736x1280 8 persons, 384.9ms
 image 14/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000014.jpg: 736x1280 20 persons, 440.8ms
 image 16/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000016.jpg: 736x1280 8 persons, 462.8ms
 image 15/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000015.jpg: 736x1280 20 persons, 451.8ms
 image 17/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000017.jpg: 736x1280 8 persons, 470.7ms
 image 16/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000016.jpg: 736x1280 22 persons, 486.0ms
 image 18/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000018.jpg: 736x1280 7 persons, 410.9ms
 image 17/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000017.jpg: 736x1280 23 persons, 425.9ms
 image 19/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000019.jpg: 736x1280 7 persons, 380.0ms
 image 20/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000020.jpg: 736x1280 8 persons, 436.8ms
 image 18/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000018.jpg: 736x1280 23 persons, 447.8ms
 image 21/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000021.jpg: 736x1280 8 persons, 476.0ms
 image 19/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000019.jpg: 736x1280 21 persons, 518.6ms
 image 22/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000022.jpg: 736x1280 8 persons, 360.0ms
 image 20/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000020.jpg: 736x1280 22 persons, 388.6ms
 image 23/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000023.jpg: 736x1280 9 persons, 391.0ms
 image 21/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000021.jpg: 736x1280 22 persons, 416.9ms
 image 24/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000024.jpg: 736x1280 9 persons, 458.8ms
 image 22/1050 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-04-FRCNN\img1\000022.jpg: 736x1280 22 persons, 404.7ms
 image 25/600 E:\PycharmProjects\yolo_tracking_main\examples\val_utils\data\MOT17\train\MOT17-02-FRCNN\img1\000025.jpg: 736x1280 9 persons, 443.9ms
 ```
**7、思考题**
 ------
 -  考虑在行人跟踪中,模型算法还有哪些改进点
 -  思考怎么将跟踪算法模型应用到手部动作跟踪中
 -  思考如何调节模型参数和训练参数提升模型的效果指标
**8、 实验报告**
 ------
 请按照实验报告的格式要求撰写实验报告。