2023-05-09 12:30:10 +00:00
{
"cells": [
2023-05-10 17:52:42 +00:00
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Now with SORT tracking\n",
"\n",
"Using a sort implementation originally by Alex Bewley, but adapted by [Chris Fotache](https://github.com/cfotache/pytorch_objectdetecttrack/blob/master/README.md). For an example implementation, see [his notebook](https://github.com/cfotache/pytorch_objectdetecttrack/blob/master/PyTorch_Object_Tracking.ipynb).\n",
"\n"
]
},
2023-05-09 12:30:10 +00:00
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 1,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [],
"source": [
"import cv2\n",
"from pathlib import Path\n",
"import numpy as np\n",
2023-05-09 14:46:19 +00:00
"from PIL import Image\n",
2023-05-09 12:30:10 +00:00
"import torch\n",
"from torchvision.io.video import read_video\n",
"import matplotlib.pyplot as plt\n",
"from torchvision.utils import draw_bounding_boxes\n",
"from torchvision.transforms.functional import to_pil_image\n",
2023-05-10 17:52:42 +00:00
"from torchvision.models.detection import retinanet_resnet50_fpn_v2, RetinaNet_ResNet50_FPN_V2_Weights\n",
" "
2023-05-09 12:30:10 +00:00
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"source = Path('../DATASETS/VIRAT_subset_0102x')\n",
2023-05-10 17:52:42 +00:00
"videos = source.glob('*.mp4')"
2023-05-09 12:30:10 +00:00
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"video_path = list(videos)[0]\n",
2023-05-10 17:52:42 +00:00
"video_path = Path(\"../DATASETS/VIRAT_subset_0102x/VIRAT_S_010200_00_000060_000218.mp4\")\n",
"video_path"
2023-05-09 12:30:10 +00:00
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 6,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
2023-05-10 17:52:42 +00:00
"device(type='cuda')"
2023-05-09 12:30:10 +00:00
]
},
2023-05-10 17:52:42 +00:00
"execution_count": 6,
2023-05-09 12:30:10 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2023-05-10 17:52:42 +00:00
"device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n",
"device"
2023-05-09 12:30:10 +00:00
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
2023-05-10 17:52:42 +00:00
"Based on code from: https://stackabuse.com/retinanet-object-detection-with-pytorch-and-torchvision/"
2023-05-09 12:30:10 +00:00
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 7,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [],
"source": [
"weights = RetinaNet_ResNet50_FPN_V2_Weights.DEFAULT\n",
"model = retinanet_resnet50_fpn_v2(weights=weights, score_thresh=0.35)\n",
"model.to(device)\n",
"# Put the model in inference mode\n",
"model.eval()\n",
"# Get the transforms for the model's weights\n",
"preprocess = weights.transforms().to(device)"
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 8,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [],
"source": [
"# hub.set_dir()"
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 9,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [],
"source": [
"video = cv2.VideoCapture(str(video_path))"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"> The score_thresh argument defines the threshold at which an object is detected as an object of a class. Intuitively, it's the confidence threshold, and we won't classify an object to belong to a class if the model is less than 35% confident that it belongs to a class."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"The result from a single prediction coming from `model(batch)` looks like:\n",
"\n",
"```python\n",
"{'boxes': tensor([[5.7001e+02, 2.5786e+02, 6.3138e+02, 3.6970e+02],\n",
" [5.0109e+02, 2.4508e+02, 5.5308e+02, 3.4852e+02],\n",
" [3.4096e+02, 2.7015e+02, 3.6156e+02, 3.1857e+02],\n",
" [5.0219e-01, 3.7588e+02, 9.7911e+01, 7.2000e+02],\n",
" [3.4096e+02, 2.7015e+02, 3.6156e+02, 3.1857e+02],\n",
" [8.3241e+01, 5.8410e+02, 1.7502e+02, 7.1743e+02]]),\n",
" 'scores': tensor([0.8525, 0.6491, 0.5985, 0.4999, 0.3753, 0.3746]),\n",
" 'labels': tensor([64, 64, 1, 64, 18, 86])}\n",
"```"
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 10,
2023-05-09 12:30:10 +00:00
"metadata": {},
2023-05-10 17:52:42 +00:00
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"/home/ruben/suspicion/trajpred/sort_cfotache.py:36: NumbaDeprecationWarning: The 'nopython' keyword argument was not supplied to the 'numba.jit' decorator. The implicit default value for this argument is currently False, but it will be changed to True in Numba 0.59.0. See https://numba.readthedocs.io/en/stable/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit for details.\n",
" def iou(bb_test,bb_gt):\n"
]
}
],
2023-05-09 12:30:10 +00:00
"source": [
"from sort_cfotache import Sort\n",
"\n",
"mot_tracker = Sort()\n",
"\n",
2023-05-10 17:52:42 +00:00
"display_image = False"
2023-05-09 12:30:10 +00:00
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 11,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [],
"source": [
"tracked_instances = {}"
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 12,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [
{
2023-05-10 17:52:42 +00:00
"name": "stdout",
"output_type": "stream",
"text": [
"Can't receive frame (stream end?). Exiting ...\n"
2023-05-09 12:30:10 +00:00
]
}
],
"source": [
"# TODO make into loop\n",
"%matplotlib inline\n",
"\n",
"\n",
"import pylab as pl\n",
"from IPython import display\n",
"from utils.timer import Timer\n",
"\n",
"i=0\n",
"timer = Timer()\n",
"while True:\n",
" timer.tic()\n",
" ret, frame = video.read()\n",
" i+=1\n",
" \n",
" if not ret:\n",
" print(\"Can't receive frame (stream end?). Exiting ...\")\n",
" break\n",
"\n",
" t = torch.from_numpy(cv2.cvtColor(frame, cv2.COLOR_BGR2RGB))\n",
" t.shape\n",
2023-05-10 17:52:42 +00:00
" # change axes of image loaded image to be compatilbe with torch.io.read_image (which has C,W,H format instead of W,H,C)\n",
2023-05-09 12:30:10 +00:00
" t = t.permute(2, 0, 1)\n",
" t.shape\n",
"\n",
" batch = preprocess(t)[None, :].to(device)\n",
" # no_grad can be used on inference, should be slightly faster\n",
" with torch.no_grad():\n",
" predictions = model(batch)\n",
" prediction = predictions[0] # we feed only one frame at the once\n",
"\n",
" mask = prediction['labels'] == 1 # if we want more than one: np.isin(prediction['labels'], [1,86])\n",
"\n",
" scores = prediction['scores'][mask]\n",
" labels = prediction['labels'][mask]\n",
" boxes = prediction['boxes'][mask]\n",
" \n",
" # TODO: introduce confidence and NMS supression: https://github.com/cfotache/pytorch_objectdetecttrack/blob/master/PyTorch_Object_Tracking.ipynb\n",
" # (which I _think_ we better do after filtering)\n",
" # alternatively look at Soft-NMS https://towardsdatascience.com/non-maximum-suppression-nms-93ce178e177c\n",
"\n",
" \n",
" # dets - a numpy array of detections in the format [[x1,y1,x2,y2,score],[x1,y1,x2,y2,score],...]\n",
" detections = np.array([np.append(bbox, [score, label]) for bbox, score, label in zip(boxes.cpu(), scores.cpu(), labels.cpu())])\n",
" # print(detections)\n",
" tracks = mot_tracker.update(detections)\n",
"\n",
" # now convert back to boxes and labels\n",
" # print(tracks)\n",
" boxes = np.array([t[:4] for t in tracks])\n",
" # initialize empty with the necesserary dimensions for drawing_bounding_boxes glitch\n",
" t_boxes = torch.from_numpy(boxes) if len(boxes) else torch.Tensor().new_empty([0, 6])\n",
" labels = [str(int(t[4])) for t in tracks]\n",
" # print(t_boxes, boxes, labels)\n",
"\n",
"\n",
" for track in tracks:\n",
" # TODO add to tracked_instances\n",
" track_id = str(int(track[4]))\n",
" if track_id not in tracked_instances:\n",
" tracked_instances[track_id] = []\n",
" tracked_instances[track_id].append(track)\n",
"\n",
" \n",
" # labels = [weights.meta[\"categories\"][i] for i in labels]\n",
"\n",
" if display_image:\n",
" box = draw_bounding_boxes(t, boxes=t_boxes,\n",
" labels=labels,\n",
" colors=\"cyan\",\n",
" width=2, \n",
" font_size=30,\n",
" # font='Arial'\n",
" )\n",
"\n",
" im = to_pil_image(box.detach())\n",
"\n",
" display.display(im, f\"frame {i}\")\n",
2023-05-10 17:52:42 +00:00
" # print(prediction)\n",
2023-05-09 12:30:10 +00:00
" print(\"time for frame: \", timer.toc(), \", avg:\", 1/timer.average_time, \"fps\")\n",
"\n",
" display.clear_output(wait=True)\n",
"\n",
" # break # for now\n",
" # pl.clf()\n",
" # # pl.plot(pl.randn(100))\n",
" # pl.figure(figsize=(24,50))\n",
" # # fig.axes[0].imshow(img)\n",
" # pl.imshow(im)\n",
" # display.display(pl.gcf(), f\"frame {i}\")\n",
" # display.clear_output(wait=True)\n",
" # time.sleep(1.0)\n",
"\n",
" # fig, ax = plt.subplots(figsize=(16, 12))\n",
" # ax.imshow(im)\n",
" # plt.show()\n",
"\n"
]
},
2023-05-10 17:52:42 +00:00
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## Project / Homography\n",
"\n",
"Now that all trajectories are captured (for a single video), these can then be projected onto a flat surface by [homography](https://en.wikipedia.org/wiki/Homography_(computer_vision)). The necessary $H$ matrix is already provided by VIRAT in the [homographies folder](https://data.kitware.com/#folder/56f581c88d777f753209c9d2) of their online data repository."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"\n",
"homography = list(source.glob('*img2world.txt'))[0]\n",
"H = np.loadtxt(homography, delimiter=',')\n",
"\n"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"The homography matrix helps to transform points from image space to a flat world plane. The `README_homography.txt` from VIRAT describes:\n",
"\n",
"> Roughly estimated 3-by-3 homographies are included for convenience. \n",
"> Each homography H provides a mapping from image coordinate to scene-dependent world coordinate.\n",
"> \n",
"> [xw,yw,zw]' = H*[xi,yi,1]'\n",
"> \n",
"> xi: horizontal axis on image with left top corner as origin, increases right.\n",
"> yi: vertical axis on image with left top corner as origin, increases downward.\n",
"> \n",
"> xw/zw: world x coordinate\n",
"> yw/zw: world y coordiante"
]
},
2023-05-09 12:30:10 +00:00
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 13,
2023-05-09 12:30:10 +00:00
"metadata": {},
"outputs": [
2023-05-10 17:52:42 +00:00
{
"name": "stdout",
"output_type": "stream",
"text": [
"(1200, 900)\n"
]
},
2023-05-09 12:30:10 +00:00
{
"data": {
2023-05-10 17:52:42 +00:00
"image/png": "iVBORw0KGgoAAAANSUhEUgAABLAAAAOECAIAAAA+D1+tAAEAAElEQVR4nOy915Mk2XUeftOU97799LiddVgYklgq9CPBEKkHvTD0V+pJEdIzgyGSEgCBIMxizezMjutpb8p7l5m/hw/36FSamu7ZXWDBPt9DR3d1VubNa443hud5SiAQCAQCgUAgEAgEtw/mH3sAAoFAIBAIBAKBQCD440AUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAhuKUQhFAgEAoFAIBAIBIJbClEIBQKBQCAQCAQCgeCWQhRCgUAgEAgEAoFAILilEIVQIBAIBAKBQCAQCG4pRCEUCAQCgUAgEAgEglsKUQgFAoFAIBAIBAKB4JZCFEKBQCAQCAQCgUAguKUQhVAgEAgEAoFAIBAIbilEIRQIBAKBQCAQCASCWwpRCAUCgUAgEAgEAoHglkIUQoFAIBAIBAKBQCC4pRCFUCAQCAQCgUAgEAgEAoFAIBAIBAKBQCC4TTC2t7c9z1NKeZ6HX1zXvby8qFZr9CH9yzRDPIr4F76olIrFYoZheJ4Xj8cNw7Asy7ZtfNHzvBcvnt+7d9/zPMdxPM9zXRc3j8fj+NB13eVyid8ty3IcBx/i6bjbYrEIDmO5XNq2HY/H4/G4UspxnOVy6TiOYRjBix3HSafT4/F4sVhUq9VMJjOfz03TNAxjOp1Wq9XFYnF6erq1tbW5ufnLX/4ym812u93pdFqv103T7HQ6SqlUKpVIJPDinueNRqN4PH737t2Tk5NYLHbnzp1qtZrP50ulUjKZTKVSsVis3+8/f/788ePHh4eHrusWCgXTNIvF4v7+/jvvvFOr1RaLRb/fN00znU6Xy+XxeNzr9a6urv7t3/6t2+3+6Ec/Ojs7SyaT9+/fn0wmtm3HYrHz8/NkMnnnzp3ZbPbzn//89PS0XC6Xy+V2u10qlebz+c7OztbW1uHhYbVa/e1vfzudTn/84x+XSqUnT54cHR31er1sNptKpUzTdF03l8ttbW0Vi0XDMD799FOl1Hw+Xy6XhmE4jjOdTpVShmGYpul53mKxsCyrWq3W63WlVKVScRzn8vLy9PR0MBjUarX/+B//Yzqdxt0mk0mtVsvn861W6/Ly0rZt13Vns5nneYZhDIfDdDpdKpUqlUo2mzVNE0vveV7orosC7oY9iaU3DAP7h3/If/cfCcOo1WrFYhHva5pmPB6PxWJYi5OTk+Pj48FgkEgkGo2GaZqDwaDVao3HY8dxaOOlUqnQXUePUOw0TSYT7G06JpiZ2Ww2nU7pWzga0+l0uVzGYjHcMJlM0uzFYjHXdS3LsixruVzOZrNMJmNZVr1ej8fjR0dH4/EYk5zL5QzD6Pf7SqnhcHh0dJRMJpPJZDqdzuVymUymWq12Op3PP//8448/zuVy0+k0nU4nEonRaJRMJjOZDM2qZVlKqcViMZlM8Dh63/l8fnV1dXV1hXe5vLzEyAuFwp07dzDIQqHQarV+97vfpdPp9957D7uC5iqfz1uWdXBwoJQqlUqmac5mM9AT13Vt28ZaP3nyZD6fz+fz2Wy2v78/mUySyWSn04nFYvv7+7PZ7OXLl/P5HKSGNgN+4XsD+00xmkZrZBgG7RZOKvGnbdvB5Y7H4/1+/969ex988MHnn3/e6/VM0zRNMxaLYZdyLJfL0P2M+eRX4miEXuy6Lgbpui6Oz3K5/NGPftTpdDKZzMbGhmma8/nctm28Pr8tnhJ6Z5oNH7AJgwg9Vpir4J3XXAyCzA9L6EhC73xT8H37xpvz5cNMgmeBHfgQ+oJR+EbeJRTYYNg/+IRTwj8Yol4Q21UphTOiovfGTRH6gm+9KHzSQg+LYRiz2Qy8FQdNKQUZw3Xd4XA4m80syyIphY4qpzme5y2Xy+Vy6bouZgOSSTqdfvjw4XK5PDw8TCQS+Xw+lUqdnp46jpNIJNLptGmai8XCcZxerwchCo+wNDKZTCKRSKVSy+VyOBy6rptMJpU+X6AYSimIUoPBAH/idDiOA45Pu4jkN9M0cZ9YLAYJ0LZty7KSySROh+M4s9kMd3BdNxaL4TVjsZhlWeCtGxsb+G8ikcBRGgwGo9FI6c2AVyDpEYME1cKflmXl83maW6UUeDfxIMdxbNsmAlgqlTB4fi4gfmDO8bJgzZhGPB1LYxgG3hoDIL7AB8A/XywW2N6
2023-05-09 12:30:10 +00:00
"text/plain": [
2023-05-10 17:52:42 +00:00
"<PIL.PngImagePlugin.PngImageFile image mode=RGB size=1200x900>"
2023-05-09 12:30:10 +00:00
]
},
2023-05-10 17:52:42 +00:00
"execution_count": 13,
2023-05-09 12:30:10 +00:00
"metadata": {},
"output_type": "execute_result"
}
],
2023-05-09 14:46:19 +00:00
"source": [
"print(Image.open(\"../DATASETS/VIRAT_subset_0102x/VIRAT_0102_homography_img2world.png\").size)\n",
"Image.open(\"../DATASETS/VIRAT_subset_0102x/VIRAT_0102_homography_img2world.png\")\n"
]
},
{
"cell_type": "code",
2023-05-10 17:52:42 +00:00
"execution_count": 17,
2023-05-09 14:46:19 +00:00
"metadata": {},
2023-05-09 12:30:10 +00:00
"outputs": [
{
"data": {
2023-05-10 17:52:42 +00:00
"image/png": "iVBORw0KGgoAAAANSUhEUgAABjQAAAHYCAYAAADqPAHlAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOz9SY8sWZ7dCf7uJINONj57gz8fw8MjIiNyICMjk0kkml1oVrM21WADDXDRC4L75CZXzA0JrrjoDRfkh+AHIMBiVYIgUcWcIxmVGXP4+GazZ7NOInKHXlwR0cHMn3swgxkerHsAe/ZMVVREVOTKFdVz/uf8RQghkJCQkJCQkJCQkJCQkJCQkJCQkJCQkJCQ8AWG/HnvQEJCQkJCQkJCQkJCQkJCQkJCQkJCQkJCwmchCRoJCQkJCQkJCQkJCQkJCQkJCQkJCQkJCV94JEEjISEhISEhISEhISEhISEhISEhISEhIeELjyRoJCQkJCQkJCQkJCQkJCQkJCQkJCQkJCR84ZEEjYSEhISEhISEhISEhISEhISEhISEhISELzySoJGQkJCQkJCQkJCQkJCQkJCQkJCQkJCQ8IVHEjQSEhISEhISEhISEhISEhISEhISEhISEr7wSIJGQkJCQkJCQkJCQkJCQkJCQkJCQkJCQsIXHknQSEhISEhISEhISEhISEhISEhISEhISEj4wiMJGgkJCQkJCQkJCQkJCQkJCQkJCQkJCQkJX3gkQSMhISEhISEhISEhIeFnjH/9r/81b731FkVR8Ju/+Zv88R//8c97lxISEhISEhISEhJ+4ZEEjYSEhISEhISEhISEhJ8h/s2/+Tf87u/+Lv/sn/0zvv3tb/Orv/qr/L2/9/c4Pj7+ee9aQkJCQkJCQkJCwi80RAgh/Lx3IiEhISEhISEhISEh4b8X/OZv/ibf+ta3+Ff/6l8B4L3n9ddf5x//43/MP/kn/+TnvHcJCQkJCQkJCQkJv7jQP+8dSEhISEhISEhISEhI+O8FdV3zZ3/2Z/ze7/1e/5iUkr/7d/8uf/AHf3Dra6qqoqqq/m/vPWdnZxwcHCCE+G++zwkJCQkJCQkJCQk/T4QQuL6+5sGDB0j56lCpJGgkJCQkJCQkJCQkJCT8jPDy5Uucc9y9e3fj8bt37/KDH/zg1tf8i3/xL/jn//yf/3XsXkJCQkJCQkJCQsIXFo8ePeLhw4evXCYJGgkJCQkJCQkJCQkJCT9H/N7v/R6/+7u/2/99eXnJG2+8wb13jkCAD54QQv/jfQAfwPsb6zJGUxYlVV3ReE8AMmOQUrav9TjvcM7jnUMJgRS0f8d1hxAQKED029yGEGJjnUi94SZpbIOSirzI2d/fZzSZsAyBu/fvcefgkCLLubq6Yneyw5/+8Z8ggNl0xvnZBQhw3nNw5w7ee16+PEFnhm/+xm8gi4I//Paf4uqK0FSMipzBoGQ82qfxIE3O3bfe4eD+fb73gx/wK9/4ZQ4OD9jZ3eXZ8+f823/7b/nqV79KqQ0Pj444O3nGH/zH/4Cwlv3dXfK8YGdvnxenJzi/ZDQa8uTJE3YmO1xdXGCbGiMNEMjznIDAGMNsNkdrhVIKreOxmEwmDCa7TJcNo50dnj0/RkiFDZ66sYx39/iN3/otHrzzHpXQNI0lhEBe5jgCPgSkUggpCEIieJVbJ/QNMkP74/H9/7szGNcTUMGhsYxzzZ2dMdX0iieffMTTRx+xMy7Z393l5OSEZ4+f4L1HAK5qMMagtcI5t7F1HzyEbjsB31iauqGuK2zVELxHK82wLAg+cH5+zvViimqPWWYM5+fnDIZDBoMBAIvZktfuv8Gbr73Oj374Q2azOdPlgobA0cP7LGzNeH+fu69F0mNWLfnL//N7fP1XfpmyHFIMhvgQ2B0VyGARQuCsZTafMR5PkFLivefFs+e8ePqU8+MTquWSOw9e4/D1h9QCqrrmwx/+hF9696v43ODaYzG7vub89IR33noTETxKShSB/eEAVzd45xAhIKzH1hbhPSJAVVegxOqMtI955/EhELzH2oBrfLxWncd7x2KxaK/N+HjTNHi/eQ6Ukjx84w2mV9fM5jNEpgkK4uUbbr2WQzemAgQC4pZxJqQkEGhsQ13VvPHGG5Rlyd27d/EhjrJGeqQQCCmRQrA9ZQSxGoNxpSC33Gdi6zUC8ZkONe8dwYe4Xdkuf0uo+vp6Qrv9V0GIzW3fmtTuicdLCJxzeO8xxmxu5zO2Fdd7cx7/aeF9nLu7/b79uP3Vt0MItx7fuq4JIaC1ju9JAJ9x7iSvPr6f5zz5YPuFRQC8b+9fEQIIDXznv/w5Z2cvUUpSDgq0AedqlBLYxqHJqOsGZ30crAikVEhl2nmioRhIDg/3OT87RWvFdDbF2hqTaYq8YDqds1wssdYyHI7I8wyAqqoByDLTzueSuqrwIaCVYrlc4p1HKYV1lnqxpCxLQnCUZcnOzg7L5RxjFHVdMRwOybTGO49tHNZ6rI3HQSlDcAEhwNkGKSXaGJxz1HVNZky8XwN5NsQjqOsaoRR5WXDvwX2kUrz59luMRiMEgTLLybQBB7ax1N5hETghcAKoAsF5nLM0zuLa+5sPAaU1dVWxmM1W5yNEF6r3Dmcd1jV4HzBKoaQCwHmHFBLrLEYbTJZhvee6WpKZDOss1lqKvNwYZjKAdxYpFdPplLPTUxbVknfffZfBYMj0+grROMqiRErJ2fkFj548YjAYsru/R1bmSCHQUpCbjOBD/PxhDEVR8PjRYy6vrnjw8AEHr72GlBIpJUpIZtfXPPr4E8ajEQ/u3sc6S2YyiiLHNpZHH31EXTecnZ6S63gfNWVJNig4uzjn7t17vPXO28yrJYvlkqIscM7xk+/9EG8tv/S1X8LahsePnzAej9nZmTCZTDg/P+fJRx9xdX5B4xqK0YjhZMz1bMo7773L6fkZJ6cXeDMkG5QsqxobPPmgYP/uEX/zm99E5xmj4ZC9nR2MMXjrqOYLhBAE59s5PVBXS5azGVVdsawqgoCdnV1mizlCCrQx5GUcL4O8YFIOmV9PefbiOUIrysmIvCyZTq+ZzebkRUkxGKC0oShLGueZX834//1//r+Mx+NXX/wkQSMhISEhISEhISEhIeFnhsPDQ5RSvHjxYuPxFy9ecO/evVtfk+c5eZ7feNxkBgRbYkYrbnhPsFuksvdorSNJ7CJxLoRAKrkSH9rHRORs6Nip6OyXCBlJOqMNILHW0jTNBuEkpcS0JEkIIRLuiA0CS0pJlmdIKfnSl75E4zzf+cEPOTy6x3JR86d/+CeMx2OCEDx7ccz+3j6nZ2cIJMFHImQ6m2GMoRgO8QQ+efyYwWhELiVL7zFKoxDMr+YsZ5aj+6+zt3+H8+NTRNAcjHbRUvHGG2/SBI8/Oearv/Yr3L93n7Onz/j93/99Xn/tHqPJmGY25+r6GjGdM9nbZzad8aX33uD8/IwQPMvlAkJAColW8VhWVUU5GLK3v8+X3n23d+dkWcbu7i7OOb73o/f55W/9Btb7uA+VxVY1AomoLM10wZMnTzG7Bzx8+BAXPEjBsq5pnCMri/awys8gd0N/OkO7mA9+JWQQHxcoBB4ZPDo0BF9zfPqSxx++jwqet7/0JVw94+NPPub05CVGG4oswzeWfNAJYx4pN8lIHyTBd6R8wDUNrqrBe8osw2QZ4+GInZ0JdVVT1RXX1RyhNDrPyMuSvK7JypJiMKQsS3b34PDgCDMo0WWBaGqEU9y5c8CXvvoe3//xj3BKIQYFznuC9zijkEWGGmToMsMTMGWBbgUNay2Nd+Rl0QsapsgxRqONxFpFZhRFkbViXyS2ijzDakGQGiMlAyMZZZK7e7ucH7+kqeYEAidX5+hWFBA+IELAKI1RiuA9mZHQXouEQAAGOzsI6AWNEAQEgRASRCT2pZLQipk++O6MbpwDZy1lOWC+O6ZuGmocLvheCOnEkRXidrp5xbfr8GFrvd4TCEg8wUmMlmRakhuFDxInARnJdCkkQnaSyNoYuU1n+CxRAbEROSLiZjZe1gkJncAqhLixnRvixOfZdhI0bsenCBpI8XMSNFS/cC9
2023-05-09 12:30:10 +00:00
"text/plain": [
2023-05-10 17:52:42 +00:00
"<Figure size 2000x800 with 2 Axes>"
2023-05-09 12:30:10 +00:00
]
},
"metadata": {},
2023-05-09 14:46:19 +00:00
"output_type": "display_data"
2023-05-10 17:52:42 +00:00
},
{
"ename": "",
"evalue": "",
"output_type": "error",
"traceback": [
"\u001b[1;31mThe Kernel crashed while executing code in the the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click <a href='https://aka.ms/vscodeJupyterKernelCrash'>here</a> for more info. View Jupyter <a href='command:jupyter.viewOutput'>log</a> for further details."
]
2023-05-09 12:30:10 +00:00
}
],
2023-05-09 14:46:19 +00:00
"source": [
"from matplotlib import pyplot as plt\n",
"\n",
2023-05-10 17:52:42 +00:00
"fig = plt.figure(figsize=(20,8))\n",
2023-05-09 14:46:19 +00:00
"ax1, ax2 = fig.subplots(1,2)\n",
"\n",
2023-05-10 17:52:42 +00:00
"ax1.set_aspect(1)\n",
"ax2.imshow(Image.open(\"../DATASETS/VIRAT_subset_0102x/VIRAT_S_0102.jpg\"))\n",
2023-05-09 14:46:19 +00:00
"\n",
"for track_id in tracked_instances:\n",
" # print(track_id)\n",
" bboxes = tracked_instances[track_id]\n",
" traj = np.array([[[0.5 * (det[0]+det[2]), det[3]]] for det in bboxes])\n",
" projected_traj = cv2.perspectiveTransform(traj,H)\n",
" # plt.plot(projected_traj[:,0])\n",
" ax1.plot(projected_traj[:,:,0].reshape(-1), projected_traj[:,:,1].reshape(-1))\n",
" ax2.plot(traj[:,:,0].reshape(-1), traj[:,:,1].reshape(-1))\n",
2023-05-10 17:52:42 +00:00
" \n",
2023-05-09 14:46:19 +00:00
"plt.show() "
]
2023-05-09 12:30:10 +00:00
}
],
"metadata": {
"kernelspec": {
"display_name": ".venv",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.2"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "1135f674f58caf91385e41dd32dc418daf761a3c5d4526b1ac3bad0b893c2eb5"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}