Skip to content

Commit

Permalink
fix predict oom and add some comments
Browse files Browse the repository at this point in the history
  • Loading branch information
lyuwenyu committed Jul 28, 2023
1 parent 2d22a7c commit 3b5cbcf
Show file tree
Hide file tree
Showing 5 changed files with 20 additions and 15 deletions.
10 changes: 8 additions & 2 deletions rtdetr_paddle/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,11 @@ path/to/custom/data/

3. Modify model config [`pretrain_weights`](configs/rtdetr/_base_/rtdetr_r50vd.yml) to coco pretrained parameters url in model zoo.

```bash
# or modified in command line

fleetrun --gpus=0,1,2,3 tools/train.py -c configs/rtdetr/rtdetr_r50vd_6x_coco.yml -o pretrain_weights=https://bj.bcebos.com/v1/paddledet/models/rtdetr_r50vd_6x_coco.pdparams --eval
```
</details>


Expand Down Expand Up @@ -171,10 +176,11 @@ trtexec --onnx=./rtdetr_r50vd_6x_coco.onnx \
<details>
<summary>1. Parameters and FLOPs </summary>

1. Find and modify [paddle flops source code](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/hapi/dynamic_flops.py#L28)
1. Find and modify paddle [`dynamic_flops.py` ](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/hapi/dynamic_flops.py#L28) source code in your local machine

```python
# anaconda3/lib/python3.8/site-packages/paddle/hapi/dynamic_flops.py
# eg. /path/to/anaconda3/lib/python3.8/site-packages/paddle/hapi/dynamic_flops.py

def flops(net, input_size, inputs=None, custom_ops=None, print_detail=False):
if isinstance(net, nn.Layer):
# If net is a dy2stat model, net.forward is StaticFunction instance,
Expand Down
2 changes: 1 addition & 1 deletion rtdetr_paddle/configs/rtdetr/_base_/rtdetr_r50vd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ ema_decay_type: "exponential"
ema_filter_no_grad: True
hidden_dim: 256
use_focal_loss: True
eval_size: [640, 640]
eval_size: [640, 640] # h, w


DETR:
Expand Down
2 changes: 1 addition & 1 deletion rtdetr_paddle/configs/rtdetr/_base_/rtdetr_reader.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ TrainReader:
EvalReader:
sample_transforms:
- Decode: {}
- Resize: {target_size: [640, 640], keep_ratio: False, interp: 2}
- Resize: {target_size: [640, 640], keep_ratio: False, interp: 2} # target_size: (h, w)
- NormalizeImage: {mean: [0., 0., 0.], std: [1., 1., 1.], norm_type: none}
- Permute: {}
batch_size: 4
Expand Down
13 changes: 8 additions & 5 deletions rtdetr_paddle/ppdet/engine/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -650,7 +650,8 @@ def setup_metrics_for_loader():
for step_id, data in enumerate(tqdm(loader)):
self.status['step_id'] = step_id
# forward
outs = self.model(data)
with paddle.no_grad():
outs = self.model(data)

outs['bbox'] = outs['bbox'].numpy() # only in test mode
shift_amount = data['st_pix']
Expand Down Expand Up @@ -798,10 +799,12 @@ def setup_metrics_for_loader():
for step_id, data in enumerate(tqdm(loader)):
self.status['step_id'] = step_id
# forward
if hasattr(self.model, 'modelTeacher'):
outs = self.model.modelTeacher(data)
else:
outs = self.model(data)
with paddle.no_grad():
if hasattr(self.model, 'modelTeacher'):
outs = self.model.modelTeacher(data)
else:
outs = self.model(data)

for _m in metrics:
_m.update(data, outs)

Expand Down
8 changes: 2 additions & 6 deletions rtdetr_paddle/ppdet/utils/visualizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,12 +127,8 @@ def draw_bbox(image, im_id, catid2name, bboxes, threshold):
# draw label
text = "{} {:.2f}".format(catid2name[catid], score)
# tw, th = draw.textsize(text)

if int(PIL.__version__.split('.')[0]) < 10:
tw, th = draw.textsize(text)
else:
left, top, right, bottom = draw.textbbox((0, 0), text)
tw, th = right - left, bottom - top
left, top, right, bottom = draw.textbbox((0, 0), text)
tw, th = right - left, bottom - top

draw.rectangle(
[(xmin + 1, ymin - th), (xmin + tw + 1, ymin)], fill=color)
Expand Down

0 comments on commit 3b5cbcf

Please sign in to comment.