Pytorch Implementation of PointNet and PointNet++
This repo is implementation for VA-GCN in pytorch.
Classification (ModelNet10/40)
Data Preparation
Download alignment ModelNet here and save in data/modelnet40_normal_resampled/.
Run
You can run different modes with following codes.
- If you want to use offline processing of data, you can use
--process_datain the first run. You can download pre-processd data here and save it indata/modelnet40_normal_resampled/. - If you want to train on ModelNet10, you can use
--num_category 10.
# ModelNet40
## Select different models in ./models
## e.g., pointnet2_ssg without normal features
python train_classification.py --model VA-GCN_cls --log_dir VA-GCN_cls
python test_classification.py --log_dir VA-GCN_cls
## e.g., pointnet2_ssg with normal features
python train_classification.py --model VA-GCN_cls --use_normals --log_dir VA-GCN_cls_normal
python test_classification.py --use_normals --log_dir VA-GCN_cls_normal
Performance
| Model | Accuracy |
|---|---|
| PointNet (Official) | 89.2 |
| PointNet2 (Official) | 91.9 |
| PointNet2_SSG (Pytorch without normal) | 92.2 |
| PointNet2_SSG (Pytorch with normal) | 92.4 |
| PointNet2_MSG (Pytorch with normal) | 92.8 |
| VA-GCN (Pytorch with normal) | 93.5 |
| VA-GCN (Pytorch with normal)+MSI | 94.3 |
Part Segmentation (ShapeNet)
Data Preparation
Download alignment ShapeNet here and save in data/shapenetcore_partanno_segmentation_benchmark_v0_normal/.
Run
## Check model in ./models
## e.g., pointnet2_msg
python train_partseg.py --model VA-GCN_part_seg --normal --log_dir VA-GCN_part_seg
python test_partseg.py --normal --log_dir VA-GCN_part_seg_normal
Performance
| Model | Inctance avg IoU | Class avg IoU |
|---|---|---|
| PointNet (Official) | 83.7 | 80.4 |
| PointNet2 (Official) | 85.1 | 81.9 |
| PointNet2 (Official) | 85.5 | 82.6 |
Semantic Segmentation (S3DIS)
Data Preparation
Download 3D indoor parsing dataset (S3DIS) here and save in data/s3dis/Stanford3dDataset_v1.2_Aligned_Version/.
cd data_utils
python collect_indoor3d_data.py
Processed data will save in data/s3dis/stanford_indoor3d/.
Run
## Check model in ./models
## e.g., pointnet2_ssg
python train_semseg.py --model VA-GCN_sem_seg --test_area 5 --log_dir VA-GCN_sem_seg
python test_semseg.py --log_dir VA-GCN_sem_seg --test_area 5 --visual
Visualization results will save in log/sem_seg/pointnet2_sem_seg/visual/ and you can visualize these .obj file by MeshLab.
Performance
| Model | Class avg IoU |
|---|---|
| PointNet (Pytorch) | 43.7 |
| PointNet2_ssg (Pytorch) | 53.5 |
| VA-GCN (Pytorch) | 56.9 |