```
├── LICENSE
├── README.md
├── config/
├── __init__.py
├── default.py
├── test.yaml
├── train.yaml
├── datasets/
├── __init__.py
├── sampler.py
├── scannet.py
├── scannet/
├── NYU40.mat
├── SensorData.py
├── batch_load_scannet_data.py
├── download_scannet.py
├── label_interpolate.py
├── load_scannet_data.py
├── meta_data/
├── scannet_means.npz
├── scannetv2-labels.combined.tsv
├── scannetv2_test.txt
├── scannetv2_train.txt
├── scannetv2_val.txt
├── reader.py
├── scannet_utils.py
├── scannetv2_test.txt
├── scannetv2_train.txt
├── scannetv2_val.txt
├── transforms.py
├── visualization.py
├── demo/
├── Overview.png
├── demo.gif
├── main.py
├── models/
├── __init__.py
├── backbone.py
├── criterion.py
├── gru_fusion.py
```
## /LICENSE
``` path="/LICENSE"
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
## /README.md
## EPRecon: An Efficient Framework for Real-Time Panoptic 3D Reconstruction from Monocular Video
## Installation
```
conda create -n EPRecon python=3.9
conda activate EPRecon
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.7 -c pytorch -c nvidia
sudo apt-get install libsparsehash-dev
git clone -b v2.0.0 https://github.com/mit-han-lab/torchsparse.git
cd torchsparse
pip install tqdm
pip install .
git clone https://github.com/zhen6618/EPRecon.git
cd EPRecon
pip install -r requirements.txt
pip install sparsehash
pip install -U openmim
mim install mmcv-full
```
## Dataset
1. Download and extract ScanNet by following the instructions provided at http://www.scan-net.org/.
```
python datasets/scannet/download_scannet.py
```
2. Generate depth, color, pose, intrinsics from .sens file (change your file path)
```
python datasets/scannet/reader.py
```
Expected directory structure of ScanNet can refer to [NeuralRecon](https://github.com/zju3dv/NeuralRecon)
3. Extract instance-level semantic labels (change your file path).
```
python datasets/scannet/batch_load_scannet_data.py
```
4. Label generation for panoptic reconstruction (change your file path):
```
# training/val split
python tools/tsdf_fusion/generate_gt.py --data_path datasets/scannet/ --save_name all_tsdf_9 --window_size 9
# test split
python tools/tsdf_fusion/generate_gt.py --test --data_path datasets/scannet/ --save_name all_tsdf_9 --window_size 9
```
5. Panoptic label interpolation (change your file path):
```
python datasets/scannet/label_interpolate.py
```
## Training
```
python main.py --cfg ./config/train.yaml
```
## Testing
```
python main.py --cfg ./config/test.yaml
```
## Generate Results for Evaluation
```
python tools/generate_semantic_instance.py
```
## Citation
```
@InProceedings{zhou2024epreconefficientframeworkrealtime,
title={EPRecon: An Efficient Framework for Real-Time Panoptic 3D Reconstruction from Monocular Video},
author={Zhen Zhou and Yunkai Ma and Junfeng Fan and Shaolin Zhang and Fengshui Jing and Min Tan},
year={2024},
booktitle={arXiv preprint arXiv:2409.01807},
}
```
## /config/__init__.py
```py path="/config/__init__.py"
from .default import _C as cfg
from .default import update_config
from .default import check_config
```
## /config/default.py
```py path="/config/default.py"
from yacs.config import CfgNode as CN
_C = CN()
_C.MODE = 'train'
_C.DATASET = 'scannet'
_C.BATCH_SIZE = 1
_C.LOADCKPT = ''
_C.LOGDIR = './checkpoints/debug'
_C.RESUME = True
_C.SUMMARY_FREQ = 20
_C.SAVE_FREQ = 1
_C.SEED = 1
_C.SAVE_SCENE_MESH = False
_C.SAVE_INCREMENTAL = False
_C.VIS_INCREMENTAL = False
_C.REDUCE_GPU_MEM = False
_C.LOCAL_RANK = 0
_C.DISTRIBUTED = False
# train
_C.TRAIN = CN()
_C.TRAIN.PATH = ''
_C.TRAIN.EPOCHS = 40
_C.TRAIN.LR = 0.001
_C.TRAIN.LREPOCHS = '12,24,36:2'
_C.TRAIN.WD = 0.0
_C.TRAIN.N_VIEWS = 5
_C.TRAIN.N_WORKERS = 8
_C.TRAIN.RANDOM_ROTATION_3D = True
_C.TRAIN.RANDOM_TRANSLATION_3D = True
_C.TRAIN.PAD_XY_3D = .1
_C.TRAIN.PAD_Z_3D = .025
_C.TRAIN.ACCUMULATION_STEPS = 4
_C.TRAIN.ONLY_INIT = False
_C.TRAIN.FUSE_TEMPORAL = True
_C.TRAIN.ONLY_OCC = False
_C.TRAIN.AUTOGRAD = False
# test
_C.TEST = CN()
_C.TEST.PATH = ''
_C.TEST.N_VIEWS = 5
_C.TEST.N_WORKERS = 4
# model
_C.MODEL = CN()
_C.MODEL.N_VOX = [128, 224, 192]
_C.MODEL.VOXEL_SIZE = 0.04
_C.MODEL.THRESHOLDS = [0, 0, 0]
_C.MODEL.N_LAYER = 3
_C.MODEL.TRAIN_NUM_SAMPLE = [4096, 16384, 65536]
_C.MODEL.TEST_NUM_SAMPLE = [32768, 131072]
_C.MODEL.LW = [1.0, 0.8, 0.64]
# TODO: images are currently loaded RGB, but the pretrained models expect BGR
_C.MODEL.PIXEL_MEAN = [103.53, 116.28, 123.675]
_C.MODEL.PIXEL_STD = [1., 1., 1.]
_C.MODEL.THRESHOLDS = [0, 0, 0]
_C.MODEL.POS_WEIGHT = 1.0
_C.MODEL.BACKBONE2D = CN()
_C.MODEL.BACKBONE2D.ARC = 'fpn-mnas'
_C.MODEL.SPARSEREG = CN()
_C.MODEL.SPARSEREG.DROPOUT = False
_C.MODEL.FUSION = CN()
_C.MODEL.FUSION.FUSION_ON = False
_C.MODEL.FUSION.HIDDEN_DIM = 64
_C.MODEL.FUSION.AVERAGE = False
_C.MODEL.FUSION.FULL = False
def update_config(cfg, args):
cfg.defrost()
cfg.merge_from_file(args.cfg)
cfg.merge_from_list(args.opts)
cfg.freeze()
def check_config(cfg):
pass
```
## /config/test.yaml
```yaml path="/config/test.yaml"
DATASET: 'scannet'
BATCH_SIZE: 1
SAVE_SCENE_MESH: True
LOGDIR: './checkpoints'
SAVE_FREQ: 1
MODE: 'test'
RESUME: True
LOADCKPT: './weights/model_000007-init.ckpt'
VIS_INCREMENTAL: False
SAVE_INCREMENTAL: True
TRAIN:
PATH: 'datasets/scannet'
EPOCHS: 991
N_VIEWS: 9
LR: 1e-3
N_WORKERS: 8
LREPOCHS: '12,24,48:2'
TEST:
PATH: 'datasets/scannet'
N_VIEWS: 9
N_WORKERS: 4
MODEL:
N_LAYER: 3
N_VOX: [96, 96, 96]
VOXEL_SIZE: 0.04
TRAIN_NUM_SAMPLE: [15000, 60000, 120000]
TEST_NUM_SAMPLE: [15000, 60000, 120000]
BACKBONE2D:
ARC: 'fpn-mnas-1'
FUSION:
FUSION_ON: True
HIDDEN_DIM: 64
AVERAGE: False
FULL: True
LW: [1.0, 0.8, 0.64, 0.8]
THRESHOLDS: [0, 0, 0]
POS_WEIGHT: 1.5
```
## /config/train.yaml
```yaml path="/config/train.yaml"
DATASET: 'scannet'
BATCH_SIZE: 4
SAVE_SCENE_MESH: False
LOGDIR: './checkpoints'
SAVE_FREQ: 1
MODE: 'train'
RESUME: True
LOADCKPT: './checkpoints/Voxel_Img_Feats_24_07_25/model_000099.ckpt'
TRAIN:
PATH: 'datasets/scannet'
EPOCHS: 100 # 30, 55
N_VIEWS: 9
LR: 1e-4
N_WORKERS: 8
LREPOCHS: '70,90:10' # '6,12,24:2', '20,25:10', '35,45:10'
ACCUMULATION_STEPS: 8
ONLY_INIT: False
ONLY_OCC: False
AUTOGRAD: False
TEST:
PATH: 'datasets/scannet'
N_VIEWS: 9
N_WORKERS: 4
MODEL:
N_LAYER: 3
N_VOX: [96, 96, 96]
VOXEL_SIZE: 0.04
TRAIN_NUM_SAMPLE: [15000, 60000, 120000]
TEST_NUM_SAMPLE: [15000, 60000, 120000]
BACKBONE2D:
ARC: 'fpn-mnas-1'
FUSION:
FUSION_ON: True
HIDDEN_DIM: 64
AVERAGE: False
FULL: True
LW: [1.0, 0.8, 0.64, 1.2]
THRESHOLDS: [0, 0, 0]
POS_WEIGHT: 1.5
```
## /datasets/__init__.py
```py path="/datasets/__init__.py"
import importlib
# find the dataset definition by name, for example ScanNetDataset (scannet.py)
def find_dataset_def(dataset_name):
module_name = 'datasets.{}'.format(dataset_name)
module = importlib.import_module(module_name)
if dataset_name == 'scannet':
return getattr(module, "ScanNetDataset")
```
## /datasets/sampler.py
```py path="/datasets/sampler.py"
import math
import torch
import torch.distributed as dist
from torch.utils.data.distributed import Sampler
class DistributedSampler(Sampler):
"""Sampler that restricts data loading to a subset of the dataset.
It is especially useful in conjunction with
:class:`torch.nn.parallel.DistributedDataParallel`. In such case, each
process can pass a DistributedSampler instance as a DataLoader sampler,
and load a subset of the original dataset that is exclusive to it.
.. note::
Dataset is assumed to be of constant size.
Arguments:
dataset: Dataset used for sampling.
num_replicas (optional): Number of processes participating in
distributed training.
rank (optional): Rank of the current process within num_replicas.
shuffle (optional): If true (default), sampler will shuffle the indices
.. warning::
In distributed mode, calling the ``set_epoch`` method is needed to
make shuffling work; each process will use the same random seed
otherwise.
Example::
>>> sampler = DistributedSampler(dataset) if is_distributed else None
>>> loader = DataLoader(dataset, shuffle=(sampler is None),
... sampler=sampler)
>>> for epoch in range(start_epoch, n_epochs):
... if is_distributed:
"""
def __init__(self, dataset, num_replicas=None, rank=None, shuffle=True):
if num_replicas is None:
if not dist.is_available():
raise RuntimeError("Requires distributed package to be available")
num_replicas = dist.get_world_size()
if rank is None:
if not dist.is_available():
raise RuntimeError("Requires distributed package to be available")
rank = dist.get_rank()
self.dataset = dataset
self.num_replicas = num_replicas
self.rank = rank
self.epoch = 0
self.num_samples = int(math.ceil(len(self.dataset) * 1.0 / self.num_replicas))
self.total_size = self.num_samples * self.num_replicas
self.shuffle = shuffle
def __iter__(self):
# deterministically shuffle based on epoch
g = torch.Generator()
g.manual_seed(self.epoch)
if self.shuffle:
indices = torch.randperm(len(self.dataset), generator=g).tolist()
else:
indices = list(range(len(self.dataset)))
# add extra samples to make it evenly divisible
indices += indices[:(self.total_size - len(indices))]
assert len(indices) == self.total_size
# subsample
# indices = indices[self.rank:self.total_size:self.num_replicas]
subsample_num = self.total_size // self.num_replicas
subsample_begin = subsample_num * self.rank
indices = indices[subsample_begin:subsample_begin + subsample_num]
assert len(indices) == self.num_samples
return iter(indices)
def __len__(self):
return self.num_samples
def set_epoch(self, epoch):
self.epoch = epoch
```
## /datasets/scannet.py
```py path="/datasets/scannet.py"
import os
import numpy as np
import pickle
import cv2
from PIL import Image
from torch.utils.data import Dataset
class ScanNetDataset(Dataset):
def __init__(self, datapath, mode, transforms, nviews, n_scales):
super(ScanNetDataset, self).__init__()
self.datapath = datapath
self.absolute_source_path = '/home/zhouzhen/Project/ScanNetV2_reader'
self.mode = mode
self.n_views = nviews
self.transforms = transforms
self.tsdf_file = 'all_tsdf_{}_1'.format(self.n_views)
assert self.mode in ["train", "val", "test"]
self.metas = self.build_list()
if mode == 'test':
self.source_path = 'scans_test'
else:
self.source_path = 'scans'
self.n_scales = n_scales
self.epoch = None
self.tsdf_cashe = {}
self.rgb_cashe = {}
self.semantic_cashe = {}
self.instance_cashe = {}
self.max_cashe = 50 # 100 TODO: CPU memory increases sharply!
def build_list(self):
with open(os.path.join(self.datapath, self.tsdf_file, 'fragments_{}.pkl'.format(self.mode)), 'rb') as f:
metas = pickle.load(f)
return metas
def __len__(self):
return len(self.metas)
def read_cam_file(self, filepath, vid):
intrinsics = np.loadtxt(os.path.join(filepath, 'intrinsic', 'intrinsic_color.txt'), delimiter=' ')[:3, :3]
intrinsics = intrinsics.astype(np.float32)
pose_path = "pose_" + "{:d}".format(vid)
extrinsics = np.loadtxt(os.path.join(filepath, 'pose', pose_path + '.txt'))
return intrinsics, extrinsics
def read_img(self, filepath):
img = Image.open(filepath)
return img
def read_depth(self, filepath):
# Read depth image and camera pose
depth_im = cv2.imread(filepath, -1).astype(
np.float32)
depth_im /= 1000. # depth is saved in 16-bit PNG in millimeters
depth_im[depth_im > 3.0] = 0
return depth_im
def read_scene_volumes_panoptic(self, data_path, scene):
if scene not in self.tsdf_cashe.keys():
if len(self.tsdf_cashe) > self.max_cashe:
self.tsdf_cashe = {}
self.rgb_cashe = {}
self.semantic_cashe = {}
self.instance_cashe = {}
full_tsdf_list = []
full_rgb_list = []
full_semantic_list = []
full_insatnce_list = []
for l in range(self.n_scales + 1):
# load full tsdf volume
full_tsdf = np.load(os.path.join(data_path, scene, 'full_tsdf_layer{}.npz'.format(l)), allow_pickle=True)
full_tsdf_list.append(full_tsdf.f.arr_0)
full_rgb = np.load(os.path.join(data_path, scene, 'full_rgb_layer{}.npz'.format(l)), allow_pickle=True)
full_rgb_list.append(full_rgb.f.arr_0)
full_semantic = np.load(os.path.join(data_path, scene, 'full_semantic_layer_interpolate{}.npz'.format(l)), allow_pickle=True)
full_semantic_list.append(full_semantic.f.arr_0)
full_instance = np.load(os.path.join(data_path, scene, 'full_instance_layer_interpolate{}.npz'.format(l)), allow_pickle=True)
full_insatnce_list.append(full_instance.f.arr_0)
self.tsdf_cashe[scene] = full_tsdf_list
self.rgb_cashe[scene] = full_rgb_list
self.semantic_cashe[scene] = full_semantic_list
self.instance_cashe[scene] = full_insatnce_list
return self.tsdf_cashe[scene], self.rgb_cashe[scene], self.semantic_cashe[scene], self.instance_cashe[scene]
def read_scene_volumes(self, data_path, scene):
if scene not in self.tsdf_cashe.keys():
if len(self.tsdf_cashe) > self.max_cashe:
self.tsdf_cashe = {}
full_tsdf_list = []
for l in range(self.n_scales + 1):
# load full tsdf volume
full_tsdf = np.load(os.path.join(data_path, scene, 'full_tsdf_layer{}.npz'.format(l)), allow_pickle=True)
full_tsdf_list.append(full_tsdf.f.arr_0)
self.tsdf_cashe[scene] = full_tsdf_list
return self.tsdf_cashe[scene]
def __getitem__(self, idx):
meta = self.metas[idx]
imgs = []
depth = []
extrinsics_list = []
intrinsics_list = []
if self.mode == 'train':
tsdf_list, rgb_list, semantic_list, instance_list = self.read_scene_volumes_panoptic(os.path.join(self.datapath, self.tsdf_file), meta['scene'])
else:
tsdf_list = self.read_scene_volumes(os.path.join(self.datapath, self.tsdf_file), meta['scene'])
for i, vid in enumerate(meta['image_ids']):
# load images
color_path = "color_" + "{:d}".format(vid)
imgs.append(self.read_img(os.path.join(self.absolute_source_path, self.source_path, meta['scene'], 'color', color_path + '.jpg')))
depth_path = "depth_" + "{:d}".format(vid)
depth.append(self.read_depth(os.path.join(self.absolute_source_path, self.source_path, meta['scene'], 'depth', depth_path + '.png')))
# load intrinsics and extrinsics
intrinsics, extrinsics = self.read_cam_file(os.path.join(self.absolute_source_path, self.source_path, meta['scene']), vid)
intrinsics_list.append(intrinsics)
extrinsics_list.append(extrinsics)
intrinsics = np.stack(intrinsics_list)
extrinsics = np.stack(extrinsics_list)
if self.mode == 'train':
items = {
'imgs': imgs,
'depth': depth,
'intrinsics': intrinsics,
'extrinsics': extrinsics,
'tsdf_list_full': tsdf_list,
'rgb_list_full': rgb_list,
'semantic_list_full': semantic_list,
'instance_list_full': instance_list,
'vol_origin': meta['vol_origin'],
'scene': meta['scene'],
'fragment': meta['scene'] + '_' + str(meta['fragment_id']),
'epoch': [self.epoch],
}
else:
items = {
'imgs': imgs,
'depth': depth,
'intrinsics': intrinsics,
'extrinsics': extrinsics,
'tsdf_list_full': tsdf_list,
'vol_origin': meta['vol_origin'],
'scene': meta['scene'],
'fragment': meta['scene'] + '_' + str(meta['fragment_id']),
'epoch': [self.epoch],
}
if self.transforms is not None:
items = self.transforms(items)
return items
```
## /datasets/scannet/NYU40.mat
Binary file available at https://raw.githubusercontent.com/zhen6618/EPRecon/refs/heads/main/datasets/scannet/NYU40.mat
## /datasets/scannet/SensorData.py
```py path="/datasets/scannet/SensorData.py"
import os, struct
import numpy as np
import zlib
import imageio
import cv2
import png
from PIL import Image
from contextlib import contextmanager
COMPRESSION_TYPE_COLOR = {-1:'unknown', 0:'raw', 1:'png', 2:'jpeg'}
COMPRESSION_TYPE_DEPTH = {-1:'unknown', 0:'raw_ushort', 1:'zlib_ushort', 2:'occi_ushort'}
@contextmanager
def print_array_on_one_line():
oldoptions = np.get_printoptions()
np.set_printoptions(linewidth=np.inf)
np.set_printoptions(linewidth=np.inf)
yield
np.set_printoptions(**oldoptions)
class RGBDFrame():
def load(self, file_handle):
self.camera_to_world = np.asarray(struct.unpack('f'*16, file_handle.read(16*4)), dtype=np.float32).reshape(4, 4)
self.timestamp_color = struct.unpack('Q', file_handle.read(8))[0]
self.timestamp_depth = struct.unpack('Q', file_handle.read(8))[0]
self.color_size_bytes = struct.unpack('Q', file_handle.read(8))[0]
self.depth_size_bytes = struct.unpack('Q', file_handle.read(8))[0]
self.color_data = b''.join(struct.unpack('c'*self.color_size_bytes, file_handle.read(self.color_size_bytes)))
self.depth_data = b''.join(struct.unpack('c'*self.depth_size_bytes, file_handle.read(self.depth_size_bytes)))
def decompress_depth(self, compression_type):
if compression_type == 'zlib_ushort':
return self.decompress_depth_zlib()
else:
raise
def decompress_depth_zlib(self):
return zlib.decompress(self.depth_data)
def decompress_color(self, compression_type):
if compression_type == 'jpeg':
return self.decompress_color_jpeg()
else:
raise
def dump_color_to_file(self, compression_type, filepath):
if compression_type == 'jpeg':
filepath += ".jpg"
else:
raise
f = open(filepath, "wb")
f.write(self.color_data)
f.close()
def decompress_color_jpeg(self):
return imageio.imread(self.color_data)
class SensorData:
def __init__(self, filename):
self.version = 4
self.load(filename)
def load(self, filename):
with open(filename, 'rb') as f:
version = struct.unpack('I', f.read(4))[0]
assert self.version == version
strlen = struct.unpack('Q', f.read(8))[0]
self.sensor_name = b''.join(struct.unpack('c'*strlen, f.read(strlen)))
self.intrinsic_color = np.asarray(struct.unpack('f'*16, f.read(16*4)), dtype=np.float32).reshape(4, 4)
self.extrinsic_color = np.asarray(struct.unpack('f'*16, f.read(16*4)), dtype=np.float32).reshape(4, 4)
self.intrinsic_depth = np.asarray(struct.unpack('f'*16, f.read(16*4)), dtype=np.float32).reshape(4, 4)
self.extrinsic_depth = np.asarray(struct.unpack('f'*16, f.read(16*4)), dtype=np.float32).reshape(4, 4)
self.color_compression_type = COMPRESSION_TYPE_COLOR[struct.unpack('i', f.read(4))[0]]
self.depth_compression_type = COMPRESSION_TYPE_DEPTH[struct.unpack('i', f.read(4))[0]]
self.color_width = struct.unpack('I', f.read(4))[0]
self.color_height = struct.unpack('I', f.read(4))[0]
self.depth_width = struct.unpack('I', f.read(4))[0]
self.depth_height = struct.unpack('I', f.read(4))[0]
self.depth_shift = struct.unpack('f', f.read(4))[0]
self.num_frames = struct.unpack('Q', f.read(8))[0]
self.frames = []
for i in range(self.num_frames):
frame = RGBDFrame()
frame.load(f)
self.frames.append(frame)
self.num_IMU_frames = struct.unpack('Q', f.read(8))[0]
def export_depth_images(self, output_path, image_size=None, frame_skip=1):
if not os.path.exists(output_path):
os.makedirs(output_path)
print('exporting', len(self.frames), frame_skip, ' depth frames to', output_path)
for f in range(0, len(self.frames), frame_skip):
depth_data = self.frames[f].decompress_depth(self.depth_compression_type)
depth = np.fromstring(depth_data, dtype=np.uint16).reshape(self.depth_height, self.depth_width)
if image_size is not None:
depth = cv2.resize(depth, (image_size[0], image_size[1]), interpolation=cv2.INTER_NEAREST)
filepath = os.path.join(output_path, f"frame-{f:06d}.depth.{int(image_size[0])}.png")
else:
filepath = os.path.join(output_path, f"depth_{f}.png")
with open(filepath, 'wb') as f: # write 16-bit
writer = png.Writer(width=depth.shape[1], height=depth.shape[0], bitdepth=16)
depth = depth.reshape(-1, depth.shape[1]).tolist()
writer.write(f, depth)
def export_color_images(self, output_path, image_size=None, frame_skip=1):
if not os.path.exists(output_path):
os.makedirs(output_path)
print('exporting', len(self.frames), frame_skip, 'color frames to', output_path)
for f in range(0, len(self.frames), frame_skip):
color = self.frames[f].decompress_color(self.color_compression_type)
if image_size is not None:
resized = Image.fromarray(color).resize((image_size[0], image_size[1]), resample=Image.BILINEAR)
filepath = os.path.join(output_path, f"frame-{f:06d}.color.{int(image_size[0])}.png")
resized.save(filepath)
else:
filepath = os.path.join(output_path, f"frame-{f:06d}.color")
self.frames[f].dump_color_to_file(self.color_compression_type, filepath)
def save_mat_to_file(self, matrix, filename):
with open(filename, 'w') as f:
for line in matrix:
np.savetxt(f, line[np.newaxis], fmt='%f')
def export_poses(self, output_path, frame_skip=1):
if not os.path.exists(output_path):
os.makedirs(output_path)
print('exporting', len(self.frames), frame_skip, 'camera poses to', output_path)
for f in range(0, len(self.frames), frame_skip):
self.save_mat_to_file(self.frames[f].camera_to_world, os.path.join(output_path, f"frame-{f:06d}.pose.txt"))
def export_intrinsics(self, output_path, scan_name):
default_intrinsics_path = os.path.join(output_path, 'intrinsic')
if not os.path.exists(default_intrinsics_path):
os.makedirs(default_intrinsics_path)
print('exporting camera intrinsics to', default_intrinsics_path)
self.save_mat_to_file(self.intrinsic_color, os.path.join(default_intrinsics_path, 'intrinsic_color.txt'))
self.save_mat_to_file(self.extrinsic_color, os.path.join(default_intrinsics_path, 'extrinsic_color.txt'))
self.save_mat_to_file(self.intrinsic_depth, os.path.join(default_intrinsics_path, 'intrinsic_depth.txt'))
self.save_mat_to_file(self.extrinsic_depth, os.path.join(default_intrinsics_path, 'extrinsic_depth.txt'))
```
## /datasets/scannet/batch_load_scannet_data.py
```py path="/datasets/scannet/batch_load_scannet_data.py"
import argparse
import datetime
import os
from os import path as osp
import numpy as np
from load_scannet_data import export
# scannet provides 607 objects, which are summarized into 40 categories in nyuv2-40;
# many current instance segmentation tasks are processed according to the categories in Nyuv2,
# where [1, 2] is [wall, floor], and the remaining 18 categories are instance classes (sequence numbers are as follows)
# nyuv2-40:
# 1-wall 2-floor 3-cabinet 4-bed 5-chair
# 6-sofa 7-table 8-door 9-window 10-bookshelf
# 11-picture 12-counter 13-blinds 14-desk 15-shelves
# 16-curtain 17-dresser 18-pillow 19-mirror 20-floor mat
# 21-clothes 22-ceiling 23-books 24-refrigerator 25-television
# 26-paper 27-towel 28-shower curtain 29-box 30-whiteboard
# 31-person 32-night stand 33-toilet 34-sink 35-lamp
# 36-bathtub 37-bag 38-otherstructure 39-otherfurniture 40-otherprop
# nyuv2-20:
# 1-wall 2-floor 3-cabinet 4-bed 5-chair
# 6-sofa 7-table 8-door 9-window 10-bookshelf
# 11-picture 12-counter 14-desk 16-curtain 24-refridgerator
# 28-shower curtain 33-toilet 34-sink 36-bathtub 39-otherfurniture
# When reading data, if 0 appears, it means there is no category
OBJ_CLASS_IDS = np.array([3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 14, 16, 24, 28, 33, 34, 36, 39]) # for instance segmentation task
DONOTCARE_CLASS_IDS = np.array([])
def reassign_ids(A, B):
new_A = np.zeros_like(A)
# For labels of 0, 1, and 2, directly set the corresponding instance id to its label value
for label in [0, 1, 2]: # 0 (no category), 1 (wall background category), 2 (floor background category)
mask = B == label
new_A[mask] = label
mask = (B != 0) & (B != 1) & (B != 2)
unique_ids = np.unique(A[mask])
next_id = 3
for id in unique_ids:
id_mask = A == id
new_A[id_mask & mask] = next_id
next_id += 1
return new_A
def export_one_scan(scan_name,
output_filename_prefix,
max_num_point,
label_map_file,
scannet_dir,
test_mode=False):
mesh_file = osp.join(scannet_dir, scan_name, scan_name + '_vh_clean_2.ply')
agg_file = osp.join(scannet_dir, scan_name,
scan_name + '.aggregation.json')
seg_file = osp.join(scannet_dir, scan_name,
scan_name + '_vh_clean_2.0.010000.segs.json')
# includes axisAlignment info for the train set scans.
'mesh_vertices: '
'mesh_vertices: [N_voxels, 6] xyzrgb'
'semantic_labels: [N_voxels] class_id'
'instance_labels: [N_voxels] instance_id'
'unaligned_bboxes: [N_instance, 7] xyzwhl+class_id'
'aligned_bboxes: [N_instance, 7] xyzwhl+class_id'
'instance2semantic: [N_instance] corresponding class_id for each instance_id'
'axis_align_matrix: [4, 4] axis-align transformation matrix'
meta_file = osp.join(scannet_dir, scan_name, f'{scan_name}.txt')
mesh_vertices, semantic_labels, instance_labels, unaligned_bboxes, \
aligned_bboxes, instance2semantic, axis_align_matrix = export(
mesh_file, agg_file, seg_file, meta_file, label_map_file, None,
test_mode)
if not test_mode:
mask = np.logical_not(np.in1d(semantic_labels, DONOTCARE_CLASS_IDS))
mesh_vertices = mesh_vertices[mask, :]
semantic_labels = semantic_labels[mask]
instance_labels = instance_labels[mask]
num_instances = len(np.unique(instance_labels))
print(f'Num of instances: {num_instances}')
bbox_mask = np.in1d(unaligned_bboxes[:, -1], OBJ_CLASS_IDS)
unaligned_bboxes = unaligned_bboxes[bbox_mask, :]
bbox_mask = np.in1d(aligned_bboxes[:, -1], OBJ_CLASS_IDS)
aligned_bboxes = aligned_bboxes[bbox_mask, :]
assert unaligned_bboxes.shape[0] == aligned_bboxes.shape[0]
print(f'Num of care instances: {unaligned_bboxes.shape[0]}')
if max_num_point is not None:
max_num_point = int(max_num_point)
N = mesh_vertices.shape[0]
if N > max_num_point:
choices = np.random.choice(N, max_num_point, replace=False)
mesh_vertices = mesh_vertices[choices, :]
if not test_mode:
semantic_labels = semantic_labels[choices]
instance_labels = instance_labels[choices]
# 0 (no category), 1 (wall background category), 2 (floor background category) are fixed, the rest of the instances remain unchanged, and the ids are reordered
instance_labels = reassign_ids(instance_labels, semantic_labels)
np.save(f'{output_filename_prefix}_vert.npy', mesh_vertices)
if not test_mode:
np.save(f'{output_filename_prefix}_sem_label.npy', semantic_labels)
np.save(f'{output_filename_prefix}_ins_label.npy', instance_labels)
np.save(f'{output_filename_prefix}_unaligned_bbox.npy', unaligned_bboxes)
np.save(f'{output_filename_prefix}_aligned_bbox.npy', aligned_bboxes)
np.save(f'{output_filename_prefix}_axis_align_matrix.npy', axis_align_matrix)
def batch_export(max_num_point,
output_folder,
scan_names_file,
label_map_file,
scannet_dir,
test_mode=False):
if test_mode and not os.path.exists(scannet_dir):
# test data preparation is optional
return
if not os.path.exists(output_folder):
print(f'Creating new data folder: {output_folder}')
os.mkdir(output_folder)
scan_names = [line.rstrip() for line in open(scan_names_file)]
for scan_name in scan_names:
print('-' * 20 + 'begin')
print(datetime.datetime.now())
print(scan_name)
output_filename_prefix = osp.join(output_folder, scan_name)
if osp.isfile(f'{output_filename_prefix}_vert.npy'):
print('File already exists. skipping.')
print('-' * 20 + 'done')
continue
try:
export_one_scan(scan_name, output_filename_prefix, max_num_point, label_map_file, scannet_dir, test_mode)
except Exception:
print(f'Failed export scan: {scan_name}')
print('-' * 20 + 'done')
def main():
parser = argparse.ArgumentParser()
parser.add_argument(
'--max_num_point',
default=None,
help='The maximum number of the points.')
parser.add_argument(
'--output_folder',
default='panoptic_info_debug',
help='output folder of the result.')
parser.add_argument(
'--train_scannet_dir', default='scans', help='scannet data directory.')
parser.add_argument(
'--test_scannet_dir',
default='scans_test',
help='scannet data directory.')
parser.add_argument(
'--label_map_file',
default='meta_data/scannetv2-labels.combined.tsv',
help='The path of label map file.')
parser.add_argument(
'--train_scan_names_file',
default='meta_data/scannet_train.txt',
help='The path of the file that stores the scan names.')
parser.add_argument(
'--test_scan_names_file',
default='meta_data/scannetv2_test.txt',
help='The path of the file that stores the scan names.')
args = parser.parse_args()
batch_export(
args.max_num_point,
args.output_folder,
args.train_scan_names_file,
args.label_map_file,
args.train_scannet_dir,
test_mode=False)
batch_export(
args.max_num_point,
args.output_folder,
args.test_scan_names_file,
args.label_map_file,
args.test_scannet_dir,
test_mode=True)
if __name__ == '__main__':
main()
```
## /datasets/scannet/download_scannet.py
```py path="/datasets/scannet/download_scannet.py"
#!/usr/bin/env python
# Downloads ScanNet public data release
# Run with ./download-scannet.py (or python download-scannet.py on Windows)
# -*- coding: utf-8 -*-
import argparse
import os
import urllib.request
import tempfile
import subprocess
import time
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
BASE_URL = 'http://kaldir.vc.in.tum.de/scannet/'
TOS_URL = BASE_URL + 'ScanNet_TOS.pdf'
FILETYPES = ['.aggregation.json', '.sens', '.txt', '_vh_clean.ply', '_vh_clean_2.0.010000.segs.json', '_vh_clean_2.ply', '_vh_clean.segs.json', '_vh_clean.aggregation.json', '_vh_clean_2.labels.ply', '_2d-instance.zip', '_2d-instance-filt.zip', '_2d-label.zip', '_2d-label-filt.zip']
FILETYPES_TEST = ['.sens', '.txt', '_vh_clean.ply', '_vh_clean_2.ply']
PREPROCESSED_FRAMES_FILE = ['scannet_frames_25k.zip', '5.6GB']
TEST_FRAMES_FILE = ['scannet_frames_test.zip', '610MB']
LABEL_MAP_FILES = ['scannetv2-labels.combined.tsv', 'scannet-labels.combined.tsv']
DATA_EFFICIENT_FILES = ['limited-reconstruction-scenes.zip', 'limited-annotation-points.zip', 'limited-bboxes.zip', '1.7MB']
GRIT_FILES = ['ScanNet-GRIT.zip']
RELEASES = ['v2/scans', 'v1/scans']
RELEASES_TASKS = ['v2/tasks', 'v1/tasks']
RELEASES_NAMES = ['v2', 'v1']
RELEASE = RELEASES[0]
RELEASE_TASKS = RELEASES_TASKS[0]
RELEASE_NAME = RELEASES_NAMES[0]
LABEL_MAP_FILE = LABEL_MAP_FILES[0]
RELEASE_SIZE = '1.2TB'
V1_IDX = 1
def get_release_scans(release_file):
scan_lines = urllib.request.urlopen(release_file)
scans = []
for scan_line in scan_lines:
scan_id = scan_line.decode('utf8').rstrip('\n')
scans.append(scan_id)
return scans
def download_release(release_scans, out_dir, file_types, use_v1_sens):
if len(release_scans) == 0:
return
print('Downloading ScanNet ' + RELEASE_NAME + ' release to ' + out_dir + '...')
for scan_id in release_scans:
scan_out_dir = os.path.join(out_dir, scan_id)
download_scan(scan_id, scan_out_dir, file_types, use_v1_sens)
print('Downloaded ScanNet ' + RELEASE_NAME + ' release.')
def download_file(url, out_file):
out_dir = os.path.dirname(out_file)
if not os.path.isdir(out_dir):
os.makedirs(out_dir)
if not os.path.isfile(out_file):
print('\t' + url + ' > ' + out_file)
fh, out_file_tmp = tempfile.mkstemp(dir=out_dir)
f = os.fdopen(fh, 'w')
f.close()
# urllib.request.urlretrieve(url, out_file_tmp)
# os.rename(out_file_tmp, out_file)
# change for wget download
try:
subprocess.run(['wget', url, '-o', out_file], check=True)
except subprocess.CalledProcessError as e:
print("ERROR: wget failed")
time.sleep(10)
subprocess.run(['python', 'download_scannet.py', '-o', 'Scannet/'], check=True)
else:
print('WARNING: skipping download of existing file ' + out_file)
def download_scan(scan_id, out_dir, file_types, use_v1_sens):
print('Downloading ScanNet ' + RELEASE_NAME + ' scan ' + scan_id + ' ...')
if not os.path.isdir(out_dir):
os.makedirs(out_dir)
for ft in file_types:
v1_sens = use_v1_sens and ft == '.sens'
url = BASE_URL + RELEASE + '/' + scan_id + '/' + scan_id + ft if not v1_sens else BASE_URL + RELEASES[V1_IDX] + '/' + scan_id + '/' + scan_id + ft
out_file = out_dir + '/' + scan_id + ft
download_file(url, out_file)
print('Downloaded scan ' + scan_id)
def download_task_data(out_dir):
print('Downloading ScanNet v1 task data...')
files = [
LABEL_MAP_FILES[V1_IDX], 'obj_classification/data.zip',
'obj_classification/trained_models.zip', 'voxel_labeling/data.zip',
'voxel_labeling/trained_models.zip'
]
for file in files:
url = BASE_URL + RELEASES_TASKS[V1_IDX] + '/' + file
localpath = os.path.join(out_dir, file)
localdir = os.path.dirname(localpath)
if not os.path.isdir(localdir):
os.makedirs(localdir)
download_file(url, localpath)
print('Downloaded task data.')
def download_tfrecords(in_dir, out_dir):
print('Downloading tf records (302 GB)...')
if not os.path.exists(out_dir):
os.makedirs(out_dir)
split_to_num_shards = {'train': 100, 'val': 25, 'test': 10}
for folder_name in ['hires_tfrecords', 'lores_tfrecords']:
folder_dir = '%s/%s' % (in_dir, folder_name)
save_dir = '%s/%s' % (out_dir, folder_name)
if not os.path.exists(save_dir):
os.makedirs(save_dir)
for split, num_shards in split_to_num_shards.items():
for i in range(num_shards):
file_name = '%s-%05d-of-%05d.tfrecords' % (split, i, num_shards)
url = '%s/%s' % (folder_dir, file_name)
localpath = '%s/%s/%s' % (out_dir, folder_name, file_name)
download_file(url, localpath)
def download_label_map(out_dir):
print('Downloading ScanNet ' + RELEASE_NAME + ' label mapping file...')
files = [ LABEL_MAP_FILE ]
for file in files:
url = BASE_URL + RELEASE_TASKS + '/' + file
localpath = os.path.join(out_dir, file)
localdir = os.path.dirname(localpath)
if not os.path.isdir(localdir):
os.makedirs(localdir)
download_file(url, localpath)
print('Downloaded ScanNet ' + RELEASE_NAME + ' label mapping file.')
def main():
parser = argparse.ArgumentParser(description='Downloads ScanNet public data release.')
parser.add_argument('-o', '--out_dir', required=True, help='directory in which to download')
parser.add_argument('--task_data', action='store_true', help='download task data (v1)')
parser.add_argument('--label_map', action='store_true', help='download label map file')
parser.add_argument('--v1', action='store_true', help='download ScanNet v1 instead of v2')
parser.add_argument('--id', help='specific scan id to download')
parser.add_argument('--preprocessed_frames', action='store_true', help='download preprocessed subset of ScanNet frames (' + PREPROCESSED_FRAMES_FILE[1] + ')')
parser.add_argument('--test_frames_2d', action='store_true', help='download 2D test frames (' + TEST_FRAMES_FILE[1] + '; also included with whole dataset download)')
parser.add_argument('--data_efficient', action='store_true', help='download data efficient task files; also included with whole dataset download)')
parser.add_argument('--tf_semantic', action='store_true', help='download google tensorflow records for 3D segmentation / detection')
parser.add_argument('--grit', action='store_true', help='download ScanNet files for General Robust Image Task')
parser.add_argument('--type', help='specific file type to download (.aggregation.json, .sens, .txt, _vh_clean.ply, _vh_clean_2.0.010000.segs.json, _vh_clean_2.ply, _vh_clean.segs.json, _vh_clean.aggregation.json, _vh_clean_2.labels.ply, _2d-instance.zip, _2d-instance-filt.zip, _2d-label.zip, _2d-label-filt.zip)')
args = parser.parse_args()
print('By pressing any key to continue you confirm that you have agreed to the ScanNet terms of use as described at:')
print(TOS_URL)
print('***')
print('Press any key to continue, or CTRL-C to exit.')
# key = input('')
if args.v1:
global RELEASE
global RELEASE_TASKS
global RELEASE_NAME
global LABEL_MAP_FILE
RELEASE = RELEASES[V1_IDX]
RELEASE_TASKS = RELEASES_TASKS[V1_IDX]
RELEASE_NAME = RELEASES_NAMES[V1_IDX]
LABEL_MAP_FILE = LABEL_MAP_FILES[V1_IDX]
assert((not args.tf_semantic) and (not args.grit)), "Task files specified invalid for v1"
release_file = BASE_URL + RELEASE + '.txt'
release_scans = get_release_scans(release_file)
file_types = FILETYPES;
release_test_file = BASE_URL + RELEASE + '_test.txt'
release_test_scans = get_release_scans(release_test_file)
file_types_test = FILETYPES_TEST;
out_dir_scans = os.path.join(args.out_dir, 'scans')
out_dir_test_scans = os.path.join(args.out_dir, 'scans_test')
out_dir_tasks = os.path.join(args.out_dir, 'tasks')
if args.type: # download file type
file_type = args.type
if file_type not in FILETYPES:
print('ERROR: Invalid file type: ' + file_type)
return
file_types = [file_type]
if file_type in FILETYPES_TEST:
file_types_test = [file_type]
else:
file_types_test = []
if args.task_data: # download task data
download_task_data(out_dir_tasks)
elif args.label_map: # download label map file
download_label_map(args.out_dir)
elif args.preprocessed_frames: # download preprocessed scannet_frames_25k.zip file
if args.v1:
print('ERROR: Preprocessed frames only available for ScanNet v2')
print('You are downloading the preprocessed subset of frames ' + PREPROCESSED_FRAMES_FILE[0] + ' which requires ' + PREPROCESSED_FRAMES_FILE[1] + ' of space.')
download_file(os.path.join(BASE_URL, RELEASE_TASKS, PREPROCESSED_FRAMES_FILE[0]), os.path.join(out_dir_tasks, PREPROCESSED_FRAMES_FILE[0]))
elif args.test_frames_2d: # download test scannet_frames_test.zip file
if args.v1:
print('ERROR: 2D test frames only available for ScanNet v2')
print('You are downloading the 2D test set ' + TEST_FRAMES_FILE[0] + ' which requires ' + TEST_FRAMES_FILE[1] + ' of space.')
download_file(os.path.join(BASE_URL, RELEASE_TASKS, TEST_FRAMES_FILE[0]), os.path.join(out_dir_tasks, TEST_FRAMES_FILE[0]))
elif args.data_efficient: # download data efficient task files
print('You are downloading the data efficient task files' + ' which requires ' + DATA_EFFICIENT_FILES[-1] + ' of space.')
for k in range(len(DATA_EFFICIENT_FILES)-1):
download_file(os.path.join(BASE_URL, RELEASE_TASKS, DATA_EFFICIENT_FILES[k]), os.path.join(out_dir_tasks, DATA_EFFICIENT_FILES[k]))
elif args.tf_semantic: # download google tf records
download_tfrecords(os.path.join(BASE_URL, RELEASE_TASKS, 'tf3d'), os.path.join(out_dir_tasks, 'tf3d'))
elif args.grit: # download GRIT file
download_file(os.path.join(BASE_URL, RELEASE_TASKS, GRIT_FILES[0]), os.path.join(out_dir_tasks, GRIT_FILES[0]))
elif args.id: # download single scan
scan_id = args.id
is_test_scan = scan_id in release_test_scans
if scan_id not in release_scans and (not is_test_scan or args.v1):
print('ERROR: Invalid scan id: ' + scan_id)
else:
out_dir = os.path.join(out_dir_scans, scan_id) if not is_test_scan else os.path.join(out_dir_test_scans, scan_id)
scan_file_types = file_types if not is_test_scan else file_types_test
use_v1_sens = not is_test_scan
if not is_test_scan and not args.v1 and '.sens' in scan_file_types:
print('Note: ScanNet v2 uses the same .sens files as ScanNet v1: Press \'n\' to exclude downloading .sens files for each scan')
# key = input('')
# if key.strip().lower() == 'n':
# scan_file_types.remove('.sens')
download_scan(scan_id, out_dir, scan_file_types, use_v1_sens)
else: # download entire release
if len(file_types) == len(FILETYPES):
print('WARNING: You are downloading the entire ScanNet ' + RELEASE_NAME + ' release which requires ' + RELEASE_SIZE + ' of space.')
else:
print('WARNING: You are downloading all ScanNet ' + RELEASE_NAME + ' scans of type ' + file_types[0])
print('Note that existing scan directories will be skipped. Delete partially downloaded directories to re-download.')
print('***')
print('Press any key to continue, or CTRL-C to exit.')
# key = input('')
if not args.v1 and '.sens' in file_types:
print('Note: ScanNet v2 uses the same .sens files as ScanNet v1: Press \'n\' to exclude downloading .sens files for each scan')
# key = input('')
# if key.strip().lower() == 'n':
# file_types.remove('.sens')
download_release(release_scans, out_dir_scans, file_types, use_v1_sens=True)
if not args.v1:
download_label_map(args.out_dir)
download_release(release_test_scans, out_dir_test_scans, file_types_test, use_v1_sens=False)
download_file(os.path.join(BASE_URL, RELEASE_TASKS, TEST_FRAMES_FILE[0]), os.path.join(out_dir_tasks, TEST_FRAMES_FILE[0]))
for k in range(len(DATA_EFFICIENT_FILES)-1):
download_file(os.path.join(BASE_URL, RELEASE_TASKS, DATA_EFFICIENT_FILES[k]), os.path.join(out_dir_tasks, DATA_EFFICIENT_FILES[k]))
if __name__ == "__main__": main()
```
## /datasets/scannet/label_interpolate.py
```py path="/datasets/scannet/label_interpolate.py"
import numpy as np
import os
from scipy.interpolate import NearestNDInterpolator
import time
def main():
root_path = 'datasets/scannet/all_tsdf_9'
folder_names = [folder for folder in os.listdir(root_path) if os.path.isdir(os.path.join(root_path, folder))]
count = 0
for folder_name in folder_names:
print(count / len(folder_names))
count += 1
folder_name = os.path.join(root_path, folder_name)
if os.path.exists(os.path.join(folder_name, 'full_instance_layer0.npz')):
for i in range(3):
instance_name = os.path.join(folder_name, 'full_instance_layer' + str(i) + '.npz')
semantic_name = os.path.join(folder_name, 'full_semantic_layer' + str(i) + '.npz')
instance = np.load(instance_name, allow_pickle=True)
instance = instance.f.arr_0
semantic = np.load(semantic_name, allow_pickle=True)
semantic = semantic.f.arr_0
'--- Nearest neighbor interpolation ---'
non_zero_indices = np.where(instance != 0)
non_zero_values = instance[non_zero_indices]
non_zero_coords = np.transpose(non_zero_indices)
interpolator = NearestNDInterpolator(non_zero_coords, non_zero_values)
grid_coords = np.indices(instance.shape).reshape(instance.ndim, -1).T
interpolated_values = interpolator(grid_coords).reshape(instance.shape)
new_instance_name = os.path.join(folder_name, 'full_instance_layer_interpolate' + str(i) + '.npz')
np.savez_compressed(new_instance_name, interpolated_values)
t1 = time.time()
non_zero_indices = np.where(semantic != 0)
non_zero_values = semantic[non_zero_indices]
non_zero_coords = np.transpose(non_zero_indices)
interpolator = NearestNDInterpolator(non_zero_coords, non_zero_values)
grid_coords = np.indices(semantic.shape).reshape(semantic.ndim, -1).T
interpolated_values = interpolator(grid_coords).reshape(semantic.shape)
print(time.time() - t1)
new_semantic_name = os.path.join(folder_name, 'full_semantic_layer_interpolate' + str(i) + '.npz')
np.savez_compressed(new_semantic_name, interpolated_values)
if __name__ == '__main__':
main()
```
## /datasets/scannet/load_scannet_data.py
```py path="/datasets/scannet/load_scannet_data.py"
import argparse
import inspect
import json
import os
import numpy as np
import scannet_utils
currentdir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe())))
def read_aggregation(filename):
assert os.path.isfile(filename)
object_id_to_segs = {}
label_to_segs = {}
with open(filename) as f:
data = json.load(f)
num_objects = len(data['segGroups'])
for i in range(num_objects):
object_id = data['segGroups'][i][
'objectId'] + 1 # instance ids should be 1-indexed
label = data['segGroups'][i]['label']
segs = data['segGroups'][i]['segments']
object_id_to_segs[object_id] = segs
if label in label_to_segs:
label_to_segs[label].extend(segs)
else:
label_to_segs[label] = segs
return object_id_to_segs, label_to_segs
def read_segmentation(filename):
assert os.path.isfile(filename)
seg_to_verts = {}
with open(filename) as f:
data = json.load(f)
num_verts = len(data['segIndices'])
for i in range(num_verts):
seg_id = data['segIndices'][i]
if seg_id in seg_to_verts:
seg_to_verts[seg_id].append(i)
else:
seg_to_verts[seg_id] = [i]
return seg_to_verts, num_verts
def extract_bbox(mesh_vertices, object_id_to_segs, object_id_to_label_id,
instance_ids):
num_instances = len(np.unique(list(object_id_to_segs.keys())))
instance_bboxes = np.zeros((num_instances, 7))
for obj_id in object_id_to_segs:
label_id = object_id_to_label_id[obj_id]
obj_pc = mesh_vertices[instance_ids == obj_id, 0:3]
if len(obj_pc) == 0:
continue
xyz_min = np.min(obj_pc, axis=0)
xyz_max = np.max(obj_pc, axis=0)
bbox = np.concatenate([(xyz_min + xyz_max) / 2.0, xyz_max - xyz_min,
np.array([label_id])])
# NOTE: this assumes obj_id is in 1,2,3,.,,,.NUM_INSTANCES
instance_bboxes[obj_id - 1, :] = bbox
return instance_bboxes
def export(mesh_file,
agg_file,
seg_file,
meta_file,
label_map_file,
output_file=None,
test_mode=False):
label_map = scannet_utils.read_label_mapping(
label_map_file, label_from='raw_category', label_to='nyu40id') # 1-40
mesh_vertices = scannet_utils.read_mesh_vertices_rgb(mesh_file)
# Load scene axis alignment matrix
lines = open(meta_file).readlines()
# test set data doesn't have align_matrix
axis_align_matrix = np.eye(4)
for line in lines:
if 'axisAlignment' in line:
axis_align_matrix = [
float(x)
for x in line.rstrip().strip('axisAlignment = ').split(' ')
]
break
axis_align_matrix = np.array(axis_align_matrix).reshape((4, 4))
# perform global alignment of mesh vertices
pts = np.ones((mesh_vertices.shape[0], 4))
pts[:, 0:3] = mesh_vertices[:, 0:3]
pts = np.dot(pts, axis_align_matrix.transpose()) # Nx4
aligned_mesh_vertices = np.concatenate([pts[:, 0:3], mesh_vertices[:, 3:]],
axis=1)
# Load semantic and instance labels
if not test_mode:
object_id_to_segs, label_to_segs = read_aggregation(agg_file)
seg_to_verts, num_verts = read_segmentation(seg_file)
label_ids = np.zeros(shape=(num_verts), dtype=np.uint32)
object_id_to_label_id = {}
for label, segs in label_to_segs.items():
label_id = label_map[label]
for seg in segs:
verts = seg_to_verts[seg]
label_ids[verts] = label_id
instance_ids = np.zeros(
shape=(num_verts), dtype=np.uint32) # 0: unannotated
for object_id, segs in object_id_to_segs.items():
for seg in segs:
verts = seg_to_verts[seg]
instance_ids[verts] = object_id
if object_id not in object_id_to_label_id:
object_id_to_label_id[object_id] = label_ids[verts][0]
unaligned_bboxes = extract_bbox(mesh_vertices, object_id_to_segs,
object_id_to_label_id, instance_ids)
aligned_bboxes = extract_bbox(aligned_mesh_vertices, object_id_to_segs,
object_id_to_label_id, instance_ids)
else:
label_ids = None
instance_ids = None
unaligned_bboxes = None
aligned_bboxes = None
object_id_to_label_id = None
if output_file is not None:
np.save(output_file + '_vert.npy', mesh_vertices)
if not test_mode:
np.save(output_file + '_sem_label.npy', label_ids)
np.save(output_file + '_ins_label.npy', instance_ids)
np.save(output_file + '_unaligned_bbox.npy', unaligned_bboxes)
np.save(output_file + '_aligned_bbox.npy', aligned_bboxes)
np.save(output_file + '_axis_align_matrix.npy', axis_align_matrix)
return mesh_vertices, label_ids, instance_ids, unaligned_bboxes, \
aligned_bboxes, object_id_to_label_id, axis_align_matrix
def main():
parser = argparse.ArgumentParser()
parser.add_argument(
'--scan_path',
required=True,
help='path to scannet scene (e.g., data/ScanNet/v2/scene0000_00')
parser.add_argument('--output_file', required=True, help='output file')
parser.add_argument(
'--label_map_file',
required=True,
help='path to scannetv2-labels.combined.tsv')
opt = parser.parse_args()
scan_name = os.path.split(opt.scan_path)[-1]
mesh_file = os.path.join(opt.scan_path, scan_name + '_vh_clean_2.ply')
agg_file = os.path.join(opt.scan_path, scan_name + '.aggregation.json')
seg_file = os.path.join(opt.scan_path,
scan_name + '_vh_clean_2.0.010000.segs.json')
meta_file = os.path.join(
opt.scan_path, scan_name +
'.txt') # includes axisAlignment info for the train set scans.
export(mesh_file, agg_file, seg_file, meta_file, opt.label_map_file,
opt.output_file)
if __name__ == '__main__':
main()
```
## /datasets/scannet/meta_data/scannet_means.npz
Binary file available at https://raw.githubusercontent.com/zhen6618/EPRecon/refs/heads/main/datasets/scannet/meta_data/scannet_means.npz
## /datasets/scannet/meta_data/scannetv2-labels.combined.tsv
```tsv path="/datasets/scannet/meta_data/scannetv2-labels.combined.tsv"
id raw_category category count nyu40id eigen13id nyuClass nyu40class eigen13class ModelNet40 ModelNet10 ShapeNetCore55 synsetoffset wnsynsetid wnsynsetkey mpcat40 mpcat40index
1 wall wall 8277 1 12 wall wall Wall n04546855 wall.n.01 wall 1
2 chair chair 4646 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
22 books book 1678 23 2 book books Books n02870526 book.n.11 objects 39
3 floor floor 1553 2 5 floor floor Floor n03365592 floor.n.01 floor 2
5 door door 1483 8 12 door door Wall door n03221720 door.n.01 door 4
1163 object object 1313 40 7 otherprop Objects objects 39
16 window window 1209 9 13 window window Window n04587648 window.n.01 window 9
4 table table 1170 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
56 trash can trash can 1090 39 6 garbage bin otherfurniture Furniture trash_bin 2747177 n02747177 ashcan.n.01 objects 39
13 pillow pillow 937 18 7 pillow pillow Objects pillow 3938244 n03938244 pillow.n.01 cushion 8
15 picture picture 862 11 8 picture picture Picture n03931044 picture.n.01 picture 6
41 ceiling ceiling 806 22 3 ceiling ceiling Ceiling n02990373 ceiling.n.01 ceiling 17
26 box box 775 29 7 box box Objects n02883344 box.n.01 objects 39
161 doorframe doorframe 768 8 12 door door Wall door doorframe.n.01 door 4
19 monitor monitor 765 40 7 monitor otherprop Objects monitor monitor tv or monitor 3211117 n03782190 monitor.n.04 objects 39
7 cabinet cabinet 731 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
9 desk desk 680 14 10 desk desk Table desk desk table 4379243 n03179701 desk.n.01 table 5
8 shelf shelf 641 15 6 shelves shelves Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
10 office chair office chair 595 5 4 chair chair Chair chair chair chair 3001627 n04373704 swivel_chair.n.01 chair 3
31 towel towel 570 27 7 towel towel Objects n04459362 towel.n.01 towel 20
6 couch couch 502 6 9 sofa sofa Sofa sofa sofa sofa 4256520 n04256520 sofa.n.01 sofa 10
14 sink sink 488 34 7 sink sink Objects sink n04223580 sink.n.01 sink 15
48 backpack backpack 479 40 7 backpack otherprop Objects n02769748 backpack.n.01 objects 39
28 lamp lamp 419 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
11 bed bed 370 4 1 bed bed Bed bed bed bed 2818832 n02818832 bed.n.01 bed 11
18 bookshelf bookshelf 360 10 6 bookshelf bookshelf Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
71 mirror mirror 349 19 7 mirror mirror Objects n03773035 mirror.n.01 mirror 21
21 curtain curtain 347 16 13 curtain curtain Window curtain n03151077 curtain.n.01 curtain 12
40 plant plant 331 40 7 plant otherprop Objects plant n00017222 plant.n.02 plant 14
52 whiteboard whiteboard 327 30 7 whiteboard whiteboard Objects n03211616 display_panel.n.01 board_panel 35
96 radiator radiator 322 39 6 radiator otherfurniture Furniture n04041069 radiator.n.02 misc 40
22 book book 318 23 2 book books Books n02870526 book.n.11 objects 39
29 kitchen cabinet kitchen cabinet 310 3 6 cabinet cabinet Furniture n02933112 cabinet.n.01 cabinet 7
49 toilet paper toilet paper 291 40 7 toilet paper otherprop Objects n15075141 toilet_tissue.n.01 objects 39
29 kitchen cabinets kitchen cabinet 289 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
23 armchair armchair 281 5 4 chair chair Chair chair chair chair 3001627 n02738535 armchair.n.01 chair 3
63 shoes shoe 272 40 7 shoe otherprop Objects n04199027 shoe.n.01 clothes 38
24 coffee table coffee table 258 7 10 coffee table table Table table table table 4379243 n03063968 coffee_table.n.01 table 5
17 toilet toilet 256 33 7 toilet toilet Objects toilet toilet n04446276 toilet.n.01 toilet 18
47 bag bag 252 37 7 bag bag Objects suitcase 2773838 n02773838 bag.n.06 objects 39
32 clothes clothes 248 21 7 clothes clothes Objects n02728440 apparel.n.01 clothes 38
46 keyboard keyboard 246 40 7 keyboard otherprop Objects keyboard computer keyboard 3085013 n03085013 computer_keyboard.n.01 objects 39
65 bottle bottle 226 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
97 recycling bin recycling bin 225 39 6 garbage bin otherfurniture Furniture trash_bin 2747177 n02747177 ashcan.n.01 objects 39
34 nightstand nightstand 224 32 6 night stand night stand Furniture night_stand night_stand n03015254 chest_of_drawers.n.01 chest_of_drawers 13
38 stool stool 221 40 7 stool otherprop Objects stool n04326896 stool.n.01 stool 19
33 tv tv 219 25 11 television television TV tv or monitor 3211117 n03211117 display.n.06 tv_monitor 22
75 file cabinet file cabinet 217 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
36 dresser dresser 213 17 6 dresser dresser Furniture dresser dresser n03015254 chest_of_drawers.n.01 chest_of_drawers 13
64 computer tower computer tower 203 40 7 computer otherprop Objects n03082979 computer.n.01 objects 39
32 clothing clothes 165 21 7 clothes clothes Objects n02728440 apparel.n.01 clothes 38
101 telephone telephone 164 40 7 telephone otherprop Objects telephone 4401088 n04401088 telephone.n.01 objects 39
130 cup cup 157 40 7 cup otherprop Objects cup cup or mug 3797390 n03797390 mug.n.04 objects 39
27 refrigerator refrigerator 154 24 6 refridgerator refridgerator Furniture n04070727 refrigerator.n.01 appliances 37
44 end table end table 147 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
131 jacket jacket 146 40 7 jacket otherprop Objects n03589791 jacket.n.01 clothes 38
55 shower curtain shower curtain 144 28 7 shower curtain shower curtain Objects curtain n04209239 shower_curtain.n.01 curtain 12
42 bathtub bathtub 144 36 7 bathtub bathtub Objects bathtub bathtub tub 2808440 n02808440 bathtub.n.01 bathtub 25
59 microwave microwave 141 40 7 microwave otherprop Objects microwave 3761084 n03761084 microwave.n.02 appliances 37
159 kitchen counter kitchen counter 140 12 6 counter counter Furniture table table table 4379243 n03116530 counter.n.01 counter 26
74 sofa chair sofa chair 129 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
82 paper towel dispenser paper towel dispenser 129 40 7 paper towel dispenser otherprop Objects objects 39
1164 bathroom vanity bathroom vanity 126 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 table 5
93 suitcase suitcase 118 40 7 luggage otherprop Objects n02773838 bag.n.06 objects 39
77 laptop laptop 111 40 7 laptop otherprop Objects laptop laptop 3642806 n03642806 laptop.n.01 objects 39
67 ottoman ottoman 111 39 6 ottoman otherfurniture Furniture stool n03380724 footstool.n.01 stool 19
128 shower walls shower wall 109 1 12 wall wall Wall n04546855 wall.n.01 wall 1
50 printer printer 106 40 7 printer otherprop Objects printer 4004475 n04004475 printer.n.03 appliances 37
35 counter counter 104 12 6 counter counter Furniture table table table 4379243 n03116530 counter.n.01 counter 26
69 board board 100 38 7 board otherstructure Objects board_panel 35
100 soap dispenser soap dispenser 99 40 7 otherprop Objects n04254120 soap_dispenser.n.01 objects 39
62 stove stove 95 38 7 stove otherstructure Objects stove 4330267 n04330267 stove.n.02 appliances 37
105 light light 93 38 7 light otherstructure Objects n03665366 light.n.02 lighting 28
1165 closet wall closet wall 90 1 12 wall wall Wall n04546855 wall.n.01 wall 1
165 mini fridge mini fridge 87 24 6 refridgerator refridgerator Furniture n03273913 electric_refrigerator.n.01 appliances 37
7 cabinets cabinet 79 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
5 doors door 76 8 12 door door Wall door n03221720 door.n.01 door 4
76 fan fan 75 40 7 fan otherprop Objects n03320046 fan.n.01 misc 40
230 tissue box tissue box 73 40 7 tissue box otherprop Objects n02883344 box.n.01 objects 39
54 blanket blanket 72 40 7 blanket otherprop Objects n02849154 blanket.n.01 objects 39
125 bathroom stall bathroom stall 71 38 7 otherstructure Objects n02873839 booth.n.02 misc 40
72 copier copier 70 40 7 otherprop Objects n03257586 duplicator.n.01 appliances 37
68 bench bench 66 39 6 bench otherfurniture Furniture bench bench 2828884 n02828884 bench.n.01 seating 34
145 bar bar 66 38 7 bar otherstructure Objects n02788689 bar.n.03 misc 40
157 soap dish soap dish 65 40 7 soap dish otherprop Objects n04254009 soap_dish.n.01 objects 39
1166 laundry hamper laundry hamper 65 40 7 laundry basket otherprop Objects objects 39
132 storage bin storage bin 63 40 7 storage bin otherprop Objects objects 39
1167 bathroom stall door bathroom stall door 62 8 12 door door Wall door n03221720 door.n.01 door 4
232 light switch light switch 61 38 7 light switch otherstructure Objects n04372370 switch.n.01 misc 40
134 coffee maker coffee maker 61 40 7 otherprop Objects n03063338 coffee_maker.n.01 appliances 37
51 tv stand tv stand 61 39 6 tv stand otherfurniture Furniture tv_stand n03290653 entertainment_center.n.01 furniture 36
250 decoration decoration 60 40 7 otherprop Objects n03169390 decoration.n.01 misc 40
1168 ceiling light ceiling light 59 38 7 light otherstructure Objects n03665366 light.n.02 lighting 28
342 range hood range hood 59 38 7 range hood otherstructure Objects range_hood n04053677 range_hood.n.01 misc 40
89 blackboard blackboard 58 38 7 blackboard otherstructure Objects n02846511 blackboard.n.01 board_panel 35
103 clock clock 58 40 7 clock otherprop Objects clock 3046257 n03046257 clock.n.01 objects 39
99 wardrobe closet wardrobe 54 39 6 wardrobe otherfurniture Furniture wardrobe n04550184 wardrobe.n.01 furniture 36
95 rail rail 53 38 7 railing otherstructure Objects n04047401 railing.n.01 railing 30
154 bulletin board bulletin board 53 38 7 board otherstructure Objects n03211616 display_panel.n.01 board_panel 35
140 mat mat 52 20 5 floor mat floor mat Floor n03727837 mat.n.01 floor 2
1169 trash bin trash bin 52 39 6 garbage bin otherfurniture Furniture trash_bin 2747177 n02747177 ashcan.n.01 objects 39
193 ledge ledge 51 38 7 otherstructure Objects n09337253 ledge.n.01 misc 40
116 seat seat 49 39 6 furniture otherfurniture Furniture n04161981 seat.n.03 furniture 36
202 mouse mouse 49 40 7 mouse otherprop Objects n03793489 mouse.n.04 objects 39
73 basket basket 48 40 7 basket otherprop Objects basket 2801938 n02801938 basket.n.01 objects 39
78 shower shower 48 38 7 otherstructure Objects n04208936 shower.n.01 shower 23
1170 dumbbell dumbbell 48 40 7 otherprop Objects n03255030 dumbbell.n.01 objects 39
79 paper paper 46 26 7 paper paper Objects n14974264 paper.n.01 objects 39
80 person person 46 31 7 person person Objects person n05217688 person.n.02 misc 40
141 windowsill windowsill 45 38 7 otherstructure Objects n04590263 windowsill.n.01 window 9
57 closet closet 45 39 6 wardrobe otherfurniture Furniture wardrobe misc 40
102 bucket bucket 45 40 7 bucket otherprop Objects n02909870 bucket.n.01 misc 40
261 sign sign 44 40 7 sign otherprop Objects n04217882 signboard.n.01 objects 39
118 speaker speaker 43 40 7 speaker otherprop Objects speaker 3691459 n03691459 loudspeaker.n.01 objects 39
136 dishwasher dishwasher 43 38 7 dishwasher otherstructure Objects dishwasher 3207941 n03207941 dishwasher.n.01 appliances 37
98 container container 43 40 7 container otherprop Objects n03094503 container.n.01 objects 39
1171 stair rail stair rail 42 38 7 banister otherstructure Objects n02788148 bannister.n.02 railing 30
170 shower curtain rod shower curtain rod 42 40 7 otherprop Objects curtain 12
1172 tube tube 41 40 7 otherprop Objects misc 40
1173 bathroom cabinet bathroom cabinet 39 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
79 papers paper 39 26 7 paper paper Objects n14974264 paper.n.01 objects 39
221 storage container storage container 39 40 7 container otherprop Objects objects 39
570 paper bag paper bag 39 37 7 bag bag Objects n04122825 sack.n.01 objects 39
138 paper towel roll paper towel roll 39 40 7 paper towel otherprop Objects n03887697 paper_towel.n.01 towel 20
168 ball ball 39 40 7 ball otherprop Objects objects 39
276 closet doors closet door 38 8 12 door door Wall door n03221720 door.n.01 door 4
106 laundry basket laundry basket 37 40 7 laundry basket otherprop Objects basket 2801938 n03050864 clothes_hamper.n.01 objects 39
214 cart cart 37 40 7 cart otherprop Objects n03484083 handcart.n.01 shelving 31
276 closet door closet door 35 8 12 door door Wall door n03221720 door.n.01 door 4
323 dish rack dish rack 35 40 7 dish rack otherprop Objects n03207630 dish_rack.n.01 objects 39
58 stairs stairs 35 38 7 stairs otherstructure Objects n04298308 stairway.n.01 stairs 16
86 blinds blinds 35 13 13 blinds blinds Window n02851099 blind.n.03 blinds 32
2 stack of chairs chair 35 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
399 purse purse 34 40 7 purse otherprop Objects n02774152 bag.n.04 objects 39
121 bicycle bicycle 33 40 7 bicycle otherprop Objects bicycle 2834778 n02834778 bicycle.n.01 objects 39
185 tray tray 32 40 7 tray otherprop Objects n04476259 tray.n.01 objects 39
300 plunger plunger 30 40 7 otherprop Objects n03970156 plunger.n.03 objects 39
180 paper cutter paper cutter 30 40 7 paper cutter otherprop Objects n03886940 paper_cutter.n.01 objects 39
163 toilet paper dispenser toilet paper dispenser 29 40 7 otherprop Objects objects 39
26 boxes box 29 29 7 box box Objects n02883344 box.n.01 objects 39
66 bin bin 28 40 7 bin otherprop Objects n02839910 bin.n.01 objects 39
208 toilet seat cover dispenser toilet seat cover dispenser 28 40 7 otherprop Objects objects 39
112 guitar guitar 28 40 7 guitar otherprop Objects guitar guitar 3467517 n03467517 guitar.n.01 objects 39
540 mailboxes mailbox 28 29 7 box box Objects mailbox 3710193 n03710193 mailbox.n.01 misc 40
395 handicap bar handicap bar 27 38 7 bar otherstructure Objects misc 40
166 fire extinguisher fire extinguisher 27 40 7 fire extinguisher otherprop Objects n03345837 fire_extinguisher.n.01 misc 40
122 ladder ladder 27 39 6 ladder otherfurniture Furniture stairs n03632277 ladder.n.01 stairs 16
120 column column 26 38 7 column otherstructure Objects n03074380 column.n.06 column 24
107 pipe pipe 25 40 7 pipe otherprop Objects n03944672 pipe.n.02 misc 40
283 vacuum cleaner vacuum cleaner 25 40 7 otherprop Objects n04517823 vacuum.n.04 objects 39
88 plate plate 24 40 7 plate otherprop Objects n03959485 plate.n.04 objects 39
90 piano piano 24 39 6 piano otherfurniture Furniture piano piano 3928116 n03928116 piano.n.01 furniture 36
177 water cooler water cooler 24 39 6 water cooler otherfurniture Furniture n04559166 water_cooler.n.01 misc 40
1174 cd case cd case 24 40 7 otherprop Objects objects 39
562 bowl bowl 24 40 7 bowl otherprop Objects bowl bowl 2880940 n02880940 bowl.n.03 objects 39
1175 closet rod closet rod 24 40 7 otherprop Objects n04100174 rod.n.01 misc 40
1156 bathroom counter bathroom counter 24 12 6 counter counter Furniture table table table 4379243 n03116530 counter.n.01 counter 26
84 oven oven 23 38 7 oven otherstructure Objects n03862676 oven.n.01 appliances 37
104 stand stand 23 39 6 stand otherfurniture Furniture table table table 4379243 n04301000 stand.n.04 table 5
229 scale scale 23 40 7 scale otherprop Objects n04141975 scale.n.07 objects 39
70 washing machine washing machine 23 39 6 washing machine otherfurniture Furniture washing_machine 4554684 n04554684 washer.n.03 appliances 37
325 broom broom 22 40 7 broom otherprop Objects n02906734 broom.n.01 objects 39
169 hat hat 22 40 7 hat otherprop Objects n03497657 hat.n.01 clothes 38
128 shower wall shower wall 22 1 12 wall wall Wall n04208936 shower.n.01 wall 1
331 guitar case guitar case 21 40 7 guitar case otherprop Objects objects 39
87 rack rack 21 39 6 stand otherfurniture Furniture n04038440 rack.n.05 shelving 31
488 water pitcher water pitcher 21 40 7 pitcher otherprop Objects n03950228 pitcher.n.02 objects 39
776 laundry detergent laundry detergent 21 40 7 otherprop Objects objects 39
370 hair dryer hair dryer 21 40 7 hair dryer otherprop Objects n03483316 hand_blower.n.01 objects 39
191 pillar pillar 21 38 7 column otherstructure Objects n03073977 column.n.07 column 24
748 divider divider 20 40 7 otherprop Objects wall 1
242 power outlet power outlet 19 40 7 otherprop Objects misc 40
45 dining table dining table 19 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
417 shower floor shower floor 19 2 5 floor floor Floor n04208936 shower.n.01 floor 2
70 washing machines washing machine 19 39 6 washing machine otherfurniture Furniture washing_machine 4554684 n04554684 washer.n.03 appliances 37
188 shower door shower door 19 8 12 door door Wall door n04208936 shower.n.01 door 4
1176 coffee kettle coffee kettle 18 40 7 pot otherprop Objects n03612814 kettle.n.01 objects 39
1177 wardrobe cabinet wardrobe 18 39 6 wardrobe otherfurniture Furniture wardrobe n04550184 wardrobe.n.01 furniture 36
1178 structure structure 18 38 7 otherstructure Objects misc 40
18 bookshelves bookshelf 17 10 6 bookshelf bookshelf Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
110 clothes dryer clothes dryer 17 39 6 otherfurniture Furniture n03251766 dryer.n.01 appliances 37
148 toaster toaster 17 40 7 toaster otherprop Objects n04442312 toaster.n.02 appliances 37
63 shoe shoe 17 40 7 shoe otherprop Objects n04199027 shoe.n.01 clothes 38
155 ironing board ironing board 16 39 6 ironing board otherfurniture Furniture n03586090 ironing_board.n.01 objects 39
572 alarm clock alarm clock 16 40 7 alarm clock otherprop Objects clock 3046257 n02694662 alarm_clock.n.01 objects 39
1179 shower head shower head 15 38 7 otherstructure Objects shower 23
28 lamp base lamp 15 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
392 water bottle water bottle 15 40 7 bottle otherprop Objects bottle bottle 2876657 n04557648 water_bottle.n.01 objects 39
1180 keyboard piano keyboard piano 15 39 6 piano otherfurniture Furniture piano piano 3928116 n03928116 piano.n.01 furniture 36
609 projector screen projector screen 15 38 7 projector screen otherstructure Objects misc 40
1181 case of water bottles case of water bottles 15 40 7 otherprop Objects objects 39
195 toaster oven toaster oven 14 40 7 toaster oven otherprop Objects n04442441 toaster_oven.n.01 appliances 37
581 music stand music stand 14 39 6 music stand otherfurniture Furniture n03801760 music_stand.n.01 furniture 36
58 staircase stairs 14 38 7 stairs otherstructure Objects n04298308 stairway.n.01 stairs 16
1182 coat rack coat rack 14 40 7 otherprop Objects n03059103 coatrack.n.01 shelving 3
1183 storage organizer storage organizer 14 40 7 otherprop Objects shelving 3
139 machine machine 14 40 7 machine otherprop Objects n03699975 machine.n.01 appliances 37
1184 folded chair folded chair 14 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
1185 fire alarm fire alarm 14 40 7 otherprop Objects n03343737 fire_alarm.n.02 misc 40
156 fireplace fireplace 13 38 7 fireplace otherstructure Objects n03346455 fireplace.n.01 fireplace 27
408 vent vent 13 40 7 otherprop Objects n04526241 vent.n.01 misc 40
213 furniture furniture 13 39 6 furniture otherfurniture Furniture n03405725 furniture.n.01 furniture 36
1186 power strip power strip 13 40 7 otherprop Objects objects 39
1187 calendar calendar 13 40 7 otherprop Objects objects 39
1188 poster poster 13 11 8 picture picture Picture n03931044 picture.n.01 picture 6
115 toilet paper holder toilet paper holder 13 40 7 toilet paper holder otherprop Objects objects 39
1189 potted plant potted plant 12 40 7 plant otherprop Objects plant n00017222 plant.n.02 plant 14
304 stuffed animal stuffed animal 12 40 7 stuffed animal otherprop Objects n04399382 teddy.n.01 objects 39
1190 luggage luggage 12 40 7 luggage otherprop Objects n02774630 baggage.n.01 objects 39
21 curtains curtain 12 16 13 curtain curtain Window curtain n03151077 curtain.n.01 curtain 12
312 headphones headphones 12 40 7 otherprop Objects n03261776 earphone.n.01 objects 39
233 crate crate 12 39 6 crate otherfurniture Furniture n03127925 crate.n.01 objects 39
286 candle candle 12 40 7 candle otherprop Objects lamp n02948072 candle.n.01 objects 39
264 projector projector 12 40 7 projector otherprop Objects n04009552 projector.n.02 objects 39
110 clothes dryers clothes dryer 12 39 6 otherfurniture Furniture n03251766 dryer.n.01 appliances 37
1191 mattress mattress 12 4 1 bed bed Bed bed bed bed 2818832 n02818832 bed.n.01 bed 11
356 dustpan dustpan 12 40 7 otherprop Objects n03259009 dustpan.n.02 objects 39
25 drawer drawer 11 39 6 drawer otherfurniture Furniture n03233905 drawer.n.01 furniture 36
750 rod rod 11 40 7 otherprop Objects pistol 3948459 n03427202 gat.n.01 misc 40
269 globe globe 11 40 7 globe otherprop Objects objects 39
307 footrest footrest 11 39 6 foot rest otherfurniture Furniture stool n03380724 footstool.n.01 stool 19
410 piano bench piano bench 11 39 6 piano bench otherfurniture Furniture bench bench 2828884 n02828884 bench.n.01 seating 34
730 breakfast bar breakfast bar 11 38 7 bar otherstructure Objects counter 26
216 step stool step stool 11 40 7 step stool otherprop Objects stool n04315713 step_stool.n.01 stool 19
1192 hand rail hand rail 11 38 7 railing otherstructure Objects railing 30
119 vending machine vending machine 11 40 7 machine otherprop Objects n04525305 vending_machine.n.01 appliances 37
682 ceiling fan ceiling fan 11 40 7 fan otherprop Objects n03320046 fan.n.01 misc 40
434 swiffer swiffer 11 40 7 otherprop Objects objects 39
126 foosball table foosball table 11 39 6 foosball table otherfurniture Furniture table table table 4379243 n04379243 table.n.02 table 5
919 jar jar 11 40 7 jar otherprop Objects jar 3593526 n03593526 jar.n.01 objects 39
85 footstool footstool 11 39 6 ottoman otherfurniture Furniture stool n03380724 footstool.n.01 stool 19
1193 folded table folded table 10 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
108 round table round table 10 7 10 table table Table table table table 4379243 n04114554 round_table.n.02 table 5
135 hamper hamper 10 40 7 basket otherprop Objects basket 2801938 n03482405 hamper.n.02 objects 39
1194 poster tube poster tube 10 40 7 otherprop Objects objects 39
432 case case 10 40 7 case otherprop Objects objects 39
53 carpet carpet 10 40 7 rug otherprop Objects n04118021 rug.n.01 floor 2
1195 thermostat thermostat 10 40 7 otherprop Objects n04422875 thermostat.n.01 misc 40
111 coat coat 10 40 7 jacket otherprop Objects n03057021 coat.n.01 clothes 38
305 water fountain water fountain 10 38 7 water fountain otherstructure Objects n03241335 drinking_fountain.n.01 misc 40
1125 smoke detector smoke detector 10 40 7 otherprop Objects misc 40
13 pillows pillow 9 18 7 pillow pillow Objects pillow 3938244 n03938244 pillow.n.01 cushion 8
1196 flip flops flip flops 9 40 7 shoe otherprop Objects n04199027 shoe.n.01 clothes 38
1197 cloth cloth 9 21 7 clothes clothes Objects n02728440 apparel.n.01 clothes 38
1198 banner banner 9 40 7 otherprop Objects n02788021 banner.n.01 misc 40
1199 clothes hanger clothes hanger 9 40 7 otherprop Objects n03057920 coat_hanger.n.01 objects 39
1200 whiteboard eraser whiteboard eraser 9 40 7 otherprop Objects objects 39
378 iron iron 9 40 7 otherprop Objects n03584829 iron.n.04 objects 39
591 instrument case instrument case 9 40 7 case otherprop Objects objects 39
49 toilet paper rolls toilet paper 9 40 7 toilet paper otherprop Objects n15075141 toilet_tissue.n.01 objects 39
92 soap soap 9 40 7 soap otherprop Objects n04253437 soap.n.01 objects 39
1098 block block 9 40 7 otherprop Objects misc 40
291 wall hanging wall hanging 8 40 7 otherprop Objects n03491178 hanging.n.01 picture 6
1063 kitchen island kitchen island 8 38 7 kitchen island otherstructure Objects n03620600 kitchen_island.n.01 counter 26
107 pipes pipe 8 38 7 otherstructure Objects misc 40
1135 toothbrush toothbrush 8 40 7 toothbrush otherprop Objects n04453156 toothbrush.n.01 objects 39
189 shirt shirt 8 40 7 otherprop Objects n04197391 shirt.n.01 clothes 38
245 cutting board cutting board 8 40 7 cutting board otherprop Objects n03025513 chopping_board.n.01 objects 39
194 vase vase 8 40 7 vase otherprop Objects vase jar 3593526 n04522168 vase.n.01 objects 39
1201 shower control valve shower control valve 8 38 7 otherstructure Objects n04208936 shower.n.01 shower 23
386 exercise machine exercise machine 8 40 7 machine otherprop Objects gym_equipment 33
1202 compost bin compost bin 8 39 6 garbage bin otherfurniture Furniture trash_bin 2747177 n02747177 ashcan.n.01 objects 39
857 shorts shorts 8 40 7 shorts otherprop Objects clothes 38
452 tire tire 8 40 7 otherprop Objects n04440749 tire.n.01 objects 39
1203 teddy bear teddy bear 7 40 7 stuffed animal otherprop Objects n04399382 teddy.n.01 objects 39
346 bathrobe bathrobe 7 40 7 otherprop Objects n02807616 bathrobe.n.01 clothes 38
152 handrail handrail 7 38 7 railing otherstructure Objects n02788148 bannister.n.02 railing 30
83 faucet faucet 7 40 7 faucet otherprop Objects faucet 3325088 n03325088 faucet.n.01 misc 40
1204 pantry wall pantry wall 7 1 12 wall wall Wall n04546855 wall.n.01 wall 1
726 thermos thermos 7 40 7 flask otherprop Objects bottle bottle 2876657 n04422727 thermos.n.01 objects 39
61 rug rug 7 40 7 rug otherprop Objects n04118021 rug.n.01 floor 2
39 couch cushions cushion 7 18 7 pillow pillow Objects n03151500 cushion.n.03 cushion 8
1117 tripod tripod 7 39 6 stand otherfurniture Furniture n04485082 tripod.n.01 objects 39
540 mailbox mailbox 7 29 7 box box Objects mailbox 3710193 n03710193 mailbox.n.01 misc 40
1205 tupperware tupperware 7 40 7 otherprop Objects objects 39
415 shoe rack shoe rack 7 40 7 shoe rack otherprop Objects shelving 31
31 towels towel 6 27 7 towel towel Objects n04459362 towel.n.01 towel 20
1206 beer bottles beer bottle 6 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
153 treadmill treadmill 6 39 6 treadmill otherfurniture Furniture n04477387 treadmill.n.01 gym_equipment 33
1207 salt salt 6 40 7 otherprop Objects objects 39
129 chest chest 6 39 6 chest otherfurniture Furniture dresser dresser chest_of_drawers 13
220 dispenser dispenser 6 40 7 otherprop Objects n03210683 dispenser.n.01 objects 39
1208 mirror doors mirror door 6 8 12 door door Wall door n03221720 door.n.01 door 4
231 remote remote 6 40 7 otherprop Objects remote_control 4074963 n04074963 remote_control.n.01 objects 39
1209 folded ladder folded ladder 6 39 6 ladder otherfurniture Furniture stairs n03632277 ladder.n.01 misc 40
39 cushion cushion 6 18 7 pillow pillow Objects n03151500 cushion.n.03 cushion 8
1210 carton carton 6 40 7 otherprop Objects objects 39
117 step step 6 38 7 otherstructure Objects n04314914 step.n.04 misc 40
822 drying rack drying rack 6 39 6 drying rack otherfurniture Furniture shelving 31
238 slippers slipper 6 40 7 shoe otherprop Objects n04241394 slipper.n.01 clothes 38
143 pool table pool table 6 39 6 pool table otherfurniture Furniture table table table 4379243 n03982430 pool_table.n.01 table 5
1211 soda stream soda stream 6 40 7 otherprop Objects objects 39
228 toilet brush toilet brush 6 40 7 toilet brush otherprop Objects objects 39
494 loft bed loft bed 6 4 1 bed bed Bed bed bed bed 2818832 n02818832 bed.n.01 bed 11
226 cooking pot cooking pot 6 40 7 pot otherprop Objects objects 39
91 heater heater 6 39 6 heater otherfurniture Furniture n03508101 heater.n.01 misc 40
1072 messenger bag messenger bag 6 37 7 bag bag Objects objects 39
435 stapler stapler 6 40 7 stapler otherprop Objects n04303497 stapler.n.01 objects 39
1165 closet walls closet wall 5 1 12 wall wall Wall n04546855 wall.n.01 wall 1
345 scanner scanner 5 40 7 otherprop Objects appliances 37
893 elliptical machine elliptical machine 5 40 7 machine otherprop Objects gym_equipment 33
621 kettle kettle 5 40 7 pot otherprop Objects n03612814 kettle.n.01 objects 39
1212 metronome metronome 5 40 7 otherprop Objects n03757604 metronome.n.01 objects 39
297 dumbell dumbell 5 40 7 otherprop Objects objects 39
1213 music book music book 5 23 2 book books Books n02870526 book.n.11 objects 39
1214 rice cooker rice cooker 5 40 7 otherprop Objects objects 39
1215 dart board dart board 5 38 7 board otherstructure Objects n03162940 dartboard.n.01 objects 39
529 sewing machine sewing machine 5 40 7 sewing machine otherprop Objects n04179913 sewing_machine.n.01 objects 39
1216 grab bar grab bar 5 38 7 railing otherstructure Objects railing 30
1217 flowerpot flowerpot 5 40 7 vase otherprop Objects vase jar 3593526 n04522168 vase.n.01 objects 39
1218 painting painting 5 11 8 picture picture Picture n03931044 picture.n.01 picture 6
1219 railing railing 5 38 7 railing otherstructure Objects n04047401 railing.n.01 railing 30
1220 stair stair 5 38 7 stairs otherstructure Objects stairs n04314914 step.n.04 stairs 16
525 toolbox toolbox 5 39 6 chest otherfurniture Furniture n04452615 toolbox.n.01 objects 39
204 nerf gun nerf gun 5 40 7 otherprop Objects objects 39
693 binders binder 5 40 7 binder otherprop Objects objects 39
179 desk lamp desk lamp 5 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
1221 quadcopter quadcopter 5 40 7 otherprop Objects objects 39
1222 pitcher pitcher 5 40 7 pitcher otherprop Objects n03950228 pitcher.n.02 objects 39
1223 hanging hanging 5 40 7 otherprop Objects misc 40
1224 mail mail 5 40 7 otherprop Objects misc 40
1225 closet ceiling closet ceiling 5 22 3 ceiling ceiling Ceiling n02990373 ceiling.n.01 ceiling 17
1226 hoverboard hoverboard 5 40 7 otherprop Objects objects 39
1227 beanbag chair beanbag chair 5 39 6 bean bag otherfurniture Furniture n02816656 beanbag.n.01 chair 3
571 water heater water heater 5 40 7 water heater otherprop Objects n04560113 water_heater.n.01 misc 40
1228 spray bottle spray bottle 5 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
556 rope rope 5 40 7 rope otherprop Objects n04108268 rope.n.01 objects 39
280 plastic container plastic container 5 40 7 container otherprop Objects objects 39
1229 soap bottle soap bottle 5 40 7 soap otherprop Objects objects 39
1230 ikea bag ikea bag 4 37 7 bag bag Objects 2773838 n02773838 bag.n.06 objects 39
1231 sleeping bag sleeping bag 4 40 7 otherprop Objects n04235860 sleeping_bag.n.01 objects 39
1232 duffel bag duffel bag 4 37 7 bag bag Objects suitcase 2773838 n02773838 bag.n.06 objects 39
746 frying pan frying pan 4 40 7 frying pan otherprop Objects n03400231 frying_pan.n.01 objects 39
1233 oven mitt oven mitt 4 40 7 otherprop Objects objects 39
1234 pot pot 4 40 7 pot otherprop Objects n04235860 sleeping_bag.n.01 objects 39
144 hand dryer hand dryer 4 40 7 otherprop Objects objects 39
282 dollhouse dollhouse 4 39 6 doll house otherfurniture Furniture n03219483 dollhouse.n.01 objects 39
167 shampoo bottle shampoo bottle 4 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
1235 hair brush hair brush 4 40 7 otherprop Objects n02908217 brush.n.02 objects 39
1236 tennis racket tennis racket 4 40 7 otherprop Objects n04409806 tennis_racket.n.01 objects 39
1237 display case display case 4 40 7 case otherprop Objects objects 39
234 ping pong table ping pong table 4 39 6 ping pong table otherfurniture Furniture table table table 4379243 n04379243 table.n.02 table 5
563 boiler boiler 4 40 7 otherprop Objects misc 40
1238 bag of coffee beans bag of coffee beans 4 37 7 bag bag Objects suitcase 2773838 n02773838 bag.n.06 objects 39
1239 bananas banana 4 40 7 otherprop Objects n00021265 food.n.01 objects 39
1240 carseat carseat 4 40 7 otherprop Objects misc 40
366 helmet helmet 4 40 7 otherprop Objects helmet 3513137 n03513137 helmet.n.02 clothes 38
816 umbrella umbrella 4 40 7 umbrella otherprop Objects n04507155 umbrella.n.01 objects 39
1241 coffee box coffee box 4 40 7 otherprop Objects objects 39
719 envelope envelope 4 40 7 envelope otherprop Objects n03291819 envelope.n.01 objects 39
284 wet floor sign wet floor sign 4 40 7 sign otherprop Objects misc 40
1242 clothing rack clothing rack 4 39 6 stand otherfurniture Furniture n04038440 rack.n.05 shelving 31
247 controller controller 4 40 7 otherprop Objects n03096960 control.n.09 objects 39
1243 bath walls bathroom wall 4 1 12 wall wall Wall n04546855 wall.n.01 wall 1
1244 podium podium 4 39 6 otherfurniture Furniture n03159640 dais.n.01 furniture 36
1245 storage box storage box 4 29 7 box box Objects n02883344 box.n.01 objects 39
1246 dolly dolly 4 40 7 otherprop Objects misc 40
1247 shampoo shampoo 3 40 7 otherprop Objects n04183516 shampoo.n.01 objects 39
592 paper tray paper tray 3 40 7 paper tray otherprop Objects objects 39
385 cabinet door cabinet door 3 8 12 door door Wall door door 4
1248 changing station changing station 3 40 7 otherprop Objects misc 40
1249 poster printer poster printer 3 40 7 printer otherprop Objects printer 4004475 n04004475 printer.n.03 appliances 37
133 screen screen 3 40 7 otherprop Objects n03151077 curtain.n.01 curtain 12
301 soap bar soap bar 3 38 7 bar otherstructure Objects objects 39
1250 crutches crutches 3 40 7 otherprop Objects n03141823 crutch.n.01 objects 39
379 studio light studio light 3 38 7 light otherstructure Objects lighting 28
130 stack of cups cup 3 40 7 cup otherprop Objects cup cup or mug 3797390 n03797390 mug.n.04 objects 39
1251 toilet flush button toilet flush button 3 40 7 otherprop Objects objects 39
450 trunk trunk 3 40 7 otherprop Objects misc 40
1252 grocery bag grocery bag 3 37 7 bag bag Objects suitcase 2773838 n03461288 grocery_bag.n.01 objects 39
316 plastic bin plastic bin 3 40 7 bin otherprop Objects objects 39
1253 pizza box pizza box 3 29 7 box box Objects objects 39
385 cabinet doors cabinet door 3 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 door 4
1254 legs legs 3 31 7 person person Objects person n05217688 person.n.02 misc 40
461 car car 3 40 7 car otherprop Objects car car 2958343 n02958343 car.n.01 misc 40
1255 shaving cream shaving cream 3 40 7 otherprop Objects n04186051 shaving_cream.n.01 objects 39
1256 luggage stand luggage stand 3 39 6 stand otherfurniture Furniture n04038440 rack.n.05 shelving 31
599 shredder shredder 3 40 7 otherprop Objects n04210120 shredder.n.01 objects 39
281 statue statue 3 40 7 sculpture otherprop Objects n04306847 statue.n.01 misc 40
1257 urinal urinal 3 33 7 toilet toilet Objects toilet toilet n04515991 urinal.n.01 toilet 18
1258 hose hose 3 40 7 otherprop Objects n03539875 hose.n.03 misc 40
1259 bike pump bike pump 3 40 7 otherprop Objects objects 39
319 coatrack coatrack 3 40 7 otherprop Objects n03059103 coatrack.n.01 shelving 31
1260 bear bear 3 40 7 otherprop Objects objects 39
28 wall lamp lamp 3 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
1261 humidifier humidifier 3 40 7 otherprop Objects objects 39
546 toothpaste toothpaste 3 40 7 toothpaste otherprop Objects objects 39
1262 mouthwash bottle mouthwash bottle 3 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
1263 poster cutter poster cutter 3 40 7 otherprop Objects objects 39
1264 golf bag golf bag 3 37 7 bag bag Objects suitcase 2773838 n03445617 golf_bag.n.01 objects 39
1265 food container food container 3 40 7 container otherprop Objects n03094503 container.n.01 objects 39
1266 camera camera 3 40 7 otherprop Objects objects 39
28 table lamp lamp 3 35 7 lamp lamp Objects lamp lamp 3636649 n04380533 table_lamp.n.01 lighting 28
1267 yoga mat yoga mat 3 20 5 floor mat floor mat Floor n03727837 mat.n.01 floor 2
1268 card card 3 40 7 otherprop Objects objects 39
1269 mug mug 3 40 7 cup otherprop Objects cup cup or mug 3797390 n03797390 mug.n.04 objects 39
188 shower doors shower door 3 38 7 otherstructure Objects n04208936 shower.n.01 door 4
689 cardboard cardboard 3 40 7 otherprop Objects objects 39
1270 rack stand rack stand 3 39 6 stand otherfurniture Furniture n04038440 rack.n.05 shelving 31
1271 boxes of paper boxes of paper 3 29 7 box box Objects n02883344 box.n.01 objects 39
1272 flag flag 3 40 7 otherprop Objects misc 40
354 futon futon 3 39 6 mattress otherfurniture Furniture n03408444 futon.n.01 sofa 10
339 magazine magazine 3 40 7 magazine otherprop Objects n06595351 magazine.n.01 objects 39
1009 exit sign exit sign 3 40 7 exit sign otherprop Objects misc 40
1273 rolled poster rolled poster 3 40 7 otherprop Objects objects 39
1274 wheel wheel 3 40 7 otherprop Objects objects 39
15 pictures picture 3 11 8 picture picture Picture n03931044 picture.n.01 picture 6
1275 blackboard eraser blackboard eraser 3 40 7 eraser otherprop Objects n03294833 eraser.n.01 objects 39
361 organizer organizer 3 40 7 otherprop Objects n03918737 personal_digital_assistant.n.01 objects 39
1276 doll doll 3 40 7 toy otherprop Objects n03219135 doll.n.01 objects 39
326 book rack book rack 3 39 6 bookrack otherfurniture Furniture objects 39
1277 laundry bag laundry bag 3 40 7 laundry basket otherprop Objects basket 2801938 n03050864 clothes_hamper.n.01 objects 39
1278 sponge sponge 3 40 7 otherprop Objects n01906749 sponge.n.04 objects 39
116 seating seat 3 39 6 furniture otherfurniture Furniture n04161981 seat.n.03 furniture 36
1184 folded chairs folded chair 2 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
1279 lotion bottle lotion bottle 2 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
212 can can 2 40 7 can otherprop Objects can 2946921 n02946921 can.n.01 objects 39
1280 lunch box lunch box 2 40 7 otherprop Objects objects 39
1281 food display food display 2 40 7 otherprop Objects misc 40
794 storage shelf storage shelf 2 40 7 otherprop Objects shelving 31
1282 sliding wood door sliding wood door 2 40 7 otherprop Objects door 4
955 pants pants 2 40 7 otherprop Objects n04489008 trouser.n.01 clothes 38
387 wood wood 2 40 7 otherprop Objects misc 40
69 boards board 2 38 7 board otherstructure Objects board_panel 35
65 bottles bottle 2 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
523 washcloth washcloth 2 40 7 otherprop Objects n04554523 washcloth.n.01 towel 20
389 workbench workbench 2 39 6 bench otherfurniture Furniture bench table 4379243 n04600486 workbench.n.01 table 5
29 open kitchen cabinet kitchen cabinet 2 3 6 cabinet cabinet Furniture n02933112 cabinet.n.01 cabinet 7
1283 organizer shelf organizer shelf 2 15 6 shelves shelves Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
146 frame frame 2 38 7 otherstructure Objects misc 40
130 cups cup 2 40 7 cup otherprop Objects cup cup or mug 3797390 n03797390 mug.n.04 objects 39
372 exercise ball exercise ball 2 40 7 ball otherprop Objects n04285146 sports_equipment.n.01 gym_equipment 33
289 easel easel 2 39 6 stand otherfurniture Furniture n03262809 easel.n.01 furniture 36
440 garbage bag garbage bag 2 37 7 bag bag Objects suitcase 2773838 n02773838 bag.n.06 objects 39
321 roomba roomba 2 40 7 otherprop Objects objects 39
976 garage door garage door 2 38 7 garage door otherstructure Objects door door 4
1256 luggage rack luggage stand 2 39 6 stand otherfurniture Furniture n04038440 shelving 31
1284 bike lock bike lock 2 40 7 otherprop Objects objects 39
1285 briefcase briefcase 2 40 7 otherprop Objects n02900705 briefcase.n.01 objects 39
357 hand towel hand towel 2 27 7 towel towel Objects n03490006 hand_towel.n.01 towel 20
1286 bath products bath product 2 40 7 otherprop Objects objects 39
1287 star star 2 40 7 otherprop Objects n09444783 star.n.03 misc 40
365 map map 2 40 7 map otherprop Objects n03720163 map.n.01 misc 40
1288 coffee bean bag coffee bean bag 2 37 7 bag bag Objects suitcase 2773838 n02773838 bag.n.06 objects 39
81 headboard headboard 2 39 6 headboard otherfurniture Furniture n03502200 headboard.n.01 bed 11
1289 ipad ipad 2 40 7 otherprop Objects objects 39
1290 display rack display rack 2 39 6 stand otherfurniture Furniture n04038440 rack.n.05 shelving 31
948 traffic cone traffic cone 2 40 7 cone otherprop Objects cone objects 39
174 toiletry toiletry 2 40 7 otherprop Objects n04447443 toiletry.n.01 objects 39
1028 canopy canopy 2 40 7 otherprop Objects misc 40
1291 massage chair massage chair 2 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
1292 paper organizer paper organizer 2 40 7 otherprop Objects objects 39
1005 barricade barricade 2 40 7 otherprop Objects misc 40
235 platform platform 2 38 7 otherstructure Objects misc 40
1293 cap cap 2 40 7 hat otherprop Objects n03497657 hat.n.01 clothes 38
1294 dumbbell plates dumbbell plates 2 40 7 otherprop Objects objects 39
1295 elevator elevator 2 38 7 otherstructure Objects misc 40
1296 cooking pan cooking pan 2 40 7 pan otherprop Objects n03880531 pan.n.01 objects 39
1297 trash bag trash bag 2 37 7 bag bag Objects objects 39
1298 santa santa 2 40 7 otherprop Objects misc 40
1299 jewelry box jewelry box 2 29 7 box box Objects n02883344 box.n.01 objects 39
1300 boat boat 2 40 7 otherprop Objects misc 40
1301 sock sock 2 21 7 clothes clothes Objects n04254777 sock.n.01 clothes 38
1051 kinect kinect 2 40 7 kinect otherprop Objects objects 39
566 crib crib 2 39 6 crib otherfurniture Furniture furniture 36
1302 plastic storage bin plastic storage bin 2 40 7 container otherprop Objects n03094503 container.n.01 objects 39
1062 cooler cooler 2 24 6 refridgerator refridgerator Furniture n03102654 cooler.n.01 appliances 37
1303 kitchen apron kitchen apron 2 21 7 clothes clothes Objects n02728440 apparel.n.01 clothes 38
1304 dishwashing soap bottle dishwashing soap bottle 2 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
1305 xbox controller xbox controller 2 40 7 otherprop Objects objects 39
1306 banana holder banana holder 2 40 7 otherprop Objects objects 39
298 ping pong paddle ping pong paddle 2 40 7 otherprop Objects table 5
1307 airplane airplane 2 40 7 otherprop Objects misc 40
1308 conditioner bottle conditioner bottle 2 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
1309 tea kettle tea kettle 2 40 7 tea kettle otherprop Objects n04397768 teakettle.n.01 objects 39
43 bedframe bedframe 2 39 6 otherfurniture Furniture n02822579 bedstead.n.01 bed 11
1310 wood beam wood beam 2 38 7 otherstructure Objects beam 29
593 toilet paper package toilet paper package 2 40 7 otherprop Objects objects 39
1311 wall mounted coat rack wall mounted coat rack 2 40 7 otherprop Objects n03059103 coatrack.n.01 shelving 31
1312 film light film light 2 40 7 otherprop Objects lighting 28
749 ceiling lamp ceiling lamp 1 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
623 chain chain 1 40 7 otherprop Objects chair 3
1313 sofa sofa 1 6 9 sofa sofa Sofa sofa sofa sofa 4256520 n04256520 sofa.n.01 sofa 10
99 closet wardrobe wardrobe 1 39 6 wardrobe otherfurniture Furniture wardrobe n04550184 wardrobe.n.01 furniture 36
265 sweater sweater 1 40 7 otherprop Objects n04370048 sweater.n.01 clothes 38
1314 kitchen mixer kitchen mixer 1 40 7 otherprop Objects appliances 37
99 wardrobe wardrobe 1 39 6 wardrobe otherfurniture Furniture wardrobe n04550184 wardrobe.n.01 furniture 36
1315 water softener water softener 1 40 7 otherprop Objects misc 40
448 banister banister 1 38 7 banister otherstructure Objects n02788148 bannister.n.02 railing 30
257 trolley trolley 1 40 7 trolley otherprop Objects n04335435 streetcar.n.01 misc 40
1316 pantry shelf pantry shelf 1 15 6 shelves shelves Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
786 sofa bed sofa bed 1 4 1 bed bed Bed bed bed bed 2818832 n02818832 bed.n.01 bed 11
801 loofa loofa 1 40 7 otherprop Objects objects 39
972 shower faucet handle shower faucet handle 1 40 7 handle otherprop Objects shower 23
1317 toy piano toy piano 1 40 7 toy otherprop Objects n03964744 plaything.n.01 objects 39
1318 fish fish 1 40 7 otherprop Objects n02512053 fish.n.01 objects 39
75 file cabinets file cabinet 1 3 6 cabinet cabinet Furniture cabinet 2933112 n03337140 file.n.03 cabinet 7
657 cat litter box cat litter box 1 29 7 box box Objects objects 39
561 electric panel electric panel 1 40 7 otherprop Objects misc 40
93 suitcases suitcase 1 40 7 luggage otherprop Objects n02774630 baggage.n.01 objects 39
513 curtain rod curtain rod 1 38 7 curtain rod otherstructure Objects curtain 12
411 bunk bed bunk bed 1 39 6 bunk bed otherfurniture Furniture bed bed bed 2818832 n02920259 bunk_bed.n.01 bed 11
1122 chandelier chandelier 1 38 7 chandelier otherstructure Objects n03005285 chandelier.n.01 lighting 28
922 tape tape 1 40 7 tape otherprop Objects objects 39
88 plates plate 1 40 7 otherprop Objects n03959485 plate.n.04 objects 39
518 alarm alarm 1 40 7 alarm otherprop Objects clock 3046257 n02694662 alarm_clock.n.01 objects 39
814 fire hose fire hose 1 40 7 otherprop Objects n03346004 fire_hose.n.01 misc 40
1319 toy dinosaur toy dinosaur 1 40 7 toy otherprop Objects n03964744 plaything.n.01 objects 39
1320 cone cone 1 40 7 otherprop Objects objects 39
649 glass doors glass door 1 8 12 door door Wall door n03221720 door.n.01 door 4
607 hatrack hatrack 1 40 7 otherprop Objects n03059103 coatrack.n.01 shelving 31
819 subwoofer subwoofer 1 40 7 speaker otherprop Objects speaker 3691459 n04349401 subwoofer.n.01 objects 39
1321 fire sprinkler fire sprinkler 1 40 7 otherprop Objects misc 40
1322 trash cabinet trash cabinet 1 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
1204 pantry walls pantry wall 1 1 12 wall wall Wall n04546855 wall.n.01 wall 1
227 photo photo 1 40 7 photo otherprop Objects n03925226 photograph.n.01 picture 6
817 barrier barrier 1 40 7 otherprop Objects n02796623 barrier.n.01 misc 40
130 stacks of cups cup 1 40 7 otherprop Objects n03147509 cup.n.01 objects 39
712 beachball beachball 1 40 7 ball otherprop Objects n02814224 beach_ball.n.01 objects 39
1323 folded boxes folded boxes 1 40 7 otherprop Objects objects 39
1324 contact lens solution bottle contact lens solution bottle 1 40 7 bottle otherprop Objects bottle bottle 2876657 n02876657 bottle.n.01 objects 39
673 covered box covered box 1 29 7 box box Objects objects 39
459 folder folder 1 40 7 folder otherprop Objects n03376279 folder.n.02 objects 39
643 mail trays mail tray 1 40 7 mail tray otherprop Objects objects 39
238 slipper slipper 1 40 7 otherprop Objects n04241394 slipper.n.01 clothes 38
765 magazine rack magazine rack 1 39 6 stand otherfurniture Furniture n03704549 magazine_rack.n.01 shelving 31
1008 sticker sticker 1 40 7 sticker otherprop Objects n07272545 gummed_label.n.01 objects 39
225 lotion lotion 1 40 7 otherprop Objects n03690938 lotion.n.01 objects 39
1083 buddha buddha 1 40 7 otherprop Objects objects 39
813 file organizer file organizer 1 40 7 otherprop Objects objects 39
138 paper towel rolls paper towel roll 1 40 7 paper towel otherprop Objects n03887697 paper_towel.n.01 towel 20
1145 night lamp night lamp 1 35 7 lamp lamp Objects lamp lamp 3636649 n03636649 lamp.n.02 lighting 28
796 fuse box fuse box 1 40 7 otherprop Objects misc 40
1325 knife block knife block 1 40 7 otherprop Objects objects 39
363 furnace furnace 1 39 6 furnace otherfurniture Furniture n03404449 furnace.n.01
1174 cd cases cd case 1 40 7 otherprop Objects objects 39
38 stools stool 1 40 7 stool otherprop Objects stool n04326896 stool.n.01 stool 19
1326 hand sanitzer dispenser hand sanitzer dispenser 1 40 7 otherprop Objects n04254120 soap_dispenser.n.01 objects 39
997 teapot teapot 1 40 7 tea pot otherprop Objects n04398044 teapot.n.01 objects 39
1327 pen holder pen holder 1 40 7 otherprop Objects objects 39
1328 tray rack tray rack 1 40 7 otherprop Objects objects 39
1329 wig wig 1 40 7 otherprop Objects n04584207 wig.n.01 objects 39
182 switch switch 1 40 7 otherprop Objects n04372370 switch.n.01 misc 40
280 plastic containers plastic container 1 40 7 container otherprop Objects n03094503 container.n.01 objects 39
1330 night light night light 1 40 7 otherprop Objects lighting 28
1331 notepad notepad 1 40 7 otherprop Objects objects 39
1332 mail bin mail bin 1 40 7 otherprop Objects misc 40
1333 elevator button elevator button 1 40 7 otherprop Objects misc 40
939 gaming wheel gaming wheel 1 40 7 otherprop Objects objects 39
1334 drum set drum set 1 40 7 otherprop Objects objects 39
480 cosmetic bag cosmetic bag 1 37 7 bag bag Objects objects 39
907 coffee mug coffee mug 1 40 7 vessel otherprop Objects cup or mug 3797390 n03063599 coffee_mug.n.01 objects 39
1335 closet shelf closet shelf 1 15 6 shelves shelves Furniture bookshelf bookshelf 2871439 n02871439 bookshelf.n.01 shelving 31
1336 baby mobile baby mobile 1 40 7 otherprop Objects objects 39
829 diaper bin diaper bin 1 40 7 bin otherprop Objects objects 39
947 door wall door wall 1 1 12 wall wall Wall wall 1
1116 stepstool stepstool 1 40 7 step stool otherprop Objects objects 39
599 paper shredder shredder 1 40 7 otherprop Objects n04210120 shredder.n.01 objects 39
733 dress rack dress rack 1 40 7 otherprop Objects n03238762 dress_rack.n.01 misc 40
123 cover cover 1 40 7 blanket otherprop Objects objects 39
506 shopping bag shopping bag 1 37 7 bag bag Objects n04204081 shopping_bag.n.01 objects 39
569 sliding door sliding door 1 8 12 door door Wall door n04239074 sliding_door.n.01 door 4
1337 exercise bike exercise bike 1 40 7 machine otherprop Objects n04210120 shredder.n.01 gym_equipment 33
1338 recliner chair recliner chair 1 5 4 chair chair Chair chair chair chair 3001627 n03238762 dress_rack.n.01 chair 3
1314 kitchenaid mixer kitchen mixer 1 40 7 otherprop Objects appliances 37
1339 soda can soda can 1 40 7 can otherprop Objects can 2946921 n02946921 can.n.01 objects 39
1340 stovetop stovetop 1 38 7 stove otherstructure Objects stove 4330267 n04330267 stove.n.02 appliances 37
851 stepladder stepladder 1 39 6 ladder otherfurniture Furniture stairs n04315599 step_ladder.n.01 stairs 16
142 tap tap 1 40 7 faucet otherprop Objects faucet 3325088 n04559451 water_faucet.n.01 objects 39
436 cable cable 1 40 7 cables otherprop Objects objects 39
1341 baby changing station baby changing station 1 39 6 otherfurniture Furniture furniture 36
1342 costume costume 1 21 7 clothes clothes Objects n02728440 apparel.n.01 clothes 38
885 rocking chair rocking chair 1 5 4 chair chair Chair chair chair chair 3001627 n04099969 rocking_chair.n.01 chair 3
693 binder binder 1 40 7 binder otherprop Objects objects 39
815 media center media center 1 3 6 cabinet cabinet Furniture cabinet 2933112 n02933112 cabinet.n.01 cabinet 7
401 towel rack towel rack 1 40 7 otherprop Objects n04459773 towel_rack.n.01 misc 40
1343 medal medal 1 40 7 otherprop Objects objects 39
1184 stack of folded chairs folded chair 1 5 4 chair chair Chair chair chair chair 3001627 n03001627 chair.n.01 chair 3
1344 telescope telescope 1 40 7 otherprop Objects n04403638 telescope.n.01 objects 39
1345 closet doorframe closet doorframe 1 8 12 door door Wall door door 4
160 glass glass 1 38 7 glass otherstructure Objects n03438257 glass.n.02 misc 40
1126 baseball cap baseball cap 1 40 7 otherprop Objects cap 2954340 n02799323 baseball_cap.n.01 clothes 38
1346 battery disposal jar battery disposal jar 1 40 7 jar otherprop Objects jar 3593526 n03593526 jar.n.01 objects 39
332 mop mop 1 40 7 otherprop Objects n04367480 swab.n.02 objects 39
397 tank tank 1 40 7 otherprop Objects objects 39
643 mail tray mail tray 1 40 7 mail tray otherprop Objects objects 39
551 centerpiece centerpiece 1 40 7 centerpiece otherprop Objects n02994419 centerpiece.n.02 objects 39
1163 stick stick 1 40 7 stick otherprop Objects objects 39
1347 closet floor closet floor 1 2 5 floor floor Floor n03365592 floor.n.01 floor 2
1348 dryer sheets dryer sheets 1 40 7 otherprop Objects objects 39
803 bycicle bycicle 1 40 7 otherprop Objects misc 40
484 flower stand flower stand 1 39 6 stand otherfurniture Furniture furniture 36
1349 air mattress air mattress 1 4 1 bed bed Bed bed bed bed 2818832 n02690809 air_mattress.n.01 bed 11
1350 clip clip 1 40 7 otherprop Objects objects 39
222 side table side table 1 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
1253 pizza boxes pizza box 1 29 7 box box Objects n02883344 box.n.01 objects 39
1351 display display 1 39 7 otherfurniture Furniture n03211117 display.n.06 misc 40
1352 postcard postcard 1 40 7 otherprop Objects objects 39
828 display sign display sign 1 40 7 sign otherprop Objects misc 40
1353 paper towel paper towel 1 40 7 paper towel otherprop Objects n03887697 paper_towel.n.01 towel 20
612 boots boot 1 40 7 shoe otherprop Objects n04199027 shoe.n.01 clothes 38
1354 tennis racket bag tennis racket bag 1 40 7 otherprop Objects objects 39
1355 air hockey table air hockey table 1 7 10 table table Table table table table 4379243 n04379243 table.n.02 table 5
1301 socks sock 1 21 7 clothes clothes Objects n04254777 sock.n.01 clothes 38
1356 food bag food bag 1 37 7 bag bag Objects objects 39
1199 clothes hangers clothes hanger 1 40 7 otherprop Objects n03057920 coat_hanger.n.01 misc 40
1357 starbucks cup starbucks cup 1 40 7 cup otherprop Objects cup cup or mug 3797390 n03797390 mug.n.04 objects 39
```
## /datasets/scannet/meta_data/scannetv2_test.txt
scene0707_00
scene0708_00
scene0709_00
scene0710_00
scene0711_00
scene0712_00
scene0713_00
scene0714_00
scene0715_00
scene0716_00
scene0717_00
scene0718_00
scene0719_00
scene0720_00
scene0721_00
scene0722_00
scene0723_00
scene0724_00
scene0725_00
scene0726_00
scene0727_00
scene0728_00
scene0729_00
scene0730_00
scene0731_00
scene0732_00
scene0733_00
scene0734_00
scene0735_00
scene0736_00
scene0737_00
scene0738_00
scene0739_00
scene0740_00
scene0741_00
scene0742_00
scene0743_00
scene0744_00
scene0745_00
scene0746_00
scene0747_00
scene0748_00
scene0749_00
scene0750_00
scene0751_00
scene0752_00
scene0753_00
scene0754_00
scene0755_00
scene0756_00
scene0757_00
scene0758_00
scene0759_00
scene0760_00
scene0761_00
scene0762_00
scene0763_00
scene0764_00
scene0765_00
scene0766_00
scene0767_00
scene0768_00
scene0769_00
scene0770_00
scene0771_00
scene0772_00
scene0773_00
scene0774_00
scene0775_00
scene0776_00
scene0777_00
scene0778_00
scene0779_00
scene0780_00
scene0781_00
scene0782_00
scene0783_00
scene0784_00
scene0785_00
scene0786_00
scene0787_00
scene0788_00
scene0789_00
scene0790_00
scene0791_00
scene0792_00
scene0793_00
scene0794_00
scene0795_00
scene0796_00
scene0797_00
scene0798_00
scene0799_00
scene0800_00
scene0801_00
scene0802_00
scene0803_00
scene0804_00
scene0805_00
scene0806_00
## /datasets/scannet/meta_data/scannetv2_train.txt
scene0000_00
scene0000_01
scene0000_02
scene0001_00
scene0001_01
scene0002_00
scene0002_01
scene0003_00
scene0003_01
scene0003_02
scene0004_00
scene0005_00
scene0005_01
scene0006_00
scene0006_01
scene0006_02
scene0007_00
scene0008_00
scene0009_00
scene0009_01
scene0009_02
scene0010_00
scene0010_01
scene0011_00
scene0011_01
scene0012_00
scene0012_01
scene0012_02
scene0013_00
scene0013_01
scene0013_02
scene0014_00
scene0015_00
scene0016_00
scene0016_01
scene0016_02
scene0017_00
scene0017_01
scene0017_02
scene0018_00
scene0019_00
scene0019_01
scene0020_00
scene0020_01
scene0021_00
scene0022_00
scene0022_01
scene0023_00
scene0024_00
scene0024_01
scene0024_02
scene0025_00
scene0025_01
scene0025_02
scene0026_00
scene0027_00
scene0027_01
scene0027_02
scene0028_00
scene0029_00
scene0029_01
scene0029_02
scene0030_00
scene0030_01
scene0030_02
scene0031_00
scene0031_01
scene0031_02
scene0032_00
scene0032_01
scene0033_00
scene0034_00
scene0034_01
scene0034_02
scene0035_00
scene0035_01
scene0036_00
scene0036_01
scene0037_00
scene0038_00
scene0038_01
scene0038_02
scene0039_00
scene0039_01
scene0040_00
scene0040_01
scene0041_00
scene0041_01
scene0042_00
scene0042_01
scene0042_02
scene0043_00
scene0043_01
scene0044_00
scene0044_01
scene0044_02
scene0045_00
scene0045_01
scene0046_00
scene0046_01
scene0046_02
scene0047_00
scene0048_00
scene0048_01
scene0049_00
scene0050_00
scene0050_01
scene0050_02
scene0051_00
scene0051_01
scene0051_02
scene0051_03
scene0052_00
scene0052_01
scene0052_02
scene0053_00
scene0054_00
scene0055_00
scene0055_01
scene0055_02
scene0056_00
scene0056_01
scene0057_00
scene0057_01
scene0058_00
scene0058_01
scene0059_00
scene0059_01
scene0059_02
scene0060_00
scene0060_01
scene0061_00
scene0061_01
scene0062_00
scene0062_01
scene0062_02
scene0063_00
scene0064_00
scene0064_01
scene0065_00
scene0065_01
scene0065_02
scene0066_00
scene0067_00
scene0067_01
scene0067_02
scene0068_00
scene0068_01
scene0069_00
scene0070_00
scene0071_00
scene0072_00
scene0072_01
scene0072_02
scene0073_00
scene0073_01
scene0073_02
scene0073_03
scene0074_00
scene0074_01
scene0074_02
scene0075_00
scene0076_00
scene0077_00
scene0077_01
scene0078_00
scene0078_01
scene0078_02
scene0079_00
scene0079_01
scene0080_00
scene0080_01
scene0080_02
scene0081_00
scene0081_01
scene0081_02
scene0082_00
scene0083_00
scene0083_01
scene0084_00
scene0084_01
scene0084_02
scene0085_00
scene0085_01
scene0086_00
scene0086_01
scene0086_02
scene0087_00
scene0087_01
scene0087_02
scene0088_00
scene0088_01
scene0088_02
scene0088_03
scene0089_00
scene0089_01
scene0089_02
scene0090_00
scene0091_00
scene0092_00
scene0092_01
scene0092_02
scene0092_03
scene0092_04
scene0093_00
scene0093_01
scene0093_02
scene0094_00
scene0095_00
scene0095_01
scene0096_00
scene0096_01
scene0096_02
scene0097_00
scene0098_00
scene0098_01
scene0099_00
scene0099_01
scene0100_00
scene0100_01
scene0100_02
scene0101_00
scene0101_01
scene0101_02
scene0101_03
scene0101_04
scene0101_05
scene0102_00
scene0102_01
scene0103_00
scene0103_01
scene0104_00
scene0105_00
scene0105_01
scene0105_02
scene0106_00
scene0106_01
scene0106_02
scene0107_00
scene0108_00
scene0109_00
scene0109_01
scene0110_00
scene0110_01
scene0110_02
scene0111_00
scene0111_01
scene0111_02
scene0112_00
scene0112_01
scene0112_02
scene0113_00
scene0113_01
scene0114_00
scene0114_01
scene0114_02
scene0115_00
scene0115_01
scene0115_02
scene0116_00
scene0116_01
scene0116_02
scene0117_00
scene0118_00
scene0118_01
scene0118_02
scene0119_00
scene0120_00
scene0120_01
scene0121_00
scene0121_01
scene0121_02
scene0122_00
scene0122_01
scene0123_00
scene0123_01
scene0123_02
scene0124_00
scene0124_01
scene0125_00
scene0126_00
scene0126_01
scene0126_02
scene0127_00
scene0127_01
scene0128_00
scene0129_00
scene0130_00
scene0131_00
scene0131_01
scene0131_02
scene0132_00
scene0132_01
scene0132_02
scene0133_00
scene0134_00
scene0134_01
scene0134_02
scene0135_00
scene0136_00
scene0136_01
scene0136_02
scene0137_00
scene0137_01
scene0137_02
scene0138_00
scene0139_00
scene0140_00
scene0140_01
scene0141_00
scene0141_01
scene0141_02
scene0142_00
scene0142_01
scene0143_00
scene0143_01
scene0143_02
scene0144_00
scene0144_01
scene0145_00
scene0146_00
scene0146_01
scene0146_02
scene0147_00
scene0147_01
scene0148_00
scene0149_00
scene0150_00
scene0150_01
scene0150_02
scene0151_00
scene0151_01
scene0152_00
scene0152_01
scene0152_02
scene0153_00
scene0153_01
scene0154_00
scene0155_00
scene0155_01
scene0155_02
scene0156_00
scene0157_00
scene0157_01
scene0158_00
scene0158_01
scene0158_02
scene0159_00
scene0160_00
scene0160_01
scene0160_02
scene0160_03
scene0160_04
scene0161_00
scene0161_01
scene0161_02
scene0162_00
scene0163_00
scene0163_01
scene0164_00
scene0164_01
scene0164_02
scene0164_03
scene0165_00
scene0165_01
scene0165_02
scene0166_00
scene0166_01
scene0166_02
scene0167_00
scene0168_00
scene0168_01
scene0168_02
scene0169_00
scene0169_01
scene0170_00
scene0170_01
scene0170_02
scene0171_00
scene0171_01
scene0172_00
scene0172_01
scene0173_00
scene0173_01
scene0173_02
scene0174_00
scene0174_01
scene0175_00
scene0176_00
scene0177_00
scene0177_01
scene0177_02
scene0178_00
scene0179_00
scene0180_00
scene0181_00
scene0181_01
scene0181_02
scene0181_03
scene0182_00
scene0182_01
scene0182_02
scene0183_00
scene0184_00
scene0185_00
scene0186_00
scene0186_01
scene0187_00
scene0187_01
scene0188_00
scene0189_00
scene0190_00
scene0191_00
scene0191_01
scene0191_02
scene0192_00
scene0192_01
scene0192_02
scene0193_00
scene0193_01
scene0194_00
scene0195_00
scene0195_01
scene0195_02
scene0196_00
scene0197_00
scene0197_01
scene0197_02
scene0198_00
scene0199_00
scene0200_00
scene0200_01
scene0200_02
scene0201_00
scene0201_01
scene0201_02
scene0202_00
scene0203_00
scene0203_01
scene0203_02
scene0204_00
scene0204_01
scene0204_02
scene0205_00
scene0205_01
scene0205_02
scene0206_00
scene0206_01
scene0206_02
scene0207_00
scene0207_01
scene0207_02
scene0208_00
scene0209_00
scene0209_01
scene0209_02
scene0210_00
scene0210_01
scene0211_00
scene0211_01
scene0211_02
scene0211_03
scene0212_00
scene0212_01
scene0212_02
scene0213_00
scene0214_00
scene0214_01
scene0214_02
scene0215_00
scene0215_01
scene0216_00
scene0217_00
scene0218_00
scene0218_01
scene0219_00
scene0220_00
scene0220_01
scene0220_02
scene0221_00
scene0221_01
scene0222_00
scene0222_01
scene0223_00
scene0223_01
scene0223_02
scene0224_00
scene0225_00
scene0226_00
scene0226_01
scene0227_00
scene0228_00
scene0229_00
scene0229_01
scene0229_02
scene0230_00
scene0231_00
scene0231_01
scene0231_02
scene0232_00
scene0232_01
scene0232_02
scene0233_00
scene0233_01
scene0234_00
scene0235_00
scene0236_00
scene0236_01
scene0237_00
scene0237_01
scene0238_00
scene0238_01
scene0239_00
scene0239_01
scene0239_02
scene0240_00
scene0241_00
scene0241_01
scene0241_02
scene0242_00
scene0242_01
scene0242_02
scene0243_00
scene0244_00
scene0244_01
scene0245_00
scene0246_00
scene0247_00
scene0247_01
scene0248_00
scene0248_01
scene0248_02
scene0249_00
scene0250_00
scene0250_01
scene0250_02
scene0251_00
scene0252_00
scene0253_00
scene0254_00
scene0254_01
scene0255_00
scene0255_01
scene0255_02
scene0256_00
scene0256_01
scene0256_02
scene0257_00
scene0258_00
scene0259_00
scene0259_01
scene0260_00
scene0260_01
scene0260_02
scene0261_00
scene0261_01
scene0261_02
scene0261_03
scene0262_00
scene0262_01
scene0263_00
scene0263_01
scene0264_00
scene0264_01
scene0264_02
scene0265_00
scene0265_01
scene0265_02
scene0266_00
scene0266_01
scene0267_00
scene0268_00
scene0268_01
scene0268_02
scene0269_00
scene0269_01
scene0269_02
scene0270_00
scene0270_01
scene0270_02
scene0271_00
scene0271_01
scene0272_00
scene0272_01
scene0273_00
scene0273_01
scene0274_00
scene0274_01
scene0274_02
scene0275_00
scene0276_00
scene0276_01
scene0277_00
scene0277_01
scene0277_02
scene0278_00
scene0278_01
scene0279_00
scene0279_01
scene0279_02
scene0280_00
scene0280_01
scene0280_02
scene0281_00
scene0282_00
scene0282_01
scene0282_02
scene0283_00
scene0284_00
scene0285_00
scene0286_00
scene0286_01
scene0286_02
scene0286_03
scene0287_00
scene0288_00
scene0288_01
scene0288_02
scene0289_00
scene0289_01
scene0290_00
scene0291_00
scene0291_01
scene0291_02
scene0292_00
scene0292_01
scene0293_00
scene0293_01
scene0294_00
scene0294_01
scene0294_02
scene0295_00
scene0295_01
scene0296_00
scene0296_01
scene0297_00
scene0297_01
scene0297_02
scene0298_00
scene0299_00
scene0299_01
scene0300_00
scene0300_01
scene0301_00
scene0301_01
scene0301_02
scene0302_00
scene0302_01
scene0303_00
scene0303_01
scene0303_02
scene0304_00
scene0305_00
scene0305_01
scene0306_00
scene0306_01
scene0307_00
scene0307_01
scene0307_02
scene0308_00
scene0309_00
scene0309_01
scene0310_00
scene0310_01
scene0310_02
scene0311_00
scene0312_00
scene0312_01
scene0312_02
scene0313_00
scene0313_01
scene0313_02
scene0314_00
scene0315_00
scene0316_00
scene0317_00
scene0317_01
scene0318_00
scene0319_00
scene0320_00
scene0320_01
scene0320_02
scene0320_03
scene0321_00
scene0322_00
scene0323_00
scene0323_01
scene0324_00
scene0324_01
scene0325_00
scene0325_01
scene0326_00
scene0327_00
scene0328_00
scene0329_00
scene0329_01
scene0329_02
scene0330_00
scene0331_00
scene0331_01
scene0332_00
scene0332_01
scene0332_02
scene0333_00
scene0334_00
scene0334_01
scene0334_02
scene0335_00
scene0335_01
scene0335_02
scene0336_00
scene0336_01
scene0337_00
scene0337_01
scene0337_02
scene0338_00
scene0338_01
scene0338_02
scene0339_00
scene0340_00
scene0340_01
scene0340_02
scene0341_00
scene0341_01
scene0342_00
scene0343_00
scene0344_00
scene0344_01
scene0345_00
scene0345_01
scene0346_00
scene0346_01
scene0347_00
scene0347_01
scene0347_02
scene0348_00
scene0348_01
scene0348_02
scene0349_00
scene0349_01
scene0350_00
scene0350_01
scene0350_02
scene0351_00
scene0351_01
scene0352_00
scene0352_01
scene0352_02
scene0353_00
scene0353_01
scene0353_02
scene0354_00
scene0355_00
scene0355_01
scene0356_00
scene0356_01
scene0356_02
scene0357_00
scene0357_01
scene0358_00
scene0358_01
scene0358_02
scene0359_00
scene0359_01
scene0360_00
scene0361_00
scene0361_01
scene0361_02
scene0362_00
scene0362_01
scene0362_02
scene0362_03
scene0363_00
scene0364_00
scene0364_01
scene0365_00
scene0365_01
scene0365_02
scene0366_00
scene0367_00
scene0367_01
scene0368_00
scene0368_01
scene0369_00
scene0369_01
scene0369_02
scene0370_00
scene0370_01
scene0370_02
scene0371_00
scene0371_01
scene0372_00
scene0373_00
scene0373_01
scene0374_00
scene0375_00
scene0375_01
scene0375_02
scene0376_00
scene0376_01
scene0376_02
scene0377_00
scene0377_01
scene0377_02
scene0378_00
scene0378_01
scene0378_02
scene0379_00
scene0380_00
scene0380_01
scene0380_02
scene0381_00
scene0381_01
scene0381_02
scene0382_00
scene0382_01
scene0383_00
scene0383_01
scene0383_02
scene0384_00
scene0385_00
scene0385_01
scene0385_02
scene0386_00
scene0387_00
scene0387_01
scene0387_02
scene0388_00
scene0388_01
scene0389_00
scene0390_00
scene0391_00
scene0392_00
scene0392_01
scene0392_02
scene0393_00
scene0393_01
scene0393_02
scene0394_00
scene0394_01
scene0395_00
scene0395_01
scene0395_02
scene0396_00
scene0396_01
scene0396_02
scene0397_00
scene0397_01
scene0398_00
scene0398_01
scene0399_00
scene0399_01
scene0400_00
scene0400_01
scene0401_00
scene0402_00
scene0403_00
scene0403_01
scene0404_00
scene0404_01
scene0404_02
scene0405_00
scene0406_00
scene0406_01
scene0406_02
scene0407_00
scene0407_01
scene0408_00
scene0408_01
scene0409_00
scene0409_01
scene0410_00
scene0410_01
scene0411_00
scene0411_01
scene0411_02
scene0412_00
scene0412_01
scene0413_00
scene0414_00
scene0415_00
scene0415_01
scene0415_02
scene0416_00
scene0416_01
scene0416_02
scene0416_03
scene0416_04
scene0417_00
scene0418_00
scene0418_01
scene0418_02
scene0419_00
scene0419_01
scene0419_02
scene0420_00
scene0420_01
scene0420_02
scene0421_00
scene0421_01
scene0421_02
scene0422_00
scene0423_00
scene0423_01
scene0423_02
scene0424_00
scene0424_01
scene0424_02
scene0425_00
scene0425_01
scene0426_00
scene0426_01
scene0426_02
scene0426_03
scene0427_00
scene0428_00
scene0428_01
scene0429_00
scene0430_00
scene0430_01
scene0431_00
scene0432_00
scene0432_01
scene0433_00
scene0434_00
scene0434_01
scene0434_02
scene0435_00
scene0435_01
scene0435_02
scene0435_03
scene0436_00
scene0437_00
scene0437_01
scene0438_00
scene0439_00
scene0439_01
scene0440_00
scene0440_01
scene0440_02
scene0441_00
scene0442_00
scene0443_00
scene0444_00
scene0444_01
scene0445_00
scene0445_01
scene0446_00
scene0446_01
scene0447_00
scene0447_01
scene0447_02
scene0448_00
scene0448_01
scene0448_02
scene0449_00
scene0449_01
scene0449_02
scene0450_00
scene0451_00
scene0451_01
scene0451_02
scene0451_03
scene0451_04
scene0451_05
scene0452_00
scene0452_01
scene0452_02
scene0453_00
scene0453_01
scene0454_00
scene0455_00
scene0456_00
scene0456_01
scene0457_00
scene0457_01
scene0457_02
scene0458_00
scene0458_01
scene0459_00
scene0459_01
scene0460_00
scene0461_00
scene0462_00
scene0463_00
scene0463_01
scene0464_00
scene0465_00
scene0465_01
scene0466_00
scene0466_01
scene0467_00
scene0468_00
scene0468_01
scene0468_02
scene0469_00
scene0469_01
scene0469_02
scene0470_00
scene0470_01
scene0471_00
scene0471_01
scene0471_02
scene0472_00
scene0472_01
scene0472_02
scene0473_00
scene0473_01
scene0474_00
scene0474_01
scene0474_02
scene0474_03
scene0474_04
scene0474_05
scene0475_00
scene0475_01
scene0475_02
scene0476_00
scene0476_01
scene0476_02
scene0477_00
scene0477_01
scene0478_00
scene0478_01
scene0479_00
scene0479_01
scene0479_02
scene0480_00
scene0480_01
scene0481_00
scene0481_01
scene0482_00
scene0482_01
scene0483_00
scene0484_00
scene0484_01
scene0485_00
scene0486_00
scene0487_00
scene0487_01
scene0488_00
scene0488_01
scene0489_00
scene0489_01
scene0489_02
scene0490_00
scene0491_00
scene0492_00
scene0492_01
scene0493_00
scene0493_01
scene0494_00
scene0495_00
scene0496_00
scene0497_00
scene0498_00
scene0498_01
scene0498_02
scene0499_00
scene0500_00
scene0500_01
scene0501_00
scene0501_01
scene0501_02
scene0502_00
scene0502_01
scene0502_02
scene0503_00
scene0504_00
scene0505_00
scene0505_01
scene0505_02
scene0505_03
scene0505_04
scene0506_00
scene0507_00
scene0508_00
scene0508_01
scene0508_02
scene0509_00
scene0509_01
scene0509_02
scene0510_00
scene0510_01
scene0510_02
scene0511_00
scene0511_01
scene0512_00
scene0513_00
scene0514_00
scene0514_01
scene0515_00
scene0515_01
scene0515_02
scene0516_00
scene0516_01
scene0517_00
scene0517_01
scene0517_02
scene0518_00
scene0519_00
scene0520_00
scene0520_01
scene0521_00
scene0522_00
scene0523_00
scene0523_01
scene0523_02
scene0524_00
scene0524_01
scene0525_00
scene0525_01
scene0525_02
scene0526_00
scene0526_01
scene0527_00
scene0528_00
scene0528_01
scene0529_00
scene0529_01
scene0529_02
scene0530_00
scene0531_00
scene0532_00
scene0532_01
scene0533_00
scene0533_01
scene0534_00
scene0534_01
scene0535_00
scene0536_00
scene0536_01
scene0536_02
scene0537_00
scene0538_00
scene0539_00
scene0539_01
scene0539_02
scene0540_00
scene0540_01
scene0540_02
scene0541_00
scene0541_01
scene0541_02
scene0542_00
scene0543_00
scene0543_01
scene0543_02
scene0544_00
scene0545_00
scene0545_01
scene0545_02
scene0546_00
scene0547_00
scene0547_01
scene0547_02
scene0548_00
scene0548_01
scene0548_02
scene0549_00
scene0549_01
scene0550_00
scene0551_00
scene0552_00
scene0552_01
scene0553_00
scene0553_01
scene0553_02
scene0554_00
scene0554_01
scene0555_00
scene0556_00
scene0556_01
scene0557_00
scene0557_01
scene0557_02
scene0558_00
scene0558_01
scene0558_02
scene0559_00
scene0559_01
scene0559_02
scene0560_00
scene0561_00
scene0561_01
scene0562_00
scene0563_00
scene0564_00
scene0565_00
scene0566_00
scene0567_00
scene0567_01
scene0568_00
scene0568_01
scene0568_02
scene0569_00
scene0569_01
scene0570_00
scene0570_01
scene0570_02
scene0571_00
scene0571_01
scene0572_00
scene0572_01
scene0572_02
scene0573_00
scene0573_01
scene0574_00
scene0574_01
scene0574_02
scene0575_00
scene0575_01
scene0575_02
scene0576_00
scene0576_01
scene0576_02
scene0577_00
scene0578_00
scene0578_01
scene0578_02
scene0579_00
scene0579_01
scene0579_02
scene0580_00
scene0580_01
scene0581_00
scene0581_01
scene0581_02
scene0582_00
scene0582_01
scene0582_02
scene0583_00
scene0583_01
scene0583_02
scene0584_00
scene0584_01
scene0584_02
scene0585_00
scene0585_01
scene0586_00
scene0586_01
scene0586_02
scene0587_00
scene0587_01
scene0587_02
scene0587_03
scene0588_00
scene0588_01
scene0588_02
scene0588_03
scene0589_00
scene0589_01
scene0589_02
scene0590_00
scene0590_01
scene0591_00
scene0591_01
scene0591_02
scene0592_00
scene0592_01
scene0593_00
scene0593_01
scene0594_00
scene0595_00
scene0596_00
scene0596_01
scene0596_02
scene0597_00
scene0597_01
scene0597_02
scene0598_00
scene0598_01
scene0598_02
scene0599_00
scene0599_01
scene0599_02
scene0600_00
scene0600_01
scene0600_02
scene0601_00
scene0601_01
scene0602_00
scene0603_00
scene0603_01
scene0604_00
scene0604_01
scene0604_02
scene0605_00
scene0605_01
scene0606_00
scene0606_01
scene0606_02
scene0607_00
scene0607_01
scene0608_00
scene0608_01
scene0608_02
scene0609_00
scene0609_01
scene0609_02
scene0609_03
scene0610_00
scene0610_01
scene0610_02
scene0611_00
scene0611_01
scene0612_00
scene0612_01
scene0613_00
scene0613_01
scene0613_02
scene0614_00
scene0614_01
scene0614_02
scene0615_00
scene0615_01
scene0616_00
scene0616_01
scene0617_00
scene0618_00
scene0619_00
scene0620_00
scene0620_01
scene0621_00
scene0622_00
scene0622_01
scene0623_00
scene0623_01
scene0624_00
scene0625_00
scene0625_01
scene0626_00
scene0626_01
scene0626_02
scene0627_00
scene0627_01
scene0628_00
scene0628_01
scene0628_02
scene0629_00
scene0629_01
scene0629_02
scene0630_00
scene0630_01
scene0630_02
scene0630_03
scene0630_04
scene0630_05
scene0630_06
scene0631_00
scene0631_01
scene0631_02
scene0632_00
scene0633_00
scene0633_01
scene0634_00
scene0635_00
scene0635_01
scene0636_00
scene0637_00
scene0638_00
scene0639_00
scene0640_00
scene0640_01
scene0640_02
scene0641_00
scene0642_00
scene0642_01
scene0642_02
scene0642_03
scene0643_00
scene0644_00
scene0645_00
scene0645_01
scene0645_02
scene0646_00
scene0646_01
scene0646_02
scene0647_00
scene0647_01
scene0648_00
scene0648_01
scene0649_00
scene0649_01
scene0650_00
scene0651_00
scene0651_01
scene0651_02
scene0652_00
scene0653_00
scene0653_01
scene0654_00
scene0654_01
scene0655_00
scene0655_01
scene0655_02
scene0656_00
scene0656_01
scene0656_02
scene0656_03
scene0657_00
scene0658_00
scene0659_00
scene0659_01
scene0660_00
scene0661_00
scene0662_00
scene0662_01
scene0662_02
scene0663_00
scene0663_01
scene0663_02
scene0664_00
scene0664_01
scene0664_02
scene0665_00
scene0665_01
scene0666_00
scene0666_01
scene0666_02
scene0667_00
scene0667_01
scene0667_02
scene0668_00
scene0669_00
scene0669_01
scene0670_00
scene0670_01
scene0671_00
scene0671_01
scene0672_00
scene0672_01
scene0673_00
scene0673_01
scene0673_02
scene0673_03
scene0673_04
scene0673_05
scene0674_00
scene0674_01
scene0675_00
scene0675_01
scene0676_00
scene0676_01
scene0677_00
scene0677_01
scene0677_02
scene0678_00
scene0678_01
scene0678_02
scene0679_00
scene0679_01
scene0680_00
scene0680_01
scene0681_00
scene0682_00
scene0683_00
scene0684_00
scene0684_01
scene0685_00
scene0685_01
scene0685_02
scene0686_00
scene0686_01
scene0686_02
scene0687_00
scene0688_00
scene0689_00
scene0690_00
scene0690_01
scene0691_00
scene0691_01
scene0692_00
scene0692_01
scene0692_02
scene0692_03
scene0692_04
scene0693_00
scene0693_01
scene0693_02
scene0694_00
scene0694_01
scene0695_00
scene0695_01
scene0695_02
scene0695_03
scene0696_00
scene0696_01
scene0696_02
scene0697_00
scene0697_01
scene0697_02
scene0697_03
scene0698_00
scene0698_01
scene0699_00
scene0700_00
scene0700_01
scene0700_02
scene0701_00
scene0701_01
scene0701_02
scene0702_00
scene0702_01
scene0702_02
scene0703_00
scene0703_01
scene0704_00
scene0704_01
scene0705_00
scene0705_01
scene0705_02
scene0706_00
## /datasets/scannet/meta_data/scannetv2_val.txt
scene0568_00
scene0568_01
scene0568_02
scene0304_00
scene0488_00
scene0488_01
scene0412_00
scene0412_01
scene0217_00
scene0019_00
scene0019_01
scene0414_00
scene0575_00
scene0575_01
scene0575_02
scene0426_00
scene0426_01
scene0426_02
scene0426_03
scene0549_00
scene0549_01
scene0578_00
scene0578_01
scene0578_02
scene0665_00
scene0665_01
scene0050_00
scene0050_01
scene0050_02
scene0257_00
scene0025_00
scene0025_01
scene0025_02
scene0583_00
scene0583_01
scene0583_02
scene0701_00
scene0701_01
scene0701_02
scene0580_00
scene0580_01
scene0565_00
scene0169_00
scene0169_01
scene0655_00
scene0655_01
scene0655_02
scene0063_00
scene0221_00
scene0221_01
scene0591_00
scene0591_01
scene0591_02
scene0678_00
scene0678_01
scene0678_02
scene0462_00
scene0427_00
scene0595_00
scene0193_00
scene0193_01
scene0164_00
scene0164_01
scene0164_02
scene0164_03
scene0598_00
scene0598_01
scene0598_02
scene0599_00
scene0599_01
scene0599_02
scene0328_00
scene0300_00
scene0300_01
scene0354_00
scene0458_00
scene0458_01
scene0423_00
scene0423_01
scene0423_02
scene0307_00
scene0307_01
scene0307_02
scene0606_00
scene0606_01
scene0606_02
scene0432_00
scene0432_01
scene0608_00
scene0608_01
scene0608_02
scene0651_00
scene0651_01
scene0651_02
scene0430_00
scene0430_01
scene0689_00
scene0357_00
scene0357_01
scene0574_00
scene0574_01
scene0574_02
scene0329_00
scene0329_01
scene0329_02
scene0153_00
scene0153_01
scene0616_00
scene0616_01
scene0671_00
scene0671_01
scene0618_00
scene0382_00
scene0382_01
scene0490_00
scene0621_00
scene0607_00
scene0607_01
scene0149_00
scene0695_00
scene0695_01
scene0695_02
scene0695_03
scene0389_00
scene0377_00
scene0377_01
scene0377_02
scene0342_00
scene0139_00
scene0629_00
scene0629_01
scene0629_02
scene0496_00
scene0633_00
scene0633_01
scene0518_00
scene0652_00
scene0406_00
scene0406_01
scene0406_02
scene0144_00
scene0144_01
scene0494_00
scene0278_00
scene0278_01
scene0316_00
scene0609_00
scene0609_01
scene0609_02
scene0609_03
scene0084_00
scene0084_01
scene0084_02
scene0696_00
scene0696_01
scene0696_02
scene0351_00
scene0351_01
scene0643_00
scene0644_00
scene0645_00
scene0645_01
scene0645_02
scene0081_00
scene0081_01
scene0081_02
scene0647_00
scene0647_01
scene0535_00
scene0353_00
scene0353_01
scene0353_02
scene0559_00
scene0559_01
scene0559_02
scene0593_00
scene0593_01
scene0246_00
scene0653_00
scene0653_01
scene0064_00
scene0064_01
scene0356_00
scene0356_01
scene0356_02
scene0030_00
scene0030_01
scene0030_02
scene0222_00
scene0222_01
scene0338_00
scene0338_01
scene0338_02
scene0378_00
scene0378_01
scene0378_02
scene0660_00
scene0553_00
scene0553_01
scene0553_02
scene0527_00
scene0663_00
scene0663_01
scene0663_02
scene0664_00
scene0664_01
scene0664_02
scene0334_00
scene0334_01
scene0334_02
scene0046_00
scene0046_01
scene0046_02
scene0203_00
scene0203_01
scene0203_02
scene0088_00
scene0088_01
scene0088_02
scene0088_03
scene0086_00
scene0086_01
scene0086_02
scene0670_00
scene0670_01
scene0256_00
scene0256_01
scene0256_02
scene0249_00
scene0441_00
scene0658_00
scene0704_00
scene0704_01
scene0187_00
scene0187_01
scene0131_00
scene0131_01
scene0131_02
scene0207_00
scene0207_01
scene0207_02
scene0461_00
scene0011_00
scene0011_01
scene0343_00
scene0251_00
scene0077_00
scene0077_01
scene0684_00
scene0684_01
scene0550_00
scene0686_00
scene0686_01
scene0686_02
scene0208_00
scene0500_00
scene0500_01
scene0552_00
scene0552_01
scene0648_00
scene0648_01
scene0435_00
scene0435_01
scene0435_02
scene0435_03
scene0690_00
scene0690_01
scene0693_00
scene0693_01
scene0693_02
scene0700_00
scene0700_01
scene0700_02
scene0699_00
scene0231_00
scene0231_01
scene0231_02
scene0697_00
scene0697_01
scene0697_02
scene0697_03
scene0474_00
scene0474_01
scene0474_02
scene0474_03
scene0474_04
scene0474_05
scene0355_00
scene0355_01
scene0146_00
scene0146_01
scene0146_02
scene0196_00
scene0702_00
scene0702_01
scene0702_02
scene0314_00
scene0277_00
scene0277_01
scene0277_02
scene0095_00
scene0095_01
scene0015_00
scene0100_00
scene0100_01
scene0100_02
scene0558_00
scene0558_01
scene0558_02
scene0685_00
scene0685_01
scene0685_02
## /datasets/scannet/reader.py
```py path="/datasets/scannet/reader.py"
import argparse
from concurrent.futures import process
import os, sys
from tqdm import tqdm
from multiprocessing.pool import Pool
from functools import partial
from multiprocessing import Manager
from SensorData import SensorData
# params
parser = argparse.ArgumentParser()
# data paths
parser.add_argument('--scans_folder', default='/root/paddlejob/workspace/env_run/zhouzhen05/Data/zhouzhen05/Data')
parser.add_argument('--scan_list_file', default='/root/paddlejob/workspace/env_run/zhouzhen05/Code/DRecon/datasets/scannet/meta_data/scannetv2_train.txt')
parser.add_argument('--single_debug_scan_id', required=False, default=None, help='single scan to debug')
parser.add_argument('--output_path', default='/root/paddlejob/workspace/env_run/zhouzhen05/Data/zhouzhen05/Data/Scannet_Reader/scans')
parser.add_argument('--export_depth_images', default=True)
parser.add_argument('--export_color_images', default=True)
parser.add_argument('--export_poses', default=True)
parser.add_argument('--export_intrinsics', default=True)
parser.add_argument('--num_workers', type=int, default=1)
parser.add_argument('--rgb_resize', nargs='+', type=int, default=None, help='width height')
parser.add_argument('--depth_resize', nargs='+', type=int, default=None, help='width height')
parser.set_defaults(export_depth_images=False, export_color_images=False, export_poses=False, export_intrinsics=False)
opt = parser.parse_args()
print(opt)
def process_scan(opt, scan_job, count=None, progress=None):
filename = scan_job[0]
output_path = scan_job[1]
scan_name = scan_job[2]
if not os.path.exists(output_path):
os.makedirs(output_path)
# load the data
sys.stdout.write('loading %s...' % opt.scans_folder)
sd = SensorData(filename)
sys.stdout.write('loaded!\n')
opt.export_depth_images, opt.export_color_images, opt.export_poses, opt.export_intrinsics = True, True, True, True
if opt.export_depth_images:
sd.export_depth_images(os.path.join(output_path, 'depth'), image_size=opt.depth_resize)
if opt.export_color_images:
sd.export_color_images(os.path.join(output_path, 'color'), image_size=opt.rgb_resize)
if opt.export_poses:
sd.export_poses(os.path.join(output_path, 'pose'))
if opt.export_intrinsics:
sd.export_intrinsics(output_path, scan_name)
if progress is not None:
progress.value += 1
print(f"Completed scan {filename}, {progress.value} of total {count}.")
def main():
if opt.single_debug_scan_id is not None:
scans = [opt.single_debug_scan_id]
else:
f = open(opt.scan_list_file, "r")
scans = f.readlines()
scans = [scan.strip() for scan in scans]
# input_files = [os.path.join(opt.scans_folder, f"{scan}/{scan}.sens") for
# scan in scans]
input_files = [os.path.join(opt.scans_folder, f"{scan}.sens") for scan in scans]
output_dirs = [os.path.join(opt.output_path, scan) for scan in scans]
scan_jobs = list(zip(input_files, output_dirs, scans))
if opt.num_workers == 1:
for scan_job in tqdm(scan_jobs):
process_scan(opt, scan_job)
else:
pool = Pool(opt.num_workers)
manager = Manager()
count = len(scan_jobs)
progress = manager.Value('i', 0)
pool.map(
partial(
process_scan,
opt,
count=count,
progress=progress
),
scan_jobs,
)
if __name__ == '__main__':
main()
```
## /datasets/scannet/scannet_utils.py
```py path="/datasets/scannet/scannet_utils.py"
"""Ref: https://github.com/ScanNet/ScanNet/blob/master/BenchmarkScripts
"""
import csv
import os
import numpy as np
from plyfile import PlyData
def represents_int(s):
"""Judge whether string s represents an int.
Args:
s(str): The input string to be judged.
Returns:
bool: Whether s represents int or not.
"""
try:
int(s)
return True
except ValueError:
return False
def read_label_mapping(filename,
label_from='raw_category',
label_to='nyu40id'):
assert os.path.isfile(filename)
mapping = dict()
with open(filename) as csvfile:
reader = csv.DictReader(csvfile, delimiter='\t')
for row in reader:
mapping[row[label_from]] = int(row[label_to])
if represents_int(list(mapping.keys())[0]):
mapping = {int(k): v for k, v in mapping.items()}
return mapping
def read_mesh_vertices(filename):
"""Read XYZ for each vertex.
Args:
filename(str): The name of the mesh vertices file.
Returns:
ndarray: Vertices.
"""
assert os.path.isfile(filename)
with open(filename, 'rb') as f:
plydata = PlyData.read(f)
num_verts = plydata['vertex'].count
vertices = np.zeros(shape=[num_verts, 3], dtype=np.float32)
vertices[:, 0] = plydata['vertex'].data['x']
vertices[:, 1] = plydata['vertex'].data['y']
vertices[:, 2] = plydata['vertex'].data['z']
return vertices
def read_mesh_vertices_rgb(filename):
"""Read XYZ and RGB for each vertex.
Args:
filename(str): The name of the mesh vertices file.
Returns:
Vertices. Note that RGB values are in 0-255.
"""
assert os.path.isfile(filename)
with open(filename, 'rb') as f:
plydata = PlyData.read(f)
num_verts = plydata['vertex'].count
vertices = np.zeros(shape=[num_verts, 6], dtype=np.float32)
vertices[:, 0] = plydata['vertex'].data['x']
vertices[:, 1] = plydata['vertex'].data['y']
vertices[:, 2] = plydata['vertex'].data['z']
vertices[:, 3] = plydata['vertex'].data['red']
vertices[:, 4] = plydata['vertex'].data['green']
vertices[:, 5] = plydata['vertex'].data['blue']
return vertices
```
## /datasets/scannet/scannetv2_test.txt
scene0707_00
scene0708_00
scene0709_00
scene0710_00
scene0711_00
scene0712_00
scene0713_00
scene0714_00
scene0715_00
scene0716_00
scene0717_00
scene0718_00
scene0719_00
scene0720_00
scene0721_00
scene0722_00
scene0723_00
scene0724_00
scene0725_00
scene0726_00
scene0727_00
scene0728_00
scene0729_00
scene0730_00
scene0731_00
scene0732_00
scene0733_00
scene0734_00
scene0735_00
scene0736_00
scene0737_00
scene0738_00
scene0739_00
scene0740_00
scene0741_00
scene0742_00
scene0743_00
scene0744_00
scene0745_00
scene0746_00
scene0747_00
scene0748_00
scene0749_00
scene0750_00
scene0751_00
scene0752_00
scene0753_00
scene0754_00
scene0755_00
scene0756_00
scene0757_00
scene0758_00
scene0759_00
scene0760_00
scene0761_00
scene0762_00
scene0763_00
scene0764_00
scene0765_00
scene0766_00
scene0767_00
scene0768_00
scene0769_00
scene0770_00
scene0771_00
scene0772_00
scene0773_00
scene0774_00
scene0775_00
scene0776_00
scene0777_00
scene0778_00
scene0779_00
scene0780_00
scene0781_00
scene0782_00
scene0783_00
scene0784_00
scene0785_00
scene0786_00
scene0787_00
scene0788_00
scene0789_00
scene0790_00
scene0791_00
scene0792_00
scene0793_00
scene0794_00
scene0795_00
scene0796_00
scene0797_00
scene0798_00
scene0799_00
scene0800_00
scene0801_00
scene0802_00
scene0803_00
scene0804_00
scene0805_00
scene0806_00
## /datasets/scannet/scannetv2_train.txt
scene0191_00
scene0191_01
scene0191_02
scene0119_00
scene0230_00
scene0528_00
scene0528_01
scene0705_00
scene0705_01
scene0705_02
scene0415_00
scene0415_01
scene0415_02
scene0007_00
scene0141_00
scene0141_01
scene0141_02
scene0515_00
scene0515_01
scene0515_02
scene0447_00
scene0447_01
scene0447_02
scene0531_00
scene0503_00
scene0285_00
scene0069_00
scene0584_00
scene0584_01
scene0584_02
scene0581_00
scene0581_01
scene0581_02
scene0620_00
scene0620_01
scene0263_00
scene0263_01
scene0481_00
scene0481_01
scene0020_00
scene0020_01
scene0291_00
scene0291_01
scene0291_02
scene0469_00
scene0469_01
scene0469_02
scene0659_00
scene0659_01
scene0024_00
scene0024_01
scene0024_02
scene0564_00
scene0117_00
scene0027_00
scene0027_01
scene0027_02
scene0028_00
scene0330_00
scene0418_00
scene0418_01
scene0418_02
scene0233_00
scene0233_01
scene0673_00
scene0673_01
scene0673_02
scene0673_03
scene0673_04
scene0673_05
scene0585_00
scene0585_01
scene0362_00
scene0362_01
scene0362_02
scene0362_03
scene0035_00
scene0035_01
scene0358_00
scene0358_01
scene0358_02
scene0037_00
scene0194_00
scene0321_00
scene0293_00
scene0293_01
scene0623_00
scene0623_01
scene0592_00
scene0592_01
scene0569_00
scene0569_01
scene0413_00
scene0313_00
scene0313_01
scene0313_02
scene0480_00
scene0480_01
scene0401_00
scene0517_00
scene0517_01
scene0517_02
scene0032_00
scene0032_01
scene0613_00
scene0613_01
scene0613_02
scene0306_00
scene0306_01
scene0052_00
scene0052_01
scene0052_02
scene0053_00
scene0444_00
scene0444_01
scene0055_00
scene0055_01
scene0055_02
scene0560_00
scene0589_00
scene0589_01
scene0589_02
scene0610_00
scene0610_01
scene0610_02
scene0364_00
scene0364_01
scene0383_00
scene0383_01
scene0383_02
scene0006_00
scene0006_01
scene0006_02
scene0275_00
scene0451_00
scene0451_01
scene0451_02
scene0451_03
scene0451_04
scene0451_05
scene0135_00
scene0065_00
scene0065_01
scene0065_02
scene0104_00
scene0674_00
scene0674_01
scene0448_00
scene0448_01
scene0448_02
scene0502_00
scene0502_01
scene0502_02
scene0440_00
scene0440_01
scene0440_02
scene0071_00
scene0072_00
scene0072_01
scene0072_02
scene0509_00
scene0509_01
scene0509_02
scene0649_00
scene0649_01
scene0602_00
scene0694_00
scene0694_01
scene0101_00
scene0101_01
scene0101_02
scene0101_03
scene0101_04
scene0101_05
scene0218_00
scene0218_01
scene0579_00
scene0579_01
scene0579_02
scene0039_00
scene0039_01
scene0493_00
scene0493_01
scene0242_00
scene0242_01
scene0242_02
scene0083_00
scene0083_01
scene0127_00
scene0127_01
scene0662_00
scene0662_01
scene0662_02
scene0018_00
scene0087_00
scene0087_01
scene0087_02
scene0332_00
scene0332_01
scene0332_02
scene0628_00
scene0628_01
scene0628_02
scene0134_00
scene0134_01
scene0134_02
scene0238_00
scene0238_01
scene0092_00
scene0092_01
scene0092_02
scene0092_03
scene0092_04
scene0022_00
scene0022_01
scene0467_00
scene0392_00
scene0392_01
scene0392_02
scene0424_00
scene0424_01
scene0424_02
scene0646_00
scene0646_01
scene0646_02
scene0098_00
scene0098_01
scene0044_00
scene0044_01
scene0044_02
scene0510_00
scene0510_01
scene0510_02
scene0571_00
scene0571_01
scene0166_00
scene0166_01
scene0166_02
scene0563_00
scene0172_00
scene0172_01
scene0388_00
scene0388_01
scene0215_00
scene0215_01
scene0252_00
scene0287_00
scene0668_00
scene0572_00
scene0572_01
scene0572_02
scene0026_00
scene0224_00
scene0113_00
scene0113_01
scene0551_00
scene0381_00
scene0381_01
scene0381_02
scene0371_00
scene0371_01
scene0460_00
scene0118_00
scene0118_01
scene0118_02
scene0417_00
scene0008_00
scene0634_00
scene0521_00
scene0123_00
scene0123_01
scene0123_02
scene0045_00
scene0045_01
scene0511_00
scene0511_01
scene0114_00
scene0114_01
scene0114_02
scene0070_00
scene0029_00
scene0029_01
scene0029_02
scene0129_00
scene0103_00
scene0103_01
scene0002_00
scene0002_01
scene0132_00
scene0132_01
scene0132_02
scene0124_00
scene0124_01
scene0143_00
scene0143_01
scene0143_02
scene0604_00
scene0604_01
scene0604_02
scene0507_00
scene0105_00
scene0105_01
scene0105_02
scene0428_00
scene0428_01
scene0311_00
scene0140_00
scene0140_01
scene0182_00
scene0182_01
scene0182_02
scene0142_00
scene0142_01
scene0399_00
scene0399_01
scene0012_00
scene0012_01
scene0012_02
scene0060_00
scene0060_01
scene0370_00
scene0370_01
scene0370_02
scene0310_00
scene0310_01
scene0310_02
scene0661_00
scene0650_00
scene0152_00
scene0152_01
scene0152_02
scene0158_00
scene0158_01
scene0158_02
scene0482_00
scene0482_01
scene0600_00
scene0600_01
scene0600_02
scene0393_00
scene0393_01
scene0393_02
scene0562_00
scene0174_00
scene0174_01
scene0157_00
scene0157_01
scene0161_00
scene0161_01
scene0161_02
scene0159_00
scene0254_00
scene0254_01
scene0115_00
scene0115_01
scene0115_02
scene0162_00
scene0163_00
scene0163_01
scene0523_00
scene0523_01
scene0523_02
scene0459_00
scene0459_01
scene0175_00
scene0085_00
scene0085_01
scene0279_00
scene0279_01
scene0279_02
scene0201_00
scene0201_01
scene0201_02
scene0283_00
scene0456_00
scene0456_01
scene0429_00
scene0043_00
scene0043_01
scene0419_00
scene0419_01
scene0419_02
scene0368_00
scene0368_01
scene0348_00
scene0348_01
scene0348_02
scene0442_00
scene0178_00
scene0380_00
scene0380_01
scene0380_02
scene0165_00
scene0165_01
scene0165_02
scene0181_00
scene0181_01
scene0181_02
scene0181_03
scene0333_00
scene0614_00
scene0614_01
scene0614_02
scene0404_00
scene0404_01
scene0404_02
scene0185_00
scene0126_00
scene0126_01
scene0126_02
scene0519_00
scene0236_00
scene0236_01
scene0189_00
scene0075_00
scene0267_00
scene0192_00
scene0192_01
scene0192_02
scene0281_00
scene0420_00
scene0420_01
scene0420_02
scene0195_00
scene0195_01
scene0195_02
scene0597_00
scene0597_01
scene0597_02
scene0041_00
scene0041_01
scene0111_00
scene0111_01
scene0111_02
scene0666_00
scene0666_01
scene0666_02
scene0200_00
scene0200_01
scene0200_02
scene0536_00
scene0536_01
scene0536_02
scene0390_00
scene0280_00
scene0280_01
scene0280_02
scene0344_00
scene0344_01
scene0205_00
scene0205_01
scene0205_02
scene0484_00
scene0484_01
scene0009_00
scene0009_01
scene0009_02
scene0302_00
scene0302_01
scene0209_00
scene0209_01
scene0209_02
scene0210_00
scene0210_01
scene0395_00
scene0395_01
scene0395_02
scene0683_00
scene0601_00
scene0601_01
scene0214_00
scene0214_01
scene0214_02
scene0477_00
scene0477_01
scene0439_00
scene0439_01
scene0468_00
scene0468_01
scene0468_02
scene0546_00
scene0466_00
scene0466_01
scene0220_00
scene0220_01
scene0220_02
scene0122_00
scene0122_01
scene0130_00
scene0110_00
scene0110_01
scene0110_02
scene0327_00
scene0156_00
scene0266_00
scene0266_01
scene0001_00
scene0001_01
scene0228_00
scene0199_00
scene0219_00
scene0464_00
scene0232_00
scene0232_01
scene0232_02
scene0299_00
scene0299_01
scene0530_00
scene0363_00
scene0453_00
scene0453_01
scene0570_00
scene0570_01
scene0570_02
scene0183_00
scene0239_00
scene0239_01
scene0239_02
scene0373_00
scene0373_01
scene0241_00
scene0241_01
scene0241_02
scene0188_00
scene0622_00
scene0622_01
scene0244_00
scene0244_01
scene0691_00
scene0691_01
scene0206_00
scene0206_01
scene0206_02
scene0247_00
scene0247_01
scene0061_00
scene0061_01
scene0082_00
scene0250_00
scene0250_01
scene0250_02
scene0501_00
scene0501_01
scene0501_02
scene0320_00
scene0320_01
scene0320_02
scene0320_03
scene0631_00
scene0631_01
scene0631_02
scene0255_00
scene0255_01
scene0255_02
scene0047_00
scene0265_00
scene0265_01
scene0265_02
scene0004_00
scene0336_00
scene0336_01
scene0058_00
scene0058_01
scene0260_00
scene0260_01
scene0260_02
scene0243_00
scene0603_00
scene0603_01
scene0093_00
scene0093_01
scene0093_02
scene0109_00
scene0109_01
scene0434_00
scene0434_01
scene0434_02
scene0290_00
scene0627_00
scene0627_01
scene0470_00
scene0470_01
scene0137_00
scene0137_01
scene0137_02
scene0270_00
scene0270_01
scene0270_02
scene0271_00
scene0271_01
scene0504_00
scene0274_00
scene0274_01
scene0274_02
scene0036_00
scene0036_01
scene0276_00
scene0276_01
scene0272_00
scene0272_01
scene0499_00
scene0698_00
scene0698_01
scene0051_00
scene0051_01
scene0051_02
scene0051_03
scene0108_00
scene0245_00
scene0369_00
scene0369_01
scene0369_02
scene0284_00
scene0289_00
scene0289_01
scene0286_00
scene0286_01
scene0286_02
scene0286_03
scene0031_00
scene0031_01
scene0031_02
scene0545_00
scene0545_01
scene0545_02
scene0557_00
scene0557_01
scene0557_02
scene0533_00
scene0533_01
scene0116_00
scene0116_01
scene0116_02
scene0611_00
scene0611_01
scene0688_00
scene0294_00
scene0294_01
scene0294_02
scene0295_00
scene0295_01
scene0296_00
scene0296_01
scene0596_00
scene0596_01
scene0596_02
scene0532_00
scene0532_01
scene0637_00
scene0638_00
scene0121_00
scene0121_01
scene0121_02
scene0040_00
scene0040_01
scene0197_00
scene0197_01
scene0197_02
scene0410_00
scene0410_01
scene0305_00
scene0305_01
scene0615_00
scene0615_01
scene0703_00
scene0703_01
scene0555_00
scene0297_00
scene0297_01
scene0297_02
scene0582_00
scene0582_01
scene0582_02
scene0023_00
scene0094_00
scene0013_00
scene0013_01
scene0013_02
scene0136_00
scene0136_01
scene0136_02
scene0407_00
scene0407_01
scene0062_00
scene0062_01
scene0062_02
scene0386_00
scene0318_00
scene0554_00
scene0554_01
scene0497_00
scene0213_00
scene0258_00
scene0323_00
scene0323_01
scene0324_00
scene0324_01
scene0016_00
scene0016_01
scene0016_02
scene0681_00
scene0398_00
scene0398_01
scene0227_00
scene0090_00
scene0066_00
scene0262_00
scene0262_01
scene0155_00
scene0155_01
scene0155_02
scene0352_00
scene0352_01
scene0352_02
scene0038_00
scene0038_01
scene0038_02
scene0335_00
scene0335_01
scene0335_02
scene0261_00
scene0261_01
scene0261_02
scene0261_03
scene0640_00
scene0640_01
scene0640_02
scene0080_00
scene0080_01
scene0080_02
scene0403_00
scene0403_01
scene0282_00
scene0282_01
scene0282_02
scene0682_00
scene0173_00
scene0173_01
scene0173_02
scene0522_00
scene0687_00
scene0345_00
scene0345_01
scene0612_00
scene0612_01
scene0411_00
scene0411_01
scene0411_02
scene0625_00
scene0625_01
scene0211_00
scene0211_01
scene0211_02
scene0211_03
scene0676_00
scene0676_01
scene0179_00
scene0498_00
scene0498_01
scene0498_02
scene0547_00
scene0547_01
scene0547_02
scene0269_00
scene0269_01
scene0269_02
scene0366_00
scene0680_00
scene0680_01
scene0588_00
scene0588_01
scene0588_02
scene0588_03
scene0346_00
scene0346_01
scene0359_00
scene0359_01
scene0014_00
scene0120_00
scene0120_01
scene0212_00
scene0212_01
scene0212_02
scene0176_00
scene0049_00
scene0259_00
scene0259_01
scene0586_00
scene0586_01
scene0586_02
scene0309_00
scene0309_01
scene0125_00
scene0455_00
scene0177_00
scene0177_01
scene0177_02
scene0326_00
scene0372_00
scene0171_00
scene0171_01
scene0374_00
scene0654_00
scene0654_01
scene0445_00
scene0445_01
scene0475_00
scene0475_01
scene0475_02
scene0349_00
scene0349_01
scene0234_00
scene0669_00
scene0669_01
scene0375_00
scene0375_01
scene0375_02
scene0387_00
scene0387_01
scene0387_02
scene0312_00
scene0312_01
scene0312_02
scene0384_00
scene0385_00
scene0385_01
scene0385_02
scene0000_00
scene0000_01
scene0000_02
scene0376_00
scene0376_01
scene0376_02
scene0301_00
scene0301_01
scene0301_02
scene0322_00
scene0542_00
scene0079_00
scene0079_01
scene0099_00
scene0099_01
scene0476_00
scene0476_01
scene0476_02
scene0394_00
scene0394_01
scene0147_00
scene0147_01
scene0067_00
scene0067_01
scene0067_02
scene0397_00
scene0397_01
scene0337_00
scene0337_01
scene0337_02
scene0431_00
scene0223_00
scene0223_01
scene0223_02
scene0010_00
scene0010_01
scene0402_00
scene0268_00
scene0268_01
scene0268_02
scene0679_00
scene0679_01
scene0405_00
scene0128_00
scene0408_00
scene0408_01
scene0190_00
scene0107_00
scene0076_00
scene0167_00
scene0361_00
scene0361_01
scene0361_02
scene0216_00
scene0202_00
scene0303_00
scene0303_01
scene0303_02
scene0446_00
scene0446_01
scene0089_00
scene0089_01
scene0089_02
scene0360_00
scene0150_00
scene0150_01
scene0150_02
scene0421_00
scene0421_01
scene0421_02
scene0454_00
scene0626_00
scene0626_01
scene0626_02
scene0186_00
scene0186_01
scene0538_00
scene0479_00
scene0479_01
scene0479_02
scene0656_00
scene0656_01
scene0656_02
scene0656_03
scene0525_00
scene0525_01
scene0525_02
scene0308_00
scene0396_00
scene0396_01
scene0396_02
scene0624_00
scene0292_00
scene0292_01
scene0632_00
scene0253_00
scene0021_00
scene0325_00
scene0325_01
scene0437_00
scene0437_01
scene0438_00
scene0590_00
scene0590_01
scene0400_00
scene0400_01
scene0541_00
scene0541_01
scene0541_02
scene0677_00
scene0677_01
scene0677_02
scene0443_00
scene0315_00
scene0288_00
scene0288_01
scene0288_02
scene0422_00
scene0672_00
scene0672_01
scene0184_00
scene0449_00
scene0449_01
scene0449_02
scene0048_00
scene0048_01
scene0138_00
scene0452_00
scene0452_01
scene0452_02
scene0667_00
scene0667_01
scene0667_02
scene0463_00
scene0463_01
scene0078_00
scene0078_01
scene0078_02
scene0636_00
scene0457_00
scene0457_01
scene0457_02
scene0465_00
scene0465_01
scene0577_00
scene0151_00
scene0151_01
scene0339_00
scene0573_00
scene0573_01
scene0154_00
scene0096_00
scene0096_01
scene0096_02
scene0235_00
scene0168_00
scene0168_01
scene0168_02
scene0594_00
scene0587_00
scene0587_01
scene0587_02
scene0587_03
scene0229_00
scene0229_01
scene0229_02
scene0512_00
scene0106_00
scene0106_01
scene0106_02
scene0472_00
scene0472_01
scene0472_02
scene0489_00
scene0489_01
scene0489_02
scene0425_00
scene0425_01
scene0641_00
scene0526_00
scene0526_01
scene0317_00
scene0317_01
scene0544_00
scene0017_00
scene0017_01
scene0017_02
scene0042_00
scene0042_01
scene0042_02
scene0576_00
scene0576_01
scene0576_02
scene0347_00
scene0347_01
scene0347_02
scene0436_00
scene0226_00
scene0226_01
scene0485_00
scene0486_00
scene0487_00
scene0487_01
scene0619_00
scene0097_00
scene0367_00
scene0367_01
scene0491_00
scene0492_00
scene0492_01
scene0005_00
scene0005_01
scene0543_00
scene0543_01
scene0543_02
scene0657_00
scene0341_00
scene0341_01
scene0534_00
scene0534_01
scene0319_00
scene0273_00
scene0273_01
scene0225_00
scene0198_00
scene0003_00
scene0003_01
scene0003_02
scene0409_00
scene0409_01
scene0331_00
scene0331_01
scene0505_00
scene0505_01
scene0505_02
scene0505_03
scene0505_04
scene0506_00
scene0057_00
scene0057_01
scene0074_00
scene0074_01
scene0074_02
scene0091_00
scene0112_00
scene0112_01
scene0112_02
scene0240_00
scene0102_00
scene0102_01
scene0513_00
scene0514_00
scene0514_01
scene0537_00
scene0516_00
scene0516_01
scene0495_00
scene0617_00
scene0133_00
scene0520_00
scene0520_01
scene0635_00
scene0635_01
scene0054_00
scene0473_00
scene0473_01
scene0524_00
scene0524_01
scene0379_00
scene0471_00
scene0471_01
scene0471_02
scene0566_00
scene0248_00
scene0248_01
scene0248_02
scene0529_00
scene0529_01
scene0529_02
scene0391_00
scene0264_00
scene0264_01
scene0264_02
scene0675_00
scene0675_01
scene0350_00
scene0350_01
scene0350_02
scene0450_00
scene0068_00
scene0068_01
scene0237_00
scene0237_01
scene0365_00
scene0365_01
scene0365_02
scene0605_00
scene0605_01
scene0539_00
scene0539_01
scene0539_02
scene0540_00
scene0540_01
scene0540_02
scene0170_00
scene0170_01
scene0170_02
scene0433_00
scene0340_00
scene0340_01
scene0340_02
scene0160_00
scene0160_01
scene0160_02
scene0160_03
scene0160_04
scene0059_00
scene0059_01
scene0059_02
scene0056_00
scene0056_01
scene0478_00
scene0478_01
scene0548_00
scene0548_01
scene0548_02
scene0204_00
scene0204_01
scene0204_02
scene0033_00
scene0145_00
scene0483_00
scene0508_00
scene0508_01
scene0508_02
scene0180_00
scene0148_00
scene0556_00
scene0556_01
scene0416_00
scene0416_01
scene0416_02
scene0416_03
scene0416_04
scene0073_00
scene0073_01
scene0073_02
scene0073_03
scene0034_00
scene0034_01
scene0034_02
scene0639_00
scene0561_00
scene0561_01
scene0298_00
scene0692_00
scene0692_01
scene0692_02
scene0692_03
scene0692_04
scene0642_00
scene0642_01
scene0642_02
scene0642_03
scene0630_00
scene0630_01
scene0630_02
scene0630_03
scene0630_04
scene0630_05
scene0630_06
scene0706_00
scene0567_00
scene0567_01
## /datasets/scannet/scannetv2_val.txt
scene0568_00
scene0568_01
scene0568_02
scene0304_00
scene0488_00
scene0488_01
scene0412_00
scene0412_01
scene0217_00
scene0019_00
scene0019_01
scene0414_00
scene0575_00
scene0575_01
scene0575_02
scene0426_00
scene0426_01
scene0426_02
scene0426_03
scene0549_00
scene0549_01
scene0578_00
scene0578_01
scene0578_02
scene0665_00
scene0665_01
scene0050_00
scene0050_01
scene0050_02
scene0257_00
scene0025_00
scene0025_01
scene0025_02
scene0583_00
scene0583_01
scene0583_02
scene0701_00
scene0701_01
scene0701_02
scene0580_00
scene0580_01
scene0565_00
scene0169_00
scene0169_01
scene0655_00
scene0655_01
scene0655_02
scene0063_00
scene0221_00
scene0221_01
scene0591_00
scene0591_01
scene0591_02
scene0678_00
scene0678_01
scene0678_02
scene0462_00
scene0427_00
scene0595_00
scene0193_00
scene0193_01
scene0164_00
scene0164_01
scene0164_02
scene0164_03
scene0598_00
scene0598_01
scene0598_02
scene0599_00
scene0599_01
scene0599_02
scene0328_00
scene0300_00
scene0300_01
scene0354_00
scene0458_00
scene0458_01
scene0423_00
scene0423_01
scene0423_02
scene0307_00
scene0307_01
scene0307_02
scene0606_00
scene0606_01
scene0606_02
scene0432_00
scene0432_01
scene0608_00
scene0608_01
scene0608_02
scene0651_00
scene0651_01
scene0651_02
scene0430_00
scene0430_01
scene0689_00
scene0357_00
scene0357_01
scene0574_00
scene0574_01
scene0574_02
scene0329_00
scene0329_01
scene0329_02
scene0153_00
scene0153_01
scene0616_00
scene0616_01
scene0671_00
scene0671_01
scene0618_00
scene0382_00
scene0382_01
scene0490_00
scene0621_00
scene0607_00
scene0607_01
scene0149_00
scene0695_00
scene0695_01
scene0695_02
scene0695_03
scene0389_00
scene0377_00
scene0377_01
scene0377_02
scene0342_00
scene0139_00
scene0629_00
scene0629_01
scene0629_02
scene0496_00
scene0633_00
scene0633_01
scene0518_00
scene0652_00
scene0406_00
scene0406_01
scene0406_02
scene0144_00
scene0144_01
scene0494_00
scene0278_00
scene0278_01
scene0316_00
scene0609_00
scene0609_01
scene0609_02
scene0609_03
scene0084_00
scene0084_01
scene0084_02
scene0696_00
scene0696_01
scene0696_02
scene0351_00
scene0351_01
scene0643_00
scene0644_00
scene0645_00
scene0645_01
scene0645_02
scene0081_00
scene0081_01
scene0081_02
scene0647_00
scene0647_01
scene0535_00
scene0353_00
scene0353_01
scene0353_02
scene0559_00
scene0559_01
scene0559_02
scene0593_00
scene0593_01
scene0246_00
scene0653_00
scene0653_01
scene0064_00
scene0064_01
scene0356_00
scene0356_01
scene0356_02
scene0030_00
scene0030_01
scene0030_02
scene0222_00
scene0222_01
scene0338_00
scene0338_01
scene0338_02
scene0378_00
scene0378_01
scene0378_02
scene0660_00
scene0553_00
scene0553_01
scene0553_02
scene0527_00
scene0663_00
scene0663_01
scene0663_02
scene0664_00
scene0664_01
scene0664_02
scene0334_00
scene0334_01
scene0334_02
scene0046_00
scene0046_01
scene0046_02
scene0203_00
scene0203_01
scene0203_02
scene0088_00
scene0088_01
scene0088_02
scene0088_03
scene0086_00
scene0086_01
scene0086_02
scene0670_00
scene0670_01
scene0256_00
scene0256_01
scene0256_02
scene0249_00
scene0441_00
scene0658_00
scene0704_00
scene0704_01
scene0187_00
scene0187_01
scene0131_00
scene0131_01
scene0131_02
scene0207_00
scene0207_01
scene0207_02
scene0461_00
scene0011_00
scene0011_01
scene0343_00
scene0251_00
scene0077_00
scene0077_01
scene0684_00
scene0684_01
scene0550_00
scene0686_00
scene0686_01
scene0686_02
scene0208_00
scene0500_00
scene0500_01
scene0552_00
scene0552_01
scene0648_00
scene0648_01
scene0435_00
scene0435_01
scene0435_02
scene0435_03
scene0690_00
scene0690_01
scene0693_00
scene0693_01
scene0693_02
scene0700_00
scene0700_01
scene0700_02
scene0699_00
scene0231_00
scene0231_01
scene0231_02
scene0697_00
scene0697_01
scene0697_02
scene0697_03
scene0474_00
scene0474_01
scene0474_02
scene0474_03
scene0474_04
scene0474_05
scene0355_00
scene0355_01
scene0146_00
scene0146_01
scene0146_02
scene0196_00
scene0702_00
scene0702_01
scene0702_02
scene0314_00
scene0277_00
scene0277_01
scene0277_02
scene0095_00
scene0095_01
scene0015_00
scene0100_00
scene0100_01
scene0100_02
scene0558_00
scene0558_01
scene0558_02
scene0685_00
scene0685_01
scene0685_02
## /datasets/transforms.py
```py path="/datasets/transforms.py"
from PIL import Image, ImageOps
import numpy as np
from utils import coordinates
import transforms3d
import torch
from tools.tsdf_fusion.fusion import TSDFVolumeTorch
class Compose(object):
""" Apply a list of transforms sequentially"""
def __init__(self, transforms):
self.transforms = transforms
def __call__(self, data):
for transform in self.transforms:
data = transform(data)
return data
class ToTensor(object):
""" Convert to torch tensors"""
def __call__(self, data):
data['imgs'] = torch.Tensor(np.stack(data['imgs']).transpose([0, 3, 1, 2]))
data['intrinsics'] = torch.Tensor(data['intrinsics'])
data['extrinsics'] = torch.Tensor(data['extrinsics'])
if 'depth' in data.keys():
data['depth'] = torch.Tensor(np.stack(data['depth']))
if 'tsdf_list_full' in data.keys():
for i in range(len(data['tsdf_list_full'])):
if not torch.is_tensor(data['tsdf_list_full'][i]):
data['tsdf_list_full'][i] = torch.Tensor(data['tsdf_list_full'][i])
if 'rgb_list_full' in data:
data['rgb_list_full'][i] = torch.Tensor(data['rgb_list_full'][i])
data['semantic_list_full'][i] = torch.Tensor(data['semantic_list_full'][i])
data['instance_list_full'][i] = torch.Tensor(data['instance_list_full'][i])
return data
class IntrinsicsPoseToProjection(object):
""" Convert intrinsics and extrinsics matrices to a single projection matrix"""
def __init__(self, n_views, stride=1):
self.nviews = n_views
self.stride = stride
def rotate_view_to_align_xyplane(self, Tr_camera_to_world):
# world space normal [0, 0, 1] camera space normal [0, -1, 0]
z_c = np.dot(np.linalg.inv(Tr_camera_to_world), np.array([0, 0, 1, 0]))[: 3]
axis = np.cross(z_c, np.array([0, -1, 0]))
axis = axis / np.linalg.norm(axis)
theta = np.arccos(-z_c[1] / (np.linalg.norm(z_c)))
quat = transforms3d.quaternions.axangle2quat(axis, theta)
rotation_matrix = transforms3d.quaternions.quat2mat(quat)
return rotation_matrix
def __call__(self, data):
middle_pose = data['extrinsics'][self.nviews // 2]
rotation_matrix = self.rotate_view_to_align_xyplane(middle_pose)
rotation_matrix4x4 = np.eye(4)
rotation_matrix4x4[:3, :3] = rotation_matrix
data['world_to_aligned_camera'] = torch.from_numpy(rotation_matrix4x4).float() @ middle_pose.inverse()
proj_matrices = []
for intrinsics, extrinsics in zip(data['intrinsics'], data['extrinsics']):
view_proj_matrics = []
for i in range(3):
# from (camera to world) to (world to camera)
proj_mat = torch.inverse(extrinsics.data.cpu())
scale_intrinsics = intrinsics / self.stride / 2 ** i
scale_intrinsics[-1, -1] = 1
proj_mat[:3, :4] = scale_intrinsics @ proj_mat[:3, :4]
view_proj_matrics.append(proj_mat)
view_proj_matrics = torch.stack(view_proj_matrics)
proj_matrices.append(view_proj_matrics)
data['proj_matrices'] = torch.stack(proj_matrices)
data.pop('intrinsics')
data.pop('extrinsics')
return data
def pad_scannet(img, intrinsics):
""" Scannet images are 1296x968 but 1296x972 is 4x3
so we pad vertically 4 pixels to make it 4x3
"""
w, h = img.size
if w == 1296 and h == 968:
img = ImageOps.expand(img, border=(0, 2))
intrinsics[1, 2] += 2
return img, intrinsics
class ResizeImage(object):
""" Resize everything to given size.
Intrinsics are assumed to refer to image prior to resize.
After resize everything (ex: depth) should have the same intrinsics
"""
def __init__(self, size):
self.size = size
def __call__(self, data):
for i, im in enumerate(data['imgs']):
im, intrinsics = pad_scannet(im, data['intrinsics'][i])
w, h = im.size
im = im.resize(self.size, Image.BILINEAR)
intrinsics[0, :] /= (w / self.size[0])
intrinsics[1, :] /= (h / self.size[1])
data['imgs'][i] = np.array(im, dtype=np.float32)
data['intrinsics'][i] = intrinsics
return data
def __repr__(self):
return self.__class__.__name__ + '(size={0})'.format(self.size)
class RandomTransformSpace(object):
""" Apply a random 3x4 linear transform to the world coordinate system.
This affects pose as well as TSDFs.
"""
def __init__(self, voxel_dim, voxel_size, random_rotation=True, random_translation=True,
paddingXY=1.5, paddingZ=.25, origin=[0, 0, 0], max_epoch=999, max_depth=3.0):
"""
Args:
voxel_dim: tuple of 3 ints (nx,ny,nz) specifying
the size of the output volume
voxel_size: floats specifying the size of a voxel
random_rotation: wheater or not to apply a random rotation
random_translation: wheater or not to apply a random translation
paddingXY: amount to allow croping beyond maximum extent of TSDF
paddingZ: amount to allow croping beyond maximum extent of TSDF
origin: origin of the voxel volume (xyz position of voxel (0,0,0))
max_epoch: maximum epoch
max_depth: maximum depth
"""
self.voxel_dim = voxel_dim
self.origin = origin
self.voxel_size = voxel_size
self.random_rotation = random_rotation
self.random_translation = random_translation
self.max_depth = max_depth
self.padding_start = torch.Tensor([paddingXY, paddingXY, paddingZ])
# no need to pad above (bias towards floor in volume)
self.padding_end = torch.Tensor([paddingXY, paddingXY, 0])
# each epoch has the same transformation
self.random_r = torch.rand(max_epoch)
self.random_t = torch.rand((max_epoch, 3))
def __call__(self, data):
origin = torch.Tensor(data['vol_origin'])
if (not self.random_rotation) and (not self.random_translation):
T = torch.eye(4)
else:
# construct rotaion matrix about z axis
if self.random_rotation:
r = self.random_r[data['epoch'][0]] * 2 * np.pi
else:
r = 0
# first construct it in 2d so we can rotate bounding corners in the plane
R = torch.tensor([[np.cos(r), -np.sin(r)],
[np.sin(r), np.cos(r)]], dtype=torch.float32)
# get corners of bounding volume
voxel_dim_old = torch.tensor(data['tsdf_list_full'][0].shape) * self.voxel_size
xmin, ymin, zmin = origin
xmax, ymax, zmax = origin + voxel_dim_old
corners2d = torch.tensor([[xmin, xmin, xmax, xmax],
[ymin, ymax, ymin, ymax]], dtype=torch.float32)
# rotate corners in plane
corners2d = R @ corners2d
# get new bounding volume (add padding for data augmentation)
xmin = corners2d[0].min()
xmax = corners2d[0].max()
ymin = corners2d[1].min()
ymax = corners2d[1].max()
zmin = zmin
zmax = zmax
# randomly sample a crop
voxel_dim = list(data['tsdf_list_full'][0].shape)
start = torch.Tensor([xmin, ymin, zmin]) - self.padding_start
end = (-torch.Tensor(voxel_dim) * self.voxel_size +
torch.Tensor([xmax, ymax, zmax]) + self.padding_end)
if self.random_translation:
t = self.random_t[data['epoch'][0]]
else:
t = .5
t = t * start + (1 - t) * end - origin
T = torch.eye(4)
T[:2, :2] = R
T[:3, 3] = -t
for i in range(len(data['extrinsics'])):
data['extrinsics'][i] = T @ data['extrinsics'][i]
data['vol_origin'] = torch.tensor(self.origin, dtype=torch.float, device=T.device)
data = self.transform(data, T.inverse(), old_origin=origin)
# print('T: ', T)
return data
def transform(self, data, transform=None, old_origin=None,
align_corners=False):
""" Applies a 3x4 linear transformation to the TSDF.
Each voxel is moved according to the transformation and a new volume
is constructed with the result.
Args:
data: items from data loader
transform: 4x4 linear transform
old_origin: origin of the voxel volume (xyz position of voxel (0, 0, 0))
default (None) is the same as the input
align_corners:
Returns:
Items with new TSDF and occupancy in the transformed coordinates
"""
# ----------computing visual frustum hull------------
bnds = torch.zeros((3, 2))
bnds[:, 0] = np.inf
bnds[:, 1] = -np.inf
for i in range(data['imgs'].shape[0]): # N_views
size = data['imgs'][i].shape[1:]
cam_intr = data['intrinsics'][i]
cam_pose = data['extrinsics'][i]
view_frust_pts = get_view_frustum(self.max_depth, size, cam_intr, cam_pose)
bnds[:, 0] = torch.min(bnds[:, 0], torch.min(view_frust_pts, dim=1)[0])
bnds[:, 1] = torch.max(bnds[:, 1], torch.max(view_frust_pts, dim=1)[0])
# -------adjust volume bounds-------
num_layers = 3
center = (torch.tensor(((bnds[0, 1] + bnds[0, 0]) / 2, (bnds[1, 1] + bnds[1, 0]) / 2, -0.2)) - data[
'vol_origin']) / self.voxel_size
center[:2] = torch.round(center[:2] / 2 ** num_layers) * 2 ** num_layers
center[2] = torch.floor(center[2] / 2 ** num_layers) * 2 ** num_layers
origin = torch.zeros_like(center)
# origin[:2] = center[:2] - torch.tensor(self.voxel_dim[:2]) // 2
origin[:2] = center[:2] - torch.div(torch.tensor(self.voxel_dim[:2]), 2, rounding_mode='floor')
origin[2] = center[2]
vol_origin_partial = origin * self.voxel_size + data['vol_origin']
data['vol_origin_partial'] = vol_origin_partial
# ------get partial tsdf and occupancy ground truth--------
if 'tsdf_list_full' in data.keys():
# -------------grid coordinates------------------
old_origin = old_origin.view(1, 3) # scene
x, y, z = self.voxel_dim # e.g., [96, 96, 96]
coords = coordinates(self.voxel_dim, device=old_origin.device)
world = coords.type(torch.float) * self.voxel_size + vol_origin_partial.view(3, 1)
world = torch.cat((world, torch.ones_like(world[:1])), dim=0)
world = transform[:3, :] @ world # augmentation
coords = (world - old_origin.T) / self.voxel_size
if 'rgb_list_full' in data:
data['tsdf_list'] = []
data['occ_list'] = []
data['rgb_list'] = []
data['semantic_list'] = []
data['instance_list'] = []
for l, (tsdf_s, rgb_s, semantic_s, instance_s) in enumerate(
zip(data['tsdf_list_full'], data['rgb_list_full'], data['semantic_list_full'], data['instance_list_full'])):
# ------get partial tsdf and occ-------
# vol_dim_s = torch.tensor(self.voxel_dim) // 2 ** l
vol_dim_s = torch.div(torch.tensor(self.voxel_dim), 2 ** l, rounding_mode='floor')
tsdf_vol = TSDFVolumeTorch(vol_dim_s, vol_origin_partial,
voxel_size=self.voxel_size * 2 ** l, margin=3) # stage 0: voxel_size=0.04, sdf_trunc=0.12, stage 1: 0.08, 0.24, stage 2: 0.16, 0.48,
for i in range(data['imgs'].shape[0]):
depth_im = data['depth'][i]
cam_intr = data['intrinsics'][i]
cam_pose = data['extrinsics'][i]
tsdf_vol.integrate(depth_im, cam_intr, cam_pose, obs_weight=1.)
tsdf_vol, weight_vol = tsdf_vol.get_volume()
occ_vol = torch.zeros_like(tsdf_vol).bool()
occ_vol[(tsdf_vol < 0.999) & (tsdf_vol > -0.999) & (weight_vol > 1)] = True # at least 2 views
# grid sample expects coords in [-1,1]
coords_world_s = coords.view(3, x, y, z)[:, ::2 ** l, ::2 ** l, ::2 ** l] / 2 ** l
dim_s = list(coords_world_s.shape[1:])
coords_world_s = coords_world_s.view(3, -1)
old_voxel_dim = list(tsdf_s.shape)
coords_world_s = 2 * coords_world_s / (torch.Tensor(old_voxel_dim) - 1).view(3, 1) - 1 # [-1, 1]
coords_world_s = coords_world_s[[2, 1, 0]].T.view([1] + dim_s + [3])
# bilinear interpolation near surface,
# no interpolation along -1,1 boundry
tsdf_vol = torch.nn.functional.grid_sample(
tsdf_s.view([1, 1] + old_voxel_dim),
coords_world_s, mode='nearest', align_corners=align_corners
).squeeze()
tsdf_vol_bilin = torch.nn.functional.grid_sample(
tsdf_s.view([1, 1] + old_voxel_dim), coords_world_s, mode='bilinear',
align_corners=align_corners
).squeeze()
mask = tsdf_vol.abs() < 1
tsdf_vol[mask] = tsdf_vol_bilin[mask]
rgb_vol = torch.zeros_like(coords_world_s).squeeze(0)
for rgb_i in range(3):
rgb_vol[..., rgb_i] = torch.nn.functional.grid_sample(
rgb_s[..., rgb_i].view([1, 1] + old_voxel_dim),
coords_world_s, mode='nearest', align_corners=align_corners,
padding_mode='zeros'
).squeeze()
semantic_vol = torch.nn.functional.grid_sample(
semantic_s.view([1, 1] + old_voxel_dim),
coords_world_s, mode='nearest', align_corners=align_corners,
padding_mode='zeros'
).squeeze()
instance_vol = torch.nn.functional.grid_sample(
instance_s.view([1, 1] + old_voxel_dim),
coords_world_s, mode='nearest', align_corners=align_corners,
padding_mode='zeros'
).squeeze()
# padding_mode='ones' does not exist for grid_sample so replace
# elements that were on the boarder with 1.
# voxels beyond full volume (prior to croping) should be marked as empty
tsdf_np, coords_np = tsdf_vol.numpy(), coords_world_s.numpy()
count_neg_one = np.count_nonzero(tsdf_np == -1)
count_one = np.count_nonzero(tsdf_np == 1)
mask = (coords_world_s.abs() >= 1).squeeze(0).any(3)
tsdf_vol[mask] = 1
rgb_vol[mask] = torch.zeros(3)
semantic_vol[mask] = 0
instance_vol[mask] = 0
data['tsdf_list'].append(tsdf_vol)
data['occ_list'].append(occ_vol)
data['rgb_list'].append(rgb_vol)
data['semantic_list'].append(semantic_vol)
data['instance_list'].append(instance_vol)
data.pop('tsdf_list_full')
data.pop('rgb_list_full')
data.pop('semantic_list_full')
data.pop('instance_list_full')
data.pop('depth')
else:
data['tsdf_list'] = []
data['occ_list'] = []
for l, tsdf_s in enumerate(data['tsdf_list_full']):
# ------get partial tsdf and occ-------
# vol_dim_s = torch.tensor(self.voxel_dim) // 2 ** l
vol_dim_s = torch.div(torch.tensor(self.voxel_dim), 2 ** l, rounding_mode='floor')
tsdf_vol = TSDFVolumeTorch(vol_dim_s, vol_origin_partial,
voxel_size=self.voxel_size * 2 ** l,
margin=3) # stage 0: voxel_size=0.04, sdf_trunc=0.12, stage 1: 0.08, 0.24, stage 2: 0.16, 0.48,
for i in range(data['imgs'].shape[0]):
depth_im = data['depth'][i]
cam_intr = data['intrinsics'][i]
cam_pose = data['extrinsics'][i]
tsdf_vol.integrate(depth_im, cam_intr, cam_pose, obs_weight=1.)
tsdf_vol, weight_vol = tsdf_vol.get_volume()
occ_vol = torch.zeros_like(tsdf_vol).bool()
occ_vol[(tsdf_vol < 0.999) & (tsdf_vol > -0.999) & (weight_vol > 1)] = True
# grid sample expects coords in [-1,1]
coords_world_s = coords.view(3, x, y, z)[:, ::2 ** l, ::2 ** l, ::2 ** l] / 2 ** l
dim_s = list(coords_world_s.shape[1:])
coords_world_s = coords_world_s.view(3, -1)
old_voxel_dim = list(tsdf_s.shape)
coords_world_s = 2 * coords_world_s / (torch.Tensor(old_voxel_dim) - 1).view(3,
1) - 1 # fragment在scene中的相对位置 [-1, 1]
coords_world_s = coords_world_s[[2, 1, 0]].T.view([1] + dim_s + [3])
# bilinear interpolation near surface,
# no interpolation along -1,1 boundry
tsdf_vol = torch.nn.functional.grid_sample(
tsdf_s.view([1, 1] + old_voxel_dim),
coords_world_s, mode='nearest', align_corners=align_corners
).squeeze()
tsdf_vol_bilin = torch.nn.functional.grid_sample(
tsdf_s.view([1, 1] + old_voxel_dim), coords_world_s, mode='bilinear',
align_corners=align_corners
).squeeze()
mask = tsdf_vol.abs() < 1
tsdf_vol[mask] = tsdf_vol_bilin[mask]
dd = ((tsdf_vol < 0.999) & (tsdf_vol > -0.999)).sum()
# padding_mode='ones' does not exist for grid_sample so replace
# elements that were on the boarder with 1.
# voxels beyond full volume (prior to croping) should be marked as empty
tsdf_np, coords_np = tsdf_vol.numpy(), coords_world_s.numpy()
count_neg_one = np.count_nonzero(tsdf_np == -1)
count_one = np.count_nonzero(tsdf_np == 1)
mask = (coords_world_s.abs() >= 1).squeeze(0).any(3)
tsdf_vol[mask] = 1
data['tsdf_list'].append(tsdf_vol)
data['occ_list'].append(occ_vol)
data.pop('tsdf_list_full')
data.pop('depth')
data.pop('epoch')
return data
def __repr__(self):
return self.__class__.__name__
def rigid_transform(xyz, transform):
"""Applies a rigid transform to an (N, 3) pointcloud.
"""
xyz_h = torch.cat([xyz, torch.ones((len(xyz), 1))], dim=1)
xyz_t_h = (transform @ xyz_h.T).T
return xyz_t_h[:, :3]
def get_view_frustum(max_depth, size, cam_intr, cam_pose):
"""Get corners of 3D camera view frustum of depth image
"""
im_h, im_w = size
im_h = int(im_h)
im_w = int(im_w)
view_frust_pts = torch.stack([
(torch.tensor([0, 0, 0, im_w, im_w]) - cam_intr[0, 2]) * torch.tensor(
[0, max_depth, max_depth, max_depth, max_depth]) /
cam_intr[0, 0],
(torch.tensor([0, 0, im_h, 0, im_h]) - cam_intr[1, 2]) * torch.tensor(
[0, max_depth, max_depth, max_depth, max_depth]) /
cam_intr[1, 1],
torch.tensor([0, max_depth, max_depth, max_depth, max_depth])
])
view_frust_pts = rigid_transform(view_frust_pts.T, cam_pose).T
return view_frust_pts
```
## /datasets/visualization.py
```py path="/datasets/visualization.py"
import numpy as np
import matplotlib.pyplot as plt
import pyvista as pv
from mpl_toolkits.mplot3d import Axes3D
def plot_voxels(voxels, title="Voxel Grid"):
grid = pv.UniformGrid()
grid.dimensions = np.array(voxels.shape) + 1
grid.origin = (0, 0, 0) # The bottom left corner of the data set
grid.spacing = (1, 1, 1) # These are the cell sizes along each axis
grid.cell_data["values"] = voxels.flatten(order="F") # Flatten the array!
plotter = pv.Plotter()
plotter.add_volume(grid, scalars="values", opacity='sigmoid', show_scalar_bar=False)
plotter.set_background("white")
plotter.camera_position = 'iso'
plotter.add_title(title)
plotter.show()
def visualize_mesh(input_A=None, input_B=None, type='xyz', use_path=False):
# A = np.load('datasets/scannet/panoptic_info/scene0000_00_vert.npy') xyz(N, 3) or xyzrgb(N, 6)
# B = np.load('datasets/scannet/panoptic_info/scene0000_00_sem_label.npy') class_id(N,)
# C = np.load('datasets/scannet/panoptic_info/scene0000_00_ins_label.npy') instance_id(N,)
if use_path:
A = np.load(input_A)
else:
A = input_A
N = A.shape[0]
A = A.reshape(N, -1)
if type == 'xyz':
points = pv.PolyData(A[:, :3])
gray_color = np.array([0.5, 0.5, 0.5])
points.point_data['gray_colors'] = np.tile(gray_color, (N, 1))
plotter = pv.Plotter()
plotter.add_points(points, scalars='gray_colors', rgb=True, point_size=5)
plotter.add_axes(interactive=True)
plotter.show_grid()
origin = pv.Sphere(radius=1.0, center=(0, 0, 0), direction=(0, 0, 1))
plotter.add_mesh(origin, color='red', show_edges=False)
plotter.view_vector([-1, -1, 0])
plotter.reset_camera()
plotter.show(title='PointCloud Gray Visualization')
elif type == 'rgb': # 0-255
rgb_values = A[:, 3:]
non_zero_indices = ~np.all(rgb_values == 0, axis=1)
A = A[non_zero_indices]
points_rgb = pv.PolyData(A[:, :3])
points_rgb['rgb'] = A[:, 3:6] / 255
plotter = pv.Plotter()
plotter.add_points(points_rgb, rgb=True, point_size=5)
plotter.add_axes(interactive=True)
plotter.show_grid()
origin = pv.Sphere(radius=1.0, center=(0, 0, 0), direction=(0, 0, 1))
plotter.add_mesh(origin, color='red', show_edges=False)
plotter.view_vector([-1, -1, 0])
plotter.reset_camera()
plotter.show(title='Point Cloud with RGB Colors')
elif type == 'semantic':
if use_path:
B = np.load(input_B)
else:
B = input_B
B = B.reshape(N, -1)
A = A[B.reshape(-1) > 0]
B = B[B.reshape(-1) > 0]
# manual_colors = np.array([
# [0, 0, 1],
# [1, 0, 0],
# [0, 1, 0]
# ])
# num_classes = 51
# colors = plt.get_cmap('Set3', num_classes)
# generated_colors = [colors(i / num_classes) for i in range(num_classes)]
# generated_colors = np.array(generated_colors)[:, :3]
#
# semantic_colors = np.vstack((manual_colors, generated_colors[3:num_classes]))
#
# semantic_class_colors = np.array([semantic_colors[i[0]] for i in B])
semantic_class_colors = np.random.rand(51, 3)
semantic_class_colors = semantic_class_colors[B.flatten()]
points_semantic = pv.PolyData(A[:, :3])
points_semantic['semantic_colors'] = semantic_class_colors
plotter = pv.Plotter()
plotter.add_points(points_semantic, rgb=True, point_size=5)
plotter.add_axes(interactive=True)
plotter.show_grid()
origin = pv.Sphere(radius=1.0, center=(0, 0, 0), direction=(0, 0, 1))
plotter.add_mesh(origin, color='red', show_edges=False)
plotter.view_vector([-1, -1, 0])
plotter.reset_camera()
plotter.show(title='Point Cloud with Semantic Labels')
elif type == 'instance':
if use_path:
B = np.load(input_B)
else:
B = input_B
B = B.reshape(N, -1)
A = A[B.reshape(-1) != 0]
B = B[B.reshape(-1) != 0]
points = pv.PolyData(A[:, :3])
instance_colors = np.random.rand(np.max(B)+1, 3)
instance_colors = instance_colors[B.flatten()]
points.point_data['instance_colors'] = instance_colors
plotter = pv.Plotter()
plotter.add_points(points, scalars='instance_colors', rgb=True, point_size=5)
plotter.add_axes(interactive=True)
plotter.show_grid()
origin = pv.Sphere(radius=1.0, center=(0, 0, 0), direction=(0, 0, 1))
plotter.add_mesh(origin, color='red', show_edges=False)
plotter.view_vector([-1, -1, 0])
plotter.reset_camera()
plotter.show(title='Instance Segmentation Result')
elif type == 'tsdf':
if use_path:
B = np.load(input_B)
else:
B = input_B
B = B.reshape(N, -1)
A = A[B.reshape(-1) > 0]
B = B[B.reshape(-1) > 0]
# normalized_tsdf = (B - np.min(B)) / (np.max(B) - np.min(B)) * 2 - 1
normalized_tsdf = B
grid = pv.PolyData(A[:, :3])
grid.point_data['tsdf'] = normalized_tsdf.flatten()
plotter = pv.Plotter()
plotter.add_mesh(grid, scalars='tsdf', cmap='plasma')
plotter.add_axes(interactive=True)
plotter.show_grid()
origin = pv.Sphere(radius=1.0, center=(0, 0, 0), direction=(0, 0, 1))
plotter.add_mesh(origin, color='red', show_edges=False)
plotter.view_vector([-1, -1, 0])
plotter.reset_camera()
plotter.show(title='TSDF Visualization')
else:
pass
if __name__ == '__main__':
path_A = 'datasets/scannet/panoptic_info/scene0000_00_vert.npy'
path_B = 'datasets/scannet/panoptic_info/scene0000_00_sem_label.npy'
path_C = 'datasets/scannet/panoptic_info/scene0000_00_ins_label.npy'
# visualize_mesh(path_A, type='xyz', use_path=True)
# visualize_mesh(path_A, type='rgb', use_path=True)
# visualize_mesh(path_A, path_B, type='semantic', use_path=True)
# visualize_mesh(path_A, path_C, type='instance', use_path=True)
visualize_mesh(path_A, path_B, type='tsdf', use_path=True)
```
## /demo/Overview.png
Binary file available at https://raw.githubusercontent.com/zhen6618/EPRecon/refs/heads/main/demo/Overview.png
## /demo/demo.gif
Binary file available at https://raw.githubusercontent.com/zhen6618/EPRecon/refs/heads/main/demo/demo.gif
## /main.py
```py path="/main.py"
import argparse
import os
import time
import datetime
import torch
import torch.distributed as dist
from torch.nn.parallel import DistributedDataParallel
from torch.utils.data import DataLoader
from tensorboardX import SummaryWriter
from loguru import logger
import torch.cuda.amp as amp
import psutil
import tracemalloc
from memory_profiler import profile
from utils import tensor2float, save_scalars, DictAverageMeter, SaveScene, make_nograd_func
from datasets import transforms, find_dataset_def
from models import NeuralRecon
from config import cfg, update_config
from datasets.sampler import DistributedSampler
from ops.comm import *
def args():
parser = argparse.ArgumentParser(description='A PyTorch Implementation of NeuralRecon')
# general
parser.add_argument('--cfg',
help='experiment configure file name',
default='./config/test.yaml', # test: 3GB, train: 8.7GB
# required=True,
type=str)
parser.add_argument('opts',
help="Modify config options using the command-line",
default=None,
nargs=argparse.REMAINDER)
# distributed training
parser.add_argument('--gpu',
help='gpu id for multiprocessing training',
type=str)
parser.add_argument('--world-size',
default=1,
type=int,
help='number of nodes for distributed training')
parser.add_argument('--dist-url',
default='tcp://127.0.0.1:23456',
type=str,
help='url used to set up distributed training')
parser.add_argument('--local_rank',
default=0,
type=int,
help='node rank for distributed training')
# parse arguments and check
args = parser.parse_args()
return args
args = args()
update_config(cfg, args)
cfg.defrost()
num_gpus = int(os.environ["WORLD_SIZE"]) if "WORLD_SIZE" in os.environ else 1
print('number of gpus: {}'.format(num_gpus))
cfg.DISTRIBUTED = num_gpus > 1
if cfg.DISTRIBUTED:
torch.cuda.set_device(args.local_rank)
torch.distributed.init_process_group(
backend="nccl", init_method="env://"
)
synchronize()
cfg.LOCAL_RANK = args.local_rank
cfg.freeze()
torch.manual_seed(cfg.SEED)
torch.cuda.manual_seed(cfg.SEED)
# create logger
if is_main_process():
if not os.path.isdir(cfg.LOGDIR):
os.makedirs(cfg.LOGDIR)
current_time_str = str(datetime.datetime.now().strftime('%Y%m%d_%H%M%S'))
logfile_path = os.path.join(cfg.LOGDIR, f'{current_time_str}_{cfg.MODE}.log')
print('creating log file', logfile_path)
logger.add(logfile_path, format="{time} {level} {message}", level="INFO")
tb_writer = SummaryWriter(cfg.LOGDIR)
# Augmentation
if cfg.MODE == 'train':
n_views = cfg.TRAIN.N_VIEWS
random_rotation = cfg.TRAIN.RANDOM_ROTATION_3D
random_translation = cfg.TRAIN.RANDOM_TRANSLATION_3D
paddingXY = cfg.TRAIN.PAD_XY_3D
paddingZ = cfg.TRAIN.PAD_Z_3D
else:
n_views = cfg.TEST.N_VIEWS
random_rotation = False
random_translation = False
paddingXY = 0
paddingZ = 0
transform = []
transform += [transforms.ResizeImage((640, 480)),
transforms.ToTensor(),
transforms.RandomTransformSpace(
cfg.MODEL.N_VOX, cfg.MODEL.VOXEL_SIZE, random_rotation, random_translation,
paddingXY, paddingZ, max_epoch=cfg.TRAIN.EPOCHS),
transforms.IntrinsicsPoseToProjection(n_views, 4),
]
transforms = transforms.Compose(transform)
# dataset, dataloader
MVSDataset = find_dataset_def(cfg.DATASET)
train_dataset = MVSDataset(cfg.TRAIN.PATH, "train", transforms, cfg.TRAIN.N_VIEWS, len(cfg.MODEL.THRESHOLDS) - 1)
test_dataset = MVSDataset(cfg.TEST.PATH, "test", transforms, cfg.TEST.N_VIEWS, len(cfg.MODEL.THRESHOLDS) - 1)
if cfg.DISTRIBUTED:
train_sampler = DistributedSampler(train_dataset, shuffle=False)
TrainImgLoader = torch.utils.data.DataLoader(
train_dataset,
batch_size=cfg.BATCH_SIZE,
sampler=train_sampler,
num_workers=cfg.TRAIN.N_WORKERS,
pin_memory=True,
drop_last=True
)
test_sampler = DistributedSampler(test_dataset, shuffle=False)
TestImgLoader = torch.utils.data.DataLoader(
test_dataset,
batch_size=cfg.BATCH_SIZE,
sampler=test_sampler,
num_workers=cfg.TEST.N_WORKERS,
pin_memory=True,
drop_last=False
)
else:
TrainImgLoader = DataLoader(train_dataset, cfg.BATCH_SIZE, shuffle=False, num_workers=cfg.TRAIN.N_WORKERS,
drop_last=True)
TestImgLoader = DataLoader(test_dataset, cfg.BATCH_SIZE, shuffle=False, num_workers=cfg.TEST.N_WORKERS,
drop_last=False)
# model, optimizer
model = NeuralRecon(cfg)
if cfg.DISTRIBUTED:
model.cuda()
model = DistributedDataParallel(
model, device_ids=[cfg.LOCAL_RANK], output_device=cfg.LOCAL_RANK,
# this should be removed if we update BatchNorm stats
broadcast_buffers=False,
find_unused_parameters=True
)
else:
model = torch.nn.DataParallel(model, device_ids=[0])
model.cuda()
optimizer = torch.optim.Adam(model.parameters(), lr=cfg.TRAIN.LR, betas=(0.9, 0.999), weight_decay=cfg.TRAIN.WD)
if cfg.TRAIN.AUTOGRAD:
scaler = amp.GradScaler()
accumulation_steps = cfg.TRAIN.ACCUMULATION_STEPS
def print_system_memory(tag=""):
mem = psutil.virtual_memory()
# print(f"[{tag}] Total: {mem.total / 1024 ** 3:.2f} GB")
# print(f"[{tag}] Available: {mem.available / 1024 ** 3:.2f} GB")
print(f"[{tag}] Used: {mem.used / 1024 ** 3:.2f} GB")
# print(f"[{tag}] Free: {mem.free / 1024 ** 3:.2f} GB")
print("")
# main function
@profile
def train():
# load parameters
start_epoch = 0
if cfg.RESUME:
saved_models = [fn for fn in os.listdir(cfg.LOGDIR) if fn.endswith(".ckpt")]
saved_models = sorted(saved_models, key=lambda x: int(x.split('_')[-1].split('.')[0]))
if len(saved_models) != 0:
# use the latest checkpoint file
loadckpt = os.path.join(cfg.LOGDIR, saved_models[-1])
logger.info("resuming " + str(loadckpt))
map_location = {'cuda:%d' % 0: 'cuda:%d' % cfg.LOCAL_RANK}
state_dict = torch.load(loadckpt, map_location=map_location)
model.load_state_dict(state_dict['model'], strict=True)
optimizer.param_groups[0]['initial_lr'] = state_dict['optimizer']['param_groups'][0]['lr']
optimizer.param_groups[0]['lr'] = state_dict['optimizer']['param_groups'][0]['lr']
start_epoch = state_dict['epoch'] + 1
elif cfg.LOADCKPT != '':
# load checkpoint file specified by args.loadckpt
logger.info("loading model {}".format(cfg.LOADCKPT))
map_location = {'cuda:%d' % 0: 'cuda:%d' % cfg.LOCAL_RANK}
state_dict = torch.load(cfg.LOADCKPT, map_location=map_location)
model.load_state_dict(state_dict['model'], strict=True)
print('Full LoadCkpt!')
# Weight initialization
load_c = 'none'
if load_c == 'backbone2d':
backbone2d_state_dict = {k[len('module.backbone2d.'):]: v for k, v in state_dict['model'].items() if k.startswith('module.backbone2d')}
model.module.backbone2d.load_state_dict(backbone2d_state_dict, strict=True)
elif load_c == 'init':
print('init 1')
backbone2d_state_dict = {k[len('module.backbone2d.'):]: v for k, v in state_dict['model'].items() if k.startswith('module.backbone2d')}
model.module.backbone2d.load_state_dict(backbone2d_state_dict, strict=True)
initialization_state_dict = {k[len('module.neucon_net.initialization.'):]: v for k, v in state_dict['model'].items() if k.startswith('module.neucon_net.initialization')}
model.module.neucon_net.initialization.load_state_dict(initialization_state_dict, strict=True)
freeze = 'init'
if freeze == 'mansnet':
for param in model.module.backbone2d.conv0[:6].parameters(): # All sub-layers under the first 6 layers of backbone2d.conv0 are frozen
param.requires_grad = False
elif freeze == 'init':
print('init 2')
for param in model.module.backbone2d.parameters():
param.requires_grad = False
for param in model.module.neucon_net.initialization.parameters():
param.requires_grad = False
# # Check if the parameter is set to non-trainable
# for name, param in model.named_parameters():
# print(name, param.requires_grad)
# # Submodule parameter calculation
# param_backbone2d = sum(p.numel() for p in model.module.backbone2d.parameters())
# param_backbone_occ_pano = sum(p.numel() for p in model.module.backbone_occ_pano.parameters())
# param_initialization = sum(p.numel() for p in model.module.neucon_net.initialization.parameters())
# param_gru_fusion = sum(p.numel() for p in model.module.neucon_net.gru_fusion.parameters())
# param_sp_convs = sum(p.numel() for p in model.module.neucon_net.sp_convs.parameters())
# param_tsdf_preds = sum(p.numel() for p in model.module.neucon_net.tsdf_preds.parameters())
# param_occ_preds = sum(p.numel() for p in model.module.neucon_net.occ_preds.parameters())
# param_panoptic_feat_fusion = sum(p.numel() for p in model.module.neucon_net.panoptic_feat_fusion.parameters())
# param_panoptic = sum(p.numel() for p in model.module.neucon_net.panoptic.parameters())
logger.info("start at epoch {}".format(start_epoch))
logger.info('Number of model parameters: {} M'.format(sum([p.data.nelement() for p in model.parameters()]) / 1e6))
milestones = [int(epoch_idx) for epoch_idx in cfg.TRAIN.LREPOCHS.split(':')[0].split(',')]
lr_gamma = 1 / float(cfg.TRAIN.LREPOCHS.split(':')[1])
lr_scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones, gamma=lr_gamma,
last_epoch=start_epoch - 1)
for epoch_idx in range(start_epoch, cfg.TRAIN.EPOCHS):
logger.info('Epoch {}:'.format(epoch_idx))
lr_scheduler.step()
print("Current learning rate:", optimizer.param_groups[0]['lr'])
TrainImgLoader.dataset.epoch = epoch_idx
TrainImgLoader.dataset.tsdf_cashe = {}
optimizer.zero_grad()
step_count = 0
model.train()
# training
for batch_idx, sample in enumerate(TrainImgLoader):
global_step = len(TrainImgLoader) * epoch_idx + batch_idx
do_summary = global_step % cfg.SUMMARY_FREQ == 0
start_time = time.time()
if cfg.TRAIN.AUTOGRAD:
with amp.autocast():
outputs, loss_dict = model(sample)
loss = loss_dict['total_loss']
if loss == 0:
continue
scaler.scale(loss).backward()
step_count += 1
if step_count % accumulation_steps == 0:
scaler.unscale_(optimizer)
torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0)
# grads = []
# for param in model.parameters():
# if param.grad is not None:
# grads.append(param.grad.clone().detach())
# else:
# grads.append(None)
scaler.step(optimizer)
scaler.update()
optimizer.zero_grad()
step_count = 0
else:
outputs, loss_dict = model(sample)
loss = loss_dict['total_loss']
if loss == 0:
continue
loss.backward()
step_count += 1
if step_count % accumulation_steps == 0:
torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0)
optimizer.step()
optimizer.zero_grad()
step_count = 0
"""
# Monitor memory after backpropagation
print_system_memory("After backpropagation")
# Get the memory usage of the current process
process = psutil.Process(os.getpid())
print(f"Total memory usage: {process.memory_info().rss / 1024 ** 2} MB")
# Get current and peak memory usage
current, peak = tracemalloc.get_traced_memory()
print(f"Current memory usage is {current / 10**6}MB; Peak was {peak / 10**6}MB")
"""
loss, scalar_outputs = tensor2float(loss), tensor2float(loss_dict)
if is_main_process():
logger.info(
'Epoch {}/{}, Iter {}/{}, train loss = {:.3f}, time = {:.3f}'.format(epoch_idx, cfg.TRAIN.EPOCHS,
batch_idx,
len(TrainImgLoader), loss,
time.time() - start_time))
if do_summary and is_main_process():
save_scalars(tb_writer, 'train', scalar_outputs, global_step)
del scalar_outputs
# checkpoint
if (epoch_idx + 1) % cfg.SAVE_FREQ == 0 and is_main_process():
torch.save({
'epoch': epoch_idx,
'model': model.state_dict(),
'optimizer': optimizer.state_dict()},
"{}/model_{:0>6}.ckpt".format(cfg.LOGDIR, epoch_idx))
def test(from_latest=False):
ckpt_list = []
# while True:
saved_models = [fn for fn in os.listdir(cfg.LOGDIR) if fn.endswith(".ckpt")]
saved_models = sorted(saved_models, key=lambda x: int(x.split('_')[-1].split('.')[0]))
saved_models = [saved_models[-1]]
model.train() # Prevent BN and Dropout
if from_latest:
saved_models = saved_models[-1:]
for ckpt in saved_models:
ckpt = 'model_000099.ckpt'
if ckpt not in ckpt_list:
# use the latest checkpoint file
loadckpt = os.path.join(cfg.LOGDIR, ckpt)
logger.info("resuming " + str(loadckpt))
state_dict = torch.load(loadckpt)
model.load_state_dict(state_dict['model'])
epoch_idx = state_dict['epoch']
TestImgLoader.dataset.tsdf_cashe = {}
logger.info('Number of model parameters: {} M'.format(sum([p.data.nelement() for p in model.parameters()]) / 1e6))
avg_test_scalars = DictAverageMeter()
save_mesh_scene = SaveScene(cfg)
batch_len = len(TestImgLoader)
for batch_idx, sample in enumerate(TestImgLoader):
if not isinstance(sample, dict):
continue
for n in sample['fragment']:
logger.info(n)
# save mesh if SAVE_SCENE_MESH and is the last fragment
save_scene = cfg.SAVE_SCENE_MESH and batch_idx == batch_len - 1
# save_scene = True
save_mesh_scene.keyframe_id = batch_idx
start_time = time.time()
loss, scalar_outputs, outputs = test_sample(sample, save_scene)
logger.info('Epoch {}, Iter {}/{}, test loss = {:.3f}, time = {:3f}'.format(epoch_idx, batch_idx,
len(TestImgLoader),
loss,
time.time() - start_time))
avg_test_scalars.update(scalar_outputs)
del scalar_outputs
if batch_idx % 100 == 0:
logger.info("Iter {}/{}, test results = {}".format(batch_idx, len(TestImgLoader),
avg_test_scalars.mean()))
# save mesh
if cfg.SAVE_SCENE_MESH:
save_mesh_scene(outputs, sample, epoch_idx)
save_scalars(tb_writer, 'fulltest', avg_test_scalars.mean(), epoch_idx)
logger.info("epoch {} avg_test_scalars:".format(epoch_idx), avg_test_scalars.mean())
ckpt_list.append(ckpt)
time.sleep(10)
# def train_sample(sample):
# model.train()
# optimizer.zero_grad()
#
# outputs, loss_dict = model(sample)
# loss = loss_dict['total_loss']
#
# if loss != 0:
# loss.backward()
# torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0)
# optimizer.step()
#
# return tensor2float(loss), tensor2float(loss_dict)
@make_nograd_func
def test_sample(sample, save_scene=False):
with torch.no_grad():
outputs, loss_dict = model(sample, save_scene, training=False)
loss = loss_dict['total_loss']
return tensor2float(loss), tensor2float(loss_dict), outputs
if __name__ == '__main__':
# Enable memory tracking
tracemalloc.start()
if cfg.MODE == "train":
train()
elif cfg.MODE == "test":
test()
# Stop memory tracking
tracemalloc.stop()
```
## /models/__init__.py
```py path="/models/__init__.py"
from models.neuralrecon import NeuralRecon
```
## /models/backbone.py
```py path="/models/backbone.py"
import torch.nn as nn
import torch.nn.functional as F
import torchvision
def _round_to_multiple_of(val, divisor, round_up_bias=0.9):
""" Asymmetric rounding to make `val` divisible by `divisor`. With default
bias, will round up, unless the number is no more than 10% greater than the
smaller divisible value, i.e. (83, 8) -> 80, but (84, 8) -> 88. """
assert 0.0 < round_up_bias < 1.0
new_val = max(divisor, int(val + divisor / 2) // divisor * divisor)
return new_val if new_val >= round_up_bias * val else new_val + divisor
def _get_depths(alpha):
""" Scales tensor depths as in reference MobileNet code, prefers rouding up
rather than down. """
depths = [32, 16, 24, 40, 80, 96, 192, 320]
return [_round_to_multiple_of(depth * alpha, 8) for depth in depths]
class MnasMulti(nn.Module):
def __init__(self, alpha=1.0):
super(MnasMulti, self).__init__()
depths = _get_depths(alpha)
if alpha == 1.0:
MNASNet = torchvision.models.mnasnet1_0(pretrained=True, progress=True)
else:
MNASNet = torchvision.models.MNASNet(alpha=alpha)
self.conv0 = nn.Sequential(
MNASNet.layers._modules['0'],
MNASNet.layers._modules['1'],
MNASNet.layers._modules['2'],
MNASNet.layers._modules['3'],
MNASNet.layers._modules['4'],
MNASNet.layers._modules['5'],
MNASNet.layers._modules['6'],
MNASNet.layers._modules['7'],
MNASNet.layers._modules['8'],
)
self.conv1 = MNASNet.layers._modules['9']
self.conv2 = MNASNet.layers._modules['10']
self.out1 = nn.Conv2d(depths[4], depths[4], 1, bias=False)
self.out_channels = [depths[4]]
final_chs = depths[4]
self.inner1 = nn.Conv2d(depths[3], final_chs, 1, bias=True)
self.inner2 = nn.Conv2d(depths[2], final_chs, 1, bias=True)
self.out2 = nn.Conv2d(final_chs, depths[3], 3, padding=1, bias=False)
self.out3 = nn.Conv2d(final_chs, depths[2], 3, padding=1, bias=False)
self.out_channels.append(depths[3])
self.out_channels.append(depths[2])
def forward(self, x):
conv0 = self.conv0(x)
conv1 = self.conv1(conv0)
conv2 = self.conv2(conv1)
intra_feat = conv2
outputs = []
out = self.out1(intra_feat)
outputs.append(out)
intra_feat = F.interpolate(intra_feat, scale_factor=2, mode="nearest") + self.inner1(conv1)
out = self.out2(intra_feat)
outputs.append(out)
intra_feat = F.interpolate(intra_feat, scale_factor=2, mode="nearest") + self.inner2(conv0)
out = self.out3(intra_feat)
outputs.append(out)
return outputs[::-1]
```
## /models/criterion.py
```py path="/models/criterion.py"
import logging
import torch
import torch.nn.functional as F
from torch import nn
def compute_pos_weight(targets):
"""
targets: [N,] bool
"""
n_all = targets.reshape(-1).shape[0]
n_p = targets.reshape(-1).sum()
pos_weight = (n_all - n_p).float() / n_p
return pos_weight
def dice_loss(
inputs: torch.Tensor,
targets: torch.Tensor,
num_masks: float,
):
"""
Compute the DICE loss, similar to generalized IOU for masks
Args:
inputs: A float tensor of arbitrary shape.
The predictions for each example.
targets: A float tensor with the same shape as inputs. Stores the binary
classification label for each element in inputs
(0 for the negative class and 1 for the positive class).
"""
inputs = inputs.sigmoid()
inputs = inputs.flatten(1)
numerator = 2 * (inputs * targets).sum(-1)
denominator = inputs.sum(-1) + targets.sum(-1)
loss = 1 - (numerator + 1) / (denominator + 1)
return loss.sum() / num_masks # loss per bs per mask
def sigmoid_ce_loss(
inputs: torch.Tensor,
targets: torch.Tensor,
num_masks: float,
):
"""
Args:
inputs: A float tensor of arbitrary shape.
The predictions for each example.
targets: A float tensor with the same shape as inputs. Stores the binary
classification label for each element in inputs
(0 for the negative class and 1 for the positive class).
Returns:
Loss tensor
"""
loss = []
for b in range(targets.shape[0]):
pos_weight = compute_pos_weight(targets[b])
pos_weight = torch.clamp(pos_weight, max=30)
loss.append(F.binary_cross_entropy_with_logits(inputs[b].view(-1, 1), targets[b].view(-1, 1), pos_weight=pos_weight, reduction="none").mean())
# loss.append(F.binary_cross_entropy_with_logits(inputs[b].view(-1, 1), targets[b].view(-1, 1), reduction="none").mean())
loss = sum(loss) / len(loss) # loss per bs per voxel
return loss
def calculate_uncertainty(logits):
"""
We estimate uncerainty as L1 distance between 0.0 and the logit prediction in 'logits' for the
foreground class in `classes`.
Args:
logits (Tensor): A tensor of shape (R, 1, ...) for class-specific or
class-agnostic, where R is the total number of predicted masks in all images and C is
the number of foreground classes. The values are logits.
Returns:
scores (Tensor): A tensor of shape (R, 1, ...) that contains uncertainty scores with
the most uncertain locations having the highest uncertainty score.
"""
assert logits.shape[1] == 1
gt_class_logits = logits.clone()
return -(torch.abs(gt_class_logits))
class SetCriterion(nn.Module):
"""This class computes the loss for DETR.
The process happens in two steps:
1) we compute hungarian assignment between ground truth boxes and the outputs of the model
2) we supervise each pair of matched ground-truth / prediction (supervise class and box)
"""
def __init__(self, num_classes, matcher, weight_dict, eos_coef, losses,
num_points=12544, oversample_ratio=3.0, importance_sample_ratio=0.75):
"""Create the criterion.
Parameters:
num_classes: number of object categories, omitting the special no-object category
matcher: module able to compute a matching between targets and proposals
weight_dict: dict containing as key the names of the losses and as values their relative weight.
eos_coef: relative classification weight applied to the no-object category
losses: list of all the losses to be applied. See get_loss for list of available losses.
"""
super().__init__()
self.num_classes = num_classes
self.matcher = matcher
self.weight_dict = weight_dict
self.eos_coef = eos_coef
self.losses = losses
empty_weight = torch.ones(self.num_classes + 1)
empty_weight[0] = self.eos_coef
self.register_buffer("empty_weight", empty_weight)
self.valid_classes = torch.Tensor([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 14, 16, 24, 28, 33, 34, 36, 39])
self.true_classes = torch.arange(1, 21)
# # pointwise mask loss parameters
# self.num_points = num_points
# self.oversample_ratio = oversample_ratio
# self.importance_sample_ratio = importance_sample_ratio
def loss_labels(self, outputs, targets, indices, num_masks):
"""Classification loss (NLL)
targets dicts must contain the key "labels" containing a tensor of dim [nb_target_boxes]
"""
assert "pred_logits" in outputs
src_logits = outputs["pred_logits"].float()
idx = self._get_src_permutation_idx(indices)
target_classes_o = torch.cat([t["labels"][J] for t, (_, J) in zip(targets, indices)])
if len(target_classes_o) == 0:
losses = {"loss_ce": torch.Tensor([0.0]).cuda()[0] * src_logits.sum()}
else:
target_classes = torch.full(
src_logits.shape[:2], 0, dtype=torch.int64, device=src_logits.device
)
target_classes[idx] = target_classes_o
loss_ce = F.cross_entropy(src_logits.transpose(1, 2), target_classes, self.empty_weight)
losses = {"loss_ce": loss_ce}
return losses
def loss_masks(self, outputs, targets, indices, num_masks):
"""Compute the losses related to the masks: the focal loss and the dice loss.
targets dicts must contain the key "masks" containing a tensor of dim [nb_target_boxes, h, w]
"""
assert "pred_masks" in outputs
src_idx = self._get_src_permutation_idx(indices)
tgt_idx = self._get_tgt_permutation_idx(indices)
src_masks = outputs["pred_masks"]
src_masks = src_masks[src_idx]
masks = [t["masks"] for t in targets]
target_masks = masks[0].unsqueeze(0)
target_masks = target_masks.to(src_masks)
target_masks = target_masks[tgt_idx]
if len(src_masks) == 0 or len(target_masks) == 0:
losses = {
"loss_masks": torch.Tensor([0.0]).cuda()[0] * src_masks.sum(),
"loss_dice": torch.Tensor([0.0]).cuda()[0] * src_masks.sum(),
}
else:
losses = {
"loss_mask": sigmoid_ce_loss(src_masks, target_masks, num_masks), # loss per voxel
"loss_dice": dice_loss(src_masks, target_masks, num_masks),
}
del src_masks
del target_masks
return losses
def _get_src_permutation_idx(self, indices):
# permute predictions following indices
batch_idx = torch.cat([torch.full_like(src, i) for i, (src, _) in enumerate(indices)])
src_idx = torch.cat([src for (src, _) in indices])
return batch_idx, src_idx
def _get_tgt_permutation_idx(self, indices):
# permute targets following indices
batch_idx = torch.cat([torch.full_like(tgt, i) for i, (_, tgt) in enumerate(indices)])
tgt_idx = torch.cat([tgt for (_, tgt) in indices])
return batch_idx, tgt_idx
def get_loss(self, loss, outputs, targets, indices, num_masks):
loss_map = {
'labels': self.loss_labels,
'masks': self.loss_masks,
}
assert loss in loss_map, f"do you really want to compute {loss} loss?"
return loss_map[loss](outputs, targets, indices, num_masks)
def forward(self, outputs, targets):
"""This performs the loss computation.
Parameters:
outputs: dict of tensors, see the output specification of the model for the format
targets: list of dicts, such that len(targets) == batch_size.
The expected keys in each dict depends on the losses applied, see each loss' doc
"""
device = outputs['pred_masks'].device
self.valid_classes = self.valid_classes.to(device)
# Only 20 classes remain
# 1. Delete the masks that are not in the 20 classes
valid_mask_in_20 = torch.zeros_like(targets[0]['masks'][0], dtype=torch.bool)
is_in_valid_classes = torch.zeros_like(targets[0]['labels'], dtype=torch.bool) # False
for idx, cls in enumerate(targets[0]['labels']):
if cls in self.valid_classes:
is_in_valid_classes[idx] = True
valid_mask_in_20 = valid_mask_in_20 | targets[0]['masks'][idx]
targets[0]['labels'] = targets[0]['labels'][is_in_valid_classes]
targets[0]['masks'] = targets[0]['masks'][is_in_valid_classes][:, valid_mask_in_20]
count3_mask = targets[0]['masks'].sum(dim=1)
# 2. Mapping semantic categories to 20 classes
for idx, cls in enumerate(targets[0]['labels']):
idx_20 = torch.where(self.valid_classes == cls)[0]
targets[0]['labels'][idx] = idx_20 + 1
if targets[0]['masks'].sum() == 0:
return {"loss_dice": torch.Tensor([0.0]).cuda()[0] * outputs['pred_logits'].sum()}
outputs['pred_masks'] = outputs['pred_masks'][..., valid_mask_in_20]
if "aux_outputs" in outputs:
for aux_output in outputs["aux_outputs"]:
# aux_output['pred_masks'] = aux_output['pred_masks'][..., valid_voxel_index][..., valid_mask_in_20]
aux_output['pred_masks'] = aux_output['pred_masks'][..., valid_mask_in_20]
# Remove classes with <100 voxels to reduce noise
valid_mask_in_20 = torch.zeros_like(targets[0]['masks'][0], dtype=torch.bool)
is_in_valid_classes = torch.zeros_like(targets[0]['labels'], dtype=torch.bool)
for idx in range(len(targets[0]['labels'])):
if targets[0]['masks'][idx].sum() > 100:
is_in_valid_classes[idx] = True
valid_mask_in_20 = valid_mask_in_20 | targets[0]['masks'][idx]
targets[0]['labels'] = targets[0]['labels'][is_in_valid_classes]
targets[0]['masks'] = targets[0]['masks'][is_in_valid_classes][:, valid_mask_in_20]
count4_mask = targets[0]['masks'].sum(dim=1)
if targets[0]['masks'].sum() == 0:
return {"loss_dice": torch.Tensor([0.0]).cuda()[0] * outputs['pred_logits'].sum()}
outputs['pred_masks'] = outputs['pred_masks'][..., valid_mask_in_20]
if "aux_outputs" in outputs:
for aux_output in outputs["aux_outputs"]:
aux_output['pred_masks'] = aux_output['pred_masks'][..., valid_mask_in_20]
# start compute loss
outputs_without_aux = {k: v for k, v in outputs.items() if k != "aux_outputs"}
# print('pano_targets: ', targets[0]['labels'], len(targets[0]['labels']), 'pano_masks: ', targets[0]['masks'].shape[1])
indices = self.matcher(outputs_without_aux, targets)
# Compute the average number of target boxes accross all nodes, for normalization purposes
num_masks = sum(len(t["labels"]) for t in targets)
num_masks = torch.as_tensor(
[num_masks], dtype=torch.float, device=device
)
num_masks = torch.clamp(num_masks, min=1).item()
# Compute all the requested losses
losses = {}
for loss in self.losses:
losses.update(self.get_loss(loss, outputs, targets, indices, num_masks)) # loss per bs per query/mask
# In case of auxiliary losses, we repeat this process with the output of each intermediate layer.
if "aux_outputs" in outputs:
for i, aux_outputs in enumerate(outputs["aux_outputs"]):
indices = self.matcher(aux_outputs, targets)
for loss in self.losses:
l_dict = self.get_loss(loss, aux_outputs, targets, indices, num_masks)
l_dict = {k + f"_{i}": v for k, v in l_dict.items()}
losses.update(l_dict)
return losses
def __repr__(self):
head = "Criterion " + self.__class__.__name__
body = [
"matcher: {}".format(self.matcher.__repr__(_repr_indent=8)),
"losses: {}".format(self.losses),
"weight_dict: {}".format(self.weight_dict),
"num_classes: {}".format(self.num_classes),
"eos_coef: {}".format(self.eos_coef),
# "num_points: {}".format(self.num_points),
# "oversample_ratio: {}".format(self.oversample_ratio),
# "importance_sample_ratio: {}".format(self.importance_sample_ratio),
]
_repr_indent = 4
lines = [head] + [" " * _repr_indent + line for line in body]
return "\n".join(lines)
```
The content has been capped at 50000 tokens, and files over NaN bytes have been omitted. The user could consider applying other filters to refine the result. The better and more specific the context, the better the LLM can follow instructions. If the context seems verbose, the user can refine the filter using uithub. Thank you for using https://uithub.com - Perfect LLM context for any GitHub repo.