File size: 1,214 Bytes
f908fbb 92da85e f908fbb 92da85e f908fbb 92da85e f908fbb 92da85e f908fbb 92da85e f908fbb a719cfe d098a4e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
---
dataset_info:
features:
- name: rgb
dtype: image
- name: ir
dtype: image
splits:
- name: train
num_bytes: 743157232.6539416
num_examples: 4113
- name: test
num_bytes: 186676141.08705837
num_examples: 1029
download_size: 928212503
dataset_size: 929833373.7409999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
For modified FLIR
Usage:
```py
from torchvision import transforms
from datasets import load_dataset
dataset_name = "newguyme/flir_paired"
dataset_flir_paired = load_dataset(dataset_name, split="train",use_auth_token=True)
# dataset_flir_paired
flir_preprocess = transforms.Compose(
[
transforms.Resize((config.image_size, config.image_size)),
# transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5]),
]
)
def flir_transform(examples):
rgb = [flir_preprocess(image.convert("RGB")) for image in examples["rgb"]]
ir = [flir_preprocess(image.convert("RGB")) for image in examples["ir"]]
return {"rgb_t_3ch": rgb, "ir_t_3ch":ir}
dataset_flir_paired.set_transform(flir_transform)
```
|