Datasets:
Modalities:
Geospatial
Languages:
English
Size:
1M<n<10M
Tags:
street view imagery
open data
data fusion
urban analytics
GeoAI
volunteered geographic information
License:
File size: 1,699 Bytes
4860298 f32c31d 4860298 f32c31d 4860298 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
from huggingface_hub import HfApi, hf_hub_download
def download_folder(repo_id, repo_type, folder_path, local_dir):
"""
Download an entire folder from a huggingface dataset repository.
repo_id : string
The ID of the repository (e.g., 'username/repo_name').
repo_type : string
Type of the repo, dataset or model.
folder_path : string
The path to the folder within the repository.
local_dir : string
Local folder to download the data. This mimics git behaviour
"""
api = HfApi()
# list all files in the repo, keep the ones within folder_path
all_files = api.list_repo_files(repo_id, repo_type=repo_type)
files_list = [f for f in all_files if f.startswith(folder_path)]
# download each of those files
for file_path in files_list:
hf_hub_download(repo_id=repo_id, repo_type=repo_type,
filename=file_path, local_dir=local_dir)
# Download entire data/ folder
repo_id = "NUS-UAL/global-streetscapes" # you can replace this for other huggingface repos
repo_type = "dataset" # required by the API when the repo is a dataset
folder_path = "data/" # replace the folder you want within the repo
local_dir = "global-streetscapes/" # the local folder in your computer where it will be downloaded
# By default, huggingface download them to the .cache/huggingface folder
download_folder(repo_id, repo_type, folder_path, local_dir)
# Download 2 additional files
hf_hub_download(repo_id=repo_id, repo_type=repo_type,
filename="cities688.csv", local_dir=local_dir)
hf_hub_download(repo_id=repo_id, repo_type=repo_type,
filename="info.csv", local_dir=local_dir)
|