Skip to content

Use FastCUT with public map images and location data from a few cities to generate realistic synthetic location data for any city in the world. Based on https://taesung.me/ContrastiveUnpairedTranslation/

License

Notifications You must be signed in to change notification settings

scollinstc/GAN-location-generator

 
 

Repository files navigation

Contrastive Unpaired Translation (CUT)

Train a GAN on public location data from a few cities to predict realistic e-bike locations across the world.

Example Results

Screen Shot 2022-01-17 at 9 28 22 PM

Prerequisites

  • Linux or macOS
  • Python 3
  • NVIDIA GPU + CUDA CuDNN

Getting started

  • Clone this repo:
git clone https://github.com/gretelai/contrastive-unpaired-translation
cd contrastive-unpaired-translation
  • Install PyTorch 1.1 and other dependencies (e.g., torchvision, visdom, dominate, gputil).

    For pip users, please type the command pip install -r requirements.txt.

    For Conda users, you can create a new Conda environment using conda env create -f environment.yml.

Create or download location datasets

Download the ebike_locations dataset

  • Download the ebike_locations dataset (Maps -> Maps with Scooter Locations)
sh datasets/download_ebike_dataset.sh

The dataset is downloaded and unzipped at ./datasets/ebike_locations/.

Or, create your own training and test sets

  • To create the training dataset from our public e-bike dataset of scooter locations in the USA, run:

python -m location_utils.create_training_data

  • To create a test dataset for a new location, run:

python -m location_utils.create_test_dataset --lat 35.652832 --lon 139.839478 --name Tokyo

FastCUT Training and Test

  • To view training results and loss plots, run python -m visdom.server and click the URL http://localhost:8097.

  • Train the FastCUT model:

python train.py --dataroot ./datasets/ebike_locations --name locations_FastCUT --CUT_mode FastCUT --n_epochs 50

The checkpoints will be stored at ./checkpoints/locations_FastCUT/web.

  • Test the FastCUT model:
python test.py --dataroot ./datasets/ebike_locations --name locations_FastCUT --CUT_mode FastCUT --num_test 500 --phase test --preprocess scale_width --load_size 256

The test results will be saved to a html file here: ./results/locations_FastCUT/latest_train/index.html.

Training using our launcher scripts

Please see experiments/location_launcher.py that generates the above command line arguments. The launcher scripts are useful for configuring rather complicated command-line arguments of training and testing.

Using the launcher, the command below generates the training command of CUT and FastCUT.

python -m experiments locations train 1  # FastCUT

To test using the launcher,

python -m experiments locations test 1   # FastCUT

Or, use a pre-trained CUT model

To run a pretrained model, run the following.

# Download and unzip the pretrained models
wget https://gretel-public-website.s3.amazonaws.com/datasets/fastcut_models/pretrained_models.tar.gz
tar -zxvf pretrained_models.tar.gz

# Generate outputs. The dataset paths might need to be adjusted.
# To do this, modify the lines of experiments/pretrained_launcher.py
# [id] corresponds to the respective commands defined in pretrained_launcher.py
# 6 - FastCUT on ebike_data
python -m experiments pretrained run_test [id]

Create the geo dataset

  • To convert synthetic images to locations, run:
python -m location_utils.images_to_geo --image_path results/locations_FastCUT/test_latest/images/fake_B --name Tokyo.csv

Citation

Contrastive Learning for Unpaired Image-to-Image Translation
Taesung Park, Alexei A. Efros, Richard Zhang, Jun-Yan Zhu
UC Berkeley and Adobe Research
In ECCV 2020

@inproceedings{park2020cut,
  title={Contrastive Learning for Unpaired Image-to-Image Translation},
  author={Taesung Park and Alexei A. Efros and Richard Zhang and Jun-Yan Zhu},
  booktitle={European Conference on Computer Vision},
  year={2020}
}

About

Use FastCUT with public map images and location data from a few cities to generate realistic synthetic location data for any city in the world. Based on https://taesung.me/ContrastiveUnpairedTranslation/

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Shell 0.1%