|
1 year ago | |
---|---|---|
datasets | 1 year ago | |
.gitignore | 1 year ago | |
README.md | 1 year ago | |
checkpoints.py | 1 year ago | |
file_utils.py | 1 year ago | |
gpu_memory.py | 1 year ago | |
hyperparameter_tuning_commands.py | 1 year ago | |
hyperparameter_tuning_nevergrad.py | 1 year ago | |
hyperparameter_tuning_results.py | 1 year ago | |
kamiak_config.sh | 1 year ago | |
kamiak_download.sh | 1 year ago | |
kamiak_eval.srun | 1 year ago | |
kamiak_hyperparam.srun | 1 year ago | |
kamiak_queue.sh | 1 year ago | |
kamiak_tflogs.sh | 1 year ago | |
kamiak_train.srun | 1 year ago | |
kamiak_upload.sh | 1 year ago | |
load_datasets.py | 1 year ago | |
main.py | 1 year ago | |
main_eval.py | 1 year ago | |
metrics.py | 1 year ago | |
models.py | 1 year ago | |
pickle_data.py | 1 year ago | |
pool.py | 1 year ago |
Method: instead of using the task classifier's softmax confidence for weighting samples for pseudo labeling, use the discriminator's / domain classifier's confidence based on how source-like the feature representations of the samples appear. In other words, we multi-purpose the discriminator to not only aid in producing domain-invariant representations (like in DANN) but also to provide pseudo labeling confidence.
Steps:
For example, to train on USPS to MNIST with no adaptation:
./kamiak_queue.sh test1 --model=vada_small --source=usps --target=mnist --method=none
To pseudo label weighting with the domain classifier's confidence (proposed method) or the task classifier's softmax confidence:
./kamiak_queue.sh test1 --model=vada_small --source=usps --target=mnist --method=pseudo
./kamiak_queue.sh test1 --model=vada_small --source=usps --target=mnist --method=pseudo --nouse_domain_confidence --debugnum=1
To instead do instance weighting:
./kamiak_queue.sh test1 --model=vada_small --source=usps --target=mnist --method=instance
./kamiak_queue.sh test1 --model=vada_small --source=usps --target=mnist --method=instance --nouse_domain_confidence --debugnum=1
Or, these but without adversarial training:
./kamiak_queue.sh test2 --model=vada_small --source=usps --target=mnist --method=pseudo --nodomain_invariant
./kamiak_queue.sh test2 --model=vada_small --source=usps --target=mnist --method=pseudo --nouse_domain_confidence --debugnum=1 --nodomain_invariant
Note: you probably need --nocompile_metrics
on any SynSigns to GTSRB adaptation, otherwise it may run out of memory. Also, these examples assume you're using SLURM. If not, you can modify kamiak_queue.sh to not queue with sbatch but run with bash.
For example, to evaluate the above "test1" trained models:
sbatch kamiak_eval.srun test1 --eval_batch=2048 --jobs=1