Sync NATS-Bench's v1.0 and update algorithm names
This commit is contained in:
@@ -7,6 +7,7 @@ We analyze the validity of our benchmark in terms of various criteria and perfor
|
||||
We also show the versatility of NATS-Bench by benchmarking 13 recent state-of-the-art NAS algorithms on it. All logs and diagnostic information trained using the same setup for each candidate are provided.
|
||||
This facilitates a much larger community of researchers to focus on developing better NAS algorithms in a more comparable and computationally effective environment.
|
||||
|
||||
**You can use `pip install nats_bench` to install the library of NATS-Bench.**
|
||||
|
||||
The structure of this Markdown file:
|
||||
- [How to use NATS-Bench?](#How-to-Use-NATS-Bench)
|
||||
@@ -175,18 +176,18 @@ python ./exps/NATS-algos/search-size.py --dataset cifar100 --data_path $TORCH_HO
|
||||
python ./exps/NATS-algos/search-size.py --dataset ImageNet16-120 --data_path $TORCH_HOME/cifar.python/ImageNet16 --algo tas --rand_seed 777
|
||||
|
||||
|
||||
Run the channel search strategy in FBNet-V2
|
||||
Run the channel search strategy in FBNet-V2 -- masking + Gumbel-Softmax :
|
||||
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar10 --data_path $TORCH_HOME/cifar.python --algo fbv2 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar100 --data_path $TORCH_HOME/cifar.python --algo fbv2 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset ImageNet16-120 --data_path $TORCH_HOME/cifar.python/ImageNet16 --algo fbv2 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar10 --data_path $TORCH_HOME/cifar.python --algo mask_gumbel --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar100 --data_path $TORCH_HOME/cifar.python --algo mask_gumbel --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset ImageNet16-120 --data_path $TORCH_HOME/cifar.python/ImageNet16 --algo mask_gumbel --rand_seed 777
|
||||
|
||||
|
||||
Run the channel search strategy in TuNAS:
|
||||
Run the channel search strategy in TuNAS -- masking + sampling :
|
||||
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar10 --data_path $TORCH_HOME/cifar.python --algo tunas --arch_weight_decay 0 --rand_seed 777 --use_api 0
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar100 --data_path $TORCH_HOME/cifar.python --algo tunas --arch_weight_decay 0 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset ImageNet16-120 --data_path $TORCH_HOME/cifar.python/ImageNet16 --algo tunas --arch_weight_decay 0 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar10 --data_path $TORCH_HOME/cifar.python --algo mask_rl --arch_weight_decay 0 --rand_seed 777 --use_api 0
|
||||
python ./exps/NATS-algos/search-size.py --dataset cifar100 --data_path $TORCH_HOME/cifar.python --algo mask_rl --arch_weight_decay 0 --rand_seed 777
|
||||
python ./exps/NATS-algos/search-size.py --dataset ImageNet16-120 --data_path $TORCH_HOME/cifar.python/ImageNet16 --algo mask_rl --arch_weight_decay 0 --rand_seed 777
|
||||
```
|
||||
|
||||
### Final Discovered Architectures for Each Algorithm
|
||||
@@ -250,7 +251,7 @@ GDAS:
|
||||
If you find that NATS-Bench helps your research, please consider citing it:
|
||||
```
|
||||
@article{dong2020nats,
|
||||
title={NATS-Bench: Benchmarking NAS algorithms for Architecture Topology and Size},
|
||||
title={{NATS-Bench}: Benchmarking NAS algorithms for Architecture Topology and Size},
|
||||
author={Dong, Xuanyi and Liu, Lu and Musial, Katarzyna and Gabrys, Bogdan},
|
||||
journal={arXiv preprint arXiv:2009.00437},
|
||||
year={2020}
|
||||
|
Reference in New Issue
Block a user