Computational Cost of Neural Architecture Search on the Cloud Infrastructure
Neural Architecture Search (NAS) is a method fpr automating the process of designing neural network architectures by using algorithms such as evolutionary algorithms or reinforcement learning to search through a predefined space of architectures. NAS is used by a couple of open-source AutoML tools such as AutoKeras, AutoPytorch and AutoGluon in order to find a DNN model with best performance for a given task. In this project, you are going to investigate the performance vs computational cost trade-off for NAS approaches using a variety of cloud technologies composed of virtual machines, containers and cloud functions.