General Technology

Microsoft unveils Archai Neural Architecture search platform

October 4, 2020

Neural architecture search, or NAS, is to search for the best-performing neural networks automatically using computers. Recent advances in the methods used in NAS have made building faster, more compact, and power-saving problem-specific networks possible.

However, many of these methods rely on tricks that aren’t easy to document in a way that it can be discovered easily. This means that while the tricks are effective, they are often obscuring the search algorithm’s performance themselves. As different NAS methods and techniques use different enhancements and some not even using any, comparing all of them is very hard. Reproducing them is also quite difficult. Prospective methods can fail when transferred to other systems and datasets. Engineers using these NAS methods are also facing difficulties due to the complexity of the methods, inability to compare various methods, conflicting research claims, fragmented code bases in research repos, poorly-managed hyper-parameters, and no plug-and-play techniques being available.

Microsoft took it upon themselves to solve these problems and make cutting-edge NAS research more widespread. They discovered that a unified NAS framework can help standardise and regulate the algorithms used in NAS research so reproduction and troubleshooting are much easier. This will accelerate development speeds and allow for more ambitious NAS projects, even breaking into new realms of NAS not studied before. With this in mind, Archai was developed, an open-source project available on GitHub. Short for Architecture AI, the name means ‘first principles’ and captures the spirit of the work perfectly.

Archai will make NAS easier for the researchers

Archai allows standard NAS algorithms to be executed with one command line. As of now, the available implemented options are Differentiable Architecture Search (DARTS), Petridish, Differentiable ArchiTecture Approximation (DATA), and eXperts Neural Architecture Search (XNAS). It is also easy to add new algorithms to Archai, experiment with standardised datasets, and add new ones using unified interfaces. In addition, Archai can isolate hyperparameters with the use of a configuration system that makes explicit assumptions and settings. The architecture search systems will behave according to the hyperparameters used and are sensitive. Having unified hyperparameter configuration controls in Archai allows users to test different algorithms on the same playing field.

Some of the key features of Archai

Declarative approach and reproducibility: Many NAS research works have enhancements that are small but make it difficult to categorise and record in the world of neural network performance. Some works use 600 epochs for final architecture training while others have 1500. Some employ AutoAugment to train the network in data augmentation, but others only use Cutout. Microsoft took its time research the different codebases available to extract bags of tricks. Archai can switch any tricks on and off with simple configurations, which applies to all algorithms. Extracting the tricks also allows Archai to be designed as a general-purpose framework capable of training manually designed neural networks with great efficiency. Recent research shows that using these tricks judiciously is usually more important than the differences in the architectures themselves.

Search-space abstractions: Current NAS research mostly focuses on small search spaces. Archai allows abstractions that expand search spaces significantly and in a more generalised way. What’s more, this is available to all algorithms as well. Microsoft hopes that the research community will push into search spaces and explore uncharted areas.

Mixing and matching of different techniques: Users can mix and match different techniques to create new options. Archai allows Petridish’s growth method to be applied to DARTS, L1 regularisation over architecture weights to be applied to other algorithms easily, or run online-learning motivated update rules proposed by XNAS or Geometric NAS in new search spaces or even in new algorithms. The possibilities are endless, and Archai has modular components to make mixing and matching simple.

Generalized Pareto frontier search: NAS becomes necessary when deploying neural networks on smartphones or embedded devices, which are constrained platforms. Researchers can expect constraints for power consumption, latency, memory usage, flops available, and more. The model must be able to work within such constraints, even if accuracy is reduced. Designing optimal networks with many constraints like those mentioned above is difficult manually. Current NAS algorithms are almost always going to outperform manual designs as this task is not easy at all. Archai can generate galleries of architectures with specified characteristics. Microsoft’s NAS method, Petridish, was designed to facilitate this. It is now available through Archai, and now it has higher-performing, distributed implementation. Microsoft also intends to generalise for all algorithms the Pareto front-end generation, allowing almost all algorithms to leverage their signature technique to produce similar model galleries.

Some other features offered by Archai are logging, publication-ready experiment reports, mixed-precision timing, distributed training, and more. It will even have cross-platform code that works for Linux, OS X, and Windows. A full list of features can be found on the Archai GitHub page.

About Archai

Archai is a platform for Neural Network Search (NAS) allowing users to create efficient and deep networks for applications. Allowing for mix and match of different techniques, Microsoft intends to make NAS research easier with a common code base. Non-experts can use the easy-to-use tools and systems to work on their projects.