NAS for Ternary Neural Networks
Introduction
- Neural Architecture Search (NAS) for Ternary Neural Networks
- Goal: Design an algorithm that can automatically find the best TNN architecture for a given task
Task
- Breast Cancer dataset
- Baseline from the default implementation of TNN
- Power: 6.12kW
- Accuracy: 98.05%
- Area: 108103083.0 \(nm^2\)
Implementation
- Utilizing DEAP library
- NSGA-II, SPEA2 or NSGA-III selection
- Chromosome encoding
- Multi fitness - output is a function of input
Experiments and Hypotheses
- Different evolutionary algorithms (NSGA-II, SPEA2, NSGA-III)
- Probabilities of crossover and mutation
- Population size
- Number of generations
- Extra objectives
SPEA2 vs NSGA-II vs NSGA-III
- NSGA-II was the first choice, not too good
- SPEA2 was very slow and could not find good solutions fast enough
- NSGA-III because it is good at multi-objective optimization
SPEA2
![]()
NSGA-II
![]()
NSGA-III
![]()
Comparison of different probabilities of crossover
Comparison of different population sizes
Results
- NSGA-III is the best performing algorithm (finds the best solution)
- Best solution:
- Accuracy: 98.55%
- Area: 8439297.0 \(nm^2\)
- Power: 0.5236 kW
A 10x improvement in power consumption with a slight increase in accuracy.
Problems 1
- Wrong choice of chromosome encoding - algorithm gets stuck permuting the same architecture
- Low chance of finding harder to find optimal solutions
Problems 2
![]()
Pareto front showing the trade-offs between accuracy, area, and power consumption across solutions
Short summary
- Accuracy increased by 0.5%
- Area decreased by ~20%
- Power decreased by ~90%
Thank you for your attention!