TY - CONF
T1 - Designing neural network hardware accelerators with decoupled objective evaluations
T2 - NIPS workshop on Bayesian Optimization
Y1 - 2016
A1 - JosÃ© Lobato
A1 - Gelbart, Michael A
A1 - Brandon Reagen
A1 - Robert Adolf
A1 - HernÃ¡ndez-Lobato, Daniel
A1 - Whatmough, Paul N
A1 - David Brooks
A1 - Gu-Yeon Wei
A1 - Adams, Ryan P
AB - Software-based implementations of deep neural network predictions consume large amounts of energy, limiting their deployment in power-constrained environments. Hardware acceleration is a promising alternative. However, it is challenging to efficiently design accelerators that have both low prediction error and low energy consumption. Bayesian optimization can be used to accelerate the design problem. However, most of the existing techniques collect data in a coupled way by always evaluating the two objectives (energy and error) jointly at the same input, which is inefficient. Instead, in this work we consider a decoupled approach in which, at each iteration, we choose which objective to evaluate next and at which input. We show that considering decoupled evaluations produces better solutions when computational resources are limited. Our results also indicate that evaluating the prediction error is more important than evaluating the energy consumption.
JF - NIPS workshop on Bayesian Optimization
UR - https://jmhldotorg.files.wordpress.com/2013/10/hardwarenn_nipsworkshopbo20161.pdf
ER -