%0 Conference Paper
%B NIPS workshop on Bayesian Optimization
%D 2016
%T Designing neural network hardware accelerators with decoupled objective evaluations
%A JosÃ© Lobato
%A Gelbart, Michael A
%A Brandon Reagen
%A Robert Adolf
%A HernÃ¡ndez-Lobato, Daniel
%A Whatmough, Paul N
%A David Brooks
%A Gu-Yeon Wei
%A Adams, Ryan P
%X Software-based implementations of deep neural network predictions consume large amounts of energy, limiting their deployment in power-constrained environments. Hardware acceleration is a promising alternative. However, it is challenging to efficiently design accelerators that have both low prediction error and low energy consumption. Bayesian optimization can be used to accelerate the design problem. However, most of the existing techniques collect data in a coupled way by always evaluating the two objectives (energy and error) jointly at the same input, which is inefficient. Instead, in this work we consider a decoupled approach in which, at each iteration, we choose which objective to evaluate next and at which input. We show that considering decoupled evaluations produces better solutions when computational resources are limited. Our results also indicate that evaluating the prediction error is more important than evaluating the energy consumption.
%B NIPS workshop on Bayesian Optimization
%P 10
%G eng
%U https://jmhldotorg.files.wordpress.com/2013/10/hardwarenn_nipsworkshopbo20161.pdf