2000 character limit reached
Efficient Sampled Softmax for Tensorflow (2004.05244v1)
Published 10 Apr 2020 in cs.LG
Abstract: This short paper discusses an efficient implementation of \emph{sampled softmax loss} for Tensorflow. The speedup over the default implementation is achieved due to simplification of the graph for the forward and backward passes.