HOGWILD演算法

时间:2017-08-25 10:32:11
【文件属性】:
文件名称:HOGWILD演算法
文件大小:267KB
文件格式:PDF
更新时间:2017-08-25 10:32:11
ML AI ALGORITHM Stochastic Gradient Descent (SGD) is a popular algorithm that can achieve state-of-the-artperformance on a variety of machine learning tasks. Several researchers have recently pro-posed schemes to parallelize SGD, but all require performance-destroying memory locking andsynchronization. This work aims to show using novel theoretical analysis, algorithms, and im-plementation that SGD can be implemented without any locking. We present an update schemecalled Hogwild! which allows processors access to shared memory with the possibility of over-writing each other’s work. We show that when the associated optimization problem is sparse,meaning most gradient updates only modify small parts of the decision variable, then Hogwild!achieves a nearly optimal rate of convergence. We demonstrate experimentally that Hogwild!outperforms alternative schemes that use locking by an order of magnitude.

网友评论