学院动态

学术报告信息发布

当前位置: 首页 >> 学术报告信息发布 >> 正文

数学与统计学院特邀南京审计大学吕绍高做学术报告

发布日期:2021-12-27    作者:     点击:

报告题目:Quantized SGD in federated learning: communication, optimization and generalization

报告时间:20211228下午15:00

会议链接:https://meeting.tencent.com/dm/cBL0NlKYHNLZ

会议 ID767-416-360

主办单位:科研处/数学与统计学院

主讲人:吕绍高

吕绍高简介:现为南京审计大学统计与数据科学学院教授,兼校外博士生导师。2011年毕业于中国科大-香港城市大学联合培养项目,获得理学博士学位。主要研究方向是统计机器学习,当前主要研究兴趣包括联邦学习、再生核方法以及深度学习与图神经网络。迄今为止在SCI检索的国际杂志上发表论文20多篇,包括国际统计或人工智能类期刊 《Annals of Statistics》、《Journal of Machine Learning Research》、“NeurIPS”与《Journal of Econometrics》。主持过国家自然科学基金项目2项。长期担任人工智能顶级会议“NeurIPS”、“ICML”、“AAAI”以及“AIStat”程序委员或审稿人。

报告摘要:Quantization techniques have become one of commonly used techniques to improve communication efficiency of federated learning. However, randomized quantization in federated learning will cause additional variance and thus impact accuracy. Besides, few studies in related literature consider the impact of quantization on generalization ability of a given algorithm. In this paper, we are interested in the interplay among some key factors of the widely-used distributed SGD with quantization, including quantization level, optimization error and generalization performance. For convex objectives, our main results show that there exist several trade-offs among communication efficiency, optimization error and generalization performance. For non-convex objectives, our theoretical results indicate that its quantization level has a more significant impact on generalization ability, compared to the convex cases. In the non-convex cases, our derived generalization bounds also show that early stopping may be required to guarantee certain generalization accuracy, even if the step size in SGD is very small. Finally, several numerical experiments using the logistic model and neural networks are implemented to support our theoretical findings.


上一条:数学与统计学院特邀北京大学王汉生做学术报告

下一条:数学与统计学院特邀东北师范大学周灿做学术报告