博彩网大全-双色球博彩知识

網站頁面已加載完成

由于您當前的瀏覽器版本過低,存在安全隱患。建議您盡快更新,以便獲取更好的體驗。推薦使用最新版Chrome、Firefox、Opera、Edge

Chrome

Firefox

Opera

Edge

ENG

當前位置: 首頁 · 學術交流 · 正文

學術交流

【學術報告】研究生“靈犀學術殿堂”第560期之徐揚揚教授報告會通知

發布時間:2020年09月18日 來源:研究生院 點擊數:

全校師生:

我校定于2020年09月26日舉辦研究生靈犀學術殿堂——徐揚揚教授報告會,現將有關事項通知如下:

1.報告會簡介

報告人:徐揚揚教授

時間:2020年09月26日(星期六)10:00

地點:騰訊會議(會議號:688 912 696)

報告題目:Accelerating stochastic gradient methods

內容簡介:Stochastic gradient method has been extensively used to train machine learning models, in particular for deep learning. Various techniques have been applied to accelerate stochastic gradient methods, either numerically or theoretically, such as momentum acceleration and adapting learning rates. In this talk, I will present two ways to accelerate stochastic gradient methods. The first one is to accelerate the popular adaptive (Adam-type) stochastic gradient method by asynchronous (async) parallel computing. Numerically, async-parallel computing can have significantly higher parallelization speed-up than its sync-parallel counterpart. Several previous works have studied async-parallel non-adaptive stochastic gradient methods. However, a non-adaptive stochastic gradient method often converges significantly slower than an adaptive one. I will show that our async-parallel adaptive stochastic gradient method can have near-linear speed-up on top of the fast convergence of an adaptive stochastic gradient method. In the second part, I will present a momentum-accelerated proximal stochastic gradient method. It can have provably faster convergence than a standard proximal stochastic gradient method. I will also show experimental results to demonstrate its superiority on training a sparse deep learning model.

2.歡迎各學院師生前來聽報告。報告會期間請關閉手機或將手機調至靜音模式。

西北工業大學黨委學生工作部

數學與統計學院

復雜系統動力學與控制工信部重點實驗室

2020年9月18日

報告人簡介

Dr. Yangyang Xu(徐揚揚) is now a tenure-track assistant professor in the Department of Mathematical Sciences at Rensselaer Polytechnic Institute. He received his B.S. in Computational Mathematics from Nanjing University in 2007, M.S. in Operations Research from Chinese Academy of Sciences in 2010, and Ph.D from the Department of Computational and Applied Mathematics at Rice University in 2014. His research interests are optimization theory and methods and their applications such as in machine learning, statistics, and signal processing. He developed optimization algorithms for compressed sensing, matrix completion, and tensor factorization and learning. Recently, his research focuses on first-order methods, operator splitting, stochastic optimization methods, and high performance parallel computing. He has published over 30 papers in prestigious journals and conference proceedings. He was awarded the gold medal in 2017 International Consortium of Chinese Mathematicians.

在线百家乐官网娱乐| 威尼斯人娱乐城博彩| 百家乐官网投注方向| 新东泰百家乐官网的玩法技巧和规则 | 赌百家乐容易的原| 世界顶级赌场酒店| 百家乐官网代理在线游戏可信吗网上哪家平台信誉好安全 | 百家乐官网玩法开户彩公司| 百家乐是骗人的么| 泰无聊棋牌游戏中心| 玩百家乐官网怎么才能赢| 百家乐金海岸| 现金娱乐城| 百家乐官网园太阳| 大发888官网免费下载| 澳门百家乐官网赢钱秘| 哪个百家乐网站信誉好| 百家乐21点桌| 临武县| 百家乐官网娱乐网网| 大发888出纳柜台 在线| 百家乐官网赌场群| 百家乐娱乐城博彩通博彩网| 金狮娱乐| 百家乐视频多开| 真钱的棋牌游戏| 克拉克百家乐试玩| 百家乐官网视频游戏挖坑| 真人百家乐代理合作| 游戏百家乐官网庄闲| 大发888娱乐城高手| 百家乐赌具哪里最好| 百家乐官网北京| 八大胜开户| 免费百家乐追号工具| 星河百家乐现金网| 百家乐官网游戏客户端| 百樂坊娱乐| 死海太阳城酒店| 百家乐官网是怎么赌法| 利来国际开户|