论坛报名 | 数理基础:人工智能的重大理论挑战和最新成果


与6位图灵奖得主和100多位专家

共同探讨人工智能的下一个十年

长按图片或点击阅读原文,内行盛会,首次免费注册

北京智源大会倒计时:10


2020年6月21-24日,第二届北京智源大会(官网:https://2020.baai.ac.cn)将邀请包括6位图灵奖获得者在内的上百位人工智能领袖,一起回顾过去,展望未来,深入系统探讨“人工智能的下一个十年”。本次大会将开设19个专题论坛,主题涵盖人工智能数理基础、自然语言处理、智能体系架构与芯片、人工智能伦理治理与可持续发展、机器学习、智能信息检索与挖掘、认知神经基础、机器感知、决策智能、AI医疗、AI创业、AI交通、AI+大数据+防疫、AI框架、图神经网络、知识智能、强化学习、青年科学家机器学习前沿,以及AI科技女性,遍历人工智能基础研究及创新应用,结合时局解析人工智能最新动态,探索未来发展方向。

 

我们将带你走进各个分论坛,领略嘉宾风采、洞悉前沿趋势。今天介绍的是将于6月21日下午举行的人工智能数理基础专题论坛。

 

论坛主席

 

张平文

 

北京大学数学科学学院教授,中国科学院院士,智源首席科学家。主要从事计算数学和科学计算研究,他在复杂流体的数学理论和计算方法、移动网格方法及应用、多尺度算法与分析等多个领域开展研究,取得一系列原创性的重要成果,迄今在JAMS,SINUM,PRL等杂志发表论文100余篇。曾获国家自然科学二等奖、教育部高等学校自然科学一等奖、国家杰出青年基金、冯康科学计算奖、教育部“长江学者”特聘教授、“百千万人才工程”国家级人选、国家自然科学基金委“创新研究群体”学术带头人等多项荣誉。

演讲主题及嘉宾介绍

 

1. Overparametrization and the Bias-Variance Dilemma

议题简介:For several machine learning methods such as neural networks, good generalisation performance has been reported in the overparametrized regime. In view of the classical bias-variance trade-off, this behaviour is highly counterintuitive. The talk summaries recent theoretical results on overparametrization and the bias-variance trade-off. This is joint work with Alexis Derumigny.

    

演讲嘉宾:Johannes Schmidt-Hieber

Professor of statistics at the University of Twente, the Netherlands. His current research focusses on the theoretical foundation of machine learning. He is an associate editor of the Annals of Statistics and Bernoulli.

2. Optimality Conditions for Constrained Minimax Optimization

议题简介:Minimax optimization problems arises from both modern machine learning including generative adversarial networks, adversarial training and multi-agent reinforcement learning, as well as from tradition research areas such as saddle point problems, numerical partial differential equations and optimality conditions of equality constrained optimization. For the unconstrained continuous nonconvex-nonconcave situation, Jin, Netrapalli and Jordan (2019) carefully considered the very basic question: what is a proper definition of local optima of a minimax optimization problem, and proposed a proper definition of local optimality called local minimax. We shall extend the definition of local minimax point to constrained nonconvex-nonconcave minimax optimization problems. By analyzing Jacobian uniqueness conditions for the lowerlevel maximization problem and the strong regularity of Karush-Kuhn-Tucker conditions of the maximization problem, we provide both necessary optimality conditions and sufficient optimality conditions for the local minimax points of constrained minimax optimization problems.

 

演讲嘉宾:戴彧虹

BAAI PI., Feng Kang distinguished professor (and also Assistant President) of Academy of Mathematics and Systems Science (AMSS) of CAS.  His research interests mainly lie in nonlinear optimization, integer programming and Optimization problems in AI and various applications. Specifically, he is quite interested in proposing simple but efficient optimization methods and in providing theoretical properties for existing elegant optimization methods. He has published many papers in various journals. He received the Fifth ZhongJiaQing Mathematics Award , Second Prize of the National Natural Science of China in 2006, the Tenth Science and Technology Award for Chinese Youth Best Paper Award of International Conference on Communication, the China National Funds for Distinguished Young Scientists, Feng Kang Prize of Scientific Computing, Shiing-Shen Chern Mathematics Award and the first Xiao Shutie Applied Mathematics Prize. 

3. Instrumental Variables for Multiple Causal Inference: Old and New

议题简介:Instrumental variable (IV) methods have a rich history and offer arguably the most viable way to control for unobserved confounding in causal inference. Classical IV methods require restrictive validity assumptions that are unlikely to hold in modern machine learning applications. We review recently proposed identification and inference strategies to relax these assumptions, focusing on the cases of multiple causes and high-dimensional instruments. Building on the IV framework, we suggest possible ways to improve the interpretability and explainability of AI algorithms.

    

演讲嘉宾:林伟

BAAI PI., Assistant Professor in the School of Mathematical Sciences and the Center for Statistical Science at Peking University. He obtained his Ph.D. in Applied Mathematics from the University of Southern California in 2011 and was a postdoctoral researcher at the University of Pennsylvania before joining PKU. His research interests include high-dimensional statistics, statistical machine learning, and causal inference. He has published papers in top journals such as Journal of the American Statistical Association, Biometrika, Biometrics, IEEE Transactions on Information Theory, and Operations Research.

4. Towards Better Global Landscape of GAN: How Two Lines of Code Change Makes a Difference

议题简介:GANs (generative adversarial networks) have been very popular in data generation and unsupervised learning, but our understanding of GAN training is still very limited. One major reason is that GANs are often formulated as non-convex-concave min-max optimization. As a result, most recent studies focused on the analysis in the local region around the equilibrium. In this talk, we discuss how to perform a global analysis of GANs and analyze mode collapse from an optimization perspective. We find that the original GAN has exponentially many bad strict local minima which are perceived as mode-collapse. We show that a simple modification to the original GAN enjoys better global landscape: it has no bad basins, and its training dynamics (with linear discriminators) has a Lyapunov function that leads to global convergence. Our experiments on standard datasets such as CIFAR10 and CELEBA show that this simple loss outperforms the original GAN and WGAN-GP.

演讲嘉宾:孙若愚

Assistant professor in the Department of Industrial and Enterprise Systems Engineering (ISE) and affiliated with Coordinated Science Lab (CSL) and Departement of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign (UIUC). Before joining UIUC, he was a visiting research scientist at Facebook AI Research, and was a postdoctoral researcher at Stanford University. He obtained PhD in electrical engineering from University of Minnesota, and B.S. in mathematics from Peking University. He has won the second place of INFORMS George Nicholson student paper competition, and honorable mention of INFORMS optimization society student paper competition. His current research interests lie on optimization and machine learning, especially deep learning and large-scale optimization.


- 点击阅读原文或长按图片,内行盛会,首次免费注册-

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值