当前位置: 首页
> 学科科研 > 学术预告
On non-redundant nonlinear dimension reduction
发布时间: 2025-04-17 10:57
浏览次数:
字号:[ ]

时间:2025-03-18 10:00

地点:理四323
报名截止时间:2025-03-17 17:00
个人简历
骆威, 浙江大学数据科学中心研究员,2014年博士毕业于美国宾夕法尼亚州立大学,之后任职于美国Baruch College,于2018年加入浙江大学。研究方向包括充分降维和因果推断,在Annals of Statistics, Biometrika, JRSSB,JMLR等统计学和机器学习领域国际学术期刊上发表了多篇论文,目前主持国家青年科学基金委青年科学基金B类(原优秀青年科学基金)项目。
单位:浙江大学
报告主要内容
Kernel principal component analysis (KPCA; Scholkopf et al., 1998), a popular nonlinear dimension reduction technique, aims at finding a basis of a presumed low-dimensional function space. This causes the redundancy issue that each kernel principal component can be a measurable function of the preceding components, which harms the effectiveness of dimension reduction and leaves the dimension of the reduced data a heuristic choice. In this paper, we formulate the parameter of interest for nonlinear dimension reduction as a small function set that generates the σ-field of the original data, and, using a novel characterization of near conditional mean independence, we propose two sequential dimension reduction methods that address the redundancy issue, have the same level of computational complexity as KPCA, and require more plausible assumptions on the singularity of the original data. Compared with the other nonlinear dimension reduction methods, the proposed methods are applicable to various complex cases with guarantee on both the asymptotic consistency and the smoothness and interpretability of the reduced data. By constructing a measure of exhaustiveness of the reduced data, we also provide consistent order determination for these methods. Some supportive numerical studies are presented at the end.


打印本页 关闭窗口
应用连接
Produced By 大汉网络 大汉版通发布系统