Taiji Suzuki, profile picture

Taiji Suzuki

Sort by
[ICLR2021 (spotlight)] Benefit of deep learning with non-convex noisy gradient descent
[NeurIPS2020 (spotlight)] Generalization bound of globally optimal non convex neural network training: Transportation map estimation by infinite dimensional Langevin dynamics
深層学習の数理:カーネル法, スパース推定との接点
Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network
数学で解き明かす深層学習の原理
深層学習の数理
はじめての機械学習
Minimax optimal alternating minimization \\ for kernel nonparametric tensor learning
Ibis2016
Sparse estimation tutorial 2014
Stochastic Alternating Direction Method of Multipliers
機械学習におけるオンライン確率的最適化の理論
PAC-Bayesian Bound for Gaussian Process Regression and Multiple Kernel Additive Model
統計的学習理論チュートリアル: 基礎から応用まで (Ibis2012)
Jokyokai