➢ Marchenko–Pastur則, Stieltjes変換 58 • 集中不等式による評価 ➢ 有限サンプルサイズにおける予測誤差の上界評価 (𝑛 < ∞) ➢ 収束レートが評価できる. (𝑛 → ∞を取る前の振る舞いを評価) ◼ Dobriban&Wager: High-dimensional asymptotics of prediction: Ridge regression and classification. The Annals of Statistics, 46(1):247–279, 2018. ◼ Hastie et al.: Surprises in High-Dimensional Ridgeless Least Squares Interpolation, arXiv:1903.08560. ◼ Song&Montanari. The generalization error of random features regression: Precise asymptotics and double descent curve. Communications on Pure and Applied Mathematics. arXiv:1908.05355 (2019). ◼ Belkin, Rakhlin&Tsybakov: Does data interpolation contradict statistical optimality? AISTATS2019. ◼ Bartlett, Long, Lugosi&Tsigler: Benign Overfitting in Linear Regression. PNAS, 117(48):30063-30070, 2020. ◼ Liang&Rakhlin: Just interpolate: Kernel “Ridgeless” regression can generalize. The Annals of Statistics, 48(3):1329–1347, 2020. • CGMT (Convex Gaussian min-max Theorem) ◼ Thrampoulidis, Oymak & Hassibi: Regularized linear regression: A precise analysis of the estimation error. COLT2015. ◼ Thrampoulidis, Abbasi & Hassibi: Precise error analysis of regularized m-estimators in high dimensions. IEEE Transactions on Information Theory, vol. 64, no. 8, pp. 5592–5628, 2018.