search for




 

심층신경망으로 가는 통계 여행, 첫 번째 여행: 회귀모형에서 심층신경망으로
A statistical journey to DNN, the first trip: From regression to deep neural network
Korean J Appl Stat 2024;37(5):541-551
Published online October 31, 2024
© 2024 The Korean Statistical Society.

김희주a, 황인준a, 김유진a, 이윤동1,a
Hee Ju Kima, In Jun Hwanga, Yu Jin Kima, Yoon Dong Lee1,a

a서강대학교 경영학부

aBusiness School, Sogang University
1Business School, Sogang University, PA 804, BaekBumRo, Mapo, Seoul 04107, Korea. E-mail: widylee@sogang.ac.kr
Received July 31, 2024; Revised August 9, 2024; Accepted August 12, 2024.
Abstract
최근 인공지능과 심층신경망에 대한 언급 없이 통계학을 이야기하기 어려운 시대가 되었다. 인공지능과 심층신경망의 발전은 통계학의 주요 연구 성과가 이루어 낸 결과이기도 하지만, 현대의 통계학과 인공지능은 사뭇 다른 방법인 것처럼 생각되기도 한다. 그 주요 원인은 통계학 교육과정이 시대에 맞게 변화하지 못한데 따른 것으로 보인다. 본 논문에서는 통계학 교육의 확장과 발전의 틀을 마련하기 위하여, 심층신경망 그중에서도 다층퍼셉트론과 회귀분석의 관계를 통계학의 관점에서 살펴보고, 그 공통점과 차이점을 살펴본다.
It has become difficult to discuss statistics without mentioning recent advancements in artificial intelligence and deep neural networks. While the progress in artificial intelligence and deep neural networks is also a result of major research achievements in statistics, modern statistics and artificial intelligence are often perceived as distinctly different approaches. The primary reason for this seems to be that the statistics education curriculum has not evolved to keep pace with the times. In this paper, to establish a framework for the expansion and development of statistics education, we examine the relationship between deep neural networks, specifically multi-layer perceptrons, and regression analysis from a statistical perspective, and explore their similarities and differences.
주요어 : 심층신경망, 분류, 다층퍼셉트론, 회귀분석, 일반화선형모형
Keywords : deep neural net, classification, multi-layer perceptron, regression, generalized linear model
References
  1. Breiman L, Friedman J, Stone CJ, and Olshen RA (1984). Classification and Regression Trees (1st ed), Chapman & Hall, New-York.
  2. Cybenko G (1989). Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals, and Systems, 2, 303-314.
    CrossRef
  3. Hastie T, Tibshirani R, and Friedman J (2001). The Elements of Statistical Learning, Springer-Verlag, New-York.
    CrossRef
  4. Hasan A, Wang Z, and Mahani AS (2016). Fast estimation of multinomial logit models: R package mnlogit, Journal of Statistical Software, 75, 1-19.
    CrossRef
  5. Hwang IJ, Kim HJ, Kim YJ, and Lee YD (2024). Generalized neural collaborative filtering, The Korean Journal of Applied Statistics, 37, 311-322.
    CrossRef
  6. Kim HJ, Kim YJ, Jang K, and Lee YD (2024a). A statistical journey to DNN, the second trip: Architecture of RNN and image classification, The Korean Journal of Applied Statistics, 37, 553-565.
    CrossRef
  7. Kim YJ, Hwang IJ, Jang K, and Lee YD (2024b). A statistical journey to DNN, the third trip: Language model and transformer, The Korean Journal of Applied Statistics, 37, 567-582.
    CrossRef
  8. Morris CN (1983). Natural exponential families with quadratic variance functions: Statistical theory, The Annals of Statistics, 11, 515-529.
    CrossRef
  9. Venables WN and Ripley BD (2002). Modern Applied Statistics with S (4th ed), Springer-Verlag, New-York.
    CrossRef


October 2024, 37 (5)