박사

자기보고식 심리검사에서 심층신경망 모형의 적용 가능성 탐색 = Investigating the Feasibility of Employing a Deep Neural Network Model in Self Report Inventories

이창묵 2019년
논문상세정보
' 자기보고식 심리검사에서 심층신경망 모형의 적용 가능성 탐색 = Investigating the Feasibility of Employing a Deep Neural Network Model in Self Report Inventories' 의 주제별 논문영향력
논문영향력 선정 방법
논문영향력 요약
주제
  • mmpi
  • 기계학습
  • 심층 신경망
  • 이상적응답반응
  • 인공신경망
  • 전환학습
  • 지도 학습
동일주제 총논문수 논문피인용 총횟수 주제별 논문영향력의 평균
1,856 0

0.0%

' 자기보고식 심리검사에서 심층신경망 모형의 적용 가능성 탐색 = Investigating the Feasibility of Employing a Deep Neural Network Model in Self Report Inventories' 의 참고문헌

  • 학습장애 진단을 위한 데이터 마이닝 기법간 비교: 회 귀분석, 의사결정나무분석, 신경망분석을 중심으로
    김동일 홍성두 특수교육연구, 16(1), 321-339 [2009]
  • 통계적 기계 학습(노영균, 남현하, 김은솔 역). 서울: 서울 대학교출판문화원
    Masashi, S. (원전은 2009에 출판) [2016]
  • 청소년의 컴퓨터 오락추구 행동을 예측하기 위한 신경 망 활용
    이혜주 정의현 한국컴퓨터교육학회논문지, 16(2), 39-48 [2012]
  • 인공지능은 뇌를 닮아 가는가
    유신 서울: 컬처룩 [2014]
  • 인공지능, 머신러닝, 딥러닝 입문
    김의중 경기: 위키북스 [2016]
  • 인공신경망에 근거한 인지진단모형 Q행렬의 타당성 평가
    이영주 이화 여자대학교 일반대학원 박사학위 논문 [2014]
  • 이세돌, 최강 인공지능 컴퓨터와 세기의 바둑대결
    한국경제TV [2016]
  • 이세돌, 인간 vs 인공지능 첫 대결 주인공 돼 영광이 다
    한국기원 바둑소식 [2016]
  • 이세돌 vs AI 컴퓨터 ‘100만달러 대국’
    서울신문 [2016]
  • 요소 간 위계 방식과 인공신경망을 적용한 수학 인지진단 평가 연구
    윤지영 서울대학교 일반대학원 석사학위 논문 [2016]
  • 알파고 꺾은 이세돌 ‘신의 한수’ 확률은... 구글도 감탄.
    중앙일보 [2017]
  • 신경망을 이용한 초등학생 컴퓨터 활용 능력 예측
    오지영 이수정 한 국정보교육학회, 12(3), 267-274 [2008]
  • 신경망 첫걸음 (송교석 역). 서울: 한빛미디어
    Rashid, T. (원전은 2016에 출판) [2017]
  • 신경망 분석을 활용한 학교폭력의 예측요인 분석 및 해결방안 모색
    이명훈 학습자중심교과교육연구, 17(22), 537-561 [2017]
  • 순환신경망 기반 언어 모델을 활용한 초등 영어 글쓰기 자동 평가
    박영기 정보교육학회논문지, 21(2), 161-169 [2017]
  • 사회과학도를 위한 연구방법론
    홍성열 서울: 시그마프레스 [2001]
  • 문항 반응 이론의 이해와 적용
    성태제 교육과학사: 서울 [2001]
  • 기계학습모형의 시각화
    허명회 통계연구, 20(2), 1-30 [2015]
  • 그림과 수식으로 배우는 통통 딥러닝 (심효섭 역). 경 기: 제이펍
    Yamashita, T. (원전은 2016에 출판) [2017]
  • “처음 배우는 머신러닝”
    김승연 정용주 서울: 한빛미디어 [2017]
  • “중퇴에 관한 위험 및 보호요인의 신경망 모형,”
    구본용 유제민 한국심 리학회지: 건강, 8(1), 133-146 [2003]
  • “인공신경망을 이용한 청소년의 또래 애착 영향 요인 탐색”
    변해원 한 국융합학회논문지, 8(10), 209-214 [2017]
  • “인간에겐 도라에몽이 필요해”
    조선비즈 감정 로봇 시대의 개화 [2017]
  • Xiong, W., Droppo, J., Huang, X., Seide, F., Seltzer, M., Stolcke, A., Yu, D., & Zweig, G. (2016). Achieving Human Parity in Conversational Speech Recognition. arXiv preprint arXiv:1610.05256.
  • XOR 문제와 Neural Network. Bussiness Intelligence Research Center. http://www.birc.co.kr/2018/01/22/xor-문제와-neural-network/에 서 2018. 11. 9
    이석준 자료 얻음 [2018]
  • Williams, C. K. (1997). Computing with Infinite Networks. In Advances in Neural Information Processing Systems.
  • Widrow, B., Hoff, M. E. (1960). Adaptive Switching Circuits. IRE WESCON Convention Record, 4, 96-104
  • Werbos, P. (1974). Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Ph. D. dissertation, Harvard University.
  • Van Rijsbergen, C. J. (1979). Information Retrieval. London: Butterworths
  • Van Gerven, M., Bohte, S. (Eds.). (2018). Artificial Neural Networks as Models of Neural Information Processing. Frontiers Media SA.
  • Tatsuoka, K. K. (1983). Rule Space: An Approach for Dealing with Misconceptions Based on Item Response Theory. Journal of Educational Measurement, 20(4), 345-354.
  • Stanford encyclopedia of philosophy [On-line Encyclopedia] Sun, Jimeng (2017). Feedforward Neural Networks. https://www.cc.gatech. edu/~san37/post/dlhc-fnn/ 에서 2018. 11. 9
    자료 얻음 [2015]
  • Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 15(1), 1929-1958.
  • Snoek, J., Larochelle, H., & Adams, R. P. (2012). Practical Bayesian Optimization of Machine Learning Algorithms. In Advances in Neural Information Processing Systems.
  • Shu, Z., Henson, R., & Willse, J. (2013). Using Neural Network Analysis to Define Methods of DINA Model Estimation for Small Sample Sizes. Journal of Classification, 30(2), 173-194.
  • Shang, W., Sohn, K., Almeida, D., & Lee, H. (2016, June). Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. In International Conference on Machine Learning.
  • Sese, A., Palmer, A. L., & Montano, J. J. (2004). Psychometric Measurement Models and Artificial Neural Networks. International Journal of Testing, 4(3), 253-266.
  • Seni, G., & Elder, J. F. (2010). Ensemble Methods in Data Mining: Improving Accuracy through Combining Predictions. Synthesis Lectures on Data Mining and Knowledge Discovery, 2(1), 1-126.
  • Seide, F., Li, G., & Yu, D. (2011). Conversational Speech Transcription Using Context-dependent Deep Neural Networks. In Twelfth Annual Conference of the International Speech Communication Association.
  • Samuel, A. L. (1959). Some Studies in Machine Learning Using the Game of Checkers. IBM Journal of Research and Development, 3(3), 210-229.
  • R을 이용한 데이터마이닝, 개 정판
  • Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., Huang, Z., Karpathy, A., Khosla, A., Bernstein, M., Berg, A. C., & Li, F. (2015). ImageNet Large Scale Visual Recognition Challenge. International Journal of Computer Vision, 115(3), 211-252.
  • Rumelhart, D. E., McClelland, . L. (1986). Parallel Distributed Processing, Vol. 1: Explorations in the Microstructure of Cognition. MIT Press.
  • Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning Representations by Back-propagating Errors. Nature, 323(October), 533.
  • Rosenblatt, F. (1958). The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review, 65(6), 386-408.
  • Rasmussen, C. E., Williams, C. K. I. (2006). Gaussian Processes for Machine Learning. MIT Press.
  • Ramachandran, P., Zoph, B., & Le, Q. V. (2017). Swish: a Self-gated Activation Function. arXiv preprint arXiv:1710.05941.
  • R core team. (2018). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL : https://www. R-project.org/.
  • Neal, R. (1994). Priors for Infinite Networks (Technical Report No. CRG-TR-94-1). University of Toronto.
  • Naver (2018). Hybrid DNN Text-to-Speech: A DNN Techniques for the Text-to-Speech and Voice Synthesis.
  • Nair, V., & Hinton, G. E. (2010). Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on Machine Learning (ICML-10).
  • Murphy, K (2012). Machine Learning: a Probabilistic perspective. MIT Press.
  • Muller, A. C., Guido S. (2016). Introduction to Machine Learning with Python. O’Reilly Media, Inc.
  • Moridis, C. N., & Economides, A. A. (2009). Prediction of Student’s Mood during an Online Test Using Formula-based and Neural Network-based Method. Computers & Education, 53(3), 644-652.
  • Minsky, M. L., Papert, S. A. (1987). Perceptrons: An Introduction to Computational Geometry. MIT Press.
  • McKinley, J. C., & Hathaway, S. R. (1940). A Multiphasic Personality Schedule (Minnesota): II. A Differential Study of Hypochondriasisa. The Journal of Psychology, 10(2), 255-268.
  • McCulloch, W. S., Pitts, W. (1943). A Logical Calculus of the Ideas Immanent in Nervous Activity. Bulletin of Mathematical Biophysics, 5(4), 115-133.
  • McCullagh, P. and Nelder, J. A. (1992). Generalized Linear Models. Chapman & Hall.
  • McCorduck, P. (2004). Machines Who Think: A Personal Inquiry into the History and Prospects of Artificial Intelligence. AK Peters/CRC Press.
  • McClelland, J., Rumelhart, D. (1988). Explorations in Parallel Distributed Processing. Cambridge, MA: MIT Press.
  • Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is Psychology Suffering from a Replication Crisis? American Psychologist, 70, 487-498.
  • Marsland, S. (2014). Machine Learning: An Algorithmic Perspective(2nd Ed.). Chapman and Hall.
  • Maas, A. L., Hannun, A. Y., & Ng, A. Y. (2013, June). Rectifier Non-linearities Improve Neural Network Acoustic Models. In Proceedings of ICML, 30(1), 3.
  • MMPI-2 다면적 인성검 사II 매뉴얼 개정판
  • MMPI 다면적 인성검사 검사법 요강
    이정균 정범모 진위교 서울: 코 리안테스팅센터 [1967]
  • Leighton, J. P., Gierl, M. J., & Hunka, S. M. (2004). The Attribute Hierarchy Method for Cognitive Assessment: A Variation on Tatsuoka's Rule‐Space Approach. Journal of Educational Measurement, 41(3), 205-237.
  • Lee, J., Bahri, Y., Novak, R., Schoenholz, S. S., Pennington, J., & Sohl-Dickstein, J. (2017). Deep Neural Networks as Gaussian Processes. arXiv preprint arXiv:1711.00165.
  • LeCun. Y., Boser, B. E., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W. E., and Jackel, L. D. (1989). Backpropagation Applied to Handwritten Zip Code Recognition. Neural Computation, 1(4), 541-551.
  • LeCun, Y.; Bottou, L.; Bengio, Y.; and Haffner, P. 1998. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86(11):2278–2324.
  • Lantz, B. (2015). Machine Learning with R (2nd Ed.). Birmingham: Packt Publishing Ltd.
  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). ImageNet Classification with Deep Convolutional Neural Networks. In Advances in Neural Information Processing Systems.
  • Kohonen, T. (1989). Self-organization and Associative Memory (3rd Ed.). New York: Springer-Verlag.
  • Klein, R. A., Ratliff, K. A., Vianello, M., Adams Jr, R. B., Bahn k, Š., Bernstein, M. J., ... & Cemalcilar, Z. (2014). Investigating Variation in Replicability: A “Many Labs” Replication Project. Social Psychology, 45, 142–152.
  • Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
  • Johnson, M. (2017). How Much Data is Enough? Predicting How Accuracy Varies with Training Data Size. Sydney, Australia: Macquarie University.
  • Item Analysis. 한국교육평가학회 워크숍, 7월 7일
    김석호 서울: 서울대 학교 교육정보관 [2018]
  • Ioffe, S., & Szegedy, C. (2015, June). Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In International Conference on Machine Learning.
  • Huebner, A. (2010). An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments. Practical Assessment, 15(3), 1-7.
  • Hopfield, J. J. (1982). Neural Networks and Physical Systems with Emergent Collective Computational Abilities. Proceedings of the National Academy of Sciences, 79(8), 2554-2558.
  • Hinton, G., Deng, L., Yu, D., Dahl, G. E., Mohamed, A. R., Jaitly, N., Senior, A., Vanhoucke, V., Nguyen, P., Sainath, T. N., & Kingsbury, B. (2012). Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups. IEEE Signal Processing Magazine, 29(6), 82-97.
  • Hinton, G. E., Osindero, S., & Teh, Y.-W. (2006). A fast learning algorithm for deep belief nets. Neural computation, 18(7): 1527-1554.
  • Hinton, G. E., & Salakhutdinov, R. R. (2006). Reducing the Dimensionality of Data with Neural Networks. Science, 313(July), 504-507.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving Deep into Rectifiers: Surpassing Human-level Performance on ImageNet Classification. In Proceedings of the IEEE International Conference on Computer Vision.
  • Harp, S. A., Samad, T., & Villano, M. (1995). Modeling Student Knowledge with Self-organizing Feature Maps. IEEE Transactions on Systems, Man, and Cybernetics, 25(5), 727-737.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of Item Response Theory. Sage Publications.
  • Hambleton, R. K., & Swaminathan, H. (2013). Item Response Theory: Principles and Applications. Springer Science & Business Media.
  • Hahnloser, R. H., Sarpeshkar, R., Mahowald, M. A., Douglas, R. J., & Seung, H. S. (2000). Digital Selection and Analogue Amplification Coexist in a Cortex-inspired Silicon Circuit. Nature, 405(June), 947-951.
  • Gregory, R. J. (2004). Psychological Testing: History, Principles, and Applications. Allyn & Bacon.
  • Green, R. A., & Michael, W. B. (1998). Using Neural Network and Traditional Psychometric Procedures in the Analysis of Test Scores: An Exploratory Study. Educational Research Quarterly, 22(2), 52-61.
  • Goodfellow, I., Bengio, Y., Courville, A., & Bengio, Y. (2016). Deep Learning. Cambridge: MIT press.
  • Glorot, X., & Bengio, Y. (2010, March). Understanding the Difficulty of Training Deep Feedforward Neural Networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.
  • Gierl, M. J., Wang, C., & Zhou, J. (2008). Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees' Cognitive Skills in Algebra on the SAT. Journal of Technology, Learning, and Assessment, 6(6).
  • Gierl, M. J., Leighton, J. P., & Hunka, S. M. (2007). Using the Attribute Hierarchy Method to Make Diagnostic Inferences about Examinees' Cognitive Skills.
  • Gierl, M. J., Cui, Y., & Hunka, S. (2008). Using Connectionist Models to Evaluate Examinees’ Response Patterns to Achievement Tests. Journal of Modern Applied Statistical Methods, 7(1), 234-245.
  • Gierl, M. J., Cui, Y., & Hunka, S. (2007, April). Using Connectionist Models to Evaluate Examinees’ Response Patterns on Tests: An Application of the Attribute Hierarchy Method to Assessment Engineering. Paper presented at the Annual Meeting of the National Council on Measurement in Education (NCME), Chicago, Il.
  • Gallant, S. I., (1993). Neural Network Learning and Expert Systems. MIT press.
  • Fukushima, K., Miyake., S. (1982). Neocognitron: a New Algorithm for Pattern Recognition Tolerant of Deformations and Shifts in Position. Pattern Recognition, 15(6), 455-469.
  • Freund, Y., Schapire, R. (1997). A Decision-theoretic Generalization of Online Learning and an Application to Boosting. Journal of Computer and System Sciences, 55, 119-139.
  • Flach, P. (2012). Machine Learning: the Art and Science of Algorithms That Make Sense of Data. Cambridge: Cambridge University Press.
  • Erhan, D., Bengio, Y., Courville, A., Manzagol, P. A., Vincent, P., Bengio, S. (2010). Why does unsupervised pre-training help deep learning? Journal of Machine Learning Research, 11(Feb), 625–660.
  • Duchi, J., Hazan, E., & Singer, Y. (2011). Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12(Jul): 2121-2159.
  • Di Nuovo, A. G., Di Nuovo, S., Buono, S., & Catania, V. (2009, June). Feedforward Artificial Neural Network to Estimate IQ of Mental Retarded People from Different Psychometric Instruments. In Neural Networks, 2009. IJCNN 2009. International Joint Conference on IEEE.
  • Di Nuovo, A. G., Di Nuovo, S., & Buono, S. (2012). Intelligent Quotient Estimation of Mental Retarded People from Different Psychometric Instruments Using Artificial Neural Networks. Artificial Intelligence in Medicine, 54(2), 135-145.
  • Deep Learning from Scratch. 서울: 한빛미디어(주)
    Goki, S. (원전은 2016년에 출판) [2017]
  • De Vreese, L. P., Gomiero, T., Uberti, M., De Bastiani, E., Weger, E., Mantesso, U., & Marangoni, A. (2015). Functional Abilities and Cognitive Decline in Adult and Aging Intellectual Disabilities. Psychometric Validation of an Italian Version of the Alzheimer's Functional Assessment Tool (AFAST): Analysis of Its Clinical Significance with Linear Statistics and Artificial Neural Networks. Journal of Intellectual Disability Research, 59(4), 370-384.
  • De Ayala, R. J. (2013). The Theory and Practice of Item Response Theory. Guilford Publications.
  • Data Mining을 이용한 음주 및 음주문제의 위 험요인과 취약성요인에 관한 탐색
    김인석 유제민 현명호 한국심리학회지: 건강, 6(2), 75-95 [2001]
  • Cui, Y., Gierl, M., & Guo, Q. (2016). Statistical Classification for Cognitive Diagnostic Assessment: An Artificial Neural Network Approach. Educational Psychology, 36(6), 1065-1082.
  • Cohen, R. J., & Swerdlik, M. E. (2002). Psychological Testing and Assessment: An Introduction to Tests and Measurement(5th ed.). New York: McGraw-Hill.
  • Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. Hillsdale, NJ: L. Lawrence Earlbaum Associates.
  • Clevert, D. A., Unterthiner, T., & Hochreiter, S. (2015). Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). arXiv preprint arXiv:1511.07289.
  • Camara, W. J., Nathan, J. S., & Puente, A. E. (2000). Psychological Test Usage: Implications in Professional Psychology. Professional Psychology: Research and Practice, 31(2), 141-154.
  • Brown, T. A. (2014). Confirmatory factor analysis for applied research. Guilford Publications.
  • Briggs, D. C., & Circi, R. (2017). Challenges to the Use of Artificial Neural Networks for Diagnostic Classifications with Student Test Data. International Journal of Testing, 17(4), 302-321.
  • Breiman, L. (2001). Random Forest. Machine Learning, 45, 5-32.
  • Breiman, L. (1996). Heuristics of Instability and Stabilization in Model Selection. Annals of Statistics, 24, 2350-2383.
  • Boyd, S., & Vandenberghe, L. (2004). Convex Optimization. Cambridge university press.
  • Binet, A., & Simon, T. (1916). The Development of Intelligence in Children: The Binet-Simon Scale. Williams & Wilkins Company.
  • Allen, M. J., & Yen, W. M. (1979). Introduction to Measurement Theory. Cole Publications.
  • Alessandri, M., Heiden, L. A., & Dunbar-Welter, M. (1995). History and Overview. In Heiden, L. A., & Hersen, M. (eds.), Introduction to Clinical Psychology. New York: Plenum Press.
  • Ahmed, M. (2017). On Improving The Performance And Resource Utilization of Consolidated Virtual Machines: Measurement, Modeling, Analysis, and Prediction. Ph.D. dissertation, Information Technologies, University of Sydney.
  • Agresti, A. (2007). An Introduction to Categorical Data Analysis(2nd Ed.). New York: John Wiley & Sons Inc.