'
Deep Belief Network에 기반한 Super Learner 앙상블 분류모형 = Classification Model Using Super Learner Ensemble Based on Deep Belief Network' 의 주제별 논문영향력
논문영향력 요약
주제
확율(PROBABILITIES), 응용수학(통계학)
Super Learner 앙상블
dbn
deep belief network
분류모형
동일주제 총논문수
논문피인용 총횟수
주제별 논문영향력의 평균
609
0
0.0%
주제별 논문영향력
논문영향력
주제
주제별 논문수
주제별 피인용횟수
주제별 논문영향력
주제분류(KDC/DDC)
확율(PROBABILITIES), ...
584
0
0.0%
주제어
Super Learner 앙상블
1
0
0.0%
dbn
3
0
0.0%
deep belief network
11
0
0.0%
분류모형
10
0
0.0%
계
609
0
0.0%
* 다른 주제어 보유 논문에서 피인용된 횟수
0
'
Deep Belief Network에 기반한 Super Learner 앙상블 분류모형 = Classification Model Using Super Learner Ensemble Based on Deep Belief Network' 의 참고문헌
Zou, H. (2006). The adaptive lasso and its oracle properties, Journal of the American Statistical Association , 101, 1418-1429.
Zhu, L., Chen, L., Zhao, D., Zhou, J., and Zhang, W. (2017), Emotion recognition from chinese speech for smart affective services using a combination of SVM and DBN, Sensors, 17, 1-14.
Zhang, G. P. (2000). Neural networks for classification: A Survey, IEEE Transactions on Systems, Man and Cybernetics Part C, 30, 451-462.
Yuan, Y. and Shaw, M. (1995). Induction of fuzzy decision trees, Fuzzy Sets System, 69, 125-139.
Yang, X. Lo, D. Xia, X. Zhang, Y. and Sun, J. (2015). Deep learning for just-in-time defect prediction, International Conference on Software Quality, Reliability and Security, 17–26.
Xiao, Y., Wu J., Lin, Z., and Zhao X. (2018). A deep learning-based multi-model ensemble method for cancer prediction, Computer Methods and Programs in Biomedicine, 153, 1-9.
Wu, L., Yang, Y., and Liu, H. (2013). Nonnegative-lasso and applicationin index tracking, Computational Statistics & Data Analysis, 70, 116-126.
Wolpert, D. H. (1992). Stacked generalization, Neural Networks, 5, 241-259.
Wang, X., Zhang, W., Zhao, F., Liu, J., Li, P., and Zhang, Q. (2013). Detection of adulteration of sesame and peanut oils via volatiles by GC GC−TOF/MS coupled with principal components analysis and cluster analysis, European Journal of Lipid Science and Technology. 115, 337−347.
Vapnik, V. N. (1996). The Nature of Statistical Learning Theory, Springer, New York.
Van der Laan, M. J., Polley E. C., and Hubbard A. E. (2007). Super learner, Statistical Applications in Genetics and Molecular Biology, 6, 1-23.
Tibshirani, R. (1996): “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288.
Shi, P. (2018). Speech emotion recognition based on deep belief network, International Conference on Networking, Sensing and Control.
Sha, F., Lin, Y., Saul, L. K., and Lee, D. D. (2007). Multiplicative updates for nonnegative quadratic programming, Neural Computation, 19, 2004-2031.
Sewak, M., Vaidya, P., Chan, C. C., and Duan, Z. H. (2007). SVM approach to breast cancer classification. The International Multi-Symposiums on Computer and Computational Sciences, 1, 32-37.
Salunkhe, U. R. and Mali, S. N. (2016). Classifier ensemble design for imbalanced data classification: a hybrid approach, Procedia Computer Science, 85, 725-732.
Rubinstein, R. Y. and Kroese, D. P. (2004). The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning, Springer, New York.
Ramasubramanian, K. and Singh, A. (2016). Machine Learning Using R . Springer, New York.
R Development Core Team. (2010). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, URL http://www.R-project.org.
Polley, E. C. and van der Laan, M. J. (2010). Super learner in prediction, U.C. Berkeley Division of Biostatistics Working Paper Series. Working Paper 266.
Park, M. Y. and Hastie, T. L. (2007). 1-regularization path algorithm for generalized linear models, Journal of the Royal Statistical Society Series B, 69, 659-677.
Newman, D. J., Hettich, S., Blake, C. L., and Merz, C. J. (2017). UCI repository of machine learning databases, URL http://archive.ics.uci.edu/ml/.
McDonald, J. W. and Diamond, I. D. (1990). On the fitting of generalized linear models with nonnegativity parameter constraints, Biometrics, 46, 201-206.
McCullagh, P. and Nelder, J. (1989). Generalized Linear Models, second edition, CHAPMAN & HALL/CRC, London.
Mandal, B. N. and Ma, J. (2016b). nnlasso: Non-Negative Lasso and Elastic Net Penalized Generalized Linear Models, R package version 0.3, URL https://cran.r-project.org/web/packages/nnlasso.
Mandal, B. N. and Ma, J. (2016a). l1 regularized multiplicative iterative path algorithm for non-negative generalized linear models, Computational Statistics & Data Analysis, 101, 289-299.
Lin, H. W. and Tegmark, M. (2016). Why does deep and cheap learning work so well?, Journal of Statistical Physics, 168, 1223–1247.
Lim, J. S., Sohn, J. Y., Sohn, J. T., and Lim D. H. (2013). Breast cancer classification using optimal support vector machine. Journal of The Korean Society of Health Informatics and Statistics, 38, 108-121.
Lim, J. S., Oh Y. S., and Lim, D. H. (2014). Bagging support vector machine for improving breast cancer classification, Journal of the Korean Society of Health Informatics and Statistics, 39, 15-24.
Lawson, C. L. and Hanson, R. J. (1995). Solving least squares problems, SIAM Review, 18, 518-520.
Kuhn, M. (2018). caret: Classification and Regression Training. URL https://cran.r-project.org/web/packages/caret. R package version 6.0-81.
Kuhn, M. (2008). Building predictive models in R using the caret package, Journal of Statistical Software, 28, 1-26.
Kitbumrungrat, K. (2012). Comparison logistic regression and discriminant analysis in classification groups for breast cancer, International Journal of Computer Science and Network Security, 12, 111-115.
Kim, J. Y. (2017). Comparison of Classical Classification Methods with Deep Learning Method, Master’s Thesis, Ewha Womans University.
Kamei, Y., Shihab, E., Adams, B., and Hassan, A. E. Mockus, A. Sinha, A. and Ubayashi, N. (2013). A large-scale empirical study of just-in-time quality assurance, IEEE Transactions on Software Engineering, 39, 757–773.
Igel, C. and Husken, M. (2003). Empirical evaluation of the improved Rprop learning algorithms, Neurocomputing, 50, 105–123
Huang, C., Gong, W., Fu, W., and Feng, D. (2014). A research of speech emotion recognition based on deep belief network and SVM, Mathematical Problems in Engineering, 2014, 1-7.
Hinton, G. E., Srivastave, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R. R. (2012). Improving neural networks by preventing coadaptation of feature detectors. Technical Report, arXiv:1207.0580.
Hinton, G. E. and Salakhutdinov, R. R. (2006). Reducing the dimensionality of data with neural networks, Science, 313, 504-507.
Gupta, S., Kumar, D., and Sharma, A. (2011). Data mining classification techniques applied for breast cancer diagnosis and prognosis, Indian Journal of Computer Science and Engineering, 2, 188-195.
Franc, V., Hlavac, V., and Navara, M. (2005). Sequential coordinate-wise algorithm for the non-negative least squares problem, Lecture Notes in Computer Science, 3691, 407-414.
Fiuzy, M., Haddadnia, J., Mollania, N., Hashemian, M., and Hassanpour, K. (2012). Introduction of a new diagnostic method for breast cancer based on fine needle aspiration (FNA) test data and combining intelligent systems, Iranian Journal of Cancer Prevention, 5, 169-177.
Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems, Annals of Eugenics, 7, 111-132.
Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360.
Drees, M. (2013). Implementierung und Analyse von tiefen Architekturen in R, Master’s Thesis, University of Fachhochschule Dortmund.
Cook, N. R. (2008). Statistical evaluation of prognostic versus diagnostic models: beyond the ROC curve, Clinical Chemistry, 54, 17-23.
Choi, D. Y., Jeong, K. M., and Lim, D. H. (2018). Breast cancer classification using deep Learning-based ensemble, Journal of Health Informatics and Statistics, 43, 140-147.
Cho, K., Ilin, A., and Raiko, T. (2011). Improved learning of Gaussian-bernoulli restricted Boltzmann machines, In International Conference on Artificial Neural Networks, 6791, 10–17.
Chao, H., Song, C., Lu, B. Y., and Liu, Y. L. (2019). Feature extraction based on DBN-SVM for tone recognition, Journal of Information Processing Systems, 15, 91-99.
Breiman, L., Friedman J. H, Olshen, R. A, and Stone, C. J. (1984). Classification and Regression Trees, Chapman & Hall/CRC, London.
Breiman, L. (1996b). Bagging predictors, Machine Learning, 24, 123-140.
Breiman, L. (1996a). Stacked regression, Machine Learning, 24, 49-64.
Bengio, Y. et al. (2009). Learning deep architectures for AI, Foundations and trends in Machine Learning, 2, 1–127.
'
Deep Belief Network에 기반한 Super Learner 앙상블 분류모형 = Classification Model Using Super Learner Ensemble Based on Deep Belief Network'
의 유사주제(
) 논문