Back to Current Issues

Data Science: Prediction and Analysis of Data using Multiple Classifier System

Veena Hosamani, Dr. H S Vimala,

Affiliations
1& 2: Dept. Of CSE, University Visvesvaraya College Of Engineering, Bengaluru, India
:10.22362/ijcert/2018/v5/i12/v5i1201


Abstract
In modern times, with the trending technology, for classification of Big Data it is very common that Deep Neural network algorithms are used. The experiment was carried out considering relatively smaller data. In this paper, we propose, a model Multiple Classifier System, in which the different classifiers are ensembled. We have ensembled different classifiers like, LR, LDA, KNN, CART, NB, and SVM. To check the performance of the Multiple Classifier System we have used Iris flower dataset. When the neural networks and the Multiple Classifier System was compared with the performance, the MCS has shown graduation increase in the results.


Citation
Veena Hosamani,Dr. H S Vimala."Data Science: Prediction and Analysis of Data using Multiple Classifier System". International Journal of Computer Engineering In Research Trends (IJCERT) ,ISSN:2349-7084 ,Vol.5, Issue 12,pp.216-222, December- 2018. http://ijcert.org/ems/ijcert_papers/V5I1201.pdf


Keywords : Multiple classifier systems, Ensemble, Data confidence, Machine learning.

References
[1] 	Yann LeCun, Yoshua Bengio, and Geoffrey Hinton.  “Deep learning”.  Nature, 521(7553):436–444, 2015.
 [2]	Yoshua Bengio, Aaron Courville, and Pascal Vincent. “Representation learning: A review and new perspectives”. IEEE transactions on pattern analysis and machine intelligence, 35(8):1798–1828, 2013.
 [3] 	Agnieszka Oni´sko, Marek J Druzdzel, and Hanna Wasyluk. “Learning bayesian network parameters from small data sets: Application of noisyor gates”. International Journal of Approximate Reasoning, 27(2):165–182, 2001.
[4]	 Nitish Srivastava, Geoffrey E Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. “Dropout: a simple way to prevent neural networks from overfitting”. Journal of machine learning research, 15(1):1929–1958, 2014.
[5]  	Kyung-Shik Shin, Taik Soo Lee, and Hyun-jung Kim. “An application of support vector machines in bankruptcy prediction model”. Expert Systems with Applications, 28(1):127–135, 2005.
[6]	Sumit Chopra, Raia Hadsell, and Yann LeCun. “Learning a similarity metric discriminatively, with application to face verification”. In  Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on, volume 1, pages 539–546. IEEE, 2005.
[7]	Tin Kam Ho, Jonathan J. Hull, and Sargur N. Srihari. “Decision combination in multiple classifier systems”. IEEE transactions on pattern analysis and machine intelligence, 16(1):66–75, 1994.
[8]	Thomas G Dietterich et al. “Ensemble methods in machine learning”. Multiple classifier systems, 1857:1–15, 2000.
[9]	Peijun Du, Junshi Xia, Wei Zhang, Kun Tan, Yi Liu, and Sicong Liu. “Multiple classifier system for remote sensing image classification: A review. Sensors”, 12(4):4764–4792, 2012.
[10]	Michał Wo´zniak, Manuel Gra˜na, and Emilio Corchado. “A survey of multiple classifier systems as hybrid systems”. Information Fusion, 16:3– 17, 2014.
[11]	Robert PW Duin and David MJ Tax. “Experiments with classifier combining rules”. In International Workshop on Multiple Classifier Systems, pages 16–29. Springer, 2000.
[12]	Jinxiu Qu, Zhousuo Zhang, and Teng Gong. “A novel intelligent method for mechanical fault diagnosis based on dual-tree complex wavelet packet transform and multiple classifier fusion”. Neurocomputing, 171:837–853, 2016.
[13]	Luigi P Cordella, Pasquale Foggia, Carlo Sansone, Francesco Tortorella, and Mario Vento. “A cascaded multiple expert system for verification”. In International Workshop on Multiple Classifier Systems, pages 330–339. Springer, 2000.
[14]	Didier Guillevic and Ching Y Suen. “Hmm-knn word recognition engine for bank cheque processing”. In Pattern Recognition, 1998. Proceedings. Fourteenth International Conference on, volume 2, pages 1526–1529. IEEE, 1998.
[15]	Ching Y Suen and Louisa Lam. “Multiple classifier combination methodologies for different output levels”. In International workshop on multiple classifier systems, pages 52–66. Springer, 2000.
[16]	Nikil Dutt, Axel Jantsch, and Santanu Sarma. “Toward smart embedded systems: A self-aware system-on-chip (soc) perspective”. ACM Trans. Embed. Comput. Syst., 15(2):22:1–22:27, February 2016.
[17]	Nima TaheriNejad, Axel Jantsch, and David Pollreisz. “Comprehensive observation and its role in self-awareness: an emotion recognition system example”. In Proceedings of the Federated Conference on Computer Science and Information Systems, Gdansk, Poland, 2016.
[18]	Maximilian G¨otzinger, Nima Taherinejad, Amir M. Rahmani, Pasi Liljeberg, Axel Jantsch, and Hannu Tenhunen. “Enhancing the Early Warning Score System Using Data Confidence”, pages 91–99. Springer International Publishing, Cham, 2017.
[19]	Arman Anzanpour, Iman Azimi, Maximilian G¨otzinger, Amir M. Rahmani, Nima TaheriNejad, Pasi Liljeberg, Axel Jantsch, and Nikil Dutt. “Self-awareness in remote health monitoring systems using wearable electronics”. In Proceedings of Design and Test Europe Conference (DATE), Lausanne, Switzerland, March 2017.
[20]	N. TaheriNejad, M. A. Shami, and S. M. P. D. “Self-aware sensing and attention-based data collection in multi-processor system-on-chips”. In 2017 15th IEEE International New Circuits and Systems Conference (NEWCAS). 


DOI Link : https://doi.org/10.22362/ijcert/2018/v5/i12/v5i1201

Download :
  V5I1201.pdf


Refbacks : Currently there are no Refbacks

Quick Links


DOI:10.22362/ijcert


Science Central

Score: 13.30



Submit your paper to [email protected]

>