Welcome! In this section I present information about articles and publications:
 Journal articles
 Conferences articles
 Book chapters
 Inproceedings.
You can select visualization options for: year and / or articles type / all types
2018 

1.  P D RoseroMontalvo; A C UmaquingaCriollo; S Flores; L Suarez; J Pijal; K L PonceGuevara; D Nejer; A Guzman; D Lugo; K Moncayo Neighborhood Criterion Analysis for Prototype Selection Applied in WSN Data Conference 2017 International Conference on Information Systems and Computer Science (INCISCOS), IEEE, 2018, ISBN: 9781538626443, (Electronic ISBN: 9781538626443 Print on Demand(PoD) ISBN: 9781538626450). Abstract  Links  BibTeX  Tags: {WSN} data, classification and the reduction of data set, Computer science, data reduction, data subset criterion, Information systems, learning (artificial intelligence), Machine learning algorithms, neighborhood criterion analysis, normalized distance, pattern classification, prototype selection, Prototypes, redundant data, set theory, Silicon, supervised machine learning classification algorithms, Training, training matrix, wireless sensor networks @conference{roseromontalvo_neighborhood_2017, title = {Neighborhood Criterion Analysis for Prototype Selection Applied in WSN Data}, author = {P D RoseroMontalvo and A C UmaquingaCriollo and S Flores and L Suarez and J Pijal and K L PonceGuevara and D Nejer and A Guzman and D Lugo and K Moncayo}, url = {https://ieeexplore.ieee.org/document/8328096}, doi = {10.1109/INCISCOS.2017.47}, isbn = {9781538626443}, year = {2018}, date = {20180402}, booktitle = {2017 International Conference on Information Systems and Computer Science (INCISCOS)}, pages = {128132}, publisher = {IEEE}, abstract = {The present work presents an analysis of the neighborhood criterion for the prototype selection (PS) in supervised machine learning classification algorithms. To do this, we use the condensed neighbor algorithm CNN to eliminate redundant data with the normalization of the distance to the centroid of each data subset criterion. This is done, in order to obtain the training matrix of the most optimal model. A selection of neighborhood criterion has been created from the quantification of the balance between the performance of the classification and the reduction of data set (CER). As proof of the test, we performed: (i) CER and (ii) realtime tests with the implementation of the algorithm within the WSN. The result is a data reduction of up to 88 % and a performance of the kNN classifier of 75%. It is concluded that the criterion of neighborhood with normalized distance must be less than or equal to 0.2 and the implementation of kNN with k = 1 obtains the best CER.}, note = {Electronic ISBN: 9781538626443 Print on Demand(PoD) ISBN: 9781538626450}, keywords = {{WSN} data, classification and the reduction of data set, Computer science, data reduction, data subset criterion, Information systems, learning (artificial intelligence), Machine learning algorithms, neighborhood criterion analysis, normalized distance, pattern classification, prototype selection, Prototypes, redundant data, set theory, Silicon, supervised machine learning classification algorithms, Training, training matrix, wireless sensor networks}, pubstate = {published}, tppubtype = {conference} } The present work presents an analysis of the neighborhood criterion for the prototype selection (PS) in supervised machine learning classification algorithms. To do this, we use the condensed neighbor algorithm CNN to eliminate redundant data with the normalization of the distance to the centroid of each data subset criterion. This is done, in order to obtain the training matrix of the most optimal model. A selection of neighborhood criterion has been created from the quantification of the balance between the performance of the classification and the reduction of data set (CER). As proof of the test, we performed: (i) CER and (ii) realtime tests with the implementation of the algorithm within the WSN. The result is a data reduction of up to 88 % and a performance of the kNN classifier of 75%. It is concluded that the criterion of neighborhood with normalized distance must be less than or equal to 0.2 and the implementation of kNN with k = 1 obtains the best CER. 