Welcome! In this section I present information about articles and publications:
 Journal articles
 Conferences articles
 Book chapters
 Inproceedings.
You can select visualization options for: year and / or articles type / all types
2018 

2.  P D RoseroMontalvo; A C UmaquingaCriollo; S Flores; L Suarez; J Pijal; K L PonceGuevara; D Nejer; A Guzman; D Lugo; K Moncayo Neighborhood Criterion Analysis for Prototype Selection Applied in WSN Data Conference 2017 International Conference on Information Systems and Computer Science (INCISCOS), IEEE, 2018, ISBN: 9781538626443, (Electronic ISBN: 9781538626443 Print on Demand(PoD) ISBN: 9781538626450). Abstract  Links  BibTeX  Tags: {WSN} data, classification and the reduction of data set, Computer science, data reduction, data subset criterion, Information systems, learning (artificial intelligence), Machine learning algorithms, neighborhood criterion analysis, normalized distance, pattern classification, prototype selection, Prototypes, redundant data, set theory, Silicon, supervised machine learning classification algorithms, Training, training matrix, wireless sensor networks @conference{roseromontalvo_neighborhood_2017, title = {Neighborhood Criterion Analysis for Prototype Selection Applied in WSN Data}, author = {P D RoseroMontalvo and A C UmaquingaCriollo and S Flores and L Suarez and J Pijal and K L PonceGuevara and D Nejer and A Guzman and D Lugo and K Moncayo}, url = {https://ieeexplore.ieee.org/document/8328096}, doi = {10.1109/INCISCOS.2017.47}, isbn = {9781538626443}, year = {2018}, date = {20180402}, booktitle = {2017 International Conference on Information Systems and Computer Science (INCISCOS)}, pages = {128132}, publisher = {IEEE}, abstract = {The present work presents an analysis of the neighborhood criterion for the prototype selection (PS) in supervised machine learning classification algorithms. To do this, we use the condensed neighbor algorithm CNN to eliminate redundant data with the normalization of the distance to the centroid of each data subset criterion. This is done, in order to obtain the training matrix of the most optimal model. A selection of neighborhood criterion has been created from the quantification of the balance between the performance of the classification and the reduction of data set (CER). As proof of the test, we performed: (i) CER and (ii) realtime tests with the implementation of the algorithm within the WSN. The result is a data reduction of up to 88 % and a performance of the kNN classifier of 75%. It is concluded that the criterion of neighborhood with normalized distance must be less than or equal to 0.2 and the implementation of kNN with k = 1 obtains the best CER.}, note = {Electronic ISBN: 9781538626443 Print on Demand(PoD) ISBN: 9781538626450}, keywords = {{WSN} data, classification and the reduction of data set, Computer science, data reduction, data subset criterion, Information systems, learning (artificial intelligence), Machine learning algorithms, neighborhood criterion analysis, normalized distance, pattern classification, prototype selection, Prototypes, redundant data, set theory, Silicon, supervised machine learning classification algorithms, Training, training matrix, wireless sensor networks}, pubstate = {published}, tppubtype = {conference} } The present work presents an analysis of the neighborhood criterion for the prototype selection (PS) in supervised machine learning classification algorithms. To do this, we use the condensed neighbor algorithm CNN to eliminate redundant data with the normalization of the distance to the centroid of each data subset criterion. This is done, in order to obtain the training matrix of the most optimal model. A selection of neighborhood criterion has been created from the quantification of the balance between the performance of the classification and the reduction of data set (CER). As proof of the test, we performed: (i) CER and (ii) realtime tests with the implementation of the algorithm within the WSN. The result is a data reduction of up to 88 % and a performance of the kNN classifier of 75%. It is concluded that the criterion of neighborhood with normalized distance must be less than or equal to 0.2 and the implementation of kNN with k = 1 obtains the best CER. 
1.  Paul RoseroMontalvo; Diego H PeluffoOrdóñez; Ana Umaquinga; Andrés Anaya; Jorge Serrano; Edwin Rosero; Carlos Vásquez; Luis Suaréz 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), IEEE, 2018, ISBN: 9781538638941, (Electronic ISBN: 9781538638941. USB ISBN: 9781509058105. Print on Demand(PoD) ISBN: 9781538638958). Abstract  Links  BibTeX  Tags: knn, prototype selection, sensor data @conference{Rosero2018b, title = {Prototype reduction algorithms comparison in nearest neighbor classification for sensor data: Empirical study}, author = {Paul RoseroMontalvo and Diego H PeluffoOrdóñez and Ana Umaquinga and Andrés Anaya and Jorge Serrano and Edwin Rosero and Carlos Vásquez and Luis Suaréz}, url = {https://ieeexplore.ieee.org/document/8247530/authors https://scholar.google.com/scholar?hl=es&as_sdt=0%2C5&q=Prototype+reduction+algorithms+comparison+in+nearest+neighbor+classification+for+sensor+data%3A+Empirical+study&btnG=}, doi = {10.1109/ETCM.2017.8247530}, isbn = {9781538638941}, year = {2018}, date = {20180108}, booktitle = {2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM)}, publisher = {IEEE}, abstract = {This work presents a comparative study of prototype selection (PS) algorithms. Such a study is done over datafromsensor acquired by an embedded system. Particularly, five flexometers are used as sensors, which are located inside a glove aimed to read sign language. Measures were taken to quantify the balance between classification performance and reduction training set data (QCR) with k neighbors equal to 3 and 1 to force the classifier (kNN) to the maximum. Two tests were used: (a)the QCR performance and (b) the embedded system decision in real proves. As result the Random Mutation Hill Climbing (RMHC) algorithm is considered the best option to choose in this data type with removed instances at 87% and classification performance at 82% in software tests, also the classifier kNN must be with k=3 to improve the classification performance. In a real situation, with the algorithm implemented. The system makes correct decisions at 81% with 5 persons doing sign language in real time.}, note = {Electronic ISBN: 9781538638941. USB ISBN: 9781509058105. Print on Demand(PoD) ISBN: 9781538638958}, keywords = {knn, prototype selection, sensor data}, pubstate = {published}, tppubtype = {conference} } This work presents a comparative study of prototype selection (PS) algorithms. Such a study is done over datafromsensor acquired by an embedded system. Particularly, five flexometers are used as sensors, which are located inside a glove aimed to read sign language. Measures were taken to quantify the balance between classification performance and reduction training set data (QCR) with k neighbors equal to 3 and 1 to force the classifier (kNN) to the maximum. Two tests were used: (a)the QCR performance and (b) the embedded system decision in real proves. As result the Random Mutation Hill Climbing (RMHC) algorithm is considered the best option to choose in this data type with removed instances at 87% and classification performance at 82% in software tests, also the classifier kNN must be with k=3 to improve the classification performance. In a real situation, with the algorithm implemented. The system makes correct decisions at 81% with 5 persons doing sign language in real time. 