Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
32
result(s) for
"Zamzami, E M"
Sort by:
Analysis of Euclidean Distance and Manhattan Distance in the K-Means Algorithm for Variations Number of Centroid K
2020
K-Means is a clustering algorithm based on a partition where the data only entered into one K cluster, the algorithm determines the number group in the beginning and defines the K centroid. The initial determination of the cluster center is very influential on the results of the clustering process in determining the quality of grouping. Better clustering results are often obtained after several attempts. The manhattan distance matrix method has better performance than the euclidean distance method. The author making the result of conducted testing with variations in the number of centroids (K) with a value of 2,3,4,5,6,7,8,9 and the authors having conclusions where the number of centroids 3 and 4 have a better iteration of values than the number of centroids that increasingly high and low based on the iris dataset.
Journal Article
Comparative Analysis of Inter-Centroid K-Means Performance using Euclidean Distance, Canberra Distance and Manhattan Distance
2020
Clustering is a method needed to group data or objects based on the required level between data, K-means is one of the clustering methods used that can be used easily in its implementation, there are some additions to this method according to the data center and on the weighting of the distance between data, the weighting of the distance between data on K-Means traditionally can be done using Euclidean Distance, Canberra Distance and Manhattan Distance, making this an analysis of the accuracy generated from the method produced by a combination of the Z-score and Min-Max Normalization methods, and is carried out Cluster homogeneity test using the Silhouette Coefficient method. The results of this method show that the Canberra method is superior to Euclidean and Manhattan on Iris dataset and the Canberra combination method with Z-score and Min-Max can increase the value on the glass without using the Normalization Method 37. 44% to 67.46% use the Z-score and 56.52% use Min-Max and use an increase in the average value of the Silhouette Coefficient.
Journal Article
Improving The Performance of K-Nearest Neighbor Algorithm by Reducing The Attributes of Dataset Using Gain Ratio
2020
Data that has many attributes or higher dimensions will affect the performance of the K-NN classification algorithm. In this study, the Gain Ratio implemented for selecting and reducing the dataset attributes to form a new dataset for the classification process is carried out with the K-NN. The dataset used in this study are the Breast Cancer Coimbra dataset and Hepatitis C Virus dataset obtained from the UCI Machine Learning Repository. The results showed that the Breast Cancer Coimbra dataset, Gain Ratio can improve the performance of K-NN with average value 0.535596 TPR, TNR = 1, NPV = 0.608279, FNR = 1, FOR = 0.391721, Accuracy = 72.85%. In Hepatitis C Virus dataset also managed to improve the performance of K-NN with average value TPR = 0.665596, TNR = 0,876667, NPV=0,738279, FNR=0,88, FOR=0,521721, and Accuracy=86,25%.
Journal Article
An Implementation of Backtracking Algorithm for Solving A Sudoku-Puzzle Based on Android
2020
Sudoku is a pretty popular number game. The goal of this game is to fill a 9x9 matrix with unique numbers, and there should not be repeated numbers in each row, column, or block. This paper proposed a solution to solve Sudoku using the Backtracking Algorithm. This algorithm is quite efficient because it does not have to check all the possibilities that exist, but which leads to only the solution that will be processed, namely by pruning every element that does not lead to the solution. Thus the time required is quite efficient and suitable for use in reasonably complex numbers games like sudoku. By implementing the backtracking algorithm in the sudoku game, the complexity of the algorithm can be as large θ (n3).
Journal Article
Analysis of Combination Algorithm Data Encryption Standard (DES) and Blum-Blum-Shub (BBS)
2021
Data Encryption Standard (DES) has a weakness in the key that is vulnerable to security threats, but it is quite popular to use because of the fast encryption and decryption process. Combining the DES algorithm with the Pseudo-random number generator Blum-Blum-Shub (BBS) in generating external keys in the encryption and decryption process of messages, produces a unique key and a good level of security. The longer the selected key and seed, the higher the security level of the DES encryption and decryption key. The use of BBS as an external key generator in the DES algorithm does not significantly affect the encryption (key and seed generated is around 0,001 – 0,003 seconds for the 2 – 4 digits experiment) processing time, meanwhile, the time in the decryption process has no effect because the BBS key and seed are no longer subject to feasibility testing or number requirements. The combination of DES and BBS algorithms can increase security in terms of encryption and decryption keys because the random number of DES keys generated by the BBS algorithm is unique and does not burden the user in determining the keys used in the DES algorithm because the keys are generated from the programming language.
Journal Article
Comparative Analysis of Eigenface and Learning Vector Quantization (LVQ) to Face Recognition
2020
Face recognition is a topic most often discussed in this era because it can be applied and developed for several needs that can be useful in daily life. Face recognition always use learning method like eigenface and learning vector quantization (LVQ). The learning process is using the face of a digital image taken from a camera in five angles for one person that will apply for dataset learning (training and learning data set), and the live image is taken from the camera for testing (testing data image). The first step image will be detected with a haar cascade and captured by a camera. It will use to processing images to get the best image and prepare to be input into the network. From the experiments with 10 testings with various parameter values, the experiment results obtained the LVQ is more accurate than eigenface to identifying faces with average accuracy 66.29% and eigenface is 56.67% with comparison 9.62%, but eigenface is faster in running time with average time 4.39 seconds and 7.38 seconds for LVQ with comparison 2.98 seconds.
Journal Article
Literature Review : Implementation of Facial Recognition in Society
2020
At one of the most successful applications of image analysis and understanding, face recognition has recently received significant attention, especially during the past few years. Facial recognition technology (FRT) has emerged as an attractive solution to address many contemporary needs for identification and verification of identity claims. It brings together the promise of other biometric systems, which attempt to tie identity to individually distinctive features of the body, and the more familiar functionality of visual surveillance systems. This report develops a socio-political analysis that bridges the technical and social scientific literature on FRT and addresses the unique challenges and concerns that attend its development, evaluation, and specific operational uses, contests, and goals. It highlights the potential and limitations of the technology, noting those tasks for which it seems ready for deployment, those areas where performance obstacles may be overcome by future technological developments or sound operating procedures, and still other issues that appear intractable. Its concern with efficacy extends to ethical considerations. Face recognition technology may solve this problem since a face is undeniably connected to its owner except in the case of identical twins. It's nontransferable. The system can then compare scans to records stored in a central or local database or ever on a smart card
Journal Article
Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)
2018
Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.
Journal Article
A Classification: using Back Propagation Neural Network Algorithm to Identify Cataract Disease
2020
Artificial Neural Networks are often used in the fields of pattern recognition, speech, and image recognition, where high levels of computation are needed. One method that can be used is the Back-propagation Neural Network. The method can be used to identify several diseases, one of which is a cataract. A cataract is one of the most significant diseases that can cause blindness. Cataracts consist of three levels, namely mature cataracts, immature cataracts, and hyper-mature cataracts. The testing method uses two parameter values, namely epoch with value 1000,5000,10000 and learning rate with value 0.01, 0.05, 0.1. From the test, it was found that the best parameters of the above trial results were epoch with a value of 10000 and learning rate with a value of 0.1 on 80 experimental data. In experiments conducted to identify normal, mature, immature, and hyper mature obtained an accuracy of 100%, 95%, 85%, and 90%. The percentage value of the accuracy of the test results using the BPN method is 92.5%.
Journal Article
A Crypto Compression System Using ElGamal Public Key Encryption Algorithm and Even-Rodeh Codes
2020
Confidentiality and size of data are some significant aspects in data exchange. Public key encryption algorithms, such as El-Gamal is known as to preserve confidentiality. On the other hand, public key encryption algorithms tend to increase the size of the encrypted data, making it difficult to transmit to the other party. This research will combine the ElGamal public key encryption algorithm with Even-Rodeh codes. ElGamal is used to secure data and Even-Rodeh codes are used to compress data. The parameters is being tested are the compression ratio and space savings. In this research, we will do an experiment that includes some data texts from Artificial Corpus. The results show that the crypto compression system can reduce the size of the transmitted data and the transmitted data could be revert back to the original data while still maintains its confidentiality.
Journal Article