Abstract
Clustering is one of the useful methods that we use to classify data. There are diverse statistical methods that can be used to divide the data into different groups. Cluster analysis is performed to discover distinct individuals that share the same common features within a large population. The observations within the same group have similar features from one to another and are different from observations in other groups. Eventually, clusters are classified by each group to determine which individual belongs to what group. In the past decades, clustering has been increasingly used in data analysis and data mining. Several clustering methods have been developed for grouping the data that share common features. k-means clustering: k-means, k-means++, and kernel k-means are the most important statistical tools used for clustering data. These methods are performed to classify the linearly separable data. Moreover, there are different ways to classify the data by assigning underline distributions to the data. In this paper, different distributions have been assigned as underline distributions for clustering the data. Clustering simulation data can be accomplished by assigning a normal or uniform distribution, as was done in this study. In order to see the improvement for each method, we assigned two different distributions (normal and uniform distributions) to classify linearly separable simulated data. The results were compared with the k-means method and with the ground truth of the data. The study found improvements in clustering when using a uniform density function. Moreover, a lower overlap percentage was found when we used the uniform density function for clustering the data. Using a significance test, there is no significant difference found between the estimated cluster mean and the cluster underlying mean. In addition, the proposed methods perform well when the sample size is larger.
Keyword
Probability density,k-means algorithm, linearly separable data, normal, and uniform distribution.
PDF Download (click here)
|