An Entropy-based Weighted Clustering Algorithm and Its Optimization for Ad hoc Networks originally published on 3rd IEEE Int’l Conf. on Wireless and Mobile Computing, Networking and Communications, 2007. (WiMOB 2007)

Yu-Xuan Wang

Forrest Sheng Bao

Dept. of Computer Science and Engineering The Ohio State University, Columbus, Ohio Email: logpie @ gmail.com

Dept. of Computer Science Texas Tech University, Lubbock, Texas Email: forrest.bao @ gmail.com

Abstract— As a newly-proposed weighing-based clustering algorithm, WCA has improved performance compared with other previous clustering algorithms. But the high mobility of nodes will lead to high frequency of reaffiliation which will increase the network overhead. To solve this problem, we proposed an entropy-based WCA (EWCA) which can prompt the stability of the network. Meanwhile, in order better to facilitate the optimal operation of the MAC protocol and to further stabilize the network structure, this paper applies tabu search onto EWCA to choose a near optimal dominant set. Consequently, less clusterheads are required to manage the network. Simulation study indicates that the revised algorithm (EWCA-TS) has improved performance with respect to the original WCA, especially on the number of clusters and the reaffiliation frequency.

I. I NTRODUCTION Unlike traditional wireless networks, the Ad-hoc network is an infrastructureless network consisting of mobile autonomous moving nodes. Every node in the network performs as a router or a package forwarder. They interconnect with each other at the same peer level to enable the network function. Although originally designed for military purpose, now it has been used in various civilian applications in the last decade. Since ad hoc networks differ to traditional hierarchical networks, its tremendous importance and the lack of rigorous methodologies motivate the in-depth research in this area. Analogous to traditional cellular networks, the partitioning in ad hoc networks, known as clustering, is used to solve the inefficient use of power and bandwidth for every node to communicate directly. Each cluster elects one clusterhead, the upper layer node, to manage the cluster and coordinate with other clusters. The set of clusterheads is known as dominant set. Several heuristics have been proposed to choose clusterheads in ad hoc networks, such as Highest-Degree heuristic [1], [2], Lowest-ID heuristic [3]–[5], Node-Weight heuristic [6], [7], etc. The Weighted Clustering Algorithm (WCA) was firstly proposed by M. Chatterjee, S. K. Das and D. Turgut [8]. A node is selected to be the clusterhead when it has the minimum weighted sum of four indices: the number of potential members; the sum of the distances to other nodes in its

radio distance; the node’s average moving speed (where less movement is desired); and time of it being a clusterhead (this takes battery life into account). When a node has moved out of its cluster, it will firstly check whether it can be a member of other clusters. If such a cluster exists, it will detach from current cluster and attach itself to that one. The process of joining a new cluster is known as reaffiliation. If the reaffiliation fails, the whole network will recall the clusterhead election routine. One disadvantage of WCA is high reaffiliation frequency when network scenario changes very fast. High frequency of reaffiliation will increase the communication overhead. Thus, reducing the amount of reaffiliation is necessary in ad hoc networks. In practice, to better facilitate the management of the network, a good dominant set that each clusterhead handles the maximum possible number of mobile nodes is required. But the underlying optimal assignment problem is an NPhard combinatorial optimization problem [9], which is essentially the problem of the selection of minimum dominating set in graph theory. Several approximation algorithms [10]– [12] were applied to this problem and achieved improved performances compared with the original WCA. As discussed above, in this paper, we tackle with these two problems. Firstly, we replace one term in the original weight formula, the average moving speed of nodes, by the entropy of local networks. By using this more reasonable and comprehensive measure of the mobility of the network, the reaffiliation frequency can be effectively reduced. Secondly, tabu search is a powerful technique to solve complex combinatorial optimization problem and was proved to be more effective than other approximation algorithms in solving the problem of the selection of minimum dominating set [13]. Thus based on the revised combined weight formula, we further use tabu search to optimize the clusterhead election routine which is beneficial in forming near optimal dominant sets. The rest of this paper is organized as follows. In Section II, we briefly introduce the clusterhead election routine in the basic WCA and its simulated annealing optimized version. In Section III, we present the proposed algorithm EWCA and

EWCA-TS. Simulation results and comparisons are presented in Section IV while conclusions are offered in Section V. II. T HE W EIGHTED C LUSTERING A LGORITHM In this section, we briefly describe the basic WCA algorithm and its simulated annealing optimized version, which serve as both the basis for and performance gauge of the proposed algorithm. Here we only concern on the algorithmic aspect of the clusterhead election routine, other details can be seen in [8]. A. The Basic WCA After nodes have obtained enough information and created its neighbor list, the WCA calculates the weighted sum of node v using Wv = w1 ∆v + w2 Dv + w3 Mv + w4 Pv

(1)

where ∆v is the degree-difference between the number of members of node v and the number it can handle under ideal condition, Dv is the sum of the distances of the members to node v, Mv is the average speed of the node, and Pv is the accumulative time of node v being a clusterhead. w1 , w2 , w3 and w4 with w1 + w2 + w3 + w4 = 1 are the corresponding weighing factors. The node with the minimum Wv is chosen to be the clusterhead. Once a node becomes the clusterhead, either that node or its members will be marked as “considered". Then the election process interacts on all “unconsidered" nodes (Initially, all nodes are “unconsidered"). The election algorithm will terminate once all the nodes have been considered. B. The Simulated Annealing Optimized WCA (WCA-SA) To minimize the total weight of the dominant set thus give improved network performance, [10] introduced simulated annealing (SA) to optimize the clusterhead election routine. Initially, SA starts at a random solution (dominant set) and a predefined initial temperature. In each iteration, SA does a random walk over the solution space by performing Monte Carlo samplings. Metropolis criterion is used to determine the solution moves, that is if the new solution is better than the current one (the total weight is smaller), then the move is accepted, otherwise it is decided by a certain probability associated with the current temperature and the difference between two solutions. After the Monte Carlo block in each iteration, the temperature is decreased (annealing), which is beneficial to jump out of local minima and meanwhile make a balance between exploration and exploitation ability of the algorithm. III. T HE P ROPOSED A LGORITHM A. Entropy-based WCA (EWCA) Higher reaffiliation frequency will lead to more recalculations of the cluster assignment, hence, increase the communication overhead. This phenomenon invokes us looking for better criterion of clusterhead election in order to form a more “stable” network.

In the paper of Beongku An and Symeon Papavassiliou [14], they introduced an entropy-based model for evaluating the route stability in Ad hoc networks. Entropy presents uncertainty and a measure of the disorder in a system. So we consider it a better methodology to measure the stability and the mobility of the ad hoc network. → Denote the positions of node m and n at time t as − p (m, t) − → and p (n, t) respectively. The positions of nodes are calculated periodically during a time interval ∆t . The relative position between node m and n at time t is defined as : − → → → p (m, n, t) = − p (m, t) − − p (n, t) (2) The variable feature of the system (network) considered here is the relative position between two nodes, m and n. It is defined as N 1 X − |→ p (m, n, ti )| (3) am,n = N i=1 where ti refers to the time instant associated with the i-th calculation and N is the number of discrete times ti within the time interval ∆t . Then we can define the entropy. Denote the entropy of node m as Hm (t, ∆t ). We have P − k∈Fm Pk (t, ∆t ) log Pk (t, ∆t ) Hm (t, ∆t ) = (4) log C(Fm ) P where Pk (t, ∆t ) = (am,k / i∈Fm am,i ). In the relation by Fm we denote the set (or any subset) of the neighboring nodes of node m, and by C(Fm ) the cardinality (degree) of set Fm . In this paper, Fm refers to the local network centered by node m, hence Hm presents the stability of this local network, the set of all nodes that can reach node m in one hop. It should be noted that the entropy, as defined here, is small when the change of variable values in the given region is severe and large when the change of the values is small [14]. We replace one term of (1), the average speed of nodes (Mv ), by the entropy defined in (4). Hence, the new formula to calculate Wv becomes : Wv = c1 ∆v + c1 Dv + c3 (−Hv ) + c4 Pv

(5)

where c1 , c2 , c3 and c4 are weighing factors. Simulation study given later will indicate that this replacement can effectively reduce the frequency of reaffiliation, especially for networks consisted by high-speed moving nodes. B. The Tabu Search Optimized EWCA (EWCA-TS) The idea of tabu search (TS) was firstly proposed by Fred Glover [15] [16]. Inspired by human intelligence procedure, TS has achieved great success on combinatorial optimization problems. In order to avoid the entrapment in local minima, TS introduces a policy to forbid certain classified moves. Attributes with good fitness value are marked in the tabu list to prevent cycling so that the solution space can be enlarged. TS undertakes an adaptive tabu list and associates tabu strategy to guarantee the diversity of search. Meanwhile, to prevent

the loss of promising solutions, TS introduces Aspiration Criterion. If the fitness of a solution advances “best so far" state, we ignore its tabu status and moreover, we adopt it as the current solution directly. The process of the simple TS is given as follows. Step 1: Randomly generate an initial solution s. Empty the tabu list. Denote the best fitness value as f ∗ = f itness(s) and the optimal solution s∗ = s. Step 2: Generate specified number of neighboring trial solutions s′ of s. Evaluate the quality of neighboring solutions using fitness function. Choose solutions of high fitness values as candidates. Step 3: For candidate s′ , check whether the aspiration criterion is satisfied. If not, jump to step 4. If satisfied, set current solution s and the optimal solution s∗ as s′ and the best fitness value f ∗ = f itness(s′ ). Meanwhile, add the associated attribute of s′ into the tabu list and set its tenure as the tabu length. The tenures of all existing attributes will be updated either. Step 4: Check the tabu status of associated attributes of all candidates. Choose the fittest non-tabued solution from the candidate set as the current solution s. Then tabu its associated attribute. Update tenure of all tabued attributes in the tabu list. Step 5: Repeat above steps until specified iteration time is reached. Neighborhood structure, candidates, tabu length, tabu attributes and aspiration criterion are key factors that affect the performance of tabu search algorithm. Neighborhood function reflects the aspect of local neighborhood search. The purpose of tabu list is to avoid cycling search. Aspiration criterion is an award to good candidates, preventing loss of promising solutions. Due to TS having adaptive memory ability, aspiration criterion and tolerance to bad solutions, it has a strong “climbing" ability which enhances the probability of obtaining the global optimum. In practice, in order better to balance the network load, prompt the stability of clusters and prolong the lifetime of the network, we need to choose a good dominant set in which each clusterhead handles the maximum possible number of mobile nodes. Thus, it will better facilitate the optimal operation of the MAC protocol and reduce the number of clusterheads. Meanwhile, the overhead on network communication will be reduced. Given the success of TS in optimizing minimum dominating set problem, we propose a TS approach to optimize the EWCA for further improvement on performance, especially in terms of reducing the amount of reaffiliation frequency and the number of clusterheads. The pseudo-code of EWCA-TS is given as follows. PThe goal of TS is to minimize the objective function F (s) = v∈s Wv , which is the total weight of the dominant set. In the pseudo-code of EWCA-TS, s is the current solution (dominant set). s∗ is the best known solution, F ∗ is the best fitness value found so far. T L is short for tabu length which limits the maximum tenure of tabued attributes in the

Algorithm 1 EWCA-TS 1: Randomly generate an initial solution s, s*←s,F *←F (s), T abu_List ← ∅. 2: repeat 3: Generate N S neighboring solutions of s and sort them by the fitness value. 4: Select CS best candidate solutions from sorted neighboring solutions. 5: for each solution si in CS sorted candidates do 6: if F (si ) < F ∗ then 7: s*←si , F *←F (si ). Tabu the associated attribute of si . Set its tenure to T L 8: else if si has not been tabued then 9: s←si . Tabu the attribute associated to si . Set its tenure to T L 10: end if 11: end for 12: Update T abu_List 13: until M AX_IT iteration time is reached 14: Dominant set (clusterhead set)← s*

tabu list. When the tenure of a tabued attributes is 0, this attribute will be erased. CS stands for the size of the candidate set. N S means the size of neighborhood solution set from which the algorithm selects CS candidates with highest fitness value. M AX_IT is the maximum iteration time. A good neighborhood structure can effectively improve the quality of solutions and accelerate the convergence. We utilize the following neighborhood solution generating procedure. Step 1: Randomly discard the clusterhead property of two cluterheads, c1 and c2 . Meanwhile, members of c1 and c2 lost their affiliations. Prohibit c1 and c2 from being clusterhead in newly-generated dominant set. Step 2: For all nodes, if one node is neither a clusterhead nor a cluster member, and its degree is less than a predefined maximum degree, this node becomes a clusterhead. Step 3: Set all other unassigned nodes to be clusterheads. New dominant set generated by this procedure contains both the part of original dominant set and random components. This will make TS better explore the space of potential solutions. Note: tabu attribute in the tabu list is not the solution (dominant set) itself but the two discarded clusterheads c1 and c2 . IV. S IMULATION S TUDY This section contains three parts. The first part evaluates the effect of replacing average speed by entropy in terms of reducing reaffiliation frequency. The second part evaluates the effect of introducing tabu search in terms of reducing number of clusters, comparisons between SA and TS are also made. The last part gives an overall comparison on network performances among five approaches(WCA, EWCA, WCATS, WCA-SA and EWCA-TS). A random waypoint mobility model is adopted in all three parts.

A. Evaluation of Introducing Entropy In order to demonstrate the influence of entropy, we set the weighing factors in original WCA as w1 = w2 = w4 = 0, w3 = 1 and the weighing factors in EWCA as c1 = c2 = c4 = 0, c3 = 1. Thus, only contrast the effectiveness of the entropy and the average speed in terms of reducing reaffiliation. 3.5

Reaffiliation per Unit Time

3 2.5 2 EWCA Original WCA

1.5 1

B. Evaluation of Introducing Tabu Search

0.5 0 0

10

20

30

40

50

60

70

Transmission Range

Fig. 1. Reaffiliation per unit time vs. transmission range, max_disp=30, number of nodes=30

In this subsection, to demonstrate and fairly compare the effects of the optimization, we apply SA and TS respectively onto the clusterhead election routine of the original WCA. Firstly, comparison on minimizing the cost function (the objective function defined in Section III-B) between SA [10] and TS is made. The convergence graph on a problem instance is shown in Fig. 3.

2.8

180

2.6

175

2.4

170

2.2

WCA−TS WCA−SA

165

2 Cost

Reaffiliation per Unit Time

out of the transmission range of its clusterhead. Hence the reaffiliation phenomenon is not obvious in these intervals and this is also the reason why the curve reaches a peak and drops then. Since EWCA has better performance in transmission range 20 to 40, we take a deeper investigation to this scenario. We keep the number of nodes 30 and the transmission range 30 fixed, the relationship between reaffiliation frequency and maximum displacement is illustrated in Fig. 2 . We observhe that WCA and EWCA have similar reaffiliation frequency when max_disp is small. The larger the max_disp is (the faster nodes move), the more obvious the reaffiliation frequency difference between WCA and EWCA becomes. EWCA can reduce about 30% reaffiliation frequency of WCA when mobile nodes are moving fast.

1.8 EWCA Original WCA

1.6

160 155

1.4

150

1.2

145

1

140

0.8 0

10

20

30 40 50 Maximum Displacement

60

70

135

80

Fig. 2. Reaffiliation per unit time vs. maximum displacement, number of nodes=30, tx_range=30

Fig. 3.

5

10

15

20

25 Iterations

30

35

40

45

50

Convergence graph of TS and SA on a problem instance

9.5 9 Average number of clusters

We simulate a system of 30 nodes on a 100 × 100 grid. The relationship between reaffiliation frequency and transmission range (tx_range) is illustrated in Fig. 1. The nodes could move in all possible directions with displacement varying from 0 to a maximum value (max_disp). Considering the reaffiliation phenomenon is obvious in networks with high-speed moving nodes, we choose the maximum displacement as 30. Fig. 1 indicates that reaffiliation per unit time of EWCA is lower obviously in transmission range 20 to 40, compared with the original WCA. EWCA and WCA has similar reaffiliation per unit time in other intervals. We explain the reason as follows. When transmission range is small, every clusterhead only manages few nodes, so the reaffiliation frequency is not high. While the transmission range becomes large, one cluster can cover a large area. Thus, it is not easy for a node to move

0

8.5 8 7.5 7 WCA WCA−SA WCA−TS

6.5 6

0

2

4

6

8 10 Maximum Displacement

12

14

16

Fig. 4. Average number of clusters vs. maximum displacement, number of nodes=30, tx_range=30

Methods WCA

WCA-SA

WCA-TS

EWCA

EWCA-TS

N 20 30 40 20 30 40 20 30 40 20 30 40 20 30 40

Cost 118.79 166.37 212.21 106.57 148.54 192.58 97.11 134.80 175.87 57.12 66.18 72.07 46.33 52.79 59.39

Clusters 8.21 9.33 9.96 7.04 7.41 7.83 6.45 7.25 7.88 7.91 9.08 9.67 6.41 7.01 7.62

Reaffiliation 2.05 3.21 4.32 1.57 2.17 2.93 1.44 2.15 2.81 1.87 2.94 3.85 1.28 1.88 2.47

TABLE I M EAN VALUES OF SIMULATION RESULTS . MAX _ DISP =30, MAX DEGREE =10, TX _ RANGE =30

In TS, we set T L = 5, CS = 20, N S = 50 and M AX_IT = 50. In SA, the parameters are the same to that in [10], that is, the maximum number of iterations is set to 50, the initial temperature T0 =50, the constant used to reduce the temperature α=0.9 and the number of sample solutions checked before decreasing the temperature L=100. From Fig. 3, we can see both two algorithms converge fast and provide good solutions in early iterations. But SA tends to stagnate in the search space in late iterations and converged prematurely, thus little chance to improve the final solution quality. In contrast, TS can effectively get rid of the entrapment in local minima and persistently optimize the problem. Fig. 4 shows the average number of clusters with the varying max_disp. Both two optimized WCAs reduced the number of clusters since each clusterhead handles the maximum possible number of nodes. Among the three, TS achieved the best result. C. An Overall Comparison In this subsection, among five algorithms (WCA, WCASA, WCA-TS, EWCA and EWCA-TS), we offer an overall comparison on three metrics (cost, average number of clusters and reaffiliation frequency) with varying number of mobile nodes. The results are listed in Table. I. Each entry in the table is an average of 20 runs. N means the number of nodes, varying P from 20 to 40. The cost is defined as v∈S Wv which is also the objective function to be minimized. “Clusters” means the number of clusters and “reaffiliation” means the reaffiliation frequency. Since we mainly concern about the number of clusters and the network stability in this paper, we set the weighing factors as 0.6, 0.05, 0.3, 0.05 in the simulation. Practically, this configuration should be adjusted according to the system requirements. The performances of five algorithms can be viewed in Table. I, it should be noted that the costs of WCA/WCASA/WCA-TS and EWCA/EWCA-TS are incomparable since

they use different combined weight formula. EWCA has similar number of clusters with WCA, but less reaffiliation frequency. Tabu search can optimize the procedure of clusterhead election. Both the number of clusters and the reaffiliation frequency are significantly reduced by using tabu search. EWCA-TS, WCA-TS and WCA-SA have much less reaffiliation frequency and number of clusters, compared with their non-optimized versions. In all five approaches, EWCATS achieved the best performance. EWCA-TS and WCA-TS marginally outperform WCA-SA, but we note that for TSoptimized methods, solutions in Table I are available after calling the neighborhood solution generating function 50 × 50 = 2500 times while for WCA-SA, 50 × 100 = 5000 times neighboring solution generating function calls are needed. So, we consider that, under the same circumstance, utilizing TS is more suitable than SA in practice, since it not only gets better results but also requires less computations. V. C ONCLUSION This paper proposed an entropy-based WCA (EWCA) and its optimization (EWCA-TS). EWCA mainly focuses on reducing the reaffiliation caused by high-speed moving nodes. For enhancing performances in diverse aspects, such as longer battery life, lower frequency of network assignment, we use tabu search to optimize EWCA and achieved significant performance promotion with relatively low computational costs, especially with regard to average number of clusters and reaffiliation frequency. Consequently, each clusterhead can maximize the number of its members and the network can stabilize its structure much longer. The simulation study shows these goals can be achieved. R EFERENCES [1] M. Gerla and J. Tsai, “Multicluster, mobile, multimedia radio network,” Wireless Networks, vol. 1, no. 3, 1995. [2] A. K. Parekh, “Selecting routers in ad-hoc wireless networks,” Proceedings of the SBT/IEEE International Telecommunications Symposium, August 1994. [3] D. J. Baker and A. Ephremides, “A distributed algorithm for organizing mobile radio telecommunication networks,” Proceedings of the 2nd International Conference on Distributed Computer Systems, April 1981. [4] ——, “The architectural organization of a mobile radio network via a distributed algorithm,” IEEE Transctions on Communications, vol. 29, no. 11, 1981. [5] A. Ephremides, J. E. Wieselthier, and D. J. Baker, “A design concept for reliable mobile radio networks with frequency hopping signaling,” Proceedings of IEEE, vol. 75, no. 1, 1987. [6] S. Basagni, “Distributed clustering for ad hoc networks,” Proceedings of International Symposium on Parallel Architecture, Algorithms and Networks, June 1999. [7] ——, “Distributed and mobility-adaptive clustering for multimedia support in multi-hop wireless netowrks,” Proceedings of Vehicular Technology Conference, 1999. [8] M. Chatterjee, S. K. Das, and D. Turgut, “Wca: A weighted clustering algorithm for mobile ad hoc networks,” Cluster Computing, vol. 5, 2002. [9] S. Basagni, I. Chlamtac, and A. Farago, “A generalized clustering algorithm for peer-to-peer networks,” Proceedings of Workshop on Algorithmic Aspects of Communication (satellite workshop of ICALP), July 1997. [10] D. Turgut, B. Turgut, R. Elmasri, and T. V. Le, “Optimizing clustering algorithm in mobile ad hoc networks using simulated annealing,” Proc. IEEE Wireless Communication and Networking Conference, 2003.

[11] D. Turgut, S. K. Das, R. Elmasari, and B. Turgut, “Optimizing clustering algorithm in mobile ad hoc networks using genetic algorithm approach,” Proc. 45th IEEE Global Telecommunications Conference, 2002. [12] G. S. Ryder and K. G. Ross, “A probability collectives approach to weighted clustering algorithms for ad hoc networks,” IASTED Conference on Communication and Computer Networks, 2005. [13] R. Davies and G. F. Royle, “Graph domination, tabu search and the football pool problem,” Discrete Applied Mathematics, vol. 74, no. 3, 1997. [14] B. An and S. Papavassiliou, “An entropy-based model for supporting and evaluating route stability in mobile ad hoc wireless networks,” IEEE Communications Letters, vol. 6, no. 8, August 2002. [15] F. Glover, “Future paths for integer programming and links to artificial intelligence,” Computer and Operational Research, vol. 13, no. 5, 1986. [16] ——, “Tabu search – part i,” ORSA Journal on Computing, vol. 1, no. 3, 1989.

An Entropy-based Weighted Clustering Algorithm and ...

Dept. of Computer Science and Engineering. The Ohio State University, ... heads in ad hoc networks, such as Highest-Degree heuris- tic [1], [2], Lowest-ID ...

85KB Sizes 2 Downloads 295 Views

Recommend Documents

An Entropy-based Weighted Clustering Algorithm ... - Semantic Scholar
Email: forrest.bao @ gmail.com ... network, a good dominant set that each clusterhead handles .... an award to good candidates, preventing loss of promising.

SWCA: A Secure Weighted Clustering Algorithm in ...
MAC is message authenticating code. This full text paper was ... MAC for this packet. ..... In SWCA, the usage of TESLA prevents such attacks: receiver accepts a.

SWCA: A Secure Weighted Clustering Algorithm in ...
(WCA) for clustering and TELSA for efficiently authenticating packets. ...... [18] A. Perrig, R. Canetti, D. Tygar, and D. Song, “Efficient authentication and signature.

Clustering Graphs by Weighted Substructure Mining
Call the mining algorithm to obtain F. Estimate θlk ..... an advanced graph mining method with the taxonomy of labels ... Computational Biology Research Center.

An Efficient Algorithm for Clustering Categorical Data
the Cluster in CS in main memory, we write the Cluster identifier of each tuple back to the file ..... algorithm is used to partition the items such that the sum of weights of ... STIRR, an iterative algorithm based on non-linear dynamical systems, .

Fast Web Clustering Algorithm using Divide and ...
5 : Simulate adding d to c. 6: HR. ( ) ... Add d to c. 9: end if. 10: end for. 11: if (d was not added to any cluster) then. 12: Create a ... Basic theme of the algorithm is ...

Fast Web Clustering Algorithm using Divide and ...
Clustering is the unsupervised classification of patterns .... [3] A.K. Jain, M.N. Murty, P.J. Flynn, Data clustering: a review, ACM Computing. Surveys 31 (3) (1999) ...

a novel parallel clustering algorithm implementation ... - Varun Jewalikar
calculations. In addition to the 3D hardware, today's GPUs include basic 2D acceleration ... handling 2D graphics from Adobe Flash or low stress 3D graphics.

A Clustering Algorithm for Radiosity in Complex ...
ume data structures useful for clustering. For the more accurate ..... This work was supported by the NSF grant “Interactive Computer. Graphics Input and Display ... Graphics and Scientific Visualization (ASC-8920219). The au- thors gratefully ...

ClusTop: A Clustering-based Topic Modelling Algorithm ...
component from Apache OpenNLP library [24], which has been used by many researchers for similar natural language processing [25], [26], [27]. ...... 18th Pacific-Asia Conference on Knowledge Discovery and Data Mining. (PAKDD'14), 2014, pp. 596–607.

A Clustering Algorithm for Radiosity in Complex Environments
Program of Computer Graphics. Cornell University. Abstract .... much of the analysis extends to general reflectance functions. To compute the exact radiance ...

a novel parallel clustering algorithm implementation ...
parallel computing course which flattened the learning curve for us. We would ...... handling 2D graphics from Adobe Flash or low stress 3D graphics. However ...

Improving Categorical Data Clustering Algorithm by ...
categorical data clustering by giving greater weight to uncommon attribute value ..... Chang, C., Ding, Z.: Categorical Data Visualization and Clustering Using ... Huang, Z.: Extensions to the k-Means Algorithm for Clustering Large Data Sets.

A Simple Algorithm for Clustering Mixtures of Discrete ...
mixture? This document is licensed under the Creative Commons License by ... on spectral clustering for continuous distributions have focused on high- ... This has resulted in rather ad-hoc methods for cleaning up mixture of discrete ...

Clustering by a genetic algorithm with biased mutation ...
It refers to visualization techniques that group data into subsets (clusters) ..... local and global results can make a big difference [38]. We implemented the ...

A Distributed Clustering Algorithm for Voronoi Cell-based Large ...
followed by simple introduction to the network initialization. phase in Section II. Then, from a mathematic view of point,. derive stochastic geometry to form the algorithm for. minimizing the energy cost in the network in section III. Section IV sho

A Scalable Hierarchical Fuzzy Clustering Algorithm for ...
discover content relationships in e-Learning material based on document metadata ... is relevant to different domains to some degree. With fuzzy ... on the cosine similarity coefficient rather than on the Euclidean distance [11]. ..... Program, vol.

a novel parallel clustering algorithm implementation ...
In the process of intelligent grouping of the files and websites, clustering may be used to ..... CUDA uses a recursion-free, function-pointer-free subset of the C language ..... To allow for unlimited dimensions the process of loading and ... GPU, s

A High Performance Algorithm for Clustering of Large ...
Oct 5, 2013 - A High Performance Algorithm for Clustering of Large-Scale. Protein Mass Spectrometry Data using Multi-Core Architectures. Fahad Saeed∗ ...

Parallel Spectral Clustering Algorithm for Large-Scale ...
Apr 21, 2008 - Spectral Clustering, Parallel Computing, Social Network. 1. INTRODUCTION .... j=1 Sij is the degree of vertex xi [5]. Consider the ..... p ) for each computer and the computation .... enough machines to do the job. On datasets ...

Parallel Spectral Clustering Algorithm for Large-Scale ...
1 Department of ECE, UCSB. 2 Department of ... Apr. 22, 2008. Gengxin Miao Et al. (). Apr. 22, 2008. 1 / 20 .... Orkut is an Internet social network service run by.

An Improved Divide-and-Conquer Algorithm for Finding ...
Zhao et al. [24] proved that the approximation ratio is. 2 − 3/k for an odd k and 2 − (3k − 4)/(k2 − k) for an even k, if we compute a k-way cut of the graph by iteratively finding and deleting minimum 3-way cuts in the graph. Xiao et al. [23