Crisp decision trees depends on sharp split points to partition the data based on continuous valued attributes. This crisp partitioning cannot handle the compatibility of the boundary points with the partitions on both sides of the split points which may lead to misclassification. Such situations may better be handled by fuzzy decision trees which provide blurred boundaries around the fuzzy partitions. Some of the existing learning algorithms for the development fuzzy decision trees rely on pre-fuzzifying the attributes before constructing the fuzzy decision tree. The drawback of this approach lies upon the difficulty of finding the correct number of fuzzy partitions for each attribute and consequently poor performance due to rule redundancy.
[...] Development of fuzzy decision tree using the proposed algorithm is depicted in figure APPLICATION In this paper we have considered the task of predicting possible subscribers for an Online Learning System (OLS)[6]. An OLS is a study community that helps the students to clear their doubts and solve their home work problems. A student registers as a free member or a paid member to access the OLS. A free member is the one who can access the OLS for a stipulated time period without any payment. [...]
[...] For continuous attributes the test condition can either be expressed as a single comparison test or for binary split or a range query like vi [...]
[...] The present approach is used to construct a multi-split fuzzy decision tree based classifier to predict the possible monthly/yearly subscribers for the OLS system. The performance of the present approach is found to be satisfactory as it could give a classification accuracy of on the test data as against given by crisp decision trees. The crisp decision tree constructed by Weka is given by figure 4. It may be observed that the attribute “activity” appears repeatedly along the paths which can better be implemented as a multi-split attribute. [...]
[...] Given the Crisp decision tree and training data set the overview of the dynamic Fuzzification process is as in figure CLASSIFICATION OF UNKNOWN EXAMPLE An unknown example to be classified traverses along multiple paths of the fuzzy decision tree with varying degrees of membership along the branches. A preselected fuzzy inference technique is then used to combine membership grades down the paths within the tree using t-norms, and then t-conorms 4]are used to aggregate the strength in support of a class label over all the paths that end with leaf nodes of the corresponding class label. [...]
[...] Number of fuzzy partitions used to represent the continuous valued attributes while building a fuzzy decision tree To determine the size and shape of the membership function for each fuzzy partition. The fuzzy tree methodologies proposed by Sisson and Chong[13] and Umano et al[12] require the data to have been pre-fuzzified before the fuzzy decision trees are induced. Umano et al took the help of human experts to fuzzify the attributes before construction of the fuzzy decision tree. They relied on the expert intuition to estimate the number of fuzzy regions and shape and size of membership function required for each attribute. [...]
APA Style reference
For your bibliographyOnline reading
with our online readerContent validated
by our reading committee