Yet Another Induction Algorithm
Author(s)
An, Jiyuan
Chen, Yi-Ping Phoebe
Griffith University Author(s)
Year published
2005
Metadata
Show full item recordAbstract
Inducing general functions from specific training examples is a central problem in the machine learning. Using sets of If-then rules is the most expressive and readable manner. To find If-then rules, many induction algorithms such as ID3, AQ, CN2 and their variants, were proposed. Sequential covering is the kernel technique of them. To avoid testing all possible selectors, Entropy gain is used to select the best attribute in ID3. Constraint of the size of star was introduced in AQ and beam search was adopted in CN2. These methods speed up their induction algorithms but many good selectors are filtered out. In this work, we ...
View more >Inducing general functions from specific training examples is a central problem in the machine learning. Using sets of If-then rules is the most expressive and readable manner. To find If-then rules, many induction algorithms such as ID3, AQ, CN2 and their variants, were proposed. Sequential covering is the kernel technique of them. To avoid testing all possible selectors, Entropy gain is used to select the best attribute in ID3. Constraint of the size of star was introduced in AQ and beam search was adopted in CN2. These methods speed up their induction algorithms but many good selectors are filtered out. In this work, we introduce a new induction algorithm that is based on enumeration of all possible selectors. Contrary to the previous works, we use pruning power to reduce irrelative selectors. But we can guarantee that no good selectors are filtered out. Comparing with other techniques, the experiment results demonstrate that the rules produced by our induction algorithm have high consistency and simplicity.
View less >
View more >Inducing general functions from specific training examples is a central problem in the machine learning. Using sets of If-then rules is the most expressive and readable manner. To find If-then rules, many induction algorithms such as ID3, AQ, CN2 and their variants, were proposed. Sequential covering is the kernel technique of them. To avoid testing all possible selectors, Entropy gain is used to select the best attribute in ID3. Constraint of the size of star was introduced in AQ and beam search was adopted in CN2. These methods speed up their induction algorithms but many good selectors are filtered out. In this work, we introduce a new induction algorithm that is based on enumeration of all possible selectors. Contrary to the previous works, we use pruning power to reduce irrelative selectors. But we can guarantee that no good selectors are filtered out. Comparing with other techniques, the experiment results demonstrate that the rules produced by our induction algorithm have high consistency and simplicity.
View less >
Journal Title
Lecture Notes in Computer Science
Volume
3682