Cover Learner using Integer Programming Version 3 (CLIP3)

The pseudocode of theCLIP3 machine learning algorithm is:

Given the POSitive and NEGative training sets and the threshold values do:

Phase I

1. Do for all NEGative examples: 1) generate BINary matrix by comparing matrix POS with negI where i = 1, ..., N, and N is the number of NEGative examples
2) solve the corresponding integer programming (IP) model
3) split the matrix of POSitive examples based on the features indicated by the IP solution and eliminate redundant matrices

Phase II
1. Generate the BINary template matrix (TM)
2. Solve the TM BINary matrix using corresponding IP model
3. Backproject matrices selected in step 2
4. Convert each resulting matrix to IP model and solve
5. Generate rules based on IP solutions
Phase III 1. Find the best rule
2. Eliminate positive examples covered by the best rule and go to Phase I
CLIP3 incorporates three thresholds:
• Noise Threshold (NT). This determines which branches (possibly containing noisy positive examples) corresponding to the IP solutions in Phase I will be removed. Thus, if NT is 0% then none of the branches are eliminated.
• Best Rule Threshold (BRT). From all the rules generated in Phase II only those covering more than the BRT percentage of the (remaining) positive examples will be chosen as best rules.
• Stop Threshold (ST). This stops the algorithm when a smaller than ST percentage of positive examples remains uncovered. Either the BRT or ST threshold can stop the algorithm.
Following is an example of the CLIP3 Algorithm:  For more information about CLIP3 check the following resources:

Cios K.J. and Liu N. 1995. An algorithm which learns multiple covers via integer linear programming, Part I - the CLILP2 algorithm. Kybernetes (The Norbert Wiener 1997 Outstanding Paper Award) 24(2):29-50

Cios K.J. and Liu N. 1995. An algorithm which learns multiple covers via integer linear programming, Part II -experimental results and conclusions. Kybernetes, 24(3):28-40

Cios K.J., Wedding D.K. and Liu N. 1997. CLIP3: cover learning using integer programming. Kybernetes (invited paper), 26(4-5): 513-536

Cios, K.J., Pedrycz, W. and Swiniarski, R. 1998. Data Mining Methods for Knowledge Discovery. Kluwer, ISBN 0-7923-8252-8,

CLIP3 incorporates three thresholds:
• Noise Threshold (NT). This determines which branches (possibly containing noisy positive examples) corresponding to the IP solutions in Phase I will be removed. Thus, if NT is 0% then none of the branches are eliminated.
• Best Rule Threshold (BRT). From all the rules generated in Phase II only those covering more than the BRT percentage of the (remaining) positive examples will be chosen as best rules.
• Stop Threshold (ST). This stops the algorithm when a smaller than ST percentage of positive examples remains uncovered. Either the BRT or ST threshold can stop the algorithm.
Following is an example of the CLIP3 Algorithm:  For more information about CLIP3 check the following resources:

Cios K.J. and Liu N. 1995. An algorithm which learns multiple covers via integer linear programming, Part I - the CLILP2 algorithm. Kybernetes (The Norbert Wiener 1997 Outstanding Paper Award) 24(2):29-50

Cios K.J. and Liu N. 1995. An algorithm which learns multiple covers via integer linear programming, Part II -experimental results and conclusions. Kybernetes, 24(3):28-40

Cios K.J., Wedding D.K. and Liu N. 1997. CLIP3: cover learning using integer programming. Kybernetes (invited paper), 26(4-5): 513-536

Cios, K.J., Pedrycz, W. and Swiniarski, R. 1998. Data Mining Methods for Knowledge Discovery. Kluwer, ISBN 0-7923-8252-8,

Cios K.J. and Kurgan L. 2001. Hybrid Inductive Machine Learning: An Overview of CLIP Algorithms.  In: Advances of Machine Learning, Kacprzyk J. (ed.), Springer, in print