You can download the archive with the complete experiment results below under attachments. The archive contains: * The compiled version of !CrossPare used for the execution of the experiments. It has been compiled with the source code of revision [26] in the SVN. * One folder each of the three data sets. Each folder contains the following: * the data * the experiment configurations * the results as .csv files * US means only undersampling is applied, i.e., the file contains the configuration/results * US-ENTITYKNN means that undersampling and the KNN data selection is applied, i.e., the file contains the configuration/results for the KNN data selection * The results contain additional classifiers, i.e., a Random Forest. The results are similar to the SVN results and were omitted from the paper as to not dilute the focus. * The results for US-ENTITYKNN also contain the combination of the KNN model with the two local models. These results are interesting, but out of scope of the paper. Moreover, they carry a similar message than the other results and also do not really beat the global model. Therefore, they were removed from the paper as to not dilute the focus. * The results contain additional metrics and success measures, we did not discuss in the article, but are evaluated by !CrossPare anyways. * the metrics error, recall, precision, f-score, g-score, MCC, AUC, AUCec (after Rahman et al.: Recalling the “imprecision” of cross-project defect prediction), true positive rate (tpr), true negative rate (tnr), number of true positives (tp), number of false negatives (fn), number of true negatives (tn), number of false positives (fp) * succHe is 1 if recall>0.7 and precision>0.5 (used in the article) * succZi is 1 if recall>0.7 and precision>0.7 * succG75 is 1 if g-score>0.75 * succG60 is 1 if g-score>0.6 * A batch script for execution the experiments. * Be aware the the results are overwritten and the executions will take at quite some time. Moreover, !CrossPare will grab up to all of your CPU cores and consume than too 100%.