This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
students:phd_2019 [2019/05/10 20:29] blay [Learning variability of Machine Learning Workflows] |
students:phd_2019 [2019/05/10 20:31] blay [References] |
||
---|---|---|---|
Line 27: | Line 27: | ||
3- A systematic exploitation of this structure to reduce the number of executions, to drive the workflow compositions, to manage the feedback loop, and to justify choices.\\ | 3- A systematic exploitation of this structure to reduce the number of executions, to drive the workflow compositions, to manage the feedback loop, and to justify choices.\\ | ||
- | ** The number of theoretical experiments to study p pretreatments, n algorithms and d data sets is 2^p*n*d. For 10 preprocessing algorithms, 100 classification algorithms and 100 sets of data, considering that each experiment only lasts one minute, it would take more than 7000 days of execution time. | + | |
===== References ===== | ===== References ===== | ||
Line 62: | Line 62: | ||
16. Pohl, K., Böckle, G. & van der Linden, F. J. Software Product Line Engineering: Foundations, Principles and Techniques. (Springer-Verlag, 2005). | 16. Pohl, K., Böckle, G. & van der Linden, F. J. Software Product Line Engineering: Foundations, Principles and Techniques. (Springer-Verlag, 2005). | ||
- | 17. Wolpert, D. H. & Macready, W. G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. (1997). | + | 17. Bilalli, B., Abelló, A. & Aluja-Banet, T. On the predictive power of meta-features in OpenML. Int. J. Appl. Math. Comput. Sci. 27, (2017). |
- | + | ||
- | 18. Bilalli, B., Abelló, A. & Aluja-Banet, T. On the predictive power of meta-features in OpenML. Int. J. Appl. Math. Comput. Sci. 27, (2017). | + | |