Q1. If you have trained five different models on the exact same training data, and they all achieve 95% precision, is there any chance that you can combine these models to get better results? If so, how? If not, why?
A1: I can try combining them into a voting ensemble. If the models are very different, this ensemble methods will surely be better!
Q2. What is the difference between hard and soft voting classifiers?
A2: A hard voting classifier just counts the votes of each classifier in the ensemble and picks the class that gets the most votes; while a soft voting classifier computes the average estimated class probability for each class and picks the class with the highest probability.
Q3. Is it possible to speed up training of a bagging ensemble by distributing it across multiple servers? What about pasting ensembles, boosting ensembles, random forests, or stacking ensembles?
A3: It is possible to speed up training of a ba