1 min readMar 24, 2018
You should have tried LightGBM, even though XGBoost works better in most cases, I found LightGBM worked better than any GB algorithms for this problem. I ranked in top 50 until a week before the deadline just by using LightGBM without any hyperparameter tuning, Unfortunately, I couldn’t get enough time to optimise the model or try new models to stack them. Finally, I ended up with rank 96 with logloss about ~0.151. Using Keras gave me about ~0.185 logloss.