Learning from complementary labels T Ishida, G Niu, W Hu, M Sugiyama Advances in neural information processing systems, 5639-5649, 2017 | 89 | 2017 |
Do We Need Zero Training Loss After Achieving Zero Training Error? T Ishida, I Yamane, T Sakai, G Niu, M Sugiyama International Conference on Machine Learning, 2020 | 53 | 2020 |
Complementary-label learning for arbitrary losses and models T Ishida, G Niu, AK Menon, M Sugiyama International Conference on Machine Learning, 2971-2980, 2019 | 47 | 2019 |
Binary classification from positive-confidence data T Ishida, G Niu, M Sugiyama Advances in Neural Information Processing Systems, 5917-5928, 2018 | 42 | 2018 |
LocalDrop: A Hybrid Regularization for Deep Neural Networks Z Lu, C Xu, B Du, T Ishida, L Zhang, M Sugiyama IEEE Transactions on Pattern Analysis & Machine Intelligence, 1-1, 2021 | 7 | 2021 |
Learning from Noisy Complementary Labels with Robust Loss Functions H ISHIGURO, T ISHIDA, M SUGIYAMA IEICE TRANSACTIONS on Information and Systems 105 (2), 364-376, 2022 | | 2022 |
Is the Performance of My Deep Network Too Good to Be True? A Direct Approach to Estimating the Bayes Error in Binary Classification T Ishida, I Yamane, N Charoenphakdee, G Niu, M Sugiyama arXiv preprint arXiv:2202.00395, 2022 | | 2022 |