Follow
Raphael Tang
Raphael Tang
Comcast Applied AI
Verified email at comcast.com - Homepage
Title
Cited by
Cited by
Year
Docbert: Bert for document classification
A Adhikari, A Ram, R Tang, J Lin
arXiv preprint arXiv:1904.08398, 2019
2422019
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
R Tang*, Y Lu*, L Liu*, L Mou, O Vechtomova, J Lin
arXiv preprint arXiv:1903.12136, 2019
2412019
Deep Residual Learning for Small-Footprint Keyword Spotting
R Tang, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
1592018
DeeBERT: Dynamic early exiting for accelerating BERT inference
J Xin, R Tang, J Lee, Y Yu, J Lin
arXiv preprint arXiv:2004.12993, 2020
1072020
Rethinking Complex Neural Network Architectures for Document Classification
A Adhikari*, A Ram*, R Tang, J Lin
Proceedings of the 2019 Conference of the North American Chapter of the …, 2019
742019
Rapidly Bootstrapping a Question Answering Dataset for COVID-19
R Tang, R Nogueira, E Zhang, N Gupta, P Cam, K Cho, J Lin
arXiv preprint arXiv:2004.11339, 2020
512020
Covidex: Neural ranking models and keyword search infrastructure for the covid-19 open research dataset
E Zhang, N Gupta, R Tang, X Han, R Pradeep, K Lu, Y Zhang, R Nogueira, ...
arXiv preprint arXiv:2007.07846, 2020
492020
An Experimental Analysis of the Power Consumption of Convolutional Neural Networks for Keyword Spotting
R Tang, W Wang, Z Tu, J Lin
2018 IEEE International Conference on Acoustics, Speech and Signal …, 2018
382018
Honk: A PyTorch Reimplementation of Convolutional Neural Networks for Keyword Spotting
R Tang, J Lin
arXiv preprint arXiv:1710.06554, 2017
352017
What would elsa do? freezing layers during transformer fine-tuning
J Lee, R Tang, J Lin
arXiv preprint arXiv:1911.03090, 2019
31*2019
Natural Language Generation for Effective Knowledge Distillation
R Tang, Y Lu, J Lin
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource …, 2019
232019
BERxiT: Early Exiting for BERT with Better fine-tuning and extension to regression
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 16th conference of the European chapter of the …, 2021
192021
Exploring the limits of simple learners in knowledge distillation for document classification with DocBERT
A Adhikari, A Ram, R Tang, WL Hamilton, J Lin
Proceedings of the 5th Workshop on Representation Learning for NLP, 72-77, 2020
162020
Flops as a direct optimization objective for learning sparse neural networks
R Tang, A Adhikari, J Lin
arXiv preprint arXiv:1811.03060, 2018
142018
Incorporating Contextual and Syntactic Structures Improves Semantic Similarity Modeling
L Liu, W Yang, J Rao, R Tang, J Lin
Proceedings of the 2019 Conference on Empirical Methods in Natural Language …, 2019
92019
DocBERT: BERT for document classification
A Ashutosh, R Achyudh, T Raphael, L Jimmy
arXiv preprint arXiv:1904.08398, 2019
92019
Howl: A Deployed, Open-Source Wake Word Detection System
R Tang*, J Lee*, A Razi, J Cambre, I Bicking, J Kaye, J Lin
arXiv preprint arXiv:2008.09606, 2020
72020
Yelling at Your TV: An Analysis of Speech Recognition Errors and Subsequent User Behavior on Entertainment Systems
R Tang, F Ture, J Lin
Proceedings of the 42nd Annual International ACM SIGIR Conference on …, 2019
72019
The art of abstention: Selective prediction and error regularization for natural language processing
J Xin, R Tang, Y Yu, J Lin
Proceedings of the 59th Annual Meeting of the Association for Computational …, 2021
62021
Adaptive pruning of neural language models for mobile devices
R Tang, J Lin
arXiv preprint arXiv:1809.10282, 2018
52018
The system can't perform the operation now. Try again later.
Articles 1–20