September: I gave a keynote talk at AMLaP on "Neural networks as a framework for modeling human syntactic processing". There's a video recording of the talk on Twitch (!).
April: The preprint of the review I wrote with Marco Baroni on Syntactic Structure from Deep Learning, for Annual Reviews of Linguistics, is now online.
April: Papers accepted to ACL: cross-linguistic language model syntactic evaluation, data augmentation for robustness to inference heuristics, constituency vs. dependency tree-LSTMs, and an opinion piece on evaluation for 'human-like' linguistic generalization.
Raquel Fernández and I are organizing CoNLL 2020, with a renewed focus on "theoretically, cognitively and scientifically motivated approaches to computational linguistics"!
R. Thomas McCoy, Robert Frank & Tal Linzen (2020). Does syntax need to grow on trees? Sources of hierarchical inductive bias in sequence-to-sequence networks. Transactions of the Association for Computational Linguistics. [arXiv]
Tal Linzen, Emmanuel Dupoux & Yoav Goldberg (2016). Assessing the ability of LSTMs to learn syntax-sensitive dependencies. Transactions of the Association for Computational Linguistics 4, 521–535. [link] [pdf] [bib]
Here's a video of a talk I gave in December 2018 at the Allen Institute for Artificial Intelligence.