Data Skeptic

GPT-2 is yet another in a succession of models like ELMo and BERT which adopt a similar deep learning architecture and train an unsupervised model on a massive text corpus.

As we have been covering recently, these approaches are showing tremendous promise, but how close are they to an AGI?  Our guest today, Vazgen Davidyants wondered exactly that, and have conversations with a Chatbot running GPT-2.  We discuss his experiences as well as some novel thoughts on artificial intelligence.

Direct download: talking-to-gpt2.mp3
Category:general -- posted at: 12:45pm PST

Rajiv Shah attempted to reproduce an earthquake-predicting deep learning model.  His results exposed some issues with the model.  Kyle and Rajiv discuss the original paper and Rajiv's analysis.

Direct download: reproducing-deep-learning-models.mp3
Category:general -- posted at: 6:15pm PST

Allyson Ettinger joins us to discuss her work in computational linguistics, specifically in exploring some of the ways in which the popular natural language processing approach BERT has limitations.

Direct download: what-bert-is-not.mp3
Category:general -- posted at: 2:02pm PST

Omer Levy joins us to discuss "SpanBERT: Improving Pre-training by Representing and Predicting Spans".

https://arxiv.org/abs/1907.10529

Direct download: spanbert.mp3
Category:general -- posted at: 1:27am PST

1