After a 6 month unplanned hiatus, Deep Hunt is back, thanks for sticking around! Top stories of this week are — Google opens up TPUs for everybody; Missing data hinder replication of AI studies; AAAS AMA with Peter Norvig, Yann LeCun and Eric Horvitz; Deep contextualised word representations
The internet giant developed the tensor processing units, or T.P.U.s, for its data centers. Now other companies can use them through its cloud-computing service.
Microsoft Research is offering grants of up to US $25,000 to help a select group of doctoral students cross the finish line and enter the workforce. So, apply before March 30th if you are eligible!
New artificial intelligence systems are using “adversarial networks” to develop creativity and originality by more fluidly mixing and matching real-world information.
Unpublished code and sensitive training conditions aggravate reproducibility crisis in computer science
Tutorials, Tools and Tips
An AMA with Yann LeCun, Peter Norvig and Eric Horvitz! Without further ado, read up on what these leading people have to say about industry, changes, and future of the field.
Alex Irpan explains why deep RL doesn’t work, cases where it does work, and ways it can be working more reliably in the future.
This research by AllenAI introduces a new type of deep contextualized word representation that models both — complex characteristics of word use and how these uses vary across linguistic contexts. The ELMo representations from pre-trained language models set new SOTA for 6 diverse NLP tasks: SQuAD, SNLI, SRL, Coref, NER and Sentiment.
If you like what you are reading, please follow and recommend to your friends or give a shoutout on Twitter! I’m glad to hear your suggestions and recommendations @deephunt_in or in comments below!
Source: Deep Learning on Medium