NLP News Cypher | 01.12.20

NLP News Cypher 01.12.19

The Truth Hurts… Perhaps, you can tell me what’s best. I trained a GPT-2 medium model on 18K+ tweets. Connected the model to Twitter’s API. Every 30 minutes it says things – sometimes funny… this is so awesome: #rockets — 21st Century AI Angst (@angst_gpt2) January 12, 2020 In the words of the great rocker,…

Read More

NLP News Cypher | 12.29.19

NLP News Cypher 12.29.19

Over the Hills and Far Away Queue the tumbleweeds. T’was a slow week. The holiday slumber has tamed us all. As a result, this week’s Cypher will be on the short end. 😔 First, want to thank all the readers who have emailed saying how much they enjoy Cypher. We have a great time sharing…

Read More

NLP News Cypher | 12.15.19

NLP News Cypher | 12.15.19

Busy Week at NeurIPS 2019! And, we’re back! Ok, NeurIPS comes to a thundering finish as TONS of research was discussed throughout the week. Also, several insightful reports were released covering the state of machine learning in the private industry which we’ll cover in-depth. As a result, this week’s edition will be slightly longer than…

Read More

NLP News Cypher | 11.17.19

NLP News Cypher 11-17-2019

Gearing Up for NeurIPS 2019… While we wind down from the recent EMNLP conference, NeurIPS 2019 is just around the corner starting on Dec. 8 thru the 14th! For a quick rundown of the NeurIPS’ metadata (authors, topics etc), check out this post: NeurIPS 2019 Stats The Thirty-third Annual Conference on Neural Information Processing Systems…

Read More

NLP News Cypher | 11.10.19

NLP News Cypher 11/10/2019

Echoes from EMNLP and GPT-2 Strikes Back Wow, what a week it was. The EMNLP conference gave us many treats to chew on such as the growing popularity of cross-lingual learning and the continued adoption of knowledge graphs in language models. Because of all this action, this week’s Cypher will be a bit longer than…

Read More

NLP Cypher

NLP Cypher

This Week’s Content -T5, Google’s New Transformer – Facebook’s RoBERTa Distilled by Hugging Face -Multiprocessing vs. Threading -Fine-Tuning BERT, a Tutorial -Microsoft’s UniLM AI Improves Summarization T5 | The New SOTA Transformer from Google A new entrant in the transformer school of hard-knocks was unveiled yesterday by Google called T5. This new transformer achieved new…

Read More

Word Representation in Natural Language Processing (Seconda Parte)

Natural Language Processing

Word Vectors and Beyond…🧐 In the prima parte of our introduction to word representations, we discussed the most basic type of representation: frequency probability. While this approach continues to be used in NLP projects requiring low semantic inference, more often than not, mission-critical projects may be dependent on word meaning. Natural language, as a complex…

Read More

Word Representation in Natural Language Processing (Prima Parte)

Machine Learning Part 1

From Probabilistic Models to Word Embeddings 🧐 In the world of Natural Language Processing (NLP), word representation models are key for understanding how models interpret language.  So what is a representation model (RM)? Before answering this, we have to be aware of an important disclaimer: AI models run on math. The consequence of this prerequisite highlights…

Read More