NLP Cypher

NLP Cypher

This Week’s Content -T5, Google’s New Transformer – Facebook’s RoBERTa Distilled by Hugging Face -Multiprocessing vs. Threading -Fine-Tuning BERT, a Tutorial -Microsoft’s UniLM AI Improves Summarization T5 | The New SOTA Transformer from Google A new entrant in the transformer school of hard-knocks was unveiled yesterday by Google called T5. This new transformer achieved new…

Read More

Word Representation in Natural Language Processing (Seconda Parte)

Natural Language Processing

Word Vectors and Beyond…🧐 In the prima parte of our introduction to word representations, we discussed the most basic type of representation: frequency probability. While this approach continues to be used in NLP projects requiring low semantic inference, more often than not, mission-critical projects may be dependent on word meaning. Natural language, as a complex…

Read More

Word Representation in Natural Language Processing (Prima Parte)

Machine Learning Part 1

From Probabilistic Models to Word Embeddings 🧐 In the world of Natural Language Processing (NLP), word representation models are key for understanding how models interpret language.  So what is a representation model (RM)? Before answering this, we have to be aware of an important disclaimer: AI models run on math. The consequence of this prerequisite highlights…

Read More