tag:blogger.com,1999:blog-49559636796746503022024-03-16T02:09:41.138+01:00Multilingual Natural Language Processing @ SapienzaHome Page and Blog of the Multilingual NLP course @ Sapienza University of RomeRoberto Naviglihttp://www.blogger.com/profile/03689392126917747034noreply@blogger.comBlogger254125tag:blogger.com,1999:blog-4955963679674650302.post-41202622320235683712024-03-14T07:17:00.005+01:002024-03-14T07:17:34.704+01:00Lecture 5 (14/03/2024): Word embeddings, word2vec<p>Word representations. Word embeddings. Word2vec (CBOW and skipgram), PyTorch notebook on word2vec.<a href="https://miro.medium.com/v2/resize:fit:720/format:webp/1*5F4TXdFYwqi-BWTToQPIfg.jpeg" style="display: block; padding: 1em 0px; text-align: center;"><img alt="" border="0" data-original-height="258" data-original-width="678" height="153" src="https://miro.medium.com/v2/resize:fit:720/format:webp/1*5F4TXdFYwqi-BWTToQPIfg.jpeg" width="401" /></a></p>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-64638611896291786432024-03-14T07:16:00.002+01:002024-03-14T07:16:17.982+01:00Lecture 4 (08/03/2024, 3 hours): first hands-on with PyTorch with language detection<p>Recap of the Supervised Learning framework, hands on practice
with <b>PyTorch</b> on the Language Detection Model: tensors, gradient
tracking, the <b>Dataset </b>and <b>DataLoader </b>class, the <b>Module </b>class, the backward step, the training loop, evaluating a model.</p>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://miro.medium.com/max/676/1*d0JWmF36SUey7aS8bvA-dw.jpeg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="425" data-original-width="676" height="251" src="https://miro.medium.com/max/676/1*d0JWmF36SUey7aS8bvA-dw.jpeg" width="400" /></a></div>
<span face=""arial" , "tahoma" , "helvetica" , "freesans" , sans-serif" style="color: black; font-size: 13.2px;"><br /></span>
Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-73311376140848172912024-03-14T07:14:00.003+01:002024-03-14T07:17:42.662+01:00Lecture 3 (07/03/2024, 2 h): Supervised vs. unsupervised vs. reinforcement learning. PyTorch<p>Introduction to Supervised, Unsupervised & Reinforcement
Learning. The Supervised Learning framework. From real to computational:
features extraction and features vectors. Feature Engineering and
inferred features. PyTorch. Introduction to Colab notebooks and first part of the PyTorch hands-on.<br /></p>
<a href="https://static.javatpoint.com/tutorial/pytorch/images/pytorch-perceptron2.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="300" data-original-width="500" height="240" src="https://static.javatpoint.com/tutorial/pytorch/images/pytorch-perceptron2.jpg" width="400" /></a>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-83009183391019742612024-03-07T08:13:00.003+01:002024-03-14T07:14:45.396+01:00Lecture 2 (01/03/2024, 3 hours): Machine Learning for NLP and Logistic Regression<p>Basics of Machine Learning for NLP. Probabilistic classification. Logistic Regression and its use
for classification. Explicit vs. implicit features. The cross-entropy
loss function.<br /></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjo6On_T1LbEQ3qkJ-R488ZskE8ZaLaRgIcWuY4usVi0HD4byZQKKo1Ah7UcRo5c2g8e0EKFk_kBcmKV--Us23KraX9v-zuOJtExRl4T8iQ7eJLD4ZghuNKOe7oyaf0_Vi8_iuphayV9k33OJMJA43rfAbJ03VfeU2w6vmBBvOW2eG3Ozeri3usTUkI" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="1288" data-original-width="2264" height="313" src="https://blogger.googleusercontent.com/img/a/AVvXsEjo6On_T1LbEQ3qkJ-R488ZskE8ZaLaRgIcWuY4usVi0HD4byZQKKo1Ah7UcRo5c2g8e0EKFk_kBcmKV--Us23KraX9v-zuOJtExRl4T8iQ7eJLD4ZghuNKOe7oyaf0_Vi8_iuphayV9k33OJMJA43rfAbJ03VfeU2w6vmBBvOW2eG3Ozeri3usTUkI=w549-h313" width="549" /></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-10934881445981483772024-03-07T08:10:00.006+01:002024-03-14T07:14:53.398+01:00Lecture 1 (29/2/2024, 2 hours): Introduction<p>Introduction to the course. Introduction to Natural Language Processing: <b>understanding </b>and <b>generation</b>. What is NLP? The <b>Turing Test</b>, criticisms and alternatives. Tasks in NLP and its importance (with examples). Key areas and publication venues.<br />
</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEicOOvM42qjMLyGZrLMeaUzrMJNL5sTrRV0isrPjQK7FuaBRpAHZ1ahlTn44K36iIbN_ZB8-EjBEYcdt7fA59RIFxShPGduGmUIpsUGkmO6BkNxvsRFbo675xmJ4ezNTQtEdT1gPhlmj74ht7idTpBrgH9EMvEvI1-rtrm565DO3Lx6o0udO3MGNmL8=s320" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="173" data-original-width="320" height="264" src="https://blogger.googleusercontent.com/img/a/AVvXsEicOOvM42qjMLyGZrLMeaUzrMJNL5sTrRV0isrPjQK7FuaBRpAHZ1ahlTn44K36iIbN_ZB8-EjBEYcdt7fA59RIFxShPGduGmUIpsUGkmO6BkNxvsRFbo675xmJ4ezNTQtEdT1gPhlmj74ht7idTpBrgH9EMvEvI1-rtrm565DO3Lx6o0udO3MGNmL8=w489-h264" width="489" /></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-2440080170417093102024-02-27T07:15:00.005+01:002024-02-29T16:00:05.070+01:00<p><b>Ready! </b>We are starting this <b>Thursday in A2 at 12</b>! Meanwhile, please register here: https://forms.gle/CY38izJpz7Y6wpnH6</p><ul><li><span><b>Thursday</b> (12.00-14.00), room A2</span><span>, DIAG, via Ariosto 25</span></li><li><span><b>Friday </b>(8.30-12.00), room A2, DIAG, via Ariosto 25</span></li></ul><p> </p><p style="text-align: center;"><img alt="What is ChatGPT? | A Chat with ChatGPT on the Method Behind the Bot | DataCamp" aria-hidden="false" class="sFlh5c pT0Scc iPVvYb" src="https://images.datacamp.com/image/upload/v1684833730/a_chat_with_chat_GPT_b0720c02d3.png" style="margin: 0px; max-width: 1344px; width: 400px;" /></p>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-78495506597814894922024-02-06T13:14:00.003+01:002024-02-06T15:32:03.715+01:00Welcome to Multilingual Natural Language Processing!!!<p> </p><p>Welcome to the Sapienza <b>Multilingual NLP </b>course blog 2024! The course is held at <b>DIAG</b>! <u><b>Cool things</b></u> about to happen:<span><br /></span>
</p><ol style="text-align: left;"><li>The course will contain lots of up-to-date content on <b>deep learning</b>, <b>neural networks</b>, <b>Large Language Model, </b>and an improved hands-on with <b>PyTorch</b>!</li><li>For attending students, there will be <u><b>only TWO homeworks</b></u> <b>(and no additional duty</b>),
one of which will be done with delivery by the end of September and
will replace the project. Non-attending students, instead, will have to
work on three homeworks.</li><li>There will be <b>cool challenges throughout the whole course</b>, including the possibility of writing and publishing papers. You will be updated on the most relevant events in the area, including the Italian/Multimodal LLM national endeavor headed by Prof. Navigli.<br /></li><li>We will include the most recent additions (including from 2024) from the world of NLP!<span> <br /></span></li></ol><span>Class hours are: TBD, DIAG, via Ariosto 25</span><p>
</p><p></p><p></p><p></p><p style="text-align: left;"><img alt="1,100+ Chatgpt Stock Illustrations, Royalty-Free Vector Graphics & Clip Art - iStock" aria-hidden="false" class="sFlh5c pT0Scc iPVvYb" src="https://media.istockphoto.com/id/1505518778/vector/chatbot-ai-chat-robot-speech-bubble-technology-talking-chatting-speech-bubble-conversation.jpg?s=612x612&w=0&k=20&c=y4yI07vRB3rD_p6MlDjqueKrBH_vMkHsvduap6PFJow=" style="height: 415px; margin: 0px; max-width: 612px; width: 612px;" /><br />
<b> </b></p><p><b>IMPORTANT:</b> <u>The current lecture model is in-person attendance.</u> See the <a href="http://naviglinlp.blogspot.com/p/natural-language-processing-basic.html">updated Syllabus</a>. </p><b>IMPORTANT (bis):</b> <u>Note that the course has been renamed into <b>Multilingual Natural Language Processing</b></u> (if you have NLP in your plan and want to attend my course, please contact me at [surname]@diag.uniroma1.it). Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-73686199977497801642023-06-05T09:45:00.002+02:002023-06-05T09:45:55.020+02:00Lecture 22 (29/05/2023, 2.5 hours): text summarization, open issues in NLP, topics for thesis and more, closing<p>Introduction to <b>text summarization </b>and <b>evaluation metrics </b>(BLEU, ROUGE, BERTScore, alternatives). <b>Open issues in NLP</b>: <b>superhuman performance </b>in current benchmarks, <b>stochastic parrots</b>, evaluation of text quality. <b>Thesis topics </b>and more. Closing.</p><p style="text-align: center;"> <img height="227px;" id="docs-internal-guid-8c2cd76b-7fff-731f-472a-72979dfa8281" src="https://lh4.googleusercontent.com/Z3acUB5SuJE5N6xOy2suHwHQJW_vBfZilmdFk2L8Px7MKU-mRT8-ZB_ND7upwSzvF5xdNWjTZqtC4em6KR35HFCSyuuPzyGr1v8g_kAoBiN250raj7UrZ3LHQoAR6zDwJdpIGJ_5j8aTmuSFgsPdT9VUxg=s2048" width="267px;" /></p>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-54713479286589715052023-05-26T12:58:00.006+02:002023-06-01T12:29:46.433+02:00Lecture 21 (26/05/2023, 4.5 hours): seq2seq, Machine Translation<p>F<span>oundations of sequence-to-sequence models and their use within Huggingface. </span></p><p>Introduction to machine translation (MT) and history of MT. Overview of statistical MT. <b>Beam search </b>for decoding. Introduction to <b>neural machine translation: the encoder-decoder</b>
neural architecture. The BLEU
evaluation score. Performances and recent improvements. <b>Neural MT: </b><span>the </span><b>encoder-decoder </b><span>architecture</span><span>; </span><b>Attention </b><span style="color: #073763;"><span style="color: black;">in NMT.</span></span><br />
</p><div style="text-align: center;">
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-Q-Lxmm72vLY/WSkuovFADEI/AAAAAAAACDQ/rTF_t43lRA4BOatP7LBt6q4ehC-H6eQxACLcB/s1600/s2s.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="309" data-original-width="350" height="282" src="https://4.bp.blogspot.com/-Q-Lxmm72vLY/WSkuovFADEI/AAAAAAAACDQ/rTF_t43lRA4BOatP7LBt6q4ehC-H6eQxACLcB/s320/s2s.jpg" width="320" /></a></div>
</div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-83818171186434736962023-05-22T21:26:00.002+02:002023-05-22T21:26:53.877+02:00Lecture 20 (22/05/2023, 2.5 hours): More on semantic role labeling; Semantic Parsing<p>More on Semantic Role Labeling. <b>Semantic Parsing: task, motivation </b>and<b> applications</b>, Abstract Meaning Representation (<b>AMR</b>) and BabelNet Meaning Representation (<b>BMR</b>), Natural Language Generation from semantic parses </p><div style="text-align: center;"><img alt="Immagine" class="css-9pa8cd" draggable="true" height="278" src="https://pbs.twimg.com/media/FPvAgAoXEAkveS_?format=jpg&name=medium" width="532" /></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-62871735206675971882023-05-22T21:22:00.007+02:002023-05-22T21:26:21.412+02:00Lecture 19 (19/05/2023, 4 hours): Semantic Role Labeling<p><b>Semantic roles. Frame resources: PropBank, FrameNet, VerbAtlas. Semantic Role Labeling</b> (<b>SRL</b>). Multilingual SRL. <b>Cross-inventory</b> approaches to SRL. Topics for thesis or excellence path.<br />
</p><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-Sfm__hXljI8/YKvTAQ657LI/AAAAAAAADvw/8d-GK6YcKe0ISeefTUFXT_N4vHkmzxOAgCNcBGAsYHQ/s1183/Cattura.PNG" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="580" data-original-width="1183" height="249" src="https://1.bp.blogspot.com/-Sfm__hXljI8/YKvTAQ657LI/AAAAAAAADvw/8d-GK6YcKe0ISeefTUFXT_N4vHkmzxOAgCNcBGAsYHQ/w508-h249/Cattura.PNG" width="508" /></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-36315112621901655402023-05-15T15:27:00.002+02:002023-05-15T15:27:33.230+02:00Lecture 18 (15/05/2023, 2.5 hours): Overview of NLP libraries and tools; HW 3 assignmentOverview of NLP libraries: Hugginface Transformers, datasets and eval. FairSeq, Lightning Transformer, Sentence Transformers, Classy. PyTorch Lightning.
Assignment of homework 3: Relation Extraction.Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-82140673624857601502023-05-12T13:13:00.003+02:002023-05-12T13:13:15.061+02:00Lecture 17 (12/05/2023, 4.5 hours): More on sense embeddings; Word Sense Disambiguation <p><b>Word Sense Disambiguation</b> (WSD): introduction to the task. Purely data-driven, and neuro-symbolic approaches. WSD cast as sense comprehension. Issues. <b>Semantic Role Labeling</b>: introduction to the task. Inventories. Neural approaches. Issues.<br />
Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-18423679630905843122023-05-12T13:09:00.005+02:002023-06-01T12:26:17.948+02:00Lecture 16 (05/05/2023, 4.5 hours): Homework 2 assignment on Word Sense Disambiguation; sense embeddings<p>Assignment of homework2: Word Sense Disambiguation. Introduction to Word Sense Disambiguation. First introduction to explicit and latent sense embeddings. SensEmbed.</p><p> </p>
</p><div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-o8D6L_Ir1cY/T60y9OH3Z_I/AAAAAAAAABY/xiX3geu48bk/s400/code280.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="287" data-original-width="400" height="286" src="https://4.bp.blogspot.com/-o8D6L_Ir1cY/T60y9OH3Z_I/AAAAAAAAABY/xiX3geu48bk/s400/code280.png" width="400" /></a></div>
Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-51357490386251438172023-04-28T13:52:00.003+02:002023-05-12T13:11:17.088+02:00Lecture 15 (28/04/2023, 4 hours): review on the Transformer; notebook on the Transformer for NLPReview on the Transformer; practical session on the Transformer with BERT.Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-64673118726037380912023-04-26T07:17:00.008+02:002023-04-26T07:19:37.385+02:00Lecture 14 (21/04/2023, 4.5 hours): more on the Transformer, pre-trained language models; introduction to lexical semantics<p>More on the <b>Transformer </b>architecture. Pre-trained language models: BERT, GPT, RoBERTa, XLM. Introduction to lexical semantics: meaning representations, WordNet, BabelNet. Neurosymbolic NLP.</p><div class="separator" style="clear: both; text-align: center;"><a href="https://www.staynerd.com/wp-content/uploads/0465731939d703d10b45538ff8e0efbb.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="690" data-original-width="800" height="276" src="https://www.staynerd.com/wp-content/uploads/0465731939d703d10b45538ff8e0efbb.png" width="320" /></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-16453354082249361502023-04-18T13:15:00.005+02:002023-04-20T13:00:06.460+02:00Lecture 13 (17/04/2023, 2 hours, E): notebook on LSTMsQ&A on Homework 1, Part-of-Speech tagging brief introduction, LSTMs recap, Notebook on Part-of-Speech Tagging with LSTMs, data preprocessing and training procedure best practices.Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-75016049428946699082023-04-18T13:13:00.008+02:002023-04-27T18:11:50.644+02:00Lecture 12 (14/04/2023, 4.5 hours): neural language modeling, the attention mechanism, the Transformer<p>Neural language modeling. Context2vec. Neural language models with BiLSTMs. Contextualized word representations. Introduction to the attention. Introduction to the Transformer architecture. </p><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-qogMNu2l7X4/YJPyaJ2Hz0I/AAAAAAAADtY/1JSkygu-5e8O2aUg_IysaijneGUdqUVgwCNcBGAsYHQ/s632/spanish-english.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="569" data-original-width="632" src="https://1.bp.blogspot.com/-qogMNu2l7X4/YJPyaJ2Hz0I/AAAAAAAADtY/1JSkygu-5e8O2aUg_IysaijneGUdqUVgwCNcBGAsYHQ/s320/spanish-english.png" width="320" /></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-91169330499466367912023-04-14T07:15:00.011+02:002023-04-26T07:19:50.440+02:00Lecture 11 (03/04/2023, 2 hours): computational lexical semantics<p><span style="color: #073763;">Introduction to </span><b style="color: #073763;">lexical semantics. Lexicon</b>, <b style="color: #073763;">lemmas </b>and<b style="color: #073763;"> word forms</b>. Introduction to the notion of concepts, the triangle of meaning, concepts vs. named entities. Word <b style="color: #073763;">senses</b>: <b style="color: #073763;">monosemy </b>vs. <b style="color: #073763;">polysemy</b>. Key tasks for Natural Language Understanding: Word Sense Disambiguation (within lexical semantics), Semantic Role Labeling and Semantic Parsing (sentence-level).<img border="0" src="https://2.bp.blogspot.com/-Pv7uF7OXLSc/WQNnP5BngTI/AAAAAAAAB9o/oHXlXtN942cXSeZ5YD3DcJ3A5CIzlgmKwCLcB/s320/context-matters.jpg" width="450" /></p>
Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-22531768274542760052023-04-03T11:19:00.000+02:002023-04-03T11:19:00.814+02:00Lecture 10 (31/03/2023, 4 hours): more on probabilistic language models, introduction to lexical semantics.More on probabilistic language models, introduction to lexical semantics.Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-58758967878282525192023-03-27T14:15:00.004+02:002023-03-27T14:15:55.155+02:00Lecture 9 (27/03/2023, 1 hour): probabilistic language modeling<p><span>What is a language model? </span><b><span style="color: #073763;">N-gram models</span></b> (unigrams, bigrams, trigrams), together with their probability modeling and issues. Chain rule and n-gram estimation.</p>
<br />
<div class="separator" style="clear: both; text-align: center;">
<img border="0" src="https://4.bp.blogspot.com/-4wPlGwhSYp0/WMLmAQxXc2I/AAAAAAAAB0s/jMyE3pp39-QMgmBwZ71SgAETF0EYe100wCLcB/s1600/locandina.jpg" /></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-29655476563000571672023-03-24T12:43:00.005+01:002023-03-27T14:19:34.827+02:00Lecture 8 (24/03/2023, 4 hours): definition of LSTM; handbook of a real-world classification problem; homework 1More on LSTMs. Notebook on training, dev, test. Notebook on a <b>real-world NLP problem</b>. Assignment of <b>Homework 1!</b>
<div class="separator" style="clear: both;"><a href="https://miro.medium.com/v2/resize:fit:640/format:webp/1*FPTmBD8GY9ZvkdsUBYXC9Q.png" style="display: block; padding: 1em 0; text-align: center; "><img alt="" border="0" width="400" data-original-height="419" data-original-width="504" src="https://miro.medium.com/v2/resize:fit:640/format:webp/1*FPTmBD8GY9ZvkdsUBYXC9Q.png"/></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-77489822181306404252023-03-24T12:42:00.006+01:002023-03-27T14:18:22.432+02:00Lecture 7 (20/03/2023, 2 hours): more on word embeddings and RNNsMore on word embeddings. Lookup tables. Cooccurence matrices. GloVe. Stopwords. Static vs. contextualized embeddings. Different inputs and outputs for RNNs.
<div class="separator" style="clear: both;"><a href="https://upload.wikimedia.org/wikipedia/commons/thumb/b/b5/Recurrent_neural_network_unfold.svg/1920px-Recurrent_neural_network_unfold.svg.png" style="display: block; padding: 1em 0; text-align: center; "><img alt="" border="0" width="400" data-original-height="267" data-original-width="800" src="https://upload.wikimedia.org/wikipedia/commons/thumb/b/b5/Recurrent_neural_network_unfold.svg/1920px-Recurrent_neural_network_unfold.svg.png"/></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-13492711969829263322023-03-20T12:31:00.003+01:002023-03-27T14:17:32.368+02:00Lecture 6 (17/03/2023, 3 hours, E): word2vec, recurrent neural networks (RNNs), Long-Short Term Memory networks (LSMTs)word2vec (CBOW and skipgram), PyTorch notebook on word2vec, recurrent neural networks, optimization for RNNs, Long-Short Term Memory (LSMT) networks.
<div class="separator" style="clear: both;"><a href="https://miro.medium.com/v2/resize:fit:720/format:webp/1*5F4TXdFYwqi-BWTToQPIfg.jpeg" style="display: block; padding: 1em 0; text-align: center; "><img alt="" border="0" width="320" data-original-height="258" data-original-width="678" src="https://miro.medium.com/v2/resize:fit:720/format:webp/1*5F4TXdFYwqi-BWTToQPIfg.jpeg"/></a></div>Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0tag:blogger.com,1999:blog-4955963679674650302.post-10651432956082501122023-03-15T16:26:00.003+01:002023-03-15T16:28:36.761+01:00Lecture 5 (13/03/2023, 2 hours, E): first hands-on with PyTorch with language detection<p>Recap of the Supervised Learning framework, hands on practice
with <b>PyTorch</b> on the Language Detection Model: tensors, gradient
tracking, the <b>Dataset </b>class, the <b>Module </b>class, the backward step, the training loop, evaluating a model.</p>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://miro.medium.com/max/676/1*d0JWmF36SUey7aS8bvA-dw.jpeg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="425" data-original-width="676" height="251" src="https://miro.medium.com/max/676/1*d0JWmF36SUey7aS8bvA-dw.jpeg" width="400" /></a></div>
<span style="color: black; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 13.2px;"><br /></span></div>
Metodologie di Programmazionehttp://www.blogger.com/profile/09686060534570846811noreply@blogger.com0