*Monthly online ILFC Seminar: interactions between formal and computational
linguistics*
https://gdr-lift.loria.fr/monthy-online-ilfc-seminar/

GdR LIFT is happy to announce the three forthcoming sessions of the ILFC
seminar on the interactions between formal and computational linguistics:

   - 2022/12/14 16:00-17:00 UTC+1: *Guy Emerson* (University of Cambridge;
   15:00-16:00 UTC+0)
   Title: *Learning meaning in a logically structured model: An
   introduction to Functional Distributional Semantics*
   Abstract:
*The aim of distributional semantics is to design computational techniques
   that can automatically learn the meanings of words based on the contexts in
   which they are observed. The mainstream approach is to represent meanings
   as vectors (such as Word2Vec embeddings, or contextualised BERT
   embeddings). However, vectors do not provide a natural way to talk about
   basic concepts in logic and formal semantics, such as truth and reference.
   While there have been many attempts to extend vector space models to
   support such concepts, there does not seem to be a clear solution. In this
   talk, I will instead go back to fundamentals, questioning whether we should
   represent meaning as a vector I will present the framework of Functional
   Distributional Semantics, which makes a clear distinction between words and
   the entities they refer to. The meaning of a word is represented as a
   binary classifier over entities, identifying whether the word could refer
   to the entity – in formal semantic terms, whether the word is true of the
   entity. The structure of the model provides a natural way to model logical
   inference, semantic composition, and context-dependent meanings, where
   Bayesian inference plays a crucial role. The same kind of model can also be
   applied to different kinds of data, including both grounded data such as
   labelled images (where entities are observed) and also text data (where
   entities are latent). I will discuss results on semantic evaluation
   datasets, indicating that the model can learn information not captured by
   vector space models like Word2Vec and BERT. I will conclude with an outlook
   for future work, including challenges and opportunities of joint learning
   from different data sources.*
   - 2023/01/18 17:00-18:00 UTC+1: *Carolyn Anderson* (Wellesley College;
   11:00-12:00 UTC-5)
   Title: [TBA]
   Abstract: [TBA]
   - 2023/02/15: *Steven T. Piantadosi* (UC Berkeley)
   Title: [TBA]
   Abstract: [TBA]


The seminar is held on Zoom. To attend the seminar and get updates, please
subscribe to our mailing list (we now only rarely communicate through other
mailing lists): https://sympa.inria.fr/sympa/subscribe/seminaire_ilfc
_______________________________________________
Corpora mailing list -- [email protected]
https://list.elra.info/mailman3/postorius/lists/corpora.list.elra.info/
To unsubscribe send an email to [email protected]

Reply via email to