In this session, you learnt the basic ideas which are used to represent meaning- entities, entity types, arity, reification and various types of semantic association that can exist between entities. you also studied the idea of aboutness- text is always about something and there are techniques to infer the topics the text is about (you’ll study that in session -3).
You also saw that associations between a wide range of entities are stored in a structured way in gigantic knowledge graphs or schemas such as schema. org.
You also learnt techniques that can be used to disambiguate the meaning of a word- supervised and unsupervised. the ‘correct’ meaning of an ambiguous word depends upon the contextual word.
In supervised techniques, such as naive Bayes (or any classifier for that matter), you take the context -sense set as the training data. the label is the ‘sense’ and the input is the context words.
In unsupervised techniques , such as the lesk algorithm, you assign the definition to the ambiguous word which overlaps with the surrounding words maximally.