In this session,you learnt the basic ideas which are used to represent meaning-entities, entity types, arity reification and varioustype of semantic associations that can exist between entities. You also studied the idea of aboutness-text is always about something , andther are techniques to infer the topicsbthe text is about.
You also saw that associations between a wide rangeof entities are stored in astructured way in gigantic knowledge graphs or schemas such as schema.org.
You also learnt techniques that can be used to disambiguate the meaning of a word – supervised and unsupervised. The ‘correct’ meaning of an ambiguous word depends upon the contextual words.
In supervised techniques, such as naive Bayes(or any classifer for the matter ) ,you take the contex -sense set as the training data.The label is the sense and input is the context word.
In unsupervised techniques,such as the lesk algorithm, you assign the difinition to the ambiguous word which overlaps with the surrounding words maximally.