Discuss, Learn and be Happy דיון בשאלות

help brightness_4 brightness_7 format_textdirection_r_to_l format_textdirection_l_to_r

Explain the intuition behind distributional methods - that is, why do we believe that solving the task of predicting a word given its context yields embeddings which capture lexical semantics?

1
done
by
מיין לפי

Describe what is the Continuous Bag of Words (CBoW) model for learning word embeddings. List two ways in which the task modeled by CBoW is different from learning an n-gram language model.

1
done
by
מיין לפי

Describe three ways in which Pre-trained Transformer models are different from Neural Language Models

1
done
by
מיין לפי

ADJ ADP ADV CCONJ SCONJ NOUN PROPN PRON DET NUM VERB AUX PART PUNCT SYM INTJ X What would be the perplexity of the task of POS given a uniform distribution over the universal POS tagset?

1
done
by
מיין לפי

List 3 types of knowledge sources that can help in performing Parts of Speech tagging:

1
done
by
מיין לפי

Give an example word for each of the following classes: ADJ AUX PRON CCONJ ADV *my addition* think about each one from the classes: ADJ ADP ADV CCONJ SCONJ NOUN PROPN PRON DET NUM VERB AUX PART PUNCT SYM INTJ X

1
done
by
מיין לפי

Tag the following sentence in the format "John/PROPN eats/VERB an/DET apple/NOUN ./PUNCT" County officials in Maryland miscalculated how many ballots they would need on Election Day -- and quickly ran out in more than a dozen precincts .

1
done
by
מיין לפי

Give an example of word that can be tagged by using morphological clues.

1
done
by
מיין לפי

Give an example of sentence where an ambiguous word can be tagged using syntactic clues.

1
done
by
מיין לפי

Consider a distribution over two discrete variables x, y displayed in the following figure: Matrix view of 2 random variables. *in the original question there is a table of X/Y* Provide formulas to compute: Joint probability: Marginal probability for x: Conditional probability given y:

1
done
by
מיין לפי