NLP QUIDE
Transformer architecture was first introduced with?
a. GloVe
b. BERT
c. Open AI's GPT
d. ULMFit - answerc
Trains two independent LSTM language model left to right and right to left and shallowly
concatenates them
a. GPT
b. BERT
c. ULMFit
d. ELMo - answerd
Uses unidirectional language model for producing word embedding
a. BERT
b. GPT
c. ELMo
d. Word2Vec - answerb
For a given token, its input representation is the sum of embedding from the token,
segment and position embedding
a. ELMo
b. GPT
c. BERT
d. ULMFit - answerc
Which of the following will be a better choice to address NLP use cases such as
semantic similarity, reading comprehension, and common sense reasoning
a. ELMo
b. Open AI's GPT
c. ULMFit - answerb
Same word can have multiple word embeddings possible with ____________?
a. GloVe
b. Word2Vec
c. ELMo
d. nltk - answerc
In this architecture, the relationship between all words in a sentence is modelled
irrespective of their position. Which architecture is this?
a. OpenAI GPT
Transformer architecture was first introduced with?
a. GloVe
b. BERT
c. Open AI's GPT
d. ULMFit - answerc
Trains two independent LSTM language model left to right and right to left and shallowly
concatenates them
a. GPT
b. BERT
c. ULMFit
d. ELMo - answerd
Uses unidirectional language model for producing word embedding
a. BERT
b. GPT
c. ELMo
d. Word2Vec - answerb
For a given token, its input representation is the sum of embedding from the token,
segment and position embedding
a. ELMo
b. GPT
c. BERT
d. ULMFit - answerc
Which of the following will be a better choice to address NLP use cases such as
semantic similarity, reading comprehension, and common sense reasoning
a. ELMo
b. Open AI's GPT
c. ULMFit - answerb
Same word can have multiple word embeddings possible with ____________?
a. GloVe
b. Word2Vec
c. ELMo
d. nltk - answerc
In this architecture, the relationship between all words in a sentence is modelled
irrespective of their position. Which architecture is this?
a. OpenAI GPT