1.What is the name of the machine learning architecture that can be used to translate text from one language to another?
checkEncoder-decoder 정답
That's correct!
check
2. What is the advantage of using the attention mechanism over a traditional recurrent neural network (RNN) encoder-decoder?
checkThe attention mechanism lets the decoder focus on specific parts of the input sequence, which can improve the accuracy of the translation.
That's correct!
check
3. How does an attention model differ from a traditional model?
checkAttention models pass a lot more information to the decoder. 정답
That's correct!
check
4. What are the two main steps of the attention mechanism?
checkCalculating the attention weights and generating the context vector 정답
That's correct!
check
5. What is the name of the machine learning technique that allows a neural network to focus on specific parts of an input sequence?
checkAttention mechanism 정답
That's correct!
check
6. What is the purpose of the attention weights?
checkTo assign weights to different parts of the input sequence, with the most important parts receiving the highest weights. 정답
That's correct!
check
7. What is the advantage of using the attention mechanism over a traditional sequence-to-sequence model?
checkThe attention mechanism lets the model focus on specific parts of the input sequence.정답