P
Peyman
Guest
Peyman Asks: What is the positional encoding in the transformer model?
I'm trying to read and understand the paper Attention is all you need and in it, there is a picture:
I don't know what positional encoding is. by listening to some youtube videos I've found out that it is an embedding having both meaning and position of a word in it and has something to do with $sin(x)$ or $cos(x)$
but I couldn't understand what exactly it is and how exactly it is doing that. so I'm here for some help. thanks in advance.
I'm trying to read and understand the paper Attention is all you need and in it, there is a picture:

I don't know what positional encoding is. by listening to some youtube videos I've found out that it is an embedding having both meaning and position of a word in it and has something to do with $sin(x)$ or $cos(x)$
but I couldn't understand what exactly it is and how exactly it is doing that. so I'm here for some help. thanks in advance.
SolveForum.com may not be responsible for the answers or solutions given to any question asked by the users. All Answers or responses are user generated answers and we do not have proof of its validity or correctness. Please vote for the answer that helped you in order to help others find out which is the most helpful answer. Questions labeled as solved may be solved or may not be solved depending on the type of question and the date posted for some posts may be scheduled to be deleted periodically. Do not hesitate to share your thoughts here to help others.