Attention in transformers