# Pe

- [A Simple and Effective Positional Encoding for Transformers](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/001.md)
- [DeBERTa Decoding-enhanced BERT with Disentangled Attention](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/002.md)
- [DecBERT Enhancing the Language Understanding of BERT with Causal Attention Masks](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/003.md)
- [Encoding word order in complex embeddings](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/004.md)
- [Improve Transformer Models with Better Relative Position Embeddings](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/005.md)
- [KERPLE Kernelized Relative Positional Embedding for Length Extrapolation](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/006.md)
- [PermuteFormer Efficient Relative Position Encoding for Long Sequences](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/007.md)
- [Rethinking Positional Encoding in Language Pre-training](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/008.md)
- [Transformer-XL Attentive Language Models Beyond a Fixed-Length Context](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/009.md)
- [Translational Equivariance in Kernelizable Attention](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/010.md)
- [Transformer Language Models without Positional Encodings Still Learn Positional Information](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/011.md)
- [Stable, Fast and Accurate: Kernelized Attention with Relative Positional Encoding](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/012.md)
- [Randomized Positional Encodings Boost Length Generalization of Transformers](https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe/013.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://doraemonzzz.gitbook.io/transformer_evolution_paper/pe.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
