< 목차 > tmp tmp References tmp tmp References Papers LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Others NTK-Aware Scaled RoPE allows LLaMA models to have extended (8k+) context size without any fine-tuning and minimal perplexity degradation.