[3] • ಈత୯ޠຒΊࠐΈʹରͯ͠: ͳΜΒ͔ଞͷόΠΞεΛআڈ͢Δ •ํੑΛ্ͤͭͭ͞දݱֶश͢Δख๏ͱͯ͠ରরֶश͕಄ [1] Mu et al., All-but-the-Top: Simple and E ff ective Postprocessing for Word Representations, arXiv 2017 [2] Ethayaraja, How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings, arXiv 2019 [3] Sasaki et al., Examining the e ff ect of whitening on static and contextualized word embeddings, Information Processing & Management 2023 ຒΊࠐΈදݱͱํੑ 7 [