PhiloBERTA: A Transformer-Based Cross-Lingual Analysis of Greek and Latin Lexicons
Rumi A. Allbert, Makai L. Allbert
2025-03-11
Summary
This paper talks about PhiloBERTA, an AI tool that compares ancient Greek and Latin words to see how closely their meanings match, especially for tricky philosophical terms like 'knowledge' or 'justice'.
What's the problem?
It’s hard to track how ancient words changed meaning across languages and time, especially for complex philosophical ideas that don’t translate directly between Greek and Latin.
What's the solution?
PhiloBERTA uses smart math (like angle-based comparisons) to measure how similar Greek and Latin words are in old texts, proving that related words (like Greek 'epistēmē' and Latin 'scientia') kept similar meanings over time.
Why it matters?
This helps historians and linguists study how ancient ideas spread between cultures and languages, making it easier to analyze old philosophy books or track the history of words.
Abstract
We present PhiloBERTA, a cross-lingual transformer model that measures semantic relationships between ancient Greek and Latin lexicons. Through analysis of selected term pairs from classical texts, we use contextual embeddings and angular similarity metrics to identify precise semantic alignments. Our results show that etymologically related pairs demonstrate significantly higher similarity scores, particularly for abstract philosophical concepts such as epist\=em\=e (scientia) and dikaiosyn\=e (iustitia). Statistical analysis reveals consistent patterns in these relationships (p = 0.012), with etymologically related pairs showing remarkably stable semantic preservation compared to control pairs. These findings establish a quantitative framework for examining how philosophical concepts moved between Greek and Latin traditions, offering new methods for classical philological research.