Understanding humanity's oldest information technology
Deep Time Lab is an independent research program applying statistical and AI-assisted methods to questions about long-duration cultural knowledge — what survives, what's accurate, and why.
For at least 50,000 years before the invention of writing, human societies maintained and transmitted complex knowledge through oral traditions — songs, stories, ceremonies, and kinship systems. How accurate was this transmission? Under what conditions does oral knowledge preserve genuine information versus accumulate cultural noise?
These questions matter because they determine how we interpret the deep human past. If oral traditions are reliable under specific conditions, they represent an untapped empirical resource. If they are not, centuries of ethnographic data must be treated differently.
Deep Time Lab applies quantitative methods — statistical testing, pre-registered predictions, cross-domain meta-analysis — to distinguish signal from noise in traditional knowledge systems.
Every claim is tested statistically. We use pre-registration, Monte Carlo baselines, replication across independent databases, and adversarial robustness checks.
All data, code, and preprints are publicly available before publication. We explicitly invite adversarial re-analysis. Every dataset has a permanent DOI.
Research involving Indigenous cultural knowledge requires cultural consultation before publication. We work with communities, not about them. Some research is paused pending consultation.
Every year, languages die, oral traditions break, and archaeological sites are destroyed by development, erosion, and conflict. The knowledge encoded in these systems took tens of thousands of years to accumulate. Once lost, it cannot be recovered. This is not an abstract concern. It is a civilization-scale loss of information happening in real time.
The oral traditions disappearing right now may contain empirical observations about climate, ecology, navigation, and geology that modern science has not yet independently discovered. We already have evidence that some traditions encode accurate physical information across 7,000+ years. We have no idea how much is vanishing before anyone thinks to check.
Traditional academic timelines — multi-year grant cycles, slow peer review, incremental publication — are not fast enough. Deep Time Lab exists to accelerate the rate of discovery. We use bespoke AI-driven research engines, large-scale computational analysis, and open-access publication to compress what would traditionally take decades of fieldwork and analysis into months. Not because speed is a virtue in itself, but because the information we are trying to study is actively disappearing.
Deep Time Lab brings together cross-domain expertise in artificial intelligence, statistical analysis, product development, and a lifelong engagement with archaeology and human prehistory. We are a small, independent operation — but the methods work, the data is open, and the results are replicable.
Explore the Observability Gradient — our most distinctive finding →