During European prehistory, hilltop enclosures made from polydisperse particle-and-block stone walling were exposed to temperatures sufficient to partially melt the constituent stonework, leading to the preservation of glassy walls called ‘vitrified forts’. During vitrification, the granular wall rocks partially melt, sinter viscously and densify, reducing inter-particle porosity. This process is strongly dependent on the solidus temperature, the particle sizes, the temperature-dependence of the viscosity of the evolving liquid phase, as well as the distribution and longevity of heat. Examination of the sintering behaviour of 45 European examples reveals that it is the raw building material that governs the vitrification efficiency. As Iron Age forts were commonly constructed from local stone, we conclude that local geology directly influenced the degree to which buildings were vitrified in the Iron Age. Additionally, we find that vitrification is accompanied by a bulk material strengthening of the aggregates of small sizes, and a partial weakening of larger blocks. We discuss these findings in the context of the debate surrounding the motive of the wall-builders. We conclude that if wall stability by bulk strengthening was the desired effect, then vitrification represents an Iron Age technology that failed to be effective in regions of refractory local geology.