Nearly a century ago it was recognized that radiation absorption by stellar matter controls the internal temperature profiles within stars. Laboratory opacity measurements, however, have never been performed at stellar interior conditions, introducing uncertainties in stellar models. A particular problem arose when refined photosphere spectral analysis led to reductions of 30-50 per cent in the inferred amounts of carbon, nitrogen and oxygen in the Sun. Standard solar models using the revised element abundances disagree with helioseismic observations that determine the internal solar structure using acoustic oscillations. This could be resolved if the true mean opacity for the solar interior matter were roughly 15 per cent higher than predicted, because increased opacity compensates for the decreased element abundances. Iron accounts for a quarter of the total opacity at the solar radiation/convection zone boundary. Here we report measurements of wavelength-resolved iron opacity at electron temperatures of 1.9-2.3 million kelvin and electron densities of (0.7-4.0) × 10(22) per cubic centimetre, conditions very similar to those in the solar region that affects the discrepancy the most: the radiation/convection zone boundary. The measured wavelength-dependent opacity is 30-400 per cent higher than predicted. This represents roughly half the change in the mean opacity needed to resolve the solar discrepancy, even though iron is only one of many elements that contribute to opacity.