In 2025, OpenAI secured nearly $1 trillion in compute infrastructure commitments, providing over 20 gigawatts of capacity, equivalent to 20 nuclear reactors. The massive scale highlights OpenAI’s strategy to maintain leadership in AI model training and deployment. Despite substantial capital expenditure and ongoing annual losses, executives see these investments as critical for advancing frontier AI research, supporting next-generation large language models, and sustaining computational demands for artificial general intelligence development, positioning the company at the forefront of global AI innovation.