GPT-5 Architecture Leak Reveals Mixture-of-Experts with 1.8 Trillion Parameters
Leaked documents suggest GPT-5 employs a sparse Mixture-of-Experts architecture with 1.8 trillion total parameters, activating only 200B per forward pass. OpenAI has neither confirmed nor denied the reports.
Documents circulating in AI research communities suggest that OpenAI's upcoming GPT-5 model employs a sophisticated sparse Mixture-of-Experts (MoE) architecture. According to these materials, the model contains approximately 1.8 trillion total parameters but activates only around 200 billion during any single forward pass.
This architectural choice follows the trend established by models like Mixtral and Google's Gemini, where computational efficiency is achieved by routing tokens to specialized expert networks. The sparse activation means that despite the enormous parameter count, inference costs remain manageable.
The leaked materials also suggest significant improvements in multimodal capabilities, with native support for video understanding and real-time audio processing. The model reportedly achieves near-human performance on several professional certification exams.
OpenAI has maintained silence on the reports, consistent with their practice of not commenting on pre-release information. Industry analysts expect an official announcement within the next quarter.
Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Models

Smallest.ai launches Lightning V3, a new text-to-speech model that beats OpenAI, Cartesia, and ElevenLabs on key voice quality benchmarks - Editorji
Smallest.ai launches Lightning V3, a new text-to-speech model that beats OpenAI, Cartesia, and ElevenLabs on key voice quality benchmarks Editorji





Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!