Chinese chip industry leaders admit the country lags five to ten years behind in AI data center chips — AI demand is straining equipment and talent supply
Chinese chip industry leaders admit the country lags five to ten years behind in AI data center chips — AI demand is straining equipment and talent supply
(Image credit: Getty Images / NurPhoto)
Senior Chinese semiconductor executives said AI-driven demand is creating bottlenecks across equipment, passive components, and workforce capacity, according to a DigiTimes report from the SEMI Industry Innovation Investment Forum at SEMICON China 2026, which ran March 25-27 in Shanghai.
Go deeper with TH Premium: AI and data centers
David Wang, CEO of ACM Research, said the AI surge has been propelled by chip advances but argued that future progress depends on semiconductor equipment. Next-generation manufacturing tools haven't yet been developed, Wang said, and will likely define the trajectory of computing performance going forward.
Article continues below
Wei Li, standing vice president of National Silicon Industry Group, pointed to rising demand for memory, data center power management ICs, and optoelectronic technologies, with data transmission and 6G emerging as key focus areas.
Daniel Yuan, EVP of Sino IC Leasing, said multilayer ceramic capacitors are facing shortages as data center construction accelerates. Sino IC Leasing is a state-backed financial leasing company focused exclusively on the integrated circuit industry, and Yuan's comments reflect broader supply pressure across passive components that underpin AI server builds.
Lee Haiming, SVP of Chongqing Xinlian Microelectronics, said AI growth is forcing Chinese foundries to scale up faster while talent retention and equipment utilization remain key constraints. Lee added that China remains competitive in consumer chips but lags five to ten years behind in automotive and data center semiconductors. He cited AI adoption in manufacturing as one path to narrowing that gap.
Chongqing Xinlian is a state-owned specialty foundry backed by China's Big Fund Phase II. The company is building the first Chinese 12-inch wafer fab in Chongqing's Xiyong Microelectronics Industrial Park with an initial capacity target of 20,000 wafers per month, focused on automotive-grade chip production.
Panelists also discussed international expansion. Wang said sustained investment and market scale are critical to global competitiveness, with differentiated technologies forming the foundation of that. Li acknowledged that geopolitical constraints persist but said companies can reach overseas customers by delivering value, with domestic competition increasingly pushing firms toward export markets.
The panelists were all in agreement that AI will continue to drive capital expenditure growth, with sustained investment and AI-driven manufacturing upgrades essential to maintaining competitiveness.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.
Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
tomshardware.com
https://www.tomshardware.com/tech-industry/semiconductors/chinese-chip-industry-leaders-say-ai-demand-is-straining-equipment-and-talent-supplySign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
trainingcountry
ENEIDE: A High Quality Silver Standard Dataset for Named Entity Recognition and Linking in Historical Italian
arXiv:2603.29801v1 Announce Type: new Abstract: This paper introduces ENEIDE (Extracting Named Entities from Italian Digital Editions), a silver standard dataset for Named Entity Recognition and Linking (NERL) in historical Italian texts. The corpus comprises 2,111 documents with over 8,000 entity annotations semi-automatically extracted from two scholarly digital editions: Digital Zibaldone, the philosophical diary of the Italian poet Giacomo Leopardi (1798--1837), and Aldo Moro Digitale, the complete works of the Italian politician Aldo Moro (1916--1978). Annotations cover multiple entity types (person, location, organization, literary work) linked to Wikidata identifiers, including NIL entities that cannot be mapped to the knowledge graph. To the best of our knowledge, ENEIDE represents

Survival In-Context: Prior-fitted In-context Learning Tabular Foundation Model for Survival Analysis
arXiv:2603.29475v1 Announce Type: new Abstract: Survival analysis is crucial for many medical applications but remains challenging for modern machine learning due to limited data, censoring, and the heterogeneity of tabular covariates. While the prior-fitted paradigm, which relies on pretraining models on large collections of synthetic datasets, has recently facilitated tabular foundation models for classification and regression, its suitability for time-to-event modeling remains unclear. We propose a flexible survival data generation framework that defines a rich survival prior with explicit control over covariates and time-event distributions. Building on this prior, we introduce Survival In-Context (SIC), a prior-fitted in-context learning model for survival analysis that is pretrained

An Isotropic Approach to Efficient Uncertainty Quantification with Gradient Norms
arXiv:2603.29466v1 Announce Type: new Abstract: Existing methods for quantifying predictive uncertainty in neural networks are either computationally intractable for large language models or require access to training data that is typically unavailable. We derive a lightweight alternative through two approximations: a first-order Taylor expansion that expresses uncertainty in terms of the gradient of the prediction and the parameter covariance, and an isotropy assumption on the parameter covariance. Together, these yield epistemic uncertainty as the squared gradient norm and aleatoric uncertainty as the Bernoulli variance of the point prediction, from a single forward-backward pass through an unmodified pretrained model. We justify the isotropy assumption by showing that covariance estimat
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Models
Large language models in psychology - Nature
<a href="https://news.google.com/rss/articles/CBMiWEFVX3lxTE5ocmtjRFJXU1NaZ3pDZnc5WmoxUU56RlZ3Sy1CUTduYlh1YU52bEROb2pwUVBMRDgyWGNuYVQ0SHQ0c2djdHVmR1c2TUlrV1Vxa3JGbHRsWjA?oc=5" target="_blank">Large language models in psychology</a> <font color="#6f6f6f">Nature</font>
A meta-analysis of the persuasive power of large language models - Nature
<a href="https://news.google.com/rss/articles/CBMiX0FVX3lxTFBtVkFYLUROMVdUY09HLWF5ZXl2TTBtNHJrSXhBQTRSLWtxUi1mQ2g3cmVBMVF2WnlELVNhUlFnNU41UDdNMDBWRHFZalJYTWdYVE5KcjNfVURLbkNFVTJj?oc=5" target="_blank">A meta-analysis of the persuasive power of large language models</a> <font color="#6f6f6f">Nature</font>
Anthropic Dials Back AI Safety Commitments - WSJ
<a href="https://news.google.com/rss/articles/CBMiiwNBVV95cUxOb1Y0aGxUNmlnWUFuVjBoTFFqTXZBanUwOEwxMmxBUXlfX2Q5ZERpd0k0TnRiMldfWmY2bTFDcTJuQlJRTFNsc1BCX0pwVFFPeldBM1NOVFZ6SmlsekZtemgxU3hSdVptM0l4a01yT1o4V2FVclRwOEc2QmRQZkl6aXhaVnVwclJhYU9qN0pXcWkwYWlfQ3lJRC0xb3FXZ3cwUjZhTFhtWnA5Ul81MDR5N2pJY3pqdEIxM0FNcm5WWDE1VkpCejI1bmZzNU5wQVVHbERqc1RHQmkyUlEyTk02ekRNVFlBYjRQUHkxS3owOXJCT0l0STRVeloya2p3a1dIX3NSMm5XR3lFdUFKekJiU3RiMUM4MlUyR2dFUm5vcGJKS3lLMi1ubnM0QWoxMDUyUEx5MlI3dkk2by1QRm1jN1RKazFfLTJyU0hkUEZicUZuRWs4MDd3U2YtZm9ucG1TS3Z4NjhQZjhMVERBT2laM0ttX2x3ZDR0QlVXcUtGZzMyYWF3M3A0U1lqUQ?oc=5" target="_blank">Anthropic Dials Back AI Safety Commitments</a> <font color="#6f6f6f">WSJ</font>
Exclusive | Caltech Researchers Claim Radical Compression of High-Fidelity AI Models - WSJ
<a href="https://news.google.com/rss/articles/CBMiuANBVV95cUxOTGxaVmNpenBkbkRYZmhsOG9MRTF4YTk0TEEwanVSUS05X2w5TE9sY1BuenFOWlozaElZWTUxVzZYTFVGTUJ3QjNpMmV6d1AtNVhjUEVMbF9Cdy1GSnFpUnVQOVN6ZzJjdzRWWnNBXzRYOEdRUW9xdEpPMFlHUmV3OFBIV1hBUmc0and2MjNZNjJIVTZqeTd6V2Q2NWlydkhDN0xEa1NyUmYtNXkxb3NvUjZWelAzQndPeDRjY2J0RHYzNi1wTW1FeWwxd2hkTWJXeHJjaENTYXFPb3VtQTlQWFFZSXVENXhMaWpJTTN1bVl1bXVUY0dFVXluTnJkQXpKNmVJdUZEZ2I3WVdsS1dnaGdrZGlwZjJFZGtqaGo3X1ZBNEltcXZna1g4c3Z3WXlqWks5Yl9SMjJyQTVCM0trNkZuV1NSUF93YzdHdXJwWlVtQ3VrcUlsTDNQZ1NEOTk5NkhVWGF6TWVpMmJ4NXNLMWJPOVFpU3lNMW52Z0lEaWN5aXJwNU9VbXR6d0VsOHo4b00wNDFrYmlRZ3BLTWphbVMtVGtTVTFoX2hYQmtjaG1GVkJSbHVzdw?oc=5" target="_blank">Exclusive | Caltech Researchers Claim Radical Compression of High-Fidelity AI Models</a> <font color="#6f6f6f">WSJ</font>

Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!