Live
Black Hat USADark ReadingBlack Hat AsiaAI BusinessStop Explaining Your Codebase to Your AI Every TimeDEV Community📙 Journal Log no. 1 Linux Unhatched ; My DevSecOps JourneyDEV CommunitySTEEP: Your repo's fortune, steeped in truth.DEV CommunityVCSU Hosting Free Public Lecture on (AI) Artificial Intelligence - newsdakota.comGoogle News: AI[D] KDD Review DiscussionReddit r/MachineLearningI Built an MCP Server That Understands Your MSBuild Project Graph — Before You BuildDEV CommunityGemma 4 31B beats several frontier models on the FoodTruck BenchReddit r/LocalLLaMA1 Artificial Intelligence (AI) Stock That Could Be Worth a Fortune by 2030 - finance.yahoo.comGoogle News: AI1 Artificial Intelligence (AI) Stock That Could Be Worth a Fortune by 2030 - fool.comGoogle News: AIAgent Middleware in Microsoft Agent Framework 1.0DEV Communityکود کشاورزی — Complete GuideDEV CommunityHow I Track My AI Spending as a Solo Dev (Without Going Broke)DEV CommunityBlack Hat USADark ReadingBlack Hat AsiaAI BusinessStop Explaining Your Codebase to Your AI Every TimeDEV Community📙 Journal Log no. 1 Linux Unhatched ; My DevSecOps JourneyDEV CommunitySTEEP: Your repo's fortune, steeped in truth.DEV CommunityVCSU Hosting Free Public Lecture on (AI) Artificial Intelligence - newsdakota.comGoogle News: AI[D] KDD Review DiscussionReddit r/MachineLearningI Built an MCP Server That Understands Your MSBuild Project Graph — Before You BuildDEV CommunityGemma 4 31B beats several frontier models on the FoodTruck BenchReddit r/LocalLLaMA1 Artificial Intelligence (AI) Stock That Could Be Worth a Fortune by 2030 - finance.yahoo.comGoogle News: AI1 Artificial Intelligence (AI) Stock That Could Be Worth a Fortune by 2030 - fool.comGoogle News: AIAgent Middleware in Microsoft Agent Framework 1.0DEV Communityکود کشاورزی — Complete GuideDEV CommunityHow I Track My AI Spending as a Solo Dev (Without Going Broke)DEV Community
AI NEWS HUBbyEIGENVECTOREigenvector

TrafficMoE: Heterogeneity-aware Mixture of Experts for Encrypted Traffic Classification

arXiv cs.CRby Qing He, Xiaowei Fu, Lei ZhangApril 1, 20262 min read0 views
Source Quiz

arXiv:2603.29520v1 Announce Type: new Abstract: Encrypted traffic classification is a critical task for network security. While deep learning has advanced this field, the occlusion of payload semantics by encryption severely challenges standard modeling approaches. Most existing frameworks rely on static and homogeneous pipelines that apply uniform parameter sharing and static fusion strategies across all inputs. This one-size-fits-all static design is inherently flawed: by forcing structured headers and randomized payloads into a unified processing pipeline, it inevitably entangles the raw protocol signals with stochastic encryption noise, thereby degrading the fine-grained discriminative features. In this paper, we propose TrafficMoE, a framework that breaks through the bottleneck of sta

View PDF HTML (experimental)

Abstract:Encrypted traffic classification is a critical task for network security. While deep learning has advanced this field, the occlusion of payload semantics by encryption severely challenges standard modeling approaches. Most existing frameworks rely on static and homogeneous pipelines that apply uniform parameter sharing and static fusion strategies across all inputs. This one-size-fits-all static design is inherently flawed: by forcing structured headers and randomized payloads into a unified processing pipeline, it inevitably entangles the raw protocol signals with stochastic encryption noise, thereby degrading the fine-grained discriminative features. In this paper, we propose TrafficMoE, a framework that breaks through the bottleneck of static modeling by establishing a Disentangle-Filter-Aggregate (DFA) paradigm. Specifically, to resolve the structural between-components conflict, the architecture disentangles headers and payloads using dual-branch sparse Mixture-of-Experts (MoE), enabling modality-specific modeling. To mitigate the impact of stochastic noise, an uncertainty-aware filtering mechanism is introduced to quantify reliability and selectively suppress high-variance representations. Finally, to overcome the limitations of static fusion, a routing-guided strategy aggregates cross-modality features dynamically, that adaptively weighs contributions based on traffic context. With this DFA paradigm, TrafficMoE maximizes representational efficiency by focusing solely on the most discriminative traffic features. Extensive experiments on six datasets demonstrate TrafficMoE consistently outperforms state-of-the-art methods, validating the necessity of heterogeneity-aware modeling in encrypted traffic analysis. The source code is publicly available at this https URL.

Comments: Project page \url{this https URL}

Subjects:

Cryptography and Security (cs.CR); Artificial Intelligence (cs.AI); Multimedia (cs.MM); Networking and Internet Architecture (cs.NI)

Cite as: arXiv:2603.29520 [cs.CR]

(or arXiv:2603.29520v1 [cs.CR] for this version)

https://doi.org/10.48550/arXiv.2603.29520

arXiv-issued DOI via DataCite (pending registration)

Submission history

From: Lei Zhang [view email] [v1] Tue, 31 Mar 2026 10:05:54 UTC (1,338 KB)

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

modelannounceavailable

Knowledge Map

Knowledge Map
TopicsEntitiesSource
TrafficMoE:…modelannounceavailablefeatureanalysiscomponentarXiv cs.CR

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 160 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Releases