Qwen3.5 vs Gemma 4: Benchmarks vs real world use?
Just tested Gemma 4 2B locally on old rtx2060 6GB VRAM and used Qwen3.5 in all sizes intensively, in customer projects before. First impression from Gemma 4 2B: It's better, faster, uses less memory than q3.5 2B. More agentic, better mermaid charts, better chat output, better structured output. It seems like either q3.5 are benchmaxed (although they really were much better than the competition) or google is playing it down. Gemma 4 2B "seems" / "feels" more like Q3.5 9B to me. submitted by /u/AppealSame4367 [link] [comments]
Could not retrieve the full article text.
Read on Reddit r/LocalLLaMA →Reddit r/LocalLLaMA
https://www.reddit.com/r/LocalLLaMA/comments/1sbec70/qwen35_vs_gemma_4_benchmarks_vs_real_world_use/Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
benchmarkagenticagent
Canônico
Aqui pensando em gramática e jeito de escrever o que fazemos em software. Em desenvolvimento de software escrever com padrão sempre foi algo importante e diria que quase obrigatório para uma equipe com pensamento de longo prazo. Em diferentes aspectos, não só na formatação do código fonte, mas também como nomear variáveis e diferentes estruturas nomeadas em um código, como testes automatizados e também de modelo de dados. Em desenvolvimento de software nos dias atuais, pensar em como fazemos descoberta de um determinado problema e como fazemos construção. No caso de descoberta o que existe de entrada? Documentações, reuniões com clientes (transcrições e desenhos), algum tipo de manual de apoio, relatórios e outras regras de negócio que podem ser usadas como base. E no caso de entrega, todo

Why OpenAI Buying TBPN Matters More Than It Looks
OpenAI’s acquisition of TBPN, the fast-rising tech talk show founded by John Coogan and Jordi Hays, looks odd at first glance. This is the company behind ChatGPT and frontier model research, not a legacy media group trying to add another audience property. But the move is more interesting than a simple brand play. It signals that the next phase of the AI race will not be won on model quality alone. It will also be fought on narrative, trust, distribution, and who gets to frame the future of AI for everyone else. According to Reuters, OpenAI bought TBPN after the show built a loyal Silicon Valley following through interviews with major industry leaders. The founders are joining OpenAI, and the company says the goal is to communicate its plans better and help shape the conversation around th
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Self-Evolving AI

Google's Gemma 4 AI can run on smartphones, no Internet required
The two largest Gemma 4 models – 26B Mixture of Experts and 31B Dense – require an 80GB Nvidia H100 GPU to run unquantized in bfloat16 format. Google claims these models deliver "frontier intelligence on personal computers" for students, researchers, and developers, providing advanced reasoning capabilities for IDEs, coding assistants, and agentic workflows. Read Entire Article

The Mapless Revolution and How Li Auto’s MindVLA and NVIDIA’s AI Infrastructure Just Rendered Traditional Autonomous Driving Obsolete - Torque News
The Mapless Revolution and How Li Auto’s MindVLA and NVIDIA’s AI Infrastructure Just Rendered Traditional Autonomous Driving Obsolete Torque News



Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!