75% of What a Neural Network Learns is noise. So is 75% of What You Learned in School.
Quantization asks how much of a neural network you can throw away before it breaks. Education asks the same question about the human mind. Source : Image by the author I was talking about TurboQuant to a tech colleague at an AI company last week. Quantization, model compression, how you squeeze a 70-billion-parameter model into something that runs on hardware a fraction of the size. Standard conversation for people who build them. A sales colleague overheard us and walked over. “Why does that matter? Isn’t data just data? It’s not like there’s a storage problem.” She was not being dismissive. She was genuinely asking. And the question stopped me, because she was right about something she did not realize she was right about. Data is just data. That part she nailed. The part she missed is th
Could not retrieve the full article text.
Read on Towards AI →Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
claudegeminimodel
Study finds smart glasses boost AI scribe accuracy and more briefs
Flinders study tests vision-enabled AI scribe Researchers at Flinders University have tested a vision-enabled AI clinical scribe combining Google Gemini and Ray-Ban Meta smart glasses and found improved accuracy in documenting pharmacist-patient consultations.
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Models

AI and Copyright: UK government backs away from exceptions for AI training, proposes maintaining the status quo – at least for now - www.hoganlovells.com
AI and Copyright: UK government backs away from exceptions for AI training, proposes maintaining the status quo – at least for now www.hoganlovells.com



Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!