Gemma 4 Uncensored (autoresearch results)
Gemma 4 Uncensored — all 4 models, MoE expert abliteration, automated research loop Released uncensored versions of all four Gemma 4 models. bf16 + GGUF for each. Collection : https://huggingface.co/collections/TrevorJS/gemma-4-uncensored-69d2885d6e4fc0581f492698 Code : https://github.com/TrevorS/gemma-4-abliteration Results Model Baseline After KL Div E2B (2.3B) 98% 0.4% 0.346 E4B (4.5B) 99% 0.7% 0.068 26B MoE 98% 0.7% 0.090 31B 100% 3.2% 0.124 Refusal rates from 686 prompts across 4 datasets (JailbreakBench, tulu-harmbench, NousResearch, mlabonne). Manually audited — most flagged refusals are actually the model complying with a disclaimer attached. 26B MoE Standard abliteration only touches dense layers, which gets you from 98% → 29% on the MoE. The remaining refusals are in the expert w
Could not retrieve the full article text.
Read on Reddit r/LocalLLaMA →Reddit r/LocalLLaMA
https://www.reddit.com/r/LocalLLaMA/comments/1sd8c59/gemma_4_uncensored_autoresearch_results/Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
llamamodelrelease
Building a Node.js document intelligence pipeline for under $10/day
You've got 10,000 support tickets, blog posts, or product reviews to process. You need summaries and keywords for each. What does that actually cost? This post walks through a real Node.js pipeline that processes documents in parallel with rate limiting, error handling, and retry logic — and calculates exactly what you'll pay. The economics first Using a pay-per-use API (1 USDC = 1,000 credits): Operation Credits Cost per call 10,000 docs Summarize 10 $0.01 $100 Keywords 5 $0.005 $50 Both 15 $0.015 $150 No monthly fee. No minimum. Idle months cost $0. Setting up npm init -y npm install node-fetch p-limit Get a free API key (100 credits, no card needed): curl -s -X POST https://textai-api.overtek.deno.net/keys/create \ -H "Content-Type: application/json" \ -d '{"label":"node-pipeline"}' # {

Claude Code in Kenya: How Nairobi developers are using AI at KSh260/month
Claude Code in Kenya: How Nairobi developers are using AI at KSh260/month If you're a developer in Kenya, you've probably done the math on ChatGPT. $20/month. At current exchange rates, that's KSh2,600 every single month . For a junior developer in Nairobi earning KSh45,000–65,000/month, that's 4–6% of your take-home pay. Just for an AI tool. Before rent, before food, before M-Pesa. There's a better option. Claude Code at KSh260/month SimplyLouie gives you full Claude API access — the same Sonnet model powering Claude.ai — for KSh260/month . That's 10x cheaper than ChatGPT Plus. The difference pays for a week of lunch at a Westlands restaurant. How Kenyan developers are using it 1. USSD/SMS API integrations Africa's mobile money ecosystem runs on USSD and SMS. Building Safaricom integratio
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Open Source AI

Running a local LLM on Android with Termux and llama.cpp
What I used Samsung S21 Ultra Termux llama-cpp-cli llama-cpp-server Qwen3.5-0.8B with Q5_K_M quantization from huggingface (I also tried Bonsai-8B-GGUF-1bit from huggingface. Although this is a newer model and required a different setup, which I might write about at a later time, it produced 2-3 TPS and I did not find that to be usable) Installation I downloaded the "Termux" app from the Google Play store and installed the needed tools in Termux: pkg update pkg upgrade -y pkg install llama-cpp -y Downloading a model I downloaded Qwen3.5-0.8B-Q5_K_M.gguf in my phone browser and saved it to my device. Then I opened the download folder shortcut in the browser, selected the GGUF file -> open with: Termux Now the file is accessible in Termux. Running it in the terminal After that, I loaded the
🔥 teng-lin/notebooklm-py
Unofficial Python API and agentic skill for Google NotebookLM. Full programmatic access to NotebookLM's features—including capabilities the web UI doesn't expose—via Python, CLI, and AI agents like Claude Code, Codex, and OpenClaw. — Trending on GitHub today with 138 new stars.



Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!