💫 spacy v3.8.0
Memory management for persistent services, numpy 2.0 support
Optional memory management for persistent services
Support a new context manager method Language.memory_zone(), to allow long-running services to avoid growing memory usage from cached entries in the Vocab or StringStore. Once the memory zone block ends, spaCy will evict Vocab and StringStore entries that were added during the block, freeing up memory. Doc objects created inside a memory zone block should not be accessed outside the block.
The current implementation disables population of the tokenizer cache inside the memory zone, resulting in some performance impact. The performance difference will likely be negligible if you're running a full pipeline, but if you're only running the tokenizer, it'll be much slower. If this is a problem, you can mitigate it by warming the cache first, by processing the first few batches of text without creating a memory zone. Support for memory zones in the tokenizer will be added in a future update.
The Language.memory_zone() context manager also checks for a memory_zone() method on pipeline components, so that components can perform similar memory management if necessary. None of the built-in components currently require this.
If you component needs to add non-transient entries to the StringStore or Vocab, you can pass the allow_transient=False flag to the Vocab.add() or StringStore.add() components.
Example usage:
import spacy import json from pathlib import Path from typing import Iterator from collections import Counter import typer from spacy.util import minibatchimport spacy import json from pathlib import Path from typing import Iterator from collections import Counter import typer from spacy.util import minibatchdef texts(path: Path) -> Iterator[str]: with path.open("r", encoding="utf8") as file_: for line in file_: yield json.loads(line)["text"]
def main(jsonl_path: Path) -> None: nlp = spacy.load("en_core_web_sm") counts = Counter() batches = minibatch(texts(jsonl_path), 1000) for i, batch in enumerate(batches): print("Batch", i) with nlp.memory_zone(): for doc in nlp.pipe(batch): for token in doc: counts[token.text] += 1 for word, count in counts.most_common(100): print(count, word)
if name == "main": typer.run(main)`
Numpy v2 compatibility
Numpy 2.0 isn't binary-compatible with numpy v1, so we need to build against one or the other. This release isolates the dependency change and has no other changes, to make things easier if the dependency change causes problems.
This dependency change was previously attempted in version 3.7.6, but dependencies within the v3.7 family of models resulted in some conflicts, and some packages depending on numpy v1 were incompatible with v3.7.6. I've therefore removed the 3.7.6 release and replaced it with this one, which increments the minor version.
Model packages no longer list spacy as a requirement
I've also made a change to the way models are packaged to make it easier to release more quickly. Previously spaCy models specified a versioned requirement on spacy itself. This meant that there was no way to increment the spaCy version and have it work with the existing models, because the models would specify they were only compatible with spacy>=3.7.0,<3.8.0. We have a compatibility table that allows spacy to see which models are compatible, but the models themselves can't know which future versions of spaCy they work with.
I've therefore added a flag --require-parent/--no-require-parent to the spacy package CLI, which controls where the parent package (e.g. spaCy) should be listed as a requirement of the model. --require-parent is the default for v3.8, but this will change to --no-require-parent by default in v4. I've set --no-require-parent for the v3.8 models, so that further changes can be published that don't impact the models, without retraining the models or forcing users to redownload them.
Sign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
service
On the Horizon: Three Science and Technology Trends That Could Affect Society
What GAO Found GAO identified three potentially transformative technologies that are trending toward maturity and may need congressional attention over the next 10 years. These technologies are: Neural implants for human augmentation. Currently, neural implants are only available to people with certain medical needs. Future implants might enable direct brain-to-brain communication, hands-free control of computers, or the rapid acquisition of new skills and abilities. General availability of neural implants could compromise users' privacy and security, depending on who can access data from such implants. In addition, differentiating between medical and augmentative uses would involve subjective value judgments and ethical questions. Policymakers could consider a variety of options, includin

The Algorithmic Edge: Launching Your Day Trading Journey with AI Sentiment and Next-Gen Charting
The Modern Trader's Toolkit: From Automated Signals to Market Sentiment AI The landscape of retail trading has undergone a seismic shift in the last five years. Where once a Bloomberg Terminal, a broker's phone line, and gut instinct were the primary tools, today's trader navigates a digital ecosystem powered by artificial intelligence, real-time analytics, and democratized data. For aspiring and established traders alike, the challenge is no longer accessing information, but intelligently filtering the signal from the noise. This evolution has given rise to sophisticated AI trading signals , comprehensive educational resources like a day trading guide for beginners , and powerful analytics platforms that go beyond traditional charting. Understanding these tools—and how they integrate—is n
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Products

AI s great paradox: The industry s rise and investors collapse
The AI industry faces a paradox, promising transformational advances while investors risk substantial losses due to limitations of current technologies and potential quantum breakthroughs. The post AI’s great paradox: The industry’s rise and investors’ collapse first appeared on TechTalks .




Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!