Live
Black Hat USADark ReadingBlack Hat AsiaAI BusinessThe thaw is real : Indian delegation visits China to talk EVs and moreCNBC TechnologyAnthropic s refusal to arm AI is exactly why the UK wants itAI NewsMIT’s Ultrasound Wristband Brings Human-Level Dexterity to Robotics - Tech BriefsGoogle News - AI robotics🔥 aaif-goose/gooseGitHub TrendingMicrosoft plans tweaks to Copilot terms claiming chatbot is for ‘entertainment purposes only’ - IT ProGNews AI Microsoft[Paper] Stringological sequence prediction Ilesswrong.comJurassic bag: From dinosaur DNA to designer goods – how biofabrication and automation could reshape materials - Robotics & Automation NewsGNews AI materials‘All roads lead to higher prices and slower growth, warns IMF chief as Iran war hits global economyCNBC TechnologyRobotics-as-a-Service (RaaS) Industry Research 2025-2035: Continuous Improvement in AI, IoT, and Cloud-Based Platforms Drives Market Opportunities - Yahoo Finance SingaporeGoogle News - AI roboticsSUSE Extends Rancher’s Open Ecosystem With Liz AI Agents - Open Source For YouGNews AI open source[D] Is ACL more about the benchmarks now?Reddit r/MachineLearningBlack Hat USADark ReadingBlack Hat AsiaAI BusinessThe thaw is real : Indian delegation visits China to talk EVs and moreCNBC TechnologyAnthropic s refusal to arm AI is exactly why the UK wants itAI NewsMIT’s Ultrasound Wristband Brings Human-Level Dexterity to Robotics - Tech BriefsGoogle News - AI robotics🔥 aaif-goose/gooseGitHub TrendingMicrosoft plans tweaks to Copilot terms claiming chatbot is for ‘entertainment purposes only’ - IT ProGNews AI Microsoft[Paper] Stringological sequence prediction Ilesswrong.comJurassic bag: From dinosaur DNA to designer goods – how biofabrication and automation could reshape materials - Robotics & Automation NewsGNews AI materials‘All roads lead to higher prices and slower growth, warns IMF chief as Iran war hits global economyCNBC TechnologyRobotics-as-a-Service (RaaS) Industry Research 2025-2035: Continuous Improvement in AI, IoT, and Cloud-Based Platforms Drives Market Opportunities - Yahoo Finance SingaporeGoogle News - AI roboticsSUSE Extends Rancher’s Open Ecosystem With Liz AI Agents - Open Source For YouGNews AI open source[D] Is ACL more about the benchmarks now?Reddit r/MachineLearning
AI NEWS HUBbyEIGENVECTOREigenvector

Time-Warping Recurrent Neural Networks for Transfer Learning

arXiv stat.MLby [Submitted on 2 Apr 2026]April 6, 20262 min read1 views
Source Quiz

arXiv:2604.02474v1 Announce Type: cross Abstract: Dynamical systems describe how a physical system evolves over time. Physical processes can evolve faster or slower in different environmental conditions. We use time-warping as rescaling the time in a model of a physical system. This thesis proposes a new method of transfer learning for Recurrent Neural Networks (RNNs) based on time-warping. We prove that for a class of linear, first-order differential equations known as time lag models, an LSTM can approximate these systems with any desired accuracy, and the model can be time-warped while maintaining the approximation accuracy. The Time-Warping method of transfer learning is then evaluated in an applied problem on predicting fuel moisture content (FMC), an important concept in wildfire mod

View PDF HTML (experimental)

Abstract:Dynamical systems describe how a physical system evolves over time. Physical processes can evolve faster or slower in different environmental conditions. We use time-warping as rescaling the time in a model of a physical system. This thesis proposes a new method of transfer learning for Recurrent Neural Networks (RNNs) based on time-warping. We prove that for a class of linear, first-order differential equations known as time lag models, an LSTM can approximate these systems with any desired accuracy, and the model can be time-warped while maintaining the approximation accuracy. The Time-Warping method of transfer learning is then evaluated in an applied problem on predicting fuel moisture content (FMC), an important concept in wildfire modeling. An RNN with LSTM recurrent layers is pretrained on fuels with a characteristic time scale of 10 hours, where there are large quantities of data available for training. The RNN is then modified with transfer learning to generate predictions for fuels with characteristic time scales of 1 hour, 100 hours, and 1000 hours. The Time-Warping method is evaluated against several known methods of transfer learning. The Time-Warping method produces predictions with an accuracy level comparable to the established methods, despite modifying only a small fraction of the parameters that the other methods modify.

Subjects:

Machine Learning (cs.LG); Machine Learning (stat.ML)

Cite as: arXiv:2604.02474 [cs.LG]

(or arXiv:2604.02474v1 [cs.LG] for this version)

https://doi.org/10.48550/arXiv.2604.02474

arXiv-issued DOI via DataCite (pending registration)

Submission history

From: Jonathon Hirschi [view email] [v1] Thu, 2 Apr 2026 19:10:08 UTC (5,154 KB)

Was this article helpful?

Sign in to highlight and annotate this article

AI
Ask AI about this article
Powered by Eigenvector · full article context loaded
Ready

Conversation starters

Ask anything about this article…

Daily AI Digest

Get the top 5 AI stories delivered to your inbox every morning.

More about

modelneural networktraining

Knowledge Map

Knowledge Map
TopicsEntitiesSource
Time-Warpin…modelneural netw…trainingannounceavailablepredictionarXiv stat.…

Connected Articles — Knowledge Graph

This article is connected to other articles through shared AI topics and tags.

Knowledge Graph100 articles · 316 connections
Scroll to zoom · drag to pan · click to open

Discussion

Sign in to join the discussion

No comments yet — be the first to share your thoughts!

More in Models