From Figma to Claude Code and back | Gui Seiz & Alex Kern (Figma)
Watch now | 🎙️ How Figma’s designers and engineers use MCPs to pull production code into Figma, edit designs, and push changes back to code—eliminating design drift and enabling real-time collaboration
Most teams are still passing static design files back and forth, and most Figma files are already out of date by the time they reach engineering. Gui Seiz (designer) and Alex Kern (engineer) from Figma walk through the exact workflow their team uses to bridge that gap with AI, live onscreen. They demo how to pull a running web app directly into Figma using the Figma MCP, edit it collaboratively, and push it back to code. The old linear waterfall workflow is gone. What replaces it is a fluid, bidirectional loop where design and code inform each other in real time.
- How to use Figma’s MCP to pull production code directly into Figma files
- A workflow for pushing design changes from Figma back into your codebase using Claude Code without manual CSS adjustments
- How to export multiple code states (like all five states of a signup flow) into Figma so designers can work with what actually exists in production
- Why AI has shifted design work upstream to planning and downstream to craft, eliminating the rushed middle phase of execution
- How to create custom skills that automate pre-flight checks, lint fixes, and CI monitoring before pushing code to production
- How to structure your codebase so AI can write 90% of your code more effectively
Optimizely—Your AI agent orchestration platform for marketing and digital teams
(00:00) Introduction to Gui and Alex from Figma
(02:56) How AI has transformed Figma’s internal workflows
(05:17) The collapse of linear design-to-code workflows
(07:28) Demo: Pulling production code into Figma using MCPs
(10:49) Using Figma for precise design manipulation and team collaboration
(14:10) Demo: Pushing Figma designs back into code with Claude Code
(16:06) How AI has changed the role of software engineers
(18:43) The shift to upstream planning and downstream craft
(22:31) Demo: Exporting multiple code states back into Figma
(25:23) Synchronous vs. asynchronous collaboration with AI
(28:00) Eliminating design and engineering toil with AI
(29:03) Demo: Custom skills for automating pre-flight checks
(34:06) Code first or design first?
(35:24) Using AI to learn and explore codebases
• Figma: https://www.figma.com/
• From Claude Code to Figma: Turning production code into editable Figma designs: https://www.figma.com/blog/introducing-claude-code-to-figma/
• Codex: https://codex.ai/
• Claude Code: https://claude.ai/code
• Buildkite: https://buildkite.com/
• Balsamiq: https://balsamiq.com/
LinkedIn: https://www.linkedin.com/in/guiseiz/
LinkedIn: https://www.linkedin.com/in/alexanderskern/
ChatPRD: https://www.chatprd.ai/
Website: https://clairevo.com/
LinkedIn: https://www.linkedin.com/in/clairevo/
Production and marketing by https://penname.co/. For inquiries about sponsoring the podcast, email [email protected].
lennysnewsletter.com
https://www.lennysnewsletter.com/p/from-figma-to-claude-code-and-backSign in to highlight and annotate this article

Conversation starters
Daily AI Digest
Get the top 5 AI stories delivered to your inbox every morning.
More about
claudeproductfigmaOpen Machine Translation for Esperanto
arXiv:2603.29345v1 Announce Type: new Abstract: Esperanto is a widespread constructed language, known for its regular grammar and productive word formation. Besides having substantial resources available thanks to its online community, it remains relatively underexplored in the context of modern machine translation (MT) approaches. In this work, we present the first comprehensive evaluation of open-source MT systems for Esperanto, comparing rule-based systems, encoder-decoder models, and LLMs across model sizes. We evaluate translation quality across six language directions involving English, Spanish, Catalan, and Esperanto using multiple automatic metrics as well as human evaluation. Our results show that the NLLB family achieves the best performance in all language pairs, followed closel
Knowledge Map
Connected Articles — Knowledge Graph
This article is connected to other articles through shared AI topics and tags.
More in Models
MemFactory: Unified Inference & Training Framework for Agent Memory
arXiv:2603.29493v1 Announce Type: new Abstract: Memory-augmented Large Language Models (LLMs) are essential for developing capable, long-term AI agents. Recently, applying Reinforcement Learning (RL) to optimize memory operations, such as extraction, updating, and retrieval, has emerged as a highly promising research direction. However, existing implementations remain highly fragmented and task-specific, lacking a unified infrastructure to streamline the integration, training, and evaluation of these complex pipelines. To address this gap, we present MemFactory, the first unified, highly modular training and inference framework specifically designed for memory-augmented agents. Inspired by the success of unified fine-tuning frameworks like LLaMA-Factory, MemFactory abstracts the memory lif
From Physics to Surrogate Intelligence: A Unified Electro-Thermo-Optimization Framework for TSV Networks
arXiv:2603.29268v1 Announce Type: new Abstract: High-density through-substrate vias (TSVs) enable 2.5D/3D heterogeneous integration but introduce significant signal-integrity and thermal-reliability challenges due to electrical coupling, insertion loss, and self-heating. Conventional full-wave finite-element method (FEM) simulations provide high accuracy but become computationally prohibitive for large design-space exploration. This work presents a scalable electro-thermal modeling and optimization framework that combines physics-informed analytical modeling, graph neural network (GNN) surrogates, and full-wave sign-off validation. A multi-conductor analytical model computes broadband S-parameters and effective anisotropic thermal conductivities of TSV arrays, achieving $5\%-10\%$ relative
Lie Generator Networks for Nonlinear Partial Differential Equations
arXiv:2603.29264v1 Announce Type: new Abstract: Linear dynamical systems are fully characterized by their eigenspectra, accessible directly from the generator of the dynamics. For nonlinear systems governed by partial differential equations, no equivalent theory exists. We introduce Lie Generator Network--Koopman (LGN-KM), a neural operator that lifts nonlinear dynamics into a linear latent space and learns the continuous-time Koopman generator ($L_k$) through a decomposition $L_k = S - D_k$, where $S$ is skew-symmetric representing conservative inter-modal coupling, and $D_k$ is a positive-definite diagonal encoding modal dissipation. This architectural decomposition enforces stability and enables interpretability through direct spectral access to the learned dynamics. On two-dimensional
M-MiniGPT4: Multilingual VLLM Alignment via Translated Data
arXiv:2603.29467v1 Announce Type: new Abstract: This paper presents a Multilingual Vision Large Language Model, named M-MiniGPT4. Our model exhibits strong vision-language understanding (VLU) capabilities across 11 languages. We utilize a mixture of native multilingual and translated data to push the multilingual VLU performance of the MiniGPT4 architecture. In addition, we propose a multilingual alignment training stage that uses parallel text corpora to further enhance the multilingual capabilities of our model. M-MiniGPT4 achieves 36% accuracy on the multilingual MMMU benchmark, outperforming state-of-the-art models in the same weight class, including foundation models released after the majority of this work was completed. We open-source our models, code, and translated datasets to fac
Discussion
Sign in to join the discussion
No comments yet — be the first to share your thoughts!