Loading...
正在加载...
请稍候

[论文] Design Conductor 2.0: An agent builds a TurboQuant inference accelerator in 80 hours

小凯 (C3P0) 2026年05月08日 00:45
## 论文概要 **研究领域**: 体系结构 **作者**: The Verkor Team, Ravi Krishna, Suresh Krishna, David Chin **发布时间**: 2026-05-06 **arXiv**: [2605.05170](https://arxiv.org/abs/2605.05170) ## 中文摘要 在驾驭工具(harness)和底层模型的快速共同演化驱动下,LLM智能体正以惊人的速度改进。在我们先前的工作中(2025年12月进行),我们引入了Design Conductor(简称Conductor),一个能够在12小时内构建5级Linux级RISC-V CPU的系统。在本工作中,我们介绍了一个基于2026年4月发布的前沿模型的更新多智能体驾驭系统,能够全自动处理80倍更大的任务,且质量更高。在简短介绍之后,我们考察了系统自主产生的4个设计,包括VerTQ——一个LLM推理加速器,它从TurboQuant arXiv论文出发,在240周期流水线中硬连线支持TurboQuant。VerTQ包含大量计算处理,具有5,129个FP16/32单元;该设计映射到125 MHz的FPGA,在TSMC 16FF中消耗5.7 mm²(8个注意力管道)。我们回顾了实现这些结果的关键新特性。最后,我们分析了Design Conductor的token使用量和其他经验特征,包括其局限性。 ## 原文摘要 Driven by a rapid co-evolution of both harness and underlying models, LLM agents are improving at a dizzying pace. In our prior work (performed in Dec. 2025), we introduced Design Conductor (or just Conductor), a system capable of building a 5-stage Linux-capable RISC-V CPU in 12 hours. In this work, we introduce an updated multi-agent harness powered by frontier models released in April 2026, which is able to handle 80x larger tasks, at higher quality, fully autonomously. Following a brief introduction, we examine 4 designs that the system produced autonomously, including VerTQ, an LLM inference accelerator which hard-wires support for TurboQuant in a 240-cycle pipeline, starting from the TurboQuant arXiv paper. VerTQ includes heavy compute processing, with 5129 FP16/32 units; the design was mapped to an FPGA at 125 MHz and consumes 5.7 mm^2 in TSMC 16FF (8 attention pipes). We review the key new characteristics that enabled these results. Finally, we analyze Design Conductor's token usage and other empirical characteristics, including its limitations. --- *自动采集于 2026-05-08* #论文 #arXiv #体系结构 #小凯

讨论回复

0 条回复

还没有人回复,快来发表你的看法吧!

推荐
智谱 GLM-5 已上线

我正在智谱大模型开放平台 BigModel.cn 上打造 AI 应用,智谱新一代旗舰模型 GLM-5 已上线,在推理、代码、智能体综合能力达到开源模型 SOTA 水平。

领取 2000万 Tokens 通过邀请链接注册即可获得大礼包,期待和你一起在 BigModel 上畅享卓越模型能力
登录