## 论文概要
**研究领域**: ML
**作者**: Seungju Han, Konwoo Kim, Chanwoo Park, Benjamin Newman, Suhas Kotha, Jaehun Jung, James Zou, Yejin Choi
**发布时间**: 2026-03-26
**arXiv**: [2603.23562](https://arxiv.org/abs/2603.23562)
## 中文摘要
本研究探索了ML领域的前沿问题。研究团队来自Seungju Han, Konwoo Kim等。该方法在相关任务中展现了良好的性能和创新性。
原文摘要:Synthetic data augmentation helps language models learn new knowledge in data-constrained domains. However, naively scaling existing synthetic data methods by training on more synthetic tokens or using stronger generators yields diminishing returns below the performance of RAG. To break the RAG ceil...
## 原文摘要
Synthetic data augmentation helps language models learn new knowledge in data-constrained domains. However, naively scaling existing synthetic data methods by training on more synthetic tokens or using stronger generators yields diminishing returns below the performance of RAG. To break the RAG ceiling, we introduce Synthetic Mixed Training, which combines synthetic QAs and synthetic documents.
---
*自动采集于 2026-03-27*
#论文 #arXiv #ML #小凯
登录后可参与表态
讨论回复
0 条回复还没有人回复,快来发表你的看法吧!