Skip to yearly menu bar Skip to main content


Poster

DistFlow: A Fully Distributed RL Framework for Scalable and Efficient LLM Post-Training

zhixin wang ⋅ Jiaming Xu ⋅ Tianyi Zhou ⋅ Mingjun Zhang ⋅ Liming Liu ⋅ JiaruiHu ⋅ Dian Yang ⋅ TongYu Wang ⋅ Ping Zhang ⋅ Jinlong Hou ⋅ Siyuan Feng ⋅ Yuan Cheng ⋅ Yuan Qi

Abstract

Log in and register to view live content