Qwen Team Hit by Major Departures After Releasing Exceptional Qwen 3.5 Models
Hangzhou, China — Alibaba’s Qwen AI team is reeling from the sudden resignation of its technical lead and several core researchers, just days after releasing a highly regarded family of open-weight models. Junyang Lin, the key figure behind Qwen’s open-source efforts since 2024, announced his departure on March 4, triggering an emergency all-hands meeting led by Alibaba Group CEO Wu Yongming.
The exits include multiple senior contributors responsible for critical areas of the Qwen models, raising questions about the future of one of China’s most competitive open AI initiatives. According to a report from Chinese tech outlet 36Kr, the departures occurred amid an internal reorganization at Alibaba’s Tongyi Lab.
Lin posted simply on X (formerly Twitter): “me stepping down. bye my beloved qwen.” Hours later, he followed up on WeChat Moments with a message to the team: “Brothers of Qwen, continue as originally planned, no problem,” without confirming whether he would stay or return. Multiple Qwen team members told 36Kr that Lin’s leadership had been essential to the team’s success despite receiving fewer resources than competitors.
Key Departures Span Core Technical Areas
In addition to Lin, several other senior researchers announced their exits on the same day. They include:
- Binyuan Hui, who led Qwen code development and the Qwen-Coder series, managed the full agent training pipeline, and had recently contributed to robotics research.
- Bowen Yu, who spearheaded post-training research and led development of the Qwen-Instruct series.
- Kaixin Li, a core contributor to Qwen 3.5, vision, and coding models.
The 36Kr article notes that many younger researchers also resigned the same day. The report, considered credible by industry observers, suggests the changes stem from a re-org that placed a new researcher hired from Google’s Gemini team in charge of Qwen, although that detail has not been independently confirmed by Alibaba.
Alibaba has not issued an official statement on the departures as of the latest reporting.
Qwen 3.5 Models Earn Strong Praise
The timing of the upheaval is notable because the Qwen 3.5 family has drawn significant acclaim. Released in stages beginning February 17 with the massive Qwen3.5-397B-A17B (an 807GB model), the lineup quickly expanded to include more accessible sizes: 122B, 35B, 27B, 9B, 4B, 2B, and 0.8B variants.
Independent testers have been particularly impressed by the mid-sized models. The 27B and 35B versions are reportedly excellent for coding tasks that fit within the memory constraints of consumer hardware such as 32GB or 64GB Macs. Smaller models have also exceeded expectations: the 2B variant, which quantizes down to as little as 1.27GB, delivers full reasoning capabilities and multi-modal (vision) support.
Simon Willison, a prominent AI developer and observer, described the Qwen 3.5 releases as “exceptional” and noted the team’s track record of delivering high-quality results from increasingly compact models. “It would be a real tragedy if the Qwen team were to disband now,” he wrote.
Impact on Open-Source AI and Alibaba’s Strategy
The departures represent a significant brain drain for Alibaba’s AI ambitions. Qwen has emerged as one of the strongest open-weight competitors to Meta’s Llama series and other Western open models, particularly in coding, instruction following, and efficient inference.
For developers and the broader open-source community, the uncertainty creates short-term risk. Many have begun integrating Qwen 3.5 models into applications ranging from agentic coding tools to multimodal systems. A weakening of the team could slow future releases and improvements to the open-weight lineup.
The situation also highlights growing tensions within Chinese tech giants as they reorganize AI efforts to compete with both domestic rivals and international leaders. Alibaba’s decision to have CEO Wu Yongming personally address the Tongyi Lab team underscores the strategic importance of the Qwen project.
What’s Next
It remains unclear whether Alibaba will succeed in retaining some of the departing talent or if the researchers will join new ventures or competing labs. The presence of an emergency all-hands meeting suggests the company is actively managing the situation.
Further updates are expected in the coming days as more details emerge from Alibaba or the individuals involved. The open-weight Qwen 3.5 models remain available on Hugging Face, and early benchmarks indicate they will continue to see strong adoption in the immediate term.
The episode serves as a reminder of how fragile top AI research teams can be, even at well-resourced companies like Alibaba, in an intensely competitive global talent market.

