Seguir
Shaokun Zhang
Título
Citado por
Citado por
Ano
Autogen: Enabling next-gen llm applications via multi-agent conversation framework
Q Wu, G Bansal, J Zhang, Y Wu, S Zhang, E Zhu, B Li, L Jiang, X Zhang, ...
ICLR 2024 Workshop on LLMAgents, 2023
1792023
An empirical study on challenging math problem solving with gpt-4
Y Wu, F Jia, S Zhang, Q Wu, H Li, E Zhu, Y Wang, YT Lee, R Peng, ...
ICLR 2024 Workshop on LLMAgents, 2023
282023
You only compress once: towards effective and elastic BERT compression via exploit-explore stochastic nature gradient
S Zhang, X Zheng, C Yang, Y Li, Y Wang, F Chao, M Wang, S Li, J Yang, ...
arXiv preprint arXiv:2106.02435, 2021
212021
Targeted hyperparameter optimization with lexicographic preferences over multiple objectives
S Zhang, F Jia, C Wang, Q Wu
ICLR 2023, 2022
192022
Ddpnas: Efficient neural architecture search via dynamic distribution pruning
X Zheng, C Yang, S Zhang, Y Wang, B Zhang, Y Wu, Y Wu, L Shao, R Ji
IJCV 131 (5), 1234-1249, 2023
182023
IDEAL: Influence-Driven Selective Annotations Empower In-Context Learners in Large Language Models
S Zhang, X Xia, Z Wang, LH Chen, J Liu, Q Wu, T Liu
ICLR 2024, 2023
62023
Hypertime: Hyperparameter optimization for combating temporal distribution shifts
S Zhang, Y Wu, Z Zheng, Q Wu, C Wang
arXiv preprint arXiv:2305.18421, 2023
52023
Coreset selection with prioritized multiple objectives
X Xia, J Liu, S Zhang, Q Wu, T Liu
arXiv preprint arXiv:2311.08675, 2023
42023
StateFlow: Enhancing LLM Task-Solving through State-Driven Workflows
Y Wu, T Yue, S Zhang, C Wang, Q Wu
arXiv preprint arXiv:2403.11322, 2024
2024
Training Language Model Agents without Modifying Language Models
S Zhang, J Zhang, J Liu, L Song, C Wang, R Krishna, Q Wu
arXiv preprint arXiv:2402.11359, 2024
2024
O sistema não pode executar a operação agora. Tente novamente mais tarde.
Artigos 1–10