Follow
Mark Chen
Mark Chen
Research Scientist, OpenAI
Verified email at openai.com
Title
Cited by
Cited by
Year
Language models are few-shot learners
T Brown, B Mann, N Ryder, M Subbiah, JD Kaplan, P Dhariwal, ...
Advances in neural information processing systems 33, 1877-1901, 2020
243752020
Hierarchical text-conditional image generation with clip latents
A Ramesh, P Dhariwal, A Nichol, C Chu, M Chen
arXiv preprint arXiv:2204.06125 1 (2), 3, 2022
41472022
Zero-shot text-to-image generation
A Ramesh, M Pavlov, G Goh, S Gray, C Voss, A Radford, M Chen, ...
International conference on machine learning, 8821-8831, 2021
35732021
Evaluating large language models trained on code
M Chen, J Tworek, H Jun, Q Yuan, HPO Pinto, J Kaplan, H Edwards, ...
arXiv preprint arXiv:2107.03374, 2021
2108*2021
Glide: Towards photorealistic image generation and editing with text-guided diffusion models
A Nichol, P Dhariwal, A Ramesh, P Shyam, P Mishkin, B McGrew, ...
arXiv preprint arXiv:2112.10741, 2021
20662021
Generative pretraining from pixels
M Chen, A Radford, R Child, J Wu, H Jun, D Luan, I Sutskever
International conference on machine learning, 1691-1703, 2020
14472020
Training verifiers to solve math word problems
K Cobbe, V Kosaraju, M Bavarian, M Chen, H Jun, L Kaiser, M Plappert, ...
arXiv preprint arXiv:2110.14168, 2021
10592021
Gpt-4 technical report
J Achiam, S Adler, S Agarwal, L Ahmad, I Akkaya, FL Aleman, D Almeida, ...
arXiv preprint arXiv:2303.08774, 2023
7192023
Consistency models
Y Song, P Dhariwal, M Chen, I Sutskever
arXiv preprint arXiv:2303.01469, 2023
2942023
Point-e: A system for generating 3d point clouds from complex prompts
A Nichol, H Jun, P Dhariwal, P Mishkin, M Chen
arXiv preprint arXiv:2212.08751, 2022
2542022
Scaling laws for autoregressive generative modeling
T Henighan, J Kaplan, M Katz, M Chen, C Hesse, J Jackson, H Jun, ...
arXiv preprint arXiv:2010.14701, 2020
2312020
Hierarchical text-conditional image generation with clip latents. arXiv 2022
A Ramesh, P Dhariwal, A Nichol, C Chu, M Chen
arXiv preprint arXiv:2204.06125, 2022
1172022
Efficient training of language models to fill in the middle
M Bavarian, H Jun, N Tezak, J Schulman, C McLeavey, J Tworek, M Chen
arXiv preprint arXiv:2207.14255, 2022
832022
DALL· E: Creating images from text
A Ramesh, M Pavlov, G Goh, S Gray, M Chen, R Child, V Misra, P Mishkin, ...
OpenAI blog. https://openai. com/blog/dall-e, 2021
802021
Distribution augmentation for generative modeling
H Jun, R Child, M Chen, J Schulman, A Ramesh, A Radford, I Sutskever
International Conference on Machine Learning, 5006-5019, 2020
512020
Language models are few-shot learners
B Mann, N Ryder, M Subbiah, J Kaplan, P Dhariwal, A Neelakantan, ...
arXiv preprint arXiv:2005.14165, 2020
472020
Using temporal correlations and full distributions to separate intrinsic and extrinsic fluctuations in biological systems
A Hilfinger, M Chen, J Paulsson
Physical review letters 109 (24), 248104, 2012
212012
Systems and methods for hierarchical text-conditional image generation
A Ramesh, P Dhariwal, A Nichol, C Chu, M Chen
US Patent 11,922,550, 2024
2024
Systems and methods for generating natural language using language models trained on computer code
M Chen, J Tworek, I Sutskever, W Zaremba, JUN Heewoo, HPDEO PINTO
US Patent App. 18/321,921, 2024
2024
Systems and methods for generating code using language models trained on computer code
M Chen, J Tworek, I Sutskever, W Zaremba, JUN Heewoo, HPDEO PINTO
US Patent App. 18/321,852, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–20