0:00:00 Suraj Patil & Patrick von Platen (Hugging Face): How to use JAX/Flax with Transformers
0:26:16 Sabrina J. Mielke (Johns Hopkins University & Hugging Face): From stateful code to purified JAX: how to build your neural net framework
1:01:35 Mostafa Dehghani (Google Brain): Long Range Arena: Benchmarking Efficient Transformers
1:28:05 Rohan Anil (Google Brain): Scalable Second Order Optimization for Deep Learning
Find more information about the speakers and the talks here https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md#thursday-july-1st
Add comment