Тёмный

Directions in ML: "Neural architecture search: Coming of age" 

Microsoft Research
Подписаться 325 тыс.
Просмотров 6 тыс.
50% 1

Neural Architecture Search (NAS) is a very promising but still young field. I will start this talk by discussing various works aiming to build a scientific community around NAS, including benchmarks, best practices, and open source frameworks. Then, I will discuss several exciting directions for the field: (1) a broad range of possible speedup techniques for NAS; (2) joint NAS + hyperparameter optimization in Auto-PyTorch to allow off-the-shelf AutoML; and (3) the extended problem definition of neural ensemble search (NES) that searches for a set of complementary architectures rather than a single one as in NAS.
Slides for this talk are available: www.automl.org/talks
Frank Hutter is a Full Professor for Machine Learning at the Computer Science Department of the University of Freiburg (Germany), as well as Chief Expert AutoML at the Bosch Center for Artificial Intelligence. Frank holds a PhD from the University of British Columbia (2009) and a MSc from TU Darmstadt (2004). He received the 2010 CAIAC doctoral dissertation award for the best thesis in AI in Canada, and with his coauthors, several best paper awards and prizes in international competitions on machine learning, SAT solving, and AI planning. He is the recipient of a 2013 Emmy Noether Fellowship, a 2016 ERC Starting Grant, a 2018 Google Faculty Research Award, and a 2020 ERC PoC Award. He is also a Fellow of ELLIS and Program Chair at ECML 2020.
In the field of AutoML, Frank co-founded the ICML workshop series on AutoML in 2014 and has co-organized it every year since, co-authored the prominent AutoML tools Auto-WEKA and Auto-sklearn, won the first two AutoML challenges with his team, co-authored the first book on AutoML, worked extensively on efficient hyperparameter optimization and neural architecture search, and gave a NeurIPS 2018 tutorial with over 3000 attendees.
Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series: aka.ms/diml

Наука

Опубликовано:

 

5 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 4   
@leixun
@leixun 3 года назад
*My takeaways:* 1. Problems with current Neural Architecture Search (NAS) methods 3:10 2. Benchmarks and best practices 6:26 3. Speedup techniques for NAS 24:46 3.1 Weight inheritance and network morphisms 25:00 3.2 Weight sharing and one-shot models 30:10 3.3 Meta-learning 38:12 3.4 Multi-fidelity optimization 40:58 4. Auto-PyTorch: Joint NAS and hyperparameter optimization 43:55 5. Extended problem formulation: neural ensemble search 49:43 6. Q&A 54:10
@rainerzufall1868
@rainerzufall1868 3 года назад
Very nice talk and work!!!! Thank you!
@giannagiavelli5098
@giannagiavelli5098 2 года назад
Guy knows NOTHING. NOTHING. I repeat .. .NOTHING
@toxicore1190
@toxicore1190 Год назад
why do you think so
Далее
The opportunities with AI and machine learning
1:24:59
What are AI Agents?
12:29
Просмотров 118 тыс.
Anomaly Detection: Algorithms, Explanations, Applications
1:26:56
Mapping GPT revealed something strange...
1:09:14
Просмотров 205 тыс.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Rethinking Attention with Performers (Paper Explained)
54:39
КАКОЙ SAMSUNG КУПИТЬ В 2024 ГОДУ
14:59
КРАХ WINDOWS 19 ИЮЛЯ 2024 | ОБЪЯСНЯЕМ
10:04