profile_picture
Alexander Tyurin, Ph.D.

Postdoctoral Fellow,

KAUST
alexander.tyurin@kaust.edu.sa
alexandertiurin@gmail.com

Hi! I am a postdoctoral fellow at KAUST, Visual Computing Center, where I work on modern optimization tasks with Professor Peter Richtárik. Previously, I defended my Ph.D. thesis at Higher School of Economics with my former supervisor, Professor Alexander Gasnikov.

I also worked at Yandex Self-Driving Cars, Perception team, where I developed real-time computer vision algorithms for self-driving cars.

Education

Ph.D. in Computer Science
Higher School of Economics, Faculty of Computer Science
2017 - 2020
Masters of Computer Science
Higher School of Economics, Faculty of Computer Science
2015 - 2017
Bachelor of Computer Science
Lomonosov Moscow State University, Faculty of Computational Mathematics and Cybernetics
2011 - 2015

Work experience

Postdoctoral fellow
KAUST, Visual Computing Center
2021 - present
Research and development engineer
Yandex Self-Driving Cars
2018 - 2021
Junior research fellow
Higher School of Economics, HDI LAB
2017 - 2021
Research engineer
Alterra.ai
2018
Research engineer
VisionLabs
2015 - 2018

Review Duties: ICML 2022*, NeurIPS 2022*, Machine Learning (journal), ICLR 2023, ICML 2023, NeurIPS 2023
(*) = Best Reviewer Award

Publications

[19] Fatkhullin I., Tyurin A., Richtárik P. Momentum Provably Improves Error Feedback! // In Advances in Neural Information Processing Systems 37 (NeurIPS 2023)
[18] Tyurin A., Richtárik P. Optimal Time Complexities of Parallel Stochastic Optimization Methods Under a Fixed Computation Model // In Advances in Neural Information Processing Systems 37 (NeurIPS 2023)
[17] Tyurin A., Richtárik P. 2Direction: Theoretically Faster Distributed Training with Bidirectional Communication Compression // In Advances in Neural Information Processing Systems 37 (NeurIPS 2023)
[16] Gruntkowska K., Tyurin A., Richtárik P. EF21-P and Friends: Improved Theoretical Communication Complexity for Distributed Optimization with Bidirectional Compression // In International Conference on Machine Learning. 2023. (ICML 2023)
[15] Tyurin A., Sun L., Burlachenko K., Richtárik P. Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling // Transactions on Machine Learning Research. 2023. (TMLR 2023)
[14] Tyurin A., Richtárik P. A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting // In Advances in Neural Information Processing Systems 37 (NeurIPS 2023)
[13] Tyurin A., Richtárik P. DASHA: Distributed nonconvex optimization with communication compression, optimal oracle complexity, and no client synchronization // In International Conference on Learning Representations. 2023. (ICLR 2023) (notable-top-25%)
[12] Dvurechensky P., Gasnikov A., Tyurin A., Zholobov V. Unifying Framework for Accelerated Randomized Methods in Convex Optimization // In Foundations of Modern Statistics, 2023
[11] Szlendak R., Tyurin A., Richtárik P. Permutation Compressors for Provably Faster Distributed Nonconvex Optimization // In International Conference on Learning Representations. 2022. (ICLR 2022)
[10] Ivanova A., Dvurechensky P., Vorontsova E., Pasechnyuk D., Gasnikov A., Dvinskikh D., Tyurin A. Oracle complexity separation in convex optimization // Journal of Optimization Theory and Applications. 2022.
[9] Stonyakin F., Tyurin A., Gasnikov A., Dvurechensky P., Agafonov A., Dvinskikh D., Alkousa M., Pasechnyuk D., Artamonov S., Piskunova V. Inexact model: a framework for optimization and variational inequalities // Optimization Methods and Software. 2021. P. 1-47.
[8] Dvurechensky P., Gasnikov A., Omelchenko A., Tyurin A. A stable alternative to Sinkhorn’s algorithm for regularized optimal transport // Lecture Notes in Computer Science. 2020. V. 12095. P. 406-423.
[7] Dvinskikh D., Omelchenko A., Gasnikov A., Tyurin A. Accelerated gradient sliding for minimizing the sum of functions // Doklady Mathematics. 2020. V. 101. N. 3. P. 244-246.
[6] Tyurin A. Primal-dual fast gradient method with a model // Computer Research and Modeling. 2020. V. 12, N. 2. P. 263-274. (in russian)
[5] Dvinskikh D., Tyurin A., Gasnikov A., Omelchenko S. Accelerated and nonaccelerated stochastic gradient descent with model conception // Mathematical Notes. 2020. V. 108. N. 4. P. 511-522 (main co-author).
[4] Gasnikov A., Tyurin A. Fast gradient descent for convex minimization problems with an oracle producing a (delta, L)-model of function at the requested point // Computational Mathematics and Mathematical Physics. 2019. V. 59. N. 7. P. 1085-1097. (main co-author; alphabetical order).
[3] Stonyakin F., Dvinskikh D., Dvurechensky P., Kroshnin A., Kuznetsova O., Agafonov A., Gasnikov A., Tyurin A., Uribe C., Pasechnyuk D., Artamonov S. Gradient methods for problems with inexact model of the objective // Lecture Notes in Computer Science. 2019. V. 11548. P. 97-114.
[2] Ogaltsov A., Tyurin A. A heuristic adaptive fast gradient method in stochastic optimization problems // Computational Mathematics and Mathematical Physics. 2019. V. 60. N. 7. P. 1108-1115 (main co-author, alphabetical order).
[1] Anikin A., Gasnikov A., Dvurechensky P., Tyurin A., Chernov A. Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints // Computational Mathematics and Mathematical Physics. 2017. V. 57. N. 8. P. 1262-1276.