[23]
Tyurin A., Richtárik P.
On the Optimal Time Complexities in Decentralized Stochastic Asynchronous Optimization // arXiv:2405.16218
[22]
Tyurin A., Gruntkowska K., Richtárik P.
Freya PAGE: First Optimal Time Complexity for Large-Scale Nonconvex Finite-Sum Optimization with Heterogeneous Asynchronous Computations // arXiv:2405.15545
[21]
Gruntkowska K., Tyurin A., Richtárik P.
Improving the Worst-Case Bidirectional Communication Complexity for Nonconvex Distributed Optimization under Function Similarity // arXiv:2402.06412
[20]
Tyurin A., Pozzi M., Ilin I., Richtárik P.
Shadowheart SGD: Distributed Asynchronous SGD with Optimal Time Complexity Under Arbitrary Computation and Communication Heterogeneity // arXiv:2402.04785
[19]
Fatkhullin I., Tyurin A., Richtárik P.
Momentum Provably Improves Error Feedback! // In Advances in Neural Information Processing Systems 36 (NeurIPS 2023)
[18]
Tyurin A., Richtárik P.
Optimal Time Complexities of Parallel Stochastic Optimization Methods Under a Fixed Computation Model // In Advances in Neural Information Processing Systems 36 (NeurIPS 2023)
[17]
Tyurin A., Richtárik P.
2Direction: Theoretically Faster Distributed Training with Bidirectional Communication Compression // In Advances in Neural Information Processing Systems 36 (NeurIPS 2023)
[16]
Gruntkowska K., Tyurin A., Richtárik P.
EF21-P and Friends: Improved Theoretical Communication Complexity for Distributed Optimization with Bidirectional Compression // In International Conference on Machine Learning. 2023. (ICML 2023)
[15]
Tyurin A., Sun L., Burlachenko K., Richtárik P.
Sharper Rates and Flexible Framework for Nonconvex SGD with Client and Data Sampling // Transactions on Machine Learning Research. 2023. (TMLR 2023)
[14]
Tyurin A., Richtárik P.
A Computation and Communication Efficient Method for Distributed Nonconvex Problems in the Partial Participation Setting // In Advances in Neural Information Processing Systems 36 (NeurIPS 2023)
[13]
Tyurin A., Richtárik P.
DASHA: Distributed nonconvex optimization with communication compression, optimal oracle complexity, and no client synchronization // In International Conference on Learning Representations. 2023. (ICLR 2023) (notable-top-25%)
[12]
Dvurechensky P., Gasnikov A., Tyurin A., Zholobov V.
Unifying Framework for Accelerated Randomized Methods in Convex Optimization // In Foundations of Modern Statistics, 2023
[11]
Szlendak R., Tyurin A., Richtárik P.
Permutation Compressors for Provably Faster Distributed Nonconvex Optimization // In International Conference on Learning Representations. 2022. (ICLR 2022)
[10]
Ivanova A., Dvurechensky P., Vorontsova E., Pasechnyuk D., Gasnikov A., Dvinskikh D., Tyurin A.
Oracle complexity separation in convex optimization // Journal of Optimization Theory and Applications. 2022.
[9]
Stonyakin F., Tyurin A., Gasnikov A., Dvurechensky P., Agafonov A., Dvinskikh D., Alkousa M., Pasechnyuk D., Artamonov S., Piskunova V.
Inexact model: a framework for optimization and variational inequalities // Optimization Methods and Software. 2021. P. 1-47.
[8]
Dvurechensky P., Gasnikov A., Omelchenko A., Tyurin A.
A stable alternative to Sinkhorn’s algorithm for regularized optimal transport // Lecture Notes in Computer Science. 2020. V. 12095. P. 406-423.
[7]
Dvinskikh D., Omelchenko A., Gasnikov A., Tyurin A.
Accelerated gradient sliding for minimizing the sum of functions // Doklady Mathematics. 2020. V. 101. N. 3. P. 244-246.
[6]
Tyurin A.
Primal-dual fast gradient method with a model // Computer Research and Modeling. 2020. V. 12, N. 2. P. 263-274. (in russian)
[5]
Dvinskikh D., Tyurin A., Gasnikov A., Omelchenko S.
Accelerated and nonaccelerated stochastic gradient descent with model conception // Mathematical Notes. 2020. V. 108. N. 4. P. 511-522 (main co-author).
[4]
Gasnikov A., Tyurin A.
Fast gradient descent for convex minimization problems with an oracle producing a (delta, L)-model of function at the requested point // Computational Mathematics and Mathematical Physics. 2019. V. 59. N. 7. P. 1085-1097. (main co-author; alphabetical order).
[3]
Stonyakin F., Dvinskikh D., Dvurechensky P., Kroshnin A., Kuznetsova O., Agafonov A., Gasnikov A., Tyurin A., Uribe C., Pasechnyuk D., Artamonov S.
Gradient methods for problems with inexact model of the objective // Lecture Notes in Computer Science. 2019. V. 11548. P. 97-114.
[2]
Ogaltsov A., Tyurin A.
A heuristic adaptive fast gradient method in stochastic optimization problems // Computational Mathematics and Mathematical Physics. 2019. V. 60. N. 7. P. 1108-1115 (main co-author, alphabetical order).
[1]
Anikin A., Gasnikov A., Dvurechensky P., Tyurin A., Chernov A.
Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints // Computational Mathematics and Mathematical Physics. 2017. V. 57. N. 8. P. 1262-1276.