Yandex Research is presenting 6 accepted papers at the 42nd International Conference on Machine Learning (ICML 2025), which is being held in Vancouver, Canada, from July 13–19. Recognized by Google Scholar as one of the world's top three AI conferences, ICML is a premier venue for cutting-edge machine learning research, bringing together global leaders in artificial intelligence, statistics, and data science. This year's conference received a record 12,107 valid submissions, of which 3,260 were accepted — an acceptance rate of 26.9%.
The accepted papers showcase Yandex аdvancements in critical areas of machine learning, including algorithmic reasoning in neural networks, measuring diversity in AI systems, and optimizing memory usage for large language models (LLMs):
- “Discrete Neural Algorithmic Reasoning”: Explores the challenges neural models face in generalizing algorithmic tasks and suggests architectural changes to enhance generalization capabilities.
- “Measuring Diversity: Axioms and Challenges”: Identifies three key properties that diversity metrics should satisfy, illustrates how existing metrics fall short, and raises questions on whether efficient metrics that satisfy all requirements currently exist.
- “Inverse Bridge Matching Distillation”: Proposes a method that speeds up image-to-image translation models by 4–100 times and enables the student model to outperform the teacher model in specific tasks.
- “Cache Me If You Must: Adaptive Key-Value Quantization for Large Language Models”: Proposes an efficient data compression method for large language models that minimizes quality loss, even under extreme quantization.
- “FRUGAL: Memory-Efficient Optimization by Reducing State Overhead for Scalable Training”: Enhances memory usage for large models and splits gradients to optimize performance for resource-constrained environments.
- “EvoPress: Towards Optimal Dynamic Model Compression via Evolutionary Search”: Introduces an approach that uses evolutionary algorithms for dynamic compression and achieves superior quality compared to uniform compression methods across the Llama, Mistral, and Phi model families.
You can learn more about the research papers on the Yandex Research blog, and presentations will take place at the Vancouver Convention Center during ICML 2025. Yandex Research continues to collaborate with leading global academic and industry partners to advance machine learning worldwide.
About Yandex Research
Yandex Research is a team focused on exploring fundamental questions in artificial intelligence. Research engineers specialize in natural language processing, computer vision, neural networks, and more. The Yandex Research team develops solutions integrated into the company's products, bringing tangible benefits to people. Thanks to their work, Yandex has become one of the leading tech companies in scientific publications at NeurIPS, ICML, and other major international machine learning conferences.