News

  1. Dr. Aibek Alanov

    We are happy to announce that Aibek Alanov got his PhD on Exploring efficient parameterizations for gans in image and speech generation on October 10!

    Generative Adversarial Networks (GANs) have excelled in generating high-quality data, with applications in computer vision and signal processing. However, their training typically requires large datasets, which can be impractical to obtain. This thesis addresses the challenge of training GANs on small datasets using domain adaptation techniques. It introduces efficient StyleGAN parametrizations and compact architectures for speech enhancement. The proposed domain modulation technique significantly reduces the number of parameters needed for StyleGAN training, enabling the HyperDomainNet model for multi-domain adaptation. Further developments led to efficient parametrizations like StyleSpace and Affine+. Additionally, this work explores crucial components of StyleGAN for effective domain adaptation and examines the properties of StyleSpace directions. In speech enhancement, the HiFi++ and FFC-SE models are presented, offering superior performance with fewer parameters. These contributions enhance the efficiency and applicability of GANs in data-limited scenarios.

  2. Dr. Kirill Struminsky

    Congrats to Kirill Struminsky who got his PhD on Learning guarantees and efficient inference for structured prediction on February 13!

    This work examines gauge functions in a structural prediction problem, allowing us to obtain learning guarantees without assuming the consistency of the target function. In addition, within the framework of the probabilistic approach, gradient estimates have been developed for working with latent structural variables in the form of permutations or subsets, and a number of practical applications have been considered.

  3. Dr. Maxim Kodryan

    On January 23 the defense of Maxim Kodryan's PhD thesis on "Training Dynamics and Loss Landscape of Neural Networks with Scale-Invariant Parameters" took place.

    Scale invariance is one of the key properties inherent in the parameters of most modern neural network architectures. Provided by the ubiquitous presence of layers of normalization of intermediate activations and/or weights, scale invariance, as the name implies, consists in the invariance of the function implemented by the neural network when its parameters are multiplied by an arbitrary positive scalar. In his work, Maxim investigates the effects of this property on the training dynamics of neural network models, as well as its influence on the intrinsic structure of the loss landscape.

  4. CVPR'23: Outstanding Reviewing

    This june Dmitry Vetrov was named one of the CVPR 2023 Outstanding Reviewers. Out of more than 7000 individuals who have completed at least one review for CVPR 2023, Program Chairs designated 232 whose reviews stood out for their exceptional quality and helpfulness, based on nominations and ratings submitted by the Area Chairs.

    Eduard Pockonechnyy, Maksim Nakhodnov, Semion Elistratov, Aibek Alanov, Viacheslav Meshchaninov helped review CVPR papers. Congratulations to our team!

  5. PhD Parade 2022

    We're happy to announce that several members of our lab have completed and successfully defended their PhD studies:

    Our congratulations to newly baked doctors!

  6. Dr. Alexander Novikov

    On December 16 our alumnus Alexander Novikov got his PhD on “Tensor methods for machine learning”. The thesis is focused on using low-rank tensor decomposition algorithms to develop faster methods for training models, methods for compressing and accelerating models, and creating less resource-intensive machine learning models from scratch. Dissertation was supervised by Ivan Oseledets and Dmitry Vetrov. Congrats to Dr. Alexander!

  7. AI Super Grant

    HSE University, where most of our group is based, got a major grant from the Russian Government to support research in the field of AI in 2021-2024. Funding – the total amount is over 1 billion rubles – will be received from the budget of the Russian Federation, and our partners: SberBank, Yandex, MTS AI. The purpose of the project is to develop new AI technologies allowed to expand the scope of its application and overcome existing limitations for solving applied tasks, and to optimize AI-models. Dmitry Vetrov will provide scientific guidance for the project.

  8. Sergey Troshin to receive a scholarship

    Sergey Troshin, a graduate student at BayesGroup, was awarded a personal scholarship for contribution to Computer Science. The scholarship has been recently established by the Faculty of Computer Science of HSE University for their students.

  9. Dr. Daniil Polykovskiy

    On May 20 Daniil Polykovskiy defended his thesis on "Generative Models for Drug Discovery". The PhD Dissertation was prepared at HSE and supervised by Dmitry Vetrov. We congratulate Daniil — our new Doctor of Philosophy in Computer Science!

Page 1 out of 4 Older →