Yandex Blog

Introducing Yandex’s Machine Intelligence and Research Division

Yandex proudly announces the creation of our new Machine Intelligence and Research (MIR) Division. The MIR division will function as a centralized, cross-functional unit to accelerate innovation and unify our core machine learning technologies. The MIR division will also transfer cutting-edge research from our various research teams into Yandex products and services. Yandex has tapped Misha Bilenko to head the new division, which brings together a mix of teams focusing on AI-centered technologies including:

  • MatrixNet and DaNet – Machine learning has always been at the core of Yandex consumer products and information services. In 2009, we launched MatrixNet, our proprietary machine learning platform. Today, MatrixNet is used in nearly every product and service Yandex offers. One important feature of MatrixNet is its resistance to overfitting, which takes into account a very large number of factors when ranking the relevancy of search results. DaNet is the deep neural network (DNN) framework developed at Yandex that provides state-of-the-art runtime performance for many tasks that rely on deep learning.
  • Computer Vision – People learn to recognize objects at a very young age. Machines, on the other hand, must be trained to recognize objects using vast amounts of labeled and unlabeled data. Yandex’s market-leading image recognition technology uses machine learning to detect similar images in visual search results as well as perform a number of high-end vision tasks, from automotive photo analysis for, to predicting weather patterns using satellite imagery.
  • Speech – Yandex’s SpeechKit voice recognition technology uses machine learning to help people better communicate with devices and be more productive on the go. SpeechKit technology powers voice commands for Yandex search and is also used in Yandex’s traffic information app, Yandex.Navigator, offering motorists voice activation control. The SpeechKit SDK enables businesses to easily integrate Yandex’s speech technologies in their productivity tools and virtual assistants.
  • Translation – With more than 90 languages in production, Yandex is one of very few companies in the world that has access to enough data to meet today’s high machine translation standards. Yandex.Translate uses machine learning throughout its stack, including unique technology for translating rare languages that don’t have enough written data to use classical methods, instead relying on linguistic structures from related popular languages to fill in the gaps.

From speech-to-speech translation to virtual assistants that chat with people and use cameras to see, the MIR division offers amazing opportunities for synthesis and cross-pollination within Yandex’s machine learning, computer vision, speech and translation technologies. By bringing team members from these core technologies together, the MIR division will improve Yandex’s machine and natural processing capabilities, enhancing its products and services and ultimately delivering consumers and businesses a better experience.

Under Misha Bilenko’s guidance, the unified division will be able to integrate its top research findings across all of Yandex products and services. Misha joins Yandex after 10 years of experience working at Microsoft, where he led the Machine Learning Algorithms team in the Cloud and Enterprise division, following a career in the Machine Learning Group for Microsoft Research. Misha brings a unique blend of leadership skills, research expertise and machine learning knowledge to Yandex. His leadership will be instrumental as the MIR division expands Yandex’s research efforts to experiment with new projects and achieve more long-term goals building the next generation of intelligent products and services.

New Yandex Service Uses Machine Learning for Hyperlocal Weather Forecast

Machine learning is Yandex's core technology. We’ve long been using it in almost all of our services — to answer users’ search queries, for machine translation, ad targeting, personal recommendations, and plotting routes on maps, among others. Since last year, our MatrixNet machine learning algorithm has been utilised for the optimisation of business processes in real enterprises — weopened Yandex Data Factory for this purpose.

Today we announce yet another application of machine learning in a new field for us — weather forecasting. For this we have developed our own forecasting technology Meteum, which will now be used in the web service and mobile application Yandex.Weather available for iOS and Android.

Basic weather forecasts are traditionally constructed using the Navier-Stokes equations. Models for describing weather are extremely complex, as they depend on a multitude of factors. Programs for their calculation consist of hundreds of thousands of lines of code and run on huge supercomputers. Nonetheless, they still make mistakes, so their forecasts need to be fine-tuned. Besides that, the complexity and resource-intensiveness of traditional calculations results in a situation where forecasts are made for relatively large regions and cities. Constructing a precise forecast for, say, a small village would require taking into account a large number of local factors – such as, solar radiation, phase transitions of water vapour, or thermal radiation from the soil. Performing this task using traditional methods is not much less resource-intensive than for a large city, while the number of people using such a forecast is much lower.

Using machine learning allows collating a large volume of historical data about forecasts and actual weather, identifying causality in forecasting errors and correcting them. This is quicker and easier, as it doesn’t require factoring in laws of nature for each new forecast, but simply corrects traditional mathematical models and localises the forecast down to specific latitude and longitude. That’s exactly what Meteum does.


Our new technology uses traditional meteo models to process the initial data, and works with intermediate results using Yandex’s machine learning technology MatrixNet. To calculate the weather, Meteum constantly compares forecast with actual weather conditions — more than 140,000 times a day. To learn about current weather conditions, we use meteorological station data, as well as weather information from other sources indirectly indicating the situation — about 9 terabytes of data every day. One of the sources is our users, who can let us know about discrepancies between forecasts and real weather conditions via the app. The more data we receive from them, the more precise Meteum’s forecasts will become.


Meteum calculates a new forecast each time a user consults Yandex.Weather on their desktop or mobile device. It locates a person and shows them a fresh forecast for precisely that location. The user can choose another place and time for the forecast to see what the weather will be like around their office in an hour or if it might rain when they go out of town in the evening.

Meteum currently works in 36 regions of Russia, with a possibility to expand to other regions or countries.

Yandex’s School of Data Analysis Joins LHCb Collaboration

The Yandex School of Data Analysis has joined in collaboration with CERN’s Large Hadron Collider beauty (LHCb) experiment. The project is one of four large particle detector experiments at the Large Hadron Collider, and collects data to study the interactions of heavy particles, called b-hadrons.

As a result of this collaboration, the LHCb researchers will receive continuous support from existing applications (EventIndex, EventFilter) and the development of new services designed for the LHCb by the Yandex School of Data Analysis. YSDA will contribute its data processing skills and capabilities, and perform interdisciplinary research and development on the edge of physics and data science that will serve the aims and needs of the LHCb experiment.

LHCb 81 copy.jpg

LHCb experiment. Photo by Tim Parchikov.

The researchers at the LHCb experiment are seeking, among other things, to explain the imbalance of matter and antimatter in the observable universe. This programme requires collecting, processing and analysing a very large amount of data. Yandex has already been contributing its search technologies, computing capabilities and machine-learning methods to the LHCb experiment since 2011, helping the physicists gain quick access to the data they need. Since January 2013, Yandex has been providing its core machine-learning technology MatrixNet for the needs of particle physics as an associate member of CERN openlab, CERN’s collaboration with industrial partners.

The Yandex School of Data Analysis is now part of the game, with its exceptional talent, a strong tradition in hard-core mathematics, and proven experience of converting new theoretical knowledge into practical solutions. The YSDA is the only member of the LHCb collaboration that does not specialise in physics. Other collaborators in the project include such prestigious institutions as MIT (USA), EPFL (Switzerland), University of Oxford and Imperial College, London (UK).

The Yandex School of Data Analysis is a free Master’s-level program in computer science and data analysis, offered by Yandex since 2007 to graduates in engineering, mathematics, computer science or related fields. It trains specialists in data analysis and information retrieval. The school’s program includes courses in machine learning, data structures and algorithms, computational linguistics and other related subjects. It runs a number of joint programs, both at Master’s and PhD levels, with leading education and research institutions including the Moscow Institute of Physics and Technology, the National Research University Higher School of Economics (HSE), and the Department of Mechanics and Mathematics of Moscow State University. In seven years, the Yandex School of Data analysis has prepared more than 320 specialists.

Yandex Data Factory Opens for Business

As far as the laws of mathematics refer to reality, they are not certain,

and as far as they are certain, they do not refer to reality.

Albert Einstein

A search engine is all about very big data and very advanced mathematics. What we have been doing here at Yandex for more than 17 years already, is develop and implement technologies and algorithms which from a billion of pages on the internet would pick the one that would offer an answer to a web user’s question or solve their problem.

The technologies that power our search are based on machine learning – an approach that allows automating the process of making a decision. Our core machine learning technology, MatrixNet, not only makes its own decisions about whether a certain piece of information is a good answer to a user’s question or not, based on previous experience, but it does so based on a relatively limited experience.

At this point in time, when we can feel that our technologies can be put to use in spheres other than internet search, we are prepared to offer what we’ve got for a larger range of applications.

Today, at the LeWeb innovation conference in Paris, we’re cutting the red ribbon for Yandex Data Factory, our new B2B-service for corporate and enterprise clients, who would like, using our machine-learning technologies, to turn large volumes of data they posses into hands-on business tools, and, by doing so, increase sales, cut costs, optimise processes, prevent losses, forecast demand, develop new or improve existing methods of audience targeting.

We first branched out of our natural realm with our collaboration with CERN on their Large Hadron Collider beauty (LHCb) experiment. For this project we trained our MatrixNet to search for specific types of particle collisions, or events, among thousands of terabytes of information about these events registered by the detector in the LHCb. Yandex provided the LHCb researchers with an instant access to the details of any specific event.

The success of this project gave us reasons to believe it can be repeated in other areas of application. Any industry producing large amounts of data and focused on business goals could benefit from our expertise and our MatrixNet-based technologies: personalisation of search suggestions, recommendations or search resultsimage or speech recognition, road traffic monitoring and prediction, word form prediction and ranking for machine translation, demographic profiling for audience targeting.

Prior to today's announcement we have run pilot projects for about a year designing experimental custom-made solutions for clients all over the world. Most of these projects involved using the data that already exist, which we used for training a MatrixNet-based model, which then was applied to new data – depending on the goal of a client, to generate suggestions for buying a specific product, or predict, with a high degree of accuracy, based on behaviour of thousands or millions of shoppers with similar behaviour patterns, which product exactly will be bought.

Using this machine-learning technique, we helped one of the leading European banks increase their sales by matching each of their products that needed upselling with the best communication channel for each customer. By applying MatrixNet to behavior data on a few million of the bank’s clients, we created a model that could predict net present value of communication of a product to a specific client via a specific channel. This model was then applied to the bank’s new data to generate personalised product recommendations for each client paired with communication channel and ranked by potential net profit value. Preliminary results of the first wave of the bank’s marketing campaign, which was run on three million of clients, were used to fine-tune the original model, which, in its turn, was used in the second wave on a much larger number of the bank’s customers. The resulting sales increase beat the increase forecasted by the bank’s own analysts by 13%.

The same machine-learning approach, together with our own data and expertise in geolocation, helped a road and traffic management agency boost their accident prediction accuracy making it 30 times more accurate. To enable the agency take measures to prevent road traffic accidents, we provided them with one-hour forecasts for traffic jams, as well as alerts for high-risk traffic conditions, in real time, and visualized potential congestion on interactive maps. Using MatrixNet, we first trained predictive formulas on our own UGC information about almost 40,000 road accidents and 5bn speed tracks minded over 2.5 years, complemented by the information provided by the agency: traffic information (i.e., number of cars passing through a given segment of the road in any given time), information about road conditions (type of surface, number of lanes, gradient etc.), weather information. These formulas were then applied to larger data sets and a predictive system for road traffic accidents was developed and deployed in the agency’s situation rooms.

Currently, we’re continuing to work on about 20 projects in various stages of completion across the globe. In essence, we're continuing to experiment, but this time, we know in which direction, or rather – in which directions – we are to move. While the majority of our potential partners, as well as data, come from finance, telecommunications, retail, logistics, utilities, and even the new-fangled 'smart cities', anyone who has data and a business goal can discover new opportunities brought about by mathematics. No matter what industry your business is in, mathematics will work for you. Despite what Einstein said.

Screen Shot 2014-12-08 at 19.01.53.png

MatrixNet helps CERN physicists find what they are looking for

One of the four big experiments at the world’s largest and most powerful particle accelerator, the Large Hadron Collider, is now testing Yandex’s machine learning technology, MatrixNet, on their data on B-meson decay.

This is a new stage in a long-term collaboration between the European Organization for Nuclear Research (CERN) and Yandex, which began in 2011 when the LHCb experiment started using our servers for some of their data simulation and continued in 2012 with Yandex supplying a prototype of a custom-built search tool for the LHC events. Now, Yandex’s machine learning technology is expected to help the CERN physicists boost precision levels in identifying extremely rare particle decays in the vast amount of data collected by LHCb. Comparing the number of observed events against predictions, scientists can confirm or refute their theories.


Bs0->mumu decay candidate observed at the LHCb experiment (photo by CERN)


In November last year the LHCb researchers reported that they had observed the decay of a Bs meson into a muon-antimuon pair for the first time. The statistical level of significance for this decay, however, did not allow to unequivocally qualify this a discovery. But, that there is not a statistically significant number of decays in the LHCb data does not mean that they are not there. It only means that a better tool or more data are needed to observe them with confidence. With MatrixNet, which allows to make decisions about data relevancy based on a very large number of factors, statistical levels of particle decay detection might turn out to be dramatically different. And this is one of the reasons why CERN liked the idea of using MatrixNet.

CERN is a very large-scale international laboratory where hypotheses, theories and models in theoretical physics are tested by running experiments and accumulating data, which can then be analysed and interpreted. Since the LHC was started in 2008, the LHCb experiment has been collecting data about over 10 billions particle events per year. When the LHC stops for an upgrade in spring this year, scientists will go into the analytical phase of their research.

MatrixNet is a high-precision tool that can make a difference in the quality of results obtained during data analysis. By joining CERN openlab, a framework to test and validate cutting-edge information technologies and services in partnership with industry at CERN, we will work this year on helping scientists find what they are looking for. As a CERN openlab Associate, the objective is to develop a service that could allow the CERN researchers to use MatrixNet for their purposes without additional assistance from our engineers, as it happens now. The launch of the MatrixNet service at CERN, scheduled for May 2013, will give the physicists an opportunity to detect particle decays more precisely, while we will be able to improve our machine learning technology by running it on a very large dataset. What MatrixNet does when applied to CERN’s event data is much like what it does to build a ranking formula for Yandex’s search engine. CERN’s use of MatrixNet on their data gives us an opportunity to expand the application range for our machine learning technology beyond web search into a new field – theoretical physics.