Science at Bravo Systems: Our Machine Learning use cases - Bravo Systems d.o.o.

Science at Bravo Systems: Our Machine Learning use cases

As big data continues to grow, the application of machine learning is becoming more and more present in almost all fields of human activity. Machine learning is a combination of mathematics, computer science and artificial intelligence. Machine learning models automatically extract knowledge and learn future trends based on the vast amount of data available. The most common application is solving complex problems for which it is difficult or unfeasible to define a solution, and it is not possible to solve them with classical programming.

Bravo Systems uses machine learning methods to enhance different products within several business areas.

Online advertising

The most common area of machine learning ​​application within the company is online advertising. The essential goal here ​is to find the best matching between users and ads. Due to the large amount of data and the dynamics of the advertising ecosystem, finding the optimal solution for optimizing the key performance parameters of advertising campaigns is a big challenge. So far, different types of algorithms with a large number of different features were tested for predicting user response on an ad. Further, the complexity of the advertising ecosystem requires the development of a real-time bidding system to increase the probability of winning the auction for a particular ad slot. Also, many assisting machine learning solutions have been explored and developed: user clustering, data visualization, automated anomaly detection, optimizing over various external platforms like Facebook, AdWords, etc.

Despite many exciting advances there are many challenges with practical implementation of the described solution in a real environment. Given the huge amount of data available for each specific advertising campaign, the system must be able to work with data sets containing hundreds of millions of examples. In order to achieve appropriate scalability, efficient and scalable algorithms were used, and the same were implemented and executed in a distributed environment. The speed of prediction is extremely important. Currently, the average processing and prediction time for an individual request is about 2 ms, with several thousand predictions per second.

Text Analytics

The growth of the amount of data that needs to be automatically processed also requires the development of techniques for analyzing human languages. It is very difficult for a machine to truly understand (interpret) human language. Due to the hyperproduction of textual content on the web, more and more research is focused on Natural Language Processing (NLP) techniques. The goal is to enable computers to understand language (text as well as speech) in much the same way human beings can. Within the company, NLP products have been developed to extract keywords that describe the text, the named entities to which the text is associated, the sentiment that predominates in the text and the category of the text in question. The machining of natural languages ​​is extremely demanding due to the great complexity of the language, frequent ambiguity, ambiguity in expression, the use of irony, sarcasm, etc.


Also, with the rapid development of information and communication technologies and the connection of a huge number of computers and other devices such as smartphones, tablets, and various smart devices, providing security and protecting users from malicious activities on the Internet is becoming one of the most important challenges. Any Internet user, in any sphere of private life or business, can be the target of a certain malicious cyber activity. Malicious cyber activities include many illegal, criminal acts such as: causing financial damage to users, identity theft, breach of data secrecy, espionage, etc. Given the variety of types of attacks, the constant emergence of new attack techniques, and the large number of contexts in which attacks can occur, it is difficult to develop a robust system for detecting malicious activity on the web. Our solution for malicious websites detection using machine learning methods gives an accuracy of 96 to 99%, and can be used to minimize such attacks on end users.


The field of e-commerce has seen a tremendous expansion in the past decade, especially lately due to Covid-19 pandemic. One of the problems we solve using machine learning methods are recommendations within different e-commerce platforms. Main task of the recommendation engine is to recommend the most relevant products to each user based on the historical behavior of a given user as well as the behavior of other users while shopping online. Recommendation engines should, through collective experience, assist the user in selecting products to purchase. In addition, integrating dynamic product pricing and the use of various discounts, coupons and special offers based on predictive models in the background are becoming increasingly popular.

Written by

Jelena Jokić

Data Science Engineer