What are recommendation systems?
What are recommendation systems?
Recommended systems are algorithms that select relevant products and services based on user data.
Technology is one of the subsections of https://gagarin.news/news/ethereums-co-creator-sent-5-million-in-support-of-ukraine/ machine learning.
The purpose of collaborative filtration is to find a user who evaluated a specific object, and calculate the correlation coefficient of vectors of its assessments of all objects in the database. To do this, often use the K-Grandian neighbors method.
In the center of the model, based on content, is the object. For the work of the user assessment algorithm, it is not needed. It is important to know any properties characterizing the object: author, genre, country of origin, manufacturer, etc. D. At the same time, it must be understood that not all of them are relevant for the consumer, so it is worth limiting himself only to the main attributes.
Recently, models based on content have been very popular. They do not need to be taught for a long time, developers can immediately begin to recommend goods for users.
However, this method has disadvantages. Many users noticed that after searching for a certain product in Google, they began to “pursue” the advertisement with a proposal to purchase this product in some online store. To reduce the number of negative reviews about the irrelevant of such ads, developers complement algorithms with models based on knowledge. They also do not rely on ratings, but take into account only the profiles of the user and goods.
How recommendation systems collect data?
Data for recommendatory algorithms can be collected in explicit and hidden ways.
Explicit methods include the user’s request to evaluate objects on a differentiated scale, rank them from the best to the worst, compare two similar goods or make a list of favorite objects. The key point – the user understands that his data is used by algorithms and gives consent to their processing.
During the hidden method, visitors to sites do not always realize that their actions can be used by recommending systems. This includes cookies, Google or Facebook trackers, detailed analysis of interaction with videos and so on.
As a rule, governments of many countries oblige sites to notify visitors to collect such data. However, users do not always have the opportunity to abandon this.
Where recommendation systems are used?
As already mentioned, recommendation systems are widely used in e -commerce. With their help, online stores can advise customers relevant goods in the “You may also like” block or offer complementary products directly in the basket. Also, if there is no product in stock, algorithms can find analogues.
Personal recommendations are also often used in postal newsletters.
Similar algorithms are used by retailers like Amazon, Ozon or Wildberries.
Large streaming services also use advisory systems. Among them Netflix, Spotify, Apple Music, Yandex.Music, YouTube, Megogo and others.
Recommendations algorithms are widely used on social networks. Facebook, Twitter, Instagram, VKontakte and others have been demonstrated to users content collected by algorithms for many years. Only a few of them allow you to switch to chronological tape.
What are the problems of recommendation systems?
Enforcement systems have a number of restrictions. One of them is the problem of a cold start – when a sufficient amount of data has not yet been accumulated to work the algorithm. This is a typical situation for a new or unpopular object that estimated a small number of users, or for an extraordinary consumer, the preferences of which are very different from the average user.
In such cases, ratings are corrected artificially. For example, the assessment is calculated not as an average in position, but as a smoothed average. With a small number of reviews, the rating of the object will gravitate to a certain “safe average”, and when a sufficient number of real assessments are gained, then artificial averaging is disconnected.
Another problem of advisory algorithms is bias. Inaccurate algorithms laid in them stereotypes, as well as user actions can affect the ranking of information.
In 2021, Facebook advertising algorithms disproportionately showed various announcements about vacancies for men and women. Twitter home -made automatic trimming tool in most cases focused on young and slender girls.
In both cases, the developers quickly corrected errors, but this does not always succeed. The company Google is constantly faced with criticizing the work of recommendatory algorithms.
For example, the results of the issuance at the search for athletes and athletes are very different. In the case of men, algorithms show articles with professional athletes achievements. However, in relation to women, the system issues various ratings of “attractiveness” and “sexuality”.
The results of the search results in Google at the request of “athletes” and “athletes”. Data: Google.
Not only users, but also bots can influence the search results. In 2018, Reddit users conducted intentional manipulations with Google algorithms so that the photo of the former US President Donald Trump was displayed at the request of “Idiot”.
During the hearing in Congress regarding the incident, the general director of the Corporation Sundar Pichai said that the company’s employees do not interfere in the ranking of information. According to him, algorithms do it on their own, scanning millions of search lines and ranking them in more than 200 parameters.
The development of algorithms can also be used by the developers of recommendation systems. In October 2021, the former Facebook employee published documents proving the intentional use of “harmful” tools on the site. According to her, top management knew that algorithms show intolerance in relation to unprotected segments of the population. But the company was in no hurry to eliminate errors, since such content involved users more and increased the company’s income due to advertising.
Subscribe to FORKLOG news at Telegram: Forklog AI – all news from the world of AI!