A combined team of Researchers from the University of Copenhagen and university of Helsinki has demonstrated that it is possible to predict individual preference based on how their brains responds to images laced before them.
In an experiment carried out, participants were shown images of human faces while having EEG electrodes on the heads. The result of this demonstration could potentially alter the way media contents are made available to individuals with more exact guesses that immediately matches the preferences. This is different from online algorithms which makes guesses of our preferences in movies, music and even shopping habits based on what we’ve looked at, our search history or what we’ve listened to.
This discovery could also help us learn more about ourselves, making it easier for individuals to make choices based of what they’d actually want and not based off impulses which could be regretted with time.
This technique can be used to provide individually-tailored media. The technique called Collaborative filtering, makes use of hidden patterns in our behaviour and the behaviour of others to predict which things we may be interested in or find appealing.
This idea which looks like a scene off of am action science fiction flick combines computer science and cognitive neuroscience has shown promise that brain based collaborative filtering is actually possible, with the results of the research being able to predict if the individual would be attracted to someone even if they be not seen the person.
The researchers have previously placed EEG electrodes on the heads of the study participants and showed them images of various faces, thereby demonstrating that machine learning can use electrical activities from the brain to detect faces the subject would find most attractive. This would make dating app make better informed choices when making matches of potential partners.
“Through comparing the brain activity of others, we’ve now also found it possible to predict faces each participant would find appealing prior to seeing them. In this way, we can make reliable recommendations for users—just as streaming services suggest new films or series based on the history of the users,” explains senior author Dr. Tuukka Ruotsalo of the University of Copenhagen’s Department of Computer Science.
Today, due to how fast the world has become most people do not have time to go through catalogues, most customers expects service providers to make personalized recommendations that’s exact and matches our specific needs. This has made many researchers and developers more interested in the development of more accurate techniques that would satisfy this demand. However the collaborative filtering techniques currently available are based on explicit behaviours such as ratings, click behaviours, content sharing, searches etc. which are not reliable methods of revealing our real and underlying preferences.
“Due to social norms or other factors, users may not reveal their actual preferences through their behaviour online. Therefore, explicit behaviour may be biased. The brain signals we investigated were picked up very early after viewing, so they are more related to immediate impressions than carefully considered behaviour,” explains co-author Dr. Michiel Spapé.
“The electrical activity in our brains is an alternative and rather untapped source of information. In the longer term, the method can probably be used to provide much more nuanced information about people’s preferences than is possible today. This could be to decode the underlying reasons for a person’s liking of certain songs—which could be related to the emotions that they evoke,” explains Tuukka Ruotsalo.
In the experiment, the participants were shown a large number of images of human faces and asked to look for those that they found attractive. While doing so, their brain signals were recorded. This data was then used to train a machine learning model to distinguish between the brain activity when the participant saw a face that they found attractive versus when they saw a face that they did not find attractive.
With a different machine learning model, the brain-based data from a larger number of participants was used to calculate which new facial images each participant would find attractive. Thus, the prediction was based partly on individual participant’s own brain signals and partly on how other participants responded to the images.
However, researchers do not believe this new method can be useful to advertisers and streaming services in product sales or retaining customers as lead author Keith Davis points out:
“I consider our study as a step towards an era that some refer to as ‘mindful computing’, in which, by using a combination of computers and neuroscience techniques, users will be able to access unique information about themselves. Indeed, Brain-Computer Interfacing as it is known, could become a tool for understanding oneself better.”
It should however be noted that there’s still a long way to go before this technique can be applied to real life beyond the test period as they believe that devices must become cheaper and easier to use before they can be released to the general public for casual use. They believe it can however be possible on 20years time of the current timeline is followed.
The research team also warns that these technology often than not comes with a significant challenge for protection of brain based data from miss use, making it very important for the research community to put user data privacy, ownership of the said data and the ethical use of raw data collected by the EEG at the front burner and probably push for a regulation that would beat these challenges.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.