Just how reliable are search terms for SEM and SEO results? Companies and organizations spend billions of dollars annually on SEO and SEM. SEO stands for search engine optimization while SEM stands for search engine marketing. The power of search terms, say digital marketers, cannot be underestimated.
However, many Internet marketing professionals have become increasingly frustrated over search teams. Specifically, they have become frustrated over the limits of just how much one can assume and predict on the search terms themselves. Internet marketing or digital marketing refers to marketing that only occurs online.
Search terms can have several different meanings. The same term or word used in five separate searches can represent five different meanings. SEM and SEO professionals, therefore, must make calculated guesses regarding exactly which search terms might be the most effective for a given marketing initiative or campaign.
Jia Liu and Olivier Toubia carried out a study on this problem. They wrote about their study and findings in the journal Marketing Science (citation below).
Digital marketers’ challenge
The study revealed that a different approach might provide the context necessary to improve SEM and SEO programs and projects significantly.
The authors focused on digital marketers’ challenge when it comes to inferring content preferences. Specifically, in a more nuanced, quantified, and detailed manner.
If they could infer content preferences, then it would be possible to plan, implement, and evaluate SEO and SEM efforts with more precision and predictability. The efforts would also be more effective.
Search terms are often ambiguous
Prof. Liu said:
“Because of the nature of textual data in online search, inferring content preferences from search queries presents several challenges.”
“A first challenge is that search terms tend to be ambiguous; that is, consumers might use the same term in different ways.”
“A second challenge is that the number of possible keywords or queries that consumers can use is vast; and a third challenge is the sparsity of search query. Most search queries contain only up to five words.”
A new approach for individual search terms
The authors now believe that a different approach could better provide context for individual search terms.
They used a ‘topic model.’ The model helped combine data from several different search queries and their associated search results. It then quantified the mapping between the search terms and the results.
A learning algorithm powered the model. It extracted ‘topics’ from text based on the frequency of the text. The creators of the model designed it to establish context where one type of term is related semantically to another type. It helps provide the system with context for the use of the term.
Monitoring search terms used on ‘Hoogle’
Professors Liu and Toubia tested various content by observing study volunteers’ behavior on the search engine. The study participants were in a controlled environment.
The researchers created ‘Hoogle,’ their own search engine. It served as a filter between Google and the participants. It ran all queries the volunteers’ search terms and revealed how the learning algorithm might work in a real-world environment.
Prof. Toubia said:
“We were able to show that our model may be used to explain and predict consumer click-through rates in online search advertising based on the degree of alignment between the search ad copy shown on the search engine results page, and the content preferences estimated by our model.”
“In the end, what this enables digital marketers to do is better match actual search results with what users mean or intend when they key in specific search terms.”
“A Semantic Approach for Estimating Consumer Content Preferences from Online Search Queries,” Jia Liu and Olivier Toubia. Marketing Science, Copyright © 2018, INFORMS. DOI: https://doi.org/10.1287/mksc.2018.1112.