Send a request for website SEO services from

To order

Non-standard SEO growth point: on-site search analytics

In this article, I will demonstrate how to build an additional SEO strategy using the example of one of our client’s websites.

Through data analysis, you can find out what users are searching for and satisfy their demands by creating missing landing pages.

We have been working on this project for a long time and decided to use on-site search analytics to find new growth points. This is a good opportunity to analyze real search queries since this metric shows what the user was looking for but failed to find. Also, you will see what parts of the website are poorly structured or poorly presented. You can analyze on-site searches with Google Analytics, and then apply the received data to improve SEO performance. 

Figure 1. Website visibility on Yandex


With a report from Google Analytics, we can find out:

  • how users perform searches on your site
  • what search queries do they enter
  • whether results help increase engagement rates
  • track user behavior through search
  • apply the received data to improve SEO

Go to Google Analytics. Then select Behavior > Site Content > All Pages.

Figure 2. All Pages

Next, in the Advanced field, specify the parameter that is used for on-site search.

Figure 3. Advanced field

To find out what search pattern is used, go to the site, and enter a query in the search box. In the URL you will see Search and that is your regular expression for search. Query parameters may contain either one letter or a word or several words (like term, search, query). It all depends on the CMS or the developer’s imagination.

Figure 4. Parameter

Now you can see a list of all queries and pages visited by users.

Figure 5. List

In the received report, you can also observe important metrics. Metrics are quantitative measures, such as Average Time on Page and Bounce Rate.

After analyzing the above indicators, you can draw the following conclusions:

  • whether users are satisfied with on-site search results
  • what users find/do not find on the site

Based on the collected data, you can create landing pages, change navigation, rename categories, etc. But this should be done wisely, taking into account demand, relevance, and positions.

Let’s take a practical example:

We have several search queries from a user. Now we want to understand if the user managed to find answers to their queries.

This will help us check whether the search works correctly, whether product characteristics or descriptions are sufficient, and possibly something else, depending on your imagination. But to make this issue clear, let’s first find out what parsing is.


In simple words, parsing is an automated collection of information from any site with subsequent analysis, transformation, and presentation in a structured form, most often as a table with a data set.

To parse a site, you need a parser.

Figure 6. Meme

A website parser is any program or service that automatically collects information. Parsers crawl sites and collect data that match a given condition.


Above, we uploaded links through the GA report. But how many products have visitors found in search results? To know this, we will use a parser. There are a lot of such programs available on the Internet, but in this case, we will use Screaming Frog SEO Spider.


So, we will perform link parsing with the help of Screaming Frog.

Parsing steps

  • define parameters or patterns for data parsing
  • prepare Xpath queries and specify them in Screaming Frog SEO Spider settings
  • start the crawler

As you remember, we have N pages of results that users search for using on-site functionality. All these pages contain certain products. To parse the number of products from the specified pages, we need XPath.  The combination of XPath and crawler will help us quickly collect all necessary information and present it in a convenient form.

Figure 7. Meme


Parsing is based on XPath queries. XPath is a query language that addresses a certain part of the page code (DOM) and collects the specified information from it.

To make the crawler collect the necessary information, you need to feed it with URLs and generate Xpath queries.


First, install the XPath extension on your computer.

Then open any link from the GA report and look for a unique element on the page. This element can be in the product card or, as in our case, on the page itself.

Figure 8. Element

Launch the extension and press the SHIFT key to select this unique element.

Figure 9. Select the element

But the parsing query is rather long. So, let’s reduce it to a readable form.

Figure 10. Query
Figure 11. Meme

Now we have //div[@class=’sort-left’]/div/span

// – addressing a certain element on the page;

div – path to a specific element;

[ ] – in square brackets with @ we indicate the class is equal to a certain value.

Be sure to check whether the query works correctly.

Figure 12. Query check


Let’s move on to Screaming Frog. Load the list of pages to be crawled (links obtained from the GA report). To do this, click Mode > List on the main panel.

Figure 13. List

Then click Upload, and Enter Manually.

Figure 14. Enter Manually

Paste the links into the window that opens and click Next.

Figure 15. Window

Now add the obtained XPath query (//div[@class=’sort-left’]/div/span). Select Configuration > Custom > Extraction.

Figure 16. Extraction

In the window that opens, click Add, select XPath, add the code, click OK, and the Start button.

Figure 17. Code

As a result of parsing with an XPath query, we received a list of data from the site pages that is convenient for further processing (the Extractor 5.1 column in the screenshot below).

Figure 18. List of pages

Then export the received data to xlsx or csv table by clicking the Export button. Save the file.

Figure 19. Export

When opening the document (xlsx or csv), you may experience problems with encoding. An example of gibberish text from the character set is shown in the screenshot.

Figure 20. Example of gibberish text

For the document to open correctly, and all characters to be readable, open the saved document in Excel, go to the Data tab, select Get external data > From text, and import your file.

Figure 21. Excel

Then, select Text by Columns > Delimited.

Figure 22. Delimited

Select Сomma, click Next > Finish.

Figure 23. Finish

As a result, we get a readable document.

Figure 24. Readable document


Thanks to the data received when sorting the table, we get a selection of URLs that contain few products. Next, check positions and keyword demand for these products. Positions can be easily checked through the service

Select Organic search > Organic keywords

Figure 25. Organic keywords
Figure 26. Organic keywords

If there is demand but no landing page, then you can create a new one. The created pages’ performance can be tracked using the SEOWORK tool.

Figure 27. SEOWORK

To summarize, using on-site search and tools like Google Analytics, XPath, and Screaming Frog, you can create and optimize pages to expand the product range. Why do visitors need a page where they will see only two products or none? They will not remember your site and will go where the product range is larger. So, it is important to track visitors’ behavior on your site.

Authors: Maria Pavlenko and Dmitry Fedoseev (