Q&A with SEMrush CEO / Cofounder, Oleg Shchegolev

Oleg Shchgolev, CEO and Co-founder of SEMrush, also created SEOquake that was released in 2006. SEOquake was the inspiration to creating something more complex. Then SEMrush was born with the help of his partner, Dimitri Melnikov.

Today, SEMrush has 10 years in the market with 500 employees, revenues close to 100 million, and about 2 million users worldwide.

I had the pleasure to interview Oleg, including some questions beyond Search.

KT: What type of inspiration, vision, and loyalty did you see in Dmitri Melnikov that made you want to go in business with him?

OS: First and foremost, Dima is my friend. Second, he is a SEMrush co-founder; he’s been here right from the very beginning. We have always believed in the product that we’ve been working on and I totally admire him as my friend, my colleague, and co-founder.

We make most decisions together as CEO & co-CEO. Our temperaments are mutually reinforcing and this contributes greatly in allowing us to make balanced and informed decisions.

KT: How is your relationship with your partner Dmitri Melnikov?

OS: I’ve known Dmitri for more than 30 years. We grew up in the same neighborhood; we started programming together and over time our friendship expanded into a business partnership. We’ve gone through lots of ups and downs together.

KT: To be successful in the tech industry do you consider a person has to go to college or can they be self-taught?

OS: Formal education greatly enhances discipline and concentration, helps you socialize and find the right and important people — to network, if you like.

But a lot of leaders I know are self-taught. Back when I was a student, people had pretty much no idea what SEO was, or how important (and complicated) it would become over the years. Students should be aware that their knowledge gets outdated fast. They need to be prepared to dedicate a lot of time to continuing self-education.

The IT industry is developing extremely quickly. That’s not to say that college education is not needed at all (it definitely doesn’t hurt to learn some basic stuff in college), but other than that…like I said, I don’t deem formal training to be absolutely essential. With all the online courses and workshops, it’s possible to study everything remotely.

KT: Does SEMrush have corporate social responsibility initiatives? Such as giving scholarships for people that dream with tech careers? Or any other?

OS: At the moment, we work closely with American and European universities and give students the opportunity to explore SEMrush. While learning digital marketing, they familiarize themselves with our tools and work on their projects with the help of SEMrush. Representatives of our company give a lot of master-classes to students, providing them with the insights about digital marketing.

We organize a lot of meetups, roundtables and conferences on digital marketing and agile. We always welcome other IT companies to take part in such events. Lots of lectures, a great deal of useful insight, plenty of opportunities for networking — these are just a few reasons to come by our events.

On a different note, we certainly care about ecological sustainability. We are going to implement waste separation in all our offices and  we encourage our employees to cycle to work to reduce CO2 emission.

It goes without saying that we give people the opportunity to work with us on paid internships. Pretty often, former interns become our full-time employees. We have plans for a large project on how to help novice specialists find work in the digital sphere, but we won’t reveal the details yet

KT: Where is SEMrush headed in the next 5 years?

OS: We are definitely going to add more features to SEMrush, while enhancing our content tools, along with local SEO and traffic analytics (Competitive Intelligence 2.0). We want to ensure that we remain the leading digital marketing software.

As for strategic plans, we are going to strengthen our global brand. At the moment we’re working diligently to enter emerging markets such as China — right now, we are updating our databases to cover Baidu data.

We’ll also introduce tools that help figure out how to rank better on Amazon and optimize for voice search.

This is just a minor part of what is coming. There are a lot of other things we are working on, but we’ll keep them quiet for a while! Hint: 2019 is going to be a big year for us.

KT: What are your biggest pain points as SEO thought-leader? What additional support/buy-in do you think most companies struggle with to get on board? Does this result in limiting their growth opportunities?

OS: SEO is multidimensional and its development is extremely rapid. 5 years ago we couldn’t even imagine that image search or voice search would be everywhere. Such technological growth involves non-stop education and creative thinking, both from my side as a thought-leader and from the side of companies trying to get on board.

One more indispensable thing to get on board successfully is to have some unique feature, to understand your uniqueness and, crucially, to communicate that knowledge to your audience. Such an environment of extrinsic value, created for your customers, will also help build strong and long-term relationships with them and will directly affect customer retention.

KT: What will impact the traditional SEO from Voice search devices in the coming years?

OS: The share of voice searches is growing, along with the amount of voice-activated smart speakers. Naturally, voice searches are different from the searches that are typed — the former are longer and the wording is different — more conversational.

Voice search is about questions, prepositions, and comparisons – the same as with featured snippets . If it’s not an informational query, people are likely to search for location-based info.

Backlinko’s study claims that 40.7% of answers come from the featured snippet. Our SEO clients are putting more and more emphasis on this feature. Questions, prepositions, and comparisons dominate featured snippet results. A whopping 52% of questions have featured snippets.

KT: Can we get a dashboard or report on voice search?

OS: There’s no single dashboard or report on voice search in SEMrush (at least for now) but some of our tools help find solutions for voice search optimization. For instance, featured snippet checks or mobile devices optimization — this contributes to voice search ranking.

KT: What’s the most popular SEMrush feature and why?

OS: Everyone loves us for Keywords Research tools, but there is so much more to SEMrush.

KT: What’s a feature of SEMrush that is less utilized and why?

OS: The features that are available in the Admin Mode only =)

KT: What is diversity like in the tech industry? Do you think there is gender bias in the tech industry?

OS: Judging by what the media says, there are indeed a lot of problems with diversity in the tech industry.  In SEMrush we want our company to be equally welcoming to people of any race or gender. About half of SEMrush C-level employees are women and we believe they are awesome specialists who help make SEMrush an industry-leading company.

Closing thoughts

My favorite answer is the last one because I am an advocate for equality and inclusivity.

In my conversation with Oleg I also learned that SEMrush’s headquarters is now in Boston.

To summarize the success of SEMrush, it is based on a great communication within its talented partners, a great culture of inclusivity, and the amazing people that work there.

The post Q&A with SEMrush CEO / Cofounder, Oleg Shchegolev appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/15/qa-with-semrush-ceo-cofounder-oleg-shchegolev/

Advertisements

How Google Ads is fighting click fraud

Click fraud is one of the most talked about issues affecting advertisers on Google and other Pay Per Click (PPC) platforms.

According to Click Guardian $7.2 billion was lost to click fraud between 2016 and 2018. That’s a staggering amount that millions of advertisers are losing to fraudsters and click errors.

So what is click fraud? According to Google it is an illegitimate action such as an unintentional click or a click resulting from malicious software.

In fact, Google chooses not to call it click fraud and calls it ‘invalid clicks’ instead. That’s understandable considering the confusion surrounding this topic and the various reasons why some clicks may be legitimate or an error.

Google uses a number of methods to fight click fraud. These include manual reviewers, automated filters, deep research and a global team of scientists and engineers.

In this article we will look at some common types of click fraud; examples of legitimate clicks that can be mistaken for fraud; what Google is doing to fight it and what you can do to prevent and report it.

What are the common types of click fraud?

1. Manual clicks intended to increase your advertising costs

This click fraud is one advertisers’ fear most. And some estimates report that it’s the most prevalent type of click fraud.

This is when other businesses that compete on your keywords deliberately click on your ads to drive up your costs.

Whatever keywords you’re bidding on, it’s almost likely you’re not the only one bidding on it. So that can often turn into a serious battle for clicks, customers and traffic.

One of the negative effects is that everyone’s costs go up, especially when competition is high. So, with a limited budget, the funds can run out and stop receiving any more clicks. And because of this, competitors have a strong incentive to click your ads and drive up your costs.

Some industries with high click fraud include insurance, personal finance services like mortgages, paid advertising and others that have cost per click rates as high as $10 per click.

2. Manual clicks intended to increase profits for website owners hosting your ads

This type of click fraud is only applicable to advertisers that use the Google Display Network. So, if you run ads on this network your ads will appear on third party websites owned by webmasters.

For a webmaster to be able to display your ads on their website, they need to first join the Google AdSense program. And for every person that clicks your ad on their website, the webmaster earns 68% of the amount paid to Google.

So, if the cost per click is $3, then the webmaster will earn $2.04. Multiply that by a hundred clicks and that could be a healthy sum of $204 for a webmaster.

The reason for this is many webmasters have trouble acquiring quality traffic and will often resort to such tactics to increase their revenue. And in some cases will invite and encourage others to do the clicking.

Although this is a violation of Google’s strict policies, there’s no doubt that it happens.

Google has partnered with millions of websites and apps that provide ad results through their Google AdSense program. Before any website is accepted in the program, it is thoroughly checked to ensure it meets Google’s high third-party policy requirements.

Any placement violating these rules is immediately banned from the program. This includes any suspicious clicks that don’t provide any value for advertisers – especially when it’s used to increase revenue for the website owner.

Google will also ban publishers that generate an excessive amount of invalid traffic. If, on the other hand this seems to be accidental, they may suspend the publisher until the problem is fixed.

3. Clicks by automated clicking tools, robots or other deceptive software

These are automated programs that run on internet servers or hijacked computers that are used for click fraud. They are programmed to create a large number of invalid clicks, impressions and traffic and are made to appear like real users.

Google uses automated filters to capture such activity but its manual reviews are usually more effective. They have a dedicated team of specialists hunt and stop botnets from harming advertisers, publishers and searchers.

Google has also partnered with the Trustworthy Accountability Group (TAG) https://www.tagtoday.net/ to create a glossary of common click fraud tactics to fight actions such as cookie-stuffing, crawler traffic and more.

Accidental clicks that provide no value to the advertiser, such as the second click of a double-click.

This also includes unintentional clicks on mobile, when a user tries to click a link but clicks an ad instead. The engagement for these tend to be very low and bounce rates are usually up to 100% for all traffic.

These provide no value for advertisers and Google employs its tools to stamp out these clicks.

Google takes a proactive approach to click fraud. Any click that is deemed as invalid is automatically filtered from your reports and billing schedules – so you’re not charged for them. And if any clicks have escaped their detection filters, you may be eligible to receive credit for them. These are known as ‘invalid activity’ credit adjustments.

screenshot from Google Ads of "invalid activity"

Each click on your ad is examined by the Google system.

Examples of legitimate clicks

There are legitimate clicks that may appear as fraud. It’s important to be aware of these and not mistakenly affect the performance of your ads by making changes.

Competitors doing competitor research – it’s inevitable that competitors will click on your ads. They will do this as they carry out competitor research and it’s likely they’ll be clicking on other advertisers’ ads too. So, there’s little you can do about this.

Multiple clicks from the same IP address – if your reports show many clicks from the same IP address, it’s likely that this is because of ISP allocation. ISPs will often provide the same IP address in similar locations to a large number of users.

Returning visitors – many searchers click ads multiply times as they search for products and services. They will do this multiple times when comparison shopping or returning to your website for more information.

What to do if you suspect click fraud

The first step is to optimize your ads and keywords to ensure you only target relevant searches. Conversion rate is one of the best indicators of success and identifying and correcting a low conversion rate will help identify potential invalid activity.

Setup Google Analytics to monitor any suspicious activity. Analytics provides powerful reporting and helps you to track the performance of your keywords and assess any suspicious clicks from specific locations.

Check your Google Ads account for any invalid interactions that you have been credited for. You can do this by logging into your account and then clicking the Tools link in the top right section.

Next, under ‘Setup’ click the Billing & payments link. Finally click ‘View Transactions and Documents’.

screenshot from Google Ads about how to see if you've been credited for invalid clicks

Any invalid credits that you’ve received will be available here and will be labelled ‘invalid activity’ on the transaction history page.

Monitor invalid interactions in your account statistics. You can customize your columns by adding the interaction invalid data columns to quickly see the number and percentage of invalid clicks. You aren’t charged for these, but it helps you to quickly see the percentages.

screenshot from Google Ads, showing summary of invalid click activity

Reporting click fraud

If you’ve updated your ads, investigated your reports, and checking billing settings and still suspect that you have indeed been affected by click fraud, then the next stage is to contact Google.

As the first step, share as much information as possible. Send a description of the invalid activity and details of why you suspect the activity to malicious and what you’ve done to counteract it.

Their team of specialists will use a range of click and impression tools to identify invalid activity. This can take several days to complete because of the amount of data that they have to deal with.

If this is proven to have been invalid and you’ve been charged, then you’ll receive a credit in your Google Ads account.

Summary

Google continues to improve its fraud detection capabilities and with help from advertisers this problem can be minimized.

The post How Google Ads is fighting click fraud appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/14/how-google-ads-fighting-click-fraud/

Mobile search and video in 2019: How visible are you?

Here at Search Engine Watch we know that video content is a great way to achieve and maintain visibility online, as well as being a successful means for providing engaging content for followers and prospective customers.

A quick look at our Twitter feed is testament to that.

Last month I argued that YouTube channels could – and should – be optimized in much the same way as our videos and websites.

And Ann Smarty’s 5 YouTube optimization tips to improve your video rankings is worth a look if you want to ensure your videos are really sticking out from the crowd.

But in 2019 are there any other considerations for ensuring that our videos are visible?

We know that Google frequently tweaks its algorithm and we should assume that YouTube does too. We also know the habits of searchers and viewers change as time passes.

Today I want to turn our attention to video search visibility in the mobile context. After all, most of our search activity is mobile and most of our video viewing activity is as well. How should this affect the way we approach SEO for our video content?

How much search activity is mobile?

US-centric statistics from Statista show just how much the mobile share of organic searches has grown in the past few years.

During Q3 2013 27% of searches were made on mobile compared to 73% on desktop, but by Q3 2018 things were firmly weighted the other way.

Now, at least 56% of searches are being accounted for by mobile.

graph showing how much the mobile share of organic searches has grown in the last couple years

How much video viewing is mobile?

With more search activity happening on mobile, can we expect the same of video activity? The short answer is: yes.

According to eMarketer, more than 80% of video viewing is expected to be on mobile in 2019.

This is up 10% in just three years and it looks like it will keep growing.

mobile video viewers and penetration worldwide, from 2016 to 2021

YouTube videos on Google – differences between desktop and mobile

So with more people searching for, and viewing, video on their handheld devices, do we need to think a little differently about how we optimize this content?

It seems logical that Google might present video differently on mobile compared to desktop. But initial differences appear to be very small – at least for the search phrase I use as an example here, ‘top youtube videos of 2018.’

search for "top youtube videos of 2018" on desktop

search for "top youtube videos of 2018" on mobile

On both desktop and mobile, the video results for this keyphrase appear presented in a carousel in around position 3 of the SERPs. Position 1 is given over to an infobox taken from an AdWeek ‘top 10’ style article and position 2 is a ‘People also ask’ box.

The first three videos that are viewable in this carousel are the same across devices – the leftmost being an embedded video in a VentureBeat article and the others being from YouTube. So Google might not be ranking things particularly differently depending on whether we search from desktop or mobile. But one instantly noticeable difference is the need to click (to scroll through the carousel) once on mobile in order to view the rightmost result and even half of the central result. In this case, the VentureBeat video certainly wins on instant visibility.

Another recent feature Google is including for video in its mobile SERPs is the preview

If a user scrolls to the video carousel and stops momentarily, the video begins to play silently showing selected moments. If we compare this to desktop – where the videos are static until played – mobile video SERPs are perhaps less likely to be clicked upon depending on the thumbnail it has.

The lesson for content creators here is to ensure the visual quality of your video is good throughout – you can’t simply depend on your colorful custom thumbnail to get these kinds of clicks.

While the video carousels in the Google SERPs are very similar on desktop and mobile, there are other ways video is presented which are notable.

mobile search results showing a video carousel in line with text results

This mobile video listing appears in-line with the text results much the same as a conventional desktop listing.

Note that it doesn’t include the video’s description, rather opting to show who has uploaded the content and when it was posted. Subsequently, the title and thumbnail are massively important for indicating to the user that this content is relevant to them. But the name of the channel and the freshness of content will also help that decision be made.

In this case, ‘Top Trending’ as a brandname is itself very relevant for our query – and we can also see how fresh the content is.

YouTube videos on YouTube – differences between desktop and mobile

So how do things fair when searches are made on YouTube?

youtube search for "top youtube videos of 2018" on desktop

youtube search for "top youtube videos of 2018" on mobile

Again, differences are subtle. Buy they are there. Ranking-wise we still see the videos we are familiar with from the earlier Google search and they are much the same across devices.

As we might expect, though, the mobile display of YouTube.com eschews video descriptions choosing to show the thumbnail aside the title, channel name, number of views and the age of the content.

Titles are also truncated if they exceed a certain length.

In this example, our second result loses nearly half of its character count of 57 down to 34.

App vs. mobile web differences

With the growth of mobile viewing habits, it also pays to see how things potentially differ on mobile apps too.

When it comes to the SERPs as a whole, we can see the YouTube app does a better job at displaying more results above the fold – showing five full thumbnails.

In our example we can also see the app does a better job at utilizing the space next to the image – with bigger fonts and less severe editing of longer titles. The YouTubers React… video still loses some of its overlong title, but only around 5 characters.

Takeaways

It’s important to remember this isn’t a comprehensive study. Different searches may produce different results. And allowances need to be made for the diversity of size and display of mobile devices (my examples were viewed on iPhone 7).

That said, I do think certain elements of video optimization deserve more consideration in the mobile context than they do in desktop.

Here’s my hotlist:

1. Titles

Always an important aspect of video SEO. But on mobile, we need to be wary that titles are more likely to be truncated in results across mobile web and app. Try to keep them concise and with the most important aspect of the title within the first 30 characters.

2. Previews

Thumbnails are still massively important, but the use of previews on mobile is an extra thing for video content producers to consider. Is your lighting good and consistent throughout? Simply, does your whole video looks good? Because, on mobile, users may or may not make a click depending on this preview.

3. Channel brand

While mobile SERPs often tend to lose video descriptions, the channel name on which the video is published is always included – and often fairly prominently. Even if you channel name is not massively well known, is it clearly relevant for the words/phrases you want your video(s) to rank for?

4. Freshness

Similar point to the above. The date or age of the content is always visible across Google, YouTube, desktop, mobile web and app. Ensure you are updating fresh content to your channel. Mobile users can clearly see if things are out of date.

The post Mobile search and video in 2019: How visible are you? appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/01/28/mobile-search-and-video-in-2019-how-visible-are-you/

Using Python to recover SEO site traffic (Part one)

Helping a client recover from a bad redesign or site migration is probably one of the most critical jobs you can face as an SEO.

The traditional approach of conducting a full forensic SEO audit works well most of the time, but what if there was a way to speed things up? You could potentially save your client a lot of money in opportunity cost.

Last November, I spoke at TechSEO Boost and presented a technique my team and I regularly use to analyze traffic drops. It allows us to pinpoint this painful problem quickly and with surgical precision. As far as I know, there are no tools that currently implement this technique. I coded this solution using Python.

This is the first part of a three-part series. In part two, we will manually group the pages using regular expressions and in part three we will group them automatically using machine learning techniques. Let’s walk over part one and have some fun!

Winners vs losers

SEO traffic after a switch to shopify, traffic takes a hit

Last June we signed up a client that moved from Ecommerce V3 to Shopify and the SEO traffic took a big hit. The owner set up 301 redirects between the old and new sites but made a number of unwise changes like merging a large number of categories and rewriting titles during the move.

When traffic drops, some parts of the site underperform while others don’t. I like to isolate them in order to 1) focus all efforts on the underperforming parts, and 2) learn from the parts that are doing well.

I call this analysis the “Winners vs Losers” analysis. Here, winners are the parts that do well, and losers the ones that do badly.

visual analysis of winners and losers to figure out why traffic changed

A visualization of the analysis looks like the chart above. I was able to narrow down the issue to the category pages (Collection pages) and found that the main issue was caused by the site owner merging and eliminating too many categories during the move.

Let’s walk over the steps to put this kind of analysis together in Python.

You can reference my carefully documented Google Colab notebook here.

Getting the data

We want to programmatically compare two separate time frames in Google Analytics (before and after the traffic drop), and we’re going to use the Google Analytics API to do it.

Google Analytics Query Explorer provides the simplest approach to do this in Python.

  1. Head on over to the Google Analytics Query Explorer
  2. Click on the button at the top that says “Click here to Authorize” and follow the steps provided.
  3. Use the dropdown menu to select the website you want to get data from.
  4. Fill in the “metrics” parameter with “ga:newUsers” in order to track new visits.
  5. Complete the “dimensions” parameter with “ga:landingPagePath” in order to get the page URLs.
  6. Fill in the “segment” parameter with “gaid::-5” in order to track organic search visits.
  7. Hit “Run Query” and let it run
  8. Scroll down to the bottom of the page and look for the text box that says “API Query URI.”
    1. Check the box underneath it that says “Include current access_token in the Query URI (will expire in ~60 minutes).”
    2. At the end of the URL in the text box you should now see access_token=string-of-text-here. You will use this string of text in the code snippet below as  the variable called token (make sure to paste it inside the quotes)
  9. Now, scroll back up to where we built the query, and look for the parameter that was filled in for you called “ids.” You will use this in the code snippet below as the variable called “gaid.” Again, it should go inside the quotes.
  10. Run the cell once you’ve filled in the gaid and token variables to instantiate them, and we’re good to go!

First, let’s define placeholder variables to pass to the API

metrics = “,”.join([“ga:users”,”ga:newUsers”])

dimensions = “,”.join([“ga:landingPagePath”, “ga:date”])

segment = “gaid::-5”

# Required, please fill in with your own GA information example: ga:23322342

gaid = “ga:23322342”

# Example: string-of-text-here from step 8.2

token = “”

# Example https://www.example.com or http://example.org

base_site_url = “”

# You can change the start and end dates as you like

start = “2017-06-01”

end = “2018-06-30”

The first function combines the placeholder variables we filled in above with an API URL to get Google Analytics data. We make additional API requests and merge them in case the results exceed the 10,000 limit.

def GAData(gaid, start, end, metrics, dimensions, 

           segment, token, max_results=10000):

  “””Creates a generator that yields GA API data 

     in chunks of size `max_results`”””

  #build uri w/ params

  api_uri = “https://www.googleapis.com/analytics/v3/data/ga?ids={gaid}&”\

             “start-date={start}&end-date={end}&metrics={metrics}&”\

             “dimensions={dimensions}&segment={segment}&access_token={token}&”\

             “max-results={max_results}”

  # insert uri params

  api_uri = api_uri.format(

      gaid=gaid,

      start=start,

      end=end,

      metrics=metrics,

      dimensions=dimensions,

      segment=segment,

      token=token,

      max_results=max_results

  )

  # Using yield to make a generator in an

  # attempt to be memory efficient, since data is downloaded in chunks

  r = requests.get(api_uri)

  data = r.json()

  yield data

  if data.get(“nextLink”, None):

    while data.get(“nextLink”):

      new_uri = data.get(“nextLink”)

      new_uri += “&access_token={token}”.format(token=token)

      r = requests.get(new_uri)

      data = r.json()

      yield data

In the second function, we load the Google Analytics Query Explorer API response into a pandas DataFrame to simplify our analysis.

import pandas as pd

def to_df(gadata):

  “””Takes in a generator from GAData() 

     creates a dataframe from the rows”””

  df = None

  for data in gadata:

    if df is None:

      df = pd.DataFrame(

          data[‘rows’], 

          columns=[x[‘name’] for x in data[‘columnHeaders’]]

      )

    else:

      newdf = pd.DataFrame(

          data[‘rows’], 

          columns=[x[‘name’] for x in data[‘columnHeaders’]]

      )

      df = df.append(newdf)

    print(“Gathered {} rows”.format(len(df)))

  return df

Now, we can call the functions to load the Google Analytics data.

data = GAData(gaid=gaid, metrics=metrics, start=start, 

                end=end, dimensions=dimensions, segment=segment, 

                token=token)

data = to_df(data)

Analyzing the data

Let’s start by just getting a look at the data. We’ll use the .head() method of DataFrames to take a look at the first few rows. Think of this as glancing at only the top few rows of an Excel spreadsheet.

data.head(5)

This displays the first five rows of the data frame.

Most of the data is not in the right format for proper analysis, so let’s perform some data transformations.

First, let’s convert the date to a datetime object and the metrics to numeric values.

data[‘ga:date’] = pd.to_datetime(data[‘ga:date’])

data[‘ga:users’] = pd.to_numeric(data[‘ga:users’])

data[‘ga:newUsers’] = pd.to_numeric(data[‘ga:newUsers’])

Next, we will need the landing page URL, which are relative and include URL parameters in two additional formats: 1) as absolute urls, and 2) as relative paths (without the URL parameters).

from urllib.parse import urlparse, urljoin

data[‘path’] = data[‘ga:landingPagePath’].apply(lambda x: urlparse(x).path)

data[‘url’] = urljoin(base_site_url, data[‘path’])

Now the fun part begins.

The goal of our analysis is to see which pages lost traffic after a particular date–compared to the period before that date–and which gained traffic after that date.

The example date chosen below corresponds to the exact midpoint of our start and end variables used above to gather the data, so that the data both before and after the date is similarly sized.

We begin the analysis by grouping each URL together by their path and adding up the newUsers for each URL. We do this with the built-in pandas method: .groupby(), which takes a column name as an input and groups together each unique value in that column.

The .sum() method then takes the sum of every other column in the data frame within each group.

For more information on these methods please see the Pandas documentation for groupby.

For those who might be familiar with SQL, this is analogous to a GROUP BY clause with a SUM in the select clause

# Change this depending on your needs

MIDPOINT_DATE = “2017-12-15”

before = data[data[‘ga:date’] < pd.to_datetime(MIDPOINT_DATE)]

after = data[data[‘ga:date’] >= pd.to_datetime(MIDPOINT_DATE)]

# Traffic totals before Shopify switch

totals_before = before[[“ga:landingPagePath”, “ga:newUsers”]]\

                .groupby(“ga:landingPagePath”).sum()

totals_before = totals_before.reset_index()\

                .sort_values(“ga:newUsers”, ascending=False)

# Traffic totals after Shopify switch

totals_after = after[[“ga:landingPagePath”, “ga:newUsers”]]\

               .groupby(“ga:landingPagePath”).sum()

totals_after = totals_after.reset_index()\

               .sort_values(“ga:newUsers”, ascending=False)

You can check the totals before and after with this code and double check with the Google Analytics numbers.

print(“Traffic Totals Before: “)

print(“Row count: “, len(totals_before))

print(“Traffic Totals After: “)

print(“Row count: “, len(totals_after))

Next up we merge the two data frames, so that we have a single column corresponding to the URL, and two columns corresponding to the totals before and after the date.

We have different options when merging as illustrated above. Here, we use an “outer” merge, because even if a URL didn’t show up in the “before” period, we still want it to be a part of this merged dataframe. We’ll fill in the blanks with zeros after the merge.

# Comparing pages from before and after the switch

change = totals_after.merge(totals_before, 

                            left_on=”ga:landingPagePath”, 

                            right_on=”ga:landingPagePath”, 

                            suffixes=[“_after”, “_before”], 

                            how=”outer”)

change.fillna(0, inplace=True)

Difference and percentage change

Pandas dataframes make simple calculations on whole columns easy. We can take the difference of two columns and divide two columns and it will perform that operation on every row for us. We will take the difference of the two totals columns, and divide by the “before” column to get the percent change before and after out midpoint date.

Using this percent_change column we can then filter our dataframe to get the winners, the losers and those URLs with no change.

change[‘difference’] = change[‘ga:newUsers_after’] – change[‘ga:newUsers_before’]

change[‘percent_change’] = change[‘difference’] / change[‘ga:newUsers_before’]

winners = change[change[‘percent_change’] > 0]

losers = change[change[‘percent_change’] < 0]

no_change = change[change[‘percent_change’] == 0]

Sanity check

Finally, we do a quick sanity check to make sure that all the traffic from the original data frame is still accounted for after all of our analysis. To do this, we simply take the sum of all traffic for both the original data frame and the two columns of our change dataframe.

# Checking that the total traffic adds up

data[‘ga:newUsers’].sum() == change[[‘ga:newUsers_after’, ‘ga:newUsers_before’]].sum().sum()

It should be True.

Results

Sorting by the difference in our losers data frame, and taking the .head(10), we can see the top 10 losers in our analysis. In other words, these pages lost the most total traffic between the two periods before and after the midpoint date.

losers.sort_values(“difference”).head(10)

You can do the same to review the winners and try to learn from them.

winners.sort_values(“difference”, ascending=False).head(10)

You can export the losing pages to a CSV or Excel using this.

losers.to_csv(“./losing-pages.csv”)

This seems like a lot of work to analyze just one site–and it is!

The magic happens when you reuse this code on new clients and simply need to replace the placeholder variables at the top of the script.

In part two, we will make the output more useful by grouping the losing (and winning) pages by their types to get the chart I included above.

The post Using Python to recover SEO site traffic (Part one) appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/06/using-python-to-recover-seo-site-traffic-part-one/

Facebook is a local search engine. Are you treating it like one?

As soon as Facebook launched its Graph Search in 2013, it was only a matter of time before it became a big player in the search engine game.

But Graph Search was limited as far as optimization was concerned. The results it returned focused on providing an answer on the relationships between people, places, and things on Facebook rather than a link that contained what was likely the answer to your query like Google.

It was an IFTTT statement on steroids that never missed a workout.

It wasn’t until 2015 that Facebook became a major player in keyword search.

A brief history on Facebook Search

2011, August: Facebook beta tests sponsored results in search feature

Since Facebook’s search was in its infancy at this time, testing sponsored results may have been too soon since search functionality was limited to page titles, profiles, games, etc…

So you couldn’t target based on keywords quite yet, nor could you take advantage of a searchers affinity. Which means the only way to target was by targeting another page or product.

facebook beta tests sponsored results in its search box, 2011

Sponsored results were short-lived as it turned into competitors targeting competitors relentlessly to steal traffic. In June of 2013, less than a year of Sponsored Results’ formal release, they were terminated.

2013, January: Facebook launches Graph Search

Prior to 2013, Facebook Search had only returned results based on existing pages. There wasn’t much logic in the algorithm other than the name of the person or page you’re looking for in your query. That changed with Graph Search.

Graph search took pages, categories, and people and put them into a revolving wheel of filters that was heavily personalized to each searcher and returned different results for everyone. It even allowed users to use natural language like, “friends near me that like DC Comics” and return results for just that.

At least that’s what you would have had to search to find friends you should stay away from…

The two major flaws in this were:

  1. You had to already know what and whom you were looking for.
  2. You also had to hope that what you were looking for had a page or tag associated with it.

 2014, December: Facebook adds keywords search functionality to Graph Search

This was the obvious next move by Facebook. They took what was already seemingly amazing at the time and added the ability to pull results based on keywords alone from your friends, people your friends have interacted with, or pages you like.

As far as a search engine was concerned, Facebook still had a long way to go to compete with Google.

2015, October: Facebook expands search to all 2 trillion public posts

Less than a year later, Facebook opens the floodgates in their search engine from inner-circle results to global, real-time, public news and posts. You could now search for any public post in their over 2 trillion indexed posts. Facebook now became a search engine that competed with the likes of Google for industry news and trending topics.

Viewing Facebook as a search engine

Prior to any real search functionality in Facebook, social media platforms were merely viewed as potential ranking signals and traffic boosters for your website.

Despite Matt Cuts denouncing the claim that Google uses social signals in their algorithm, Bing has gone the opposite direction about how they use engagement metrics from social media.

Facebook has been a solid directory site for local businesses to have their business page since 2007 when the number of businesses listed on the social media was only about 100,000. It allowed businesses to be in front of one of the fastest growing social media networks in the world, but the discoverability of your business was limited to mutual connections and brand awareness.

It wasn’t until 2014 when Facebook launched a new and improved version of the 2011 Places directory that Local Business Search became a focal point for Facebook to compete with Yelp and FourSquare.

Now, when searching for a company in Facebook’s Search that’s near you, you’ll get results that are eerily similar to the local 3-pack on Google. If we know anything about local 3-packs, it’s that there’s definitely an algorithm behind them that determines which businesses get to show up and in which order.

Facebook sees over 1.5 billion searches every day and over 600 million users visit business pages every day. They still have a ways to go to reach Google’s 3.5 billion searches per day. That said, claiming search queries just over 40% of what the search engine giant has — as a social media platform — isn’t anything to scoff at.

Why Facebook Search is important for local businesses

Facebook has provided a different means for customers to engage with brands than a typical search engine. But now the search engine has come to Facebook and the data shows people are using it. Not only to stalk their ex but also to find businesses.

  1. Facebook has a large user base using their search engine
  2. Local businesses can be discovered using search
  3. Local business are ranked using search

So I guess that means we should be optimizing our local business pages to rank higher in Facebook’s search for specific queries…

Sounds a lot like local SEO. But this time, it’s not about your website or Google.

The whole reason us SEOs are obsessed with SEO is that the value and opportunity it holds when 74% of buying journeys start with search engines during the consideration stage. If no one used search engines, SEO wouldn’t be much of a big deal. But they do and it is.

graph, "which of the following ways do you typically discover or find out about new products, brands, or services?"

According to a survey by Square, 52% of consumers discovered a brand through Facebook. That’s a pretty significant number to be passing off.

And with the launch of Facebook Local in late 2017, the network is getting more discover-friendly.

Optimizing for local Facebook SEO

Facebook has caught up with typical search engines and started implementing keywords in their algorithm and database. Bundle that knowledge with the fact that 600 million users visit business pages, and Facebook alone has a whopping 1.5 billion searches every day. It doesn’t take a genius to understand that Facebook SEO shows to be valuable in local business discoverability on the platform.

All we have to do is crack the code in optimizing Facebook business pages. Unfortunately, it seems Facebook is a bit more secretive than Google on what are and aren’t ranking signals in their local business Graph search.

It’s a matter of finding out why Facebook chose to rank Lightning Landscape & Irrigation over Dynamic Earth Lawn & Landscape, LLC when Dynamic Earth is both verified and closer.

 

In most of my research, Facebook tends to heavily weight posts and pages based on user engagement. But it doesn’t mean other ranking factors don’t exist in search. We’re looking at around 200 ranking signals similar to Google, but also vastly different.

Trying to crack this code has led the idea of “optimizing your Facebook business page”, including myself. But most seem to be focused on optimizing Facebook pages to rank in other search engines rather than Facebook itself.

While it is definitely a good idea to follow SEO best practices for the former reason, why not do both?

Facebook testing search ads

Coming into 2019, Facebook has started beta-testing search ads. They’re not keyword-based yet. Rather, they serve as extensions of news feed ads and only on supported ad formats. It’s quite an improvement from the original search ads that were abandoned in 2015.

It’s the type of subtle testing that could definitely produce some useful analytics in pursuing a full-blown search ad feature with keywords.

Related: “Facebook is expanding into Search Ads. What will this mean?”

Facebook knows two things:

1) Ad space on news feeds is decreasing.

2) More and more people are using their search feature.

The fact that Facebook is testing this doesn’t really tell me anything about local SEO signals on the platform. But it does tell me even Facebook sees a real opportunity in their own search engine. And their analytics are probably better than what we have right now.

Summary

Without any solid advice from Facebook, I think it’s time for the SEO community to start thinking about organic brand exposure through the social media platform itself. We should start viewing it as an enhanced search engine as it continues to grow and improve its search features.

More so, without search analytics from Facebook, there really isn’t a lot we can do in regards to optimizing for placement. At least right now there isn’t.

I’d love to see their new search ads beta really get traction and prompt Zuckerberg to consider a more SEO-friendly approach to search marketing on his own platform.

Of course, this is going to give “social media gurus” another reason to clog up my news feed with ads.

Jake Hundley is the founder of Evergrow Marketing.

The post Facebook is a local search engine. Are you treating it like one? appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/11/facebook-local-search-engine/

17 top plugins and extensions for SEO

There are so many great plugins available, and it’s difficult to choose which are the best for you.

To help you decide which tools will make your work easier and more productive, I’ve asked SEO experts to share what they use.

Per the experts, I compiled this list of 17 of the best plugins and extensions for SEO — and they’re all free.

Here are the top plugins and extensions recommended by experienced SEOs

1. SEO TextOptimizer

Free extension

This plugin is perfect for those who deal with content. SEO TextOptimizer lets you measure the quality of texts you create for your website based on how search engines would evaluate it.

The tool shows you topics you should develop as well as those you’d better eliminate for search robots to understand the text is relevant to the specific queries. The plugin also suggests you a list of words you could add to improve your content. The best thing is that you don’t need great SEO expertise to use it.

2. SEOquake

Free extension

With SEOquake plugin, you can easily analyze your key SEO metrics. Moreover, the tool provides SEO audit, backlinks analysis, and other useful functions.

One of the factors why SEO professionals choose this tool is that you can get a comprehensive analysis of a SERP and even export its results. There’s a bar appearing below each search result which provides you with key metrics such as traffic, Alexa rank, social media shares, etc.

3. BuiltWith

Free extension

This extension lets you find what a website you are visiting at the moment is built with. It’s created to help developers, designers, and researchers to discover the technologies other pages are implementing and choose those they want to use for their sites.

The plugin tracks:

  • Widgets
  • Frameworks
  • Advertising
  • Publishing
  • Hosting
  • Analytics
  • Content Delivery Network
  • Document Standards

Experts also say it’s great you can easily get global trends on using specific technologies.

4. Serpstat Plugin

Free extension

It’s an extension which helps you conduct SEO analysis of a page. Serpstat Plugin provides the most critical information on keywords, traffic, and page visibility. You can also get the report on the top 10 keywords for which your page ranks at the top of search results.

Serpstat SEO & Website Analysis Plugin has now three tabs: Page Analysis, On-page SEO parameters, and Domain Analysis. Here are the most crucial parameters you’ll get with the plugin:

  • Domain’s traffic.
  • Domain’s visibility trend for a year.
  • The number of results on Google, Bing, and Baidu.
  • The number of images on Google Image Search.
  • Alexa Rank.
  • Page speed.
  • Site start date.
  • Meta tags.
  • The number of shares on social media networks (Facebook and Pinterest).

The plugin is free, but to use it, you need to create your Serpstat account, if you don’t have one yet.

5. WordPress SEO by Yoast

WordPress plugin

This incredibly popular plugin by Yoast helps experts with on-site SEO needs. The tool will let you:

  • Add meta keywords, title, and description to your posts.
  • Provide clear site navigation for crawlers and users.
  • Analyze your on-page SEO. You can check your content, descriptions, and keywords.
  • See what your snippets will look like.
  • Create SEO-friendly Facebook Open Graph.

This WordPress plugin has a very quick and easy-to-use interface.

6. WAVE Evaluation Tool

Free extension

This tool evaluates web content accessibility within Chrome and Firefox browsers. WAVE provides 100% secure and private accessibility reporting. The plugin checks password-protected, intranet, sensitive or dynamically generated web pages.

7. Spark Content Optimizer

Free extension

Spark Content Optimizer is a tool designed to help you develop your site’s search experience. The plugin provides you with easy access to such a crucial data as:

  • Monthly traffic.
  • The performance of your site for all the keywords.
  • The technical audit which analyzes more than 40 hard-to-find issues.
  • Information on backlink authority.

8. Link Redirect Trace

Free extension

It’s a great tool for tracking redirect path. The tool analyzes HTTP Headers, rel-canonicals, robots.txt, link power, etc. You can use Link Redirect Trace extension to analyze your competitors, your on-page and off-page SEO, and other critical factors.

Here are the main tasks this plugin can help you cope with:

  • Identify and fix problems in your on-page/off-page SEO.
  • Analyze your competitors’ links.
  • See the redirect chain and fix problems to make your load time faster.
  • After your site was redesigned or migrated, you can check your links.
  • Check links from affiliate and advertising networks.

9. Ap – Data Layer Inspector+

Free extension

This plugin is a perfect toolkit for digital analysts. This add-on lets you monitor, debug, get detailed data not having to switch between the page, the code, and the developer console.

With this tool, you can inspect the dataLayer in real time, insert code into the page, analyze GA hits, ignore hits to individual properties, etc.

10. User-Agent Switcher

Free extension

The tool will help you switch quickly between user-agent strings. If you want to test how your page responds to different browsers, this plugin will let you do it. Due to User-Agent Switcher, you can browse with predefined user-agents or add your own ones.

11. Open SEO Stats

Free extension

This extension provides quick access to the most important SEO stats. The tool will show you:

  • Traffic stats. Graphs from Alexa Rank, Quantcast Rank, Compete Rank.
  • Information on your backlinks.
  • Cached pages.
  • Indexed pages. You’ll see the number of pages indexed in Google, Bing, Yahoo, Baidu, Yandex, etc.
  • Geolocation information, such as country, city, and IP address.
  • The shares on social websites.
  • Meta information, such as title, meta keywords, description, canonical tags, internal links, external links, and more.

12. Velvet Blues

Free WordPress plugin

This plugin will be handy for those who move their WordPress website to another domain and need to update internal links and references to pages. The plugin helps you fix the problem and change old links on your website. Experts say it’s great that you can find and replace any URL in your WordPress database without having to use phpMyAdmin directly.

With Velvet Blues Plugin, you can:

  • Update links which are embedded in excerpts, content, or custom fields.
  • Choose whether you want to update links for attachments or not.
  • View the number of items updated.

Install it only when you need to fix something and then uninstall it. The plugin treats everything it finds.

13. WP Rocket

Free WordPress plugin

Experts consider this plugin to be one of the best caching tools. Using WP Rocket to cache pages, your page load time decreases, and indexing improves. Moreover, the tool lets users reduce the weight of HTML, CSS, and JavaScript files.

With WP Rocket, you can optimize your images, so that they’ll get loaded only when visitors scroll down the page. Such an action contributes to improving page speed.

14. All In One Schema.org Rich Snippets

Free WordPress plugin

This tool will be useful for those who want to get rich snippets for their web pages. The plugin is created to help you make your page stand out in Google, Bing, and Yahoo search results.

All In One Schema.org Rich Snippets supports most content types released by Schema.org. Here are eight different content types for which you can add schema:

  • Review
  • Event
  • People
  • Product
  • Recipe
  • Software Application
  • Video
  • Articles

15. Cloudflare’s plugin for WordPress

Free WordPress plugin

This free plugin helps to accelerate page loading time, improve your SEO, and protect against DDoS attacks.

Cloudflare plugin adds value for SMEs/Medium sized businesses, making it very easy to setup CDNs, DDoS Protection, and allow them to utilize edge SEO technologies like service workers.

16. WhatRuns

Free extension

This extension lets you find out what runs any website. You, ll get all the technologies used on websites you visit:

  • CMS
  • WordPress plugins
  • Themes
  • Analytics tools
  • Frameworks

Moreover, you can even get notified when websites start using new tools and services if you follow them.

17. Grammarly

Free extension

There are both free and paid access available for this plugin. The tool underlines your grammar, spelling, or punctuation errors for you to correct them. It also suggests you synonyms for overused words and gives you tips on how you can improve your texts. To get the most out of this plugin, you’d better use a paid version, as it’ll get you access to the most critical issues.

Choose the best for you

Remember that the more extensions you download, the slower your browser becomes. That’s why it’s essential to know which ones exactly are perfect for you.

Free WordPress plugins and Chrome extensions will help you make your work easier, but you may spend quite a significant time looking for those which are really useful for you. So, this list should have helped you circle out the tools you’ll try to implement into your working process.

Want more SEO tools?

Check out these articles as well:

The post 17 top plugins and extensions for SEO appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/08/17-top-plugins-extensions-seo/

Cybersecurity in SEO: How website security affects SEO performance

Website security — or lack thereof — can directly impact your SEO performance.

Search specialists can grow complacent. Marketers often get locked into a perception of what SEO is and begin to overlook what SEO should be.

The industry has long questioned the permanent impact a website hack can have on organic performance.

And many are beginning to question the role preventative security measures might play in Google’s evaluation of a given domain.

Thanks to the introduction of the GDPR and its accompanying regulations, questions of cybersecurity and data privacy have returned to the fray.

The debate rages on. What is the true cost of an attack? To what extent will site security affect my ranking?

The truth is, a lot of businesses have yet to grasp the importance of securing their digital assets. Until now, establishing on-site vulnerabilities has been considered a different skillset than SEO. But it shouldn’t be.

Being a leader – both in thought and search performance – is about being proactive and covering the bases your competition has not.

Website security is often neglected when discussing long-term digital marketing plans. But in reality, it could be the signal that sets you apart.

When was the last time cybersecurity was discussed during your SEO site audit or strategy meeting?

How does website security affect SEO?

HTTPS was named as a ranking factor and outwardly pushed in updates to the Chrome browser. Since then, HTTPS has, for the most part, become the ‘poster child’ of cybersecurity in SEO.

But as most of us know, security doesn’t stop at HTTPS. And HTTPS certainly does not mean you have a secure website.

Regardless of HTTPS certification, research shows that most websites will experience an average of 58 attacks per day. What’s more, as much as 61 percent of all internet traffic is automated — which means these attacks do not discriminate based on the size or popularity of the website in question.

No site is too small or too insignificant to attack. Unfortunately, these numbers are only rising. And attacks are becoming increasingly difficult to detect.

1. Blacklisting

If – or when – you’re targeted for an attack, direct financial loss is not the only cause for concern. A compromised website can distort SERPs and be subject to a range of manual penalties from Google.

That being said, search engines are blacklisting only a fraction of the total number of websites infected with malware.

GoDaddy’s recent report found that in 90 percent of cases, infected websites were not flagged at all.

This means the operator could be continually targeted without their knowledge – eventually increasing the severity of sanctions imposed.

Even without being blacklisted, a website’s rankings can still suffer from an attack. The addition of malware or spam to a website can only have a negative outcome.

It’s clear that those continuing to rely on outward-facing symptoms or warnings from Google might be overlooking malware that is affecting their visitors.

This creates a paradox. Being flagged or blacklisted for malware essentially terminates your website and obliterates your rankings, at least until the site is cleaned and the penalties are rescinded.

Not getting flagged when your site contains malware leads to greater susceptibility to hackers and stricter penalties.

Prevention is the only solution.

This is especially alarming considering that 9 percent, or as many as 1.7 million websites, have a major vulnerability that could allow for the deployment of malware.

If you’re invested in your long-term search visibility, operating in a highly competitive market, or heavily reliant on organic traffic, then vigilance in preventing a compromise is crucial.

2. Crawling errors

Bots will inevitably represent a significant portion of your website and application traffic.

But not all bots are benign. At least 19% of bots crawl websites for more nefarious purposes like content scraping, vulnerability identification, or data theft.

Even if their attempts are unsuccessful, constant attacks from automated software can prevent Googlebot from adequately crawling your site.

Malicious bots use the same bandwidth and server resources as a legitimate bot or normal visitor would.

However, if your server is subject to repetitive, automated tasks from multiple bots over a long period of time, it can begin to throttle your web traffic. In response, your server could potentially stop serving pages altogether.

If you notice strange 404 or 503 errors in Search Console for pages that aren’t missing at all, it’s possible Google tried crawling them but your server reported them as missing.

This kind of error can happen if your server is overextended

Though their activity is usually manageable, sometimes even legitimate bots can consume resources at an unsustainable rate. If you add lots of new content, aggressive crawling in an attempt to index it may strain your server.

Similarly, it’s possible that legitimate bots may encounter a fault in your website, triggering a resource intensive operation or an infinite loop.

To combat this, most sites use server-side caching to serve pre-built versions of their site rather than repeatedly generating the same page on every request, which is far more resource intensive. This has the added benefit of reducing load times for your real visitors, which Google will approve of.

Most major search engines also provide a way to control the rate at which their bots crawl your site, so as not to overwhelm your servers’ capabilities.

This does not control how often a bot will crawl your site, but the level of resources consumed when they do.

To optimize effectively, you must recognize the threat against you or your client’s specific business model.

Appreciate the need to build systems that can differentiate between bad bot traffic, good bot traffic, and human activity. Done poorly, you could reduce the effectiveness of your SEO, or even block valuable visitors from your services completely.

In the second section, we’ll cover more on identifying malicious bot traffic and how to best mitigate the problem.

3. SEO spam

Over 73% of hacked sites in GoDaddy’s study were attacked strictly for SEO spam purposes.

This could be an act of deliberate sabotage, or an indiscriminate attempt to scrape, deface, or capitalize upon an authoritative website.

Generally, malicious actors load sites with spam to discourage legitimate visits, turn them into link farms, and bait unsuspecting visitors with malware or phishing links.

In many cases, hackers take advantage of existing vulnerabilities and get administrative access using an SQL injection.

This type of targeted attack can be devastating. Your site will be overrun with spam and potentially blacklisted. Your customers will be manipulated. The reputation damages can be irreparable.

Other than blacklisting, there is no direct SEO penalty for website defacements. However, the way your website appears in the SERP changes. The final damages depend on the alterations made.

But it’s likely your website won’t be relevant for the queries it used to be, at least for a while.

Say an attacker gets access and implants a rogue process on your server that operates outside of the hosting directory.

They could potentially have unfettered backdoor access to the server and all of the content hosted therein, even after a file clean-up.

Using this, they could run and store thousands of files – including pirated content – on your server.

If this became popular, your server resources would be used mainly for delivering this content. This will massively reduce your site speed, not only losing the attention of your visitors, but potentially demoting your rankings.

Other SEO spam techniques include the use of scraper bots to steal and duplicate content, email addresses, and personal information. Whether you’re aware of this activity or not, your website could eventually be hit by penalties for duplicate content.

How to mitigate SEO risks by improving website security

Though the prospect of these attacks can be alarming, there are steps that website owners and agencies can take to protect themselves and their clients. Here, proactivity and training are key in protecting sites from successful attacks and safeguarding organic performance in the long-run.

1. Malicious bots 

Unfortunately, most malicious bots do not follow standard protocols when it comes to web crawlers. This obviously makes them harder to deter. Ultimately, the solution is dependent on the type of bot you’re dealing with.

If you’re concerned about content scrapers, you can manually look at your backlinks or trackbacks to see what sites are using your links. If you find that your content has been posted without your permission on a spam site, file a DMCA-complaint with Google.

In general, your best defense is to identify the source of your malicious traffic and block access from these sources.

The traditional way of doing this is to routinely analyze your log files through a tool like AWStats. This produces a report listing every bot that has crawled your website, the bandwidth consumed, total number of hits, and more.

Normal bot bandwidth usage should not surpass a few megabytes per month.

If this doesn’t give you the data you need, you can always go through your site or server log files. Using this, specifically the ‘Source IP address’ and ‘User Agent’ data, you can easily distinguish bots from normal users.

Malicious bots might be more difficult to identify as they often mimic legitimate crawlers by using the same or similar User Agent.

If you’re suspicious, you can do a reverse DNS lookup on the source IP address to get the hostname of the bot in question.

The IP addresses of major search engine bots should resolve to recognizable host names like ‘*.googlebot.com’ or ‘*.search.msn.com’ for Bing.

Additionally, malicious bots tend to ignore the robots exclusion standard. If you have bots visiting pages that are supposed to be excluded, this indicates the bot might be malicious.

2. WordPress plugins and extensions 

A huge number of compromised sites involve outdated software on the most commonly used platform and tools – WordPress and its CMS.

WordPress security is a mixed bag. The bad news is, hackers look specifically for sites using outdated plugins in order to exploit known vulnerabilities. What’s more, they’re constantly looking for new vulnerabilities to exploit.

This can lead to a multitude of problems. If you are hacked and your site directories have not been closed from listing their content, the index pages of theme and plugin related directories can get into Google’s index. Even if these pages are set to 404 and the remaining site is cleaned up, they can make your site an easy target for further bulk platform or plugin-based hacking.

It’s been known for hackers to exploit this method to take control of a site’s SMTP services and send spam emails. This can lead to your domain getting blacklisted with email spam databases.

If your website’s core function has any legitimate need for bulk emails – whether it’s newsletters, outreach, or event participants – this can be disastrous.

How to prevent this

Closing these pages from indexing via robots.txt would still leave a telling footprint. Many sites are left removing them from Google’s index manually via the URL removal request form. Along with removal from email spam databases, this can take multiple attempts and long correspondences, leaving lasting damages.

On the bright side, there are plenty of security plugins which, if kept updated, can help you in your efforts to monitor and protect your site.

Popular examples include All in One and Sucuri Security. These can monitor and scan for potential hacking events and have firewall features that block suspicious visitors on a permanent basis.

Review, research, and update each plugin and script that you use. It’s better to invest the time in keeping your plugins updated than make yourself an easy target.

3. System monitoring and identifying hacks 

Many practitioners don’t try to actively determine whether a site has been hacked when accepting prospective clients. Aside from Google’s notifications and the client being transparent about their history, it can be difficult to determine.

This process should play a key role in your appraisal of existing and future business. Your findings here – both in terms of historic and current security – should factor into the strategy you choose to apply.

With 16 months of Search Console data, it can be possible to identify past attacks like spam injection by tracking historical impression data.

That being said, not all attacks take this form. And certain verticals naturally experience extreme traffic variations due to seasonality. Ask your client directly and be thorough in your research.

How to prevent this

To stand your best chance of identifying current hacks early, you’ll need dedicated tools to help diagnose things like crypto-mining software, phishing, and malware.

There are paid services like WebsitePulse or SiteLock that provide a single platform solution for monitoring your site, servers, and applications. Thus, if a plugin goes rogue, adds links to existing pages, or creates new pages altogether, the monitoring software will alert you within minutes.

You can also use a source code analysis tool to detect if a site has been compromised.

These inspect your PHP and other source code for signatures and patterns that match known malware code. Advanced versions of this software compare your code against ‘correct’ versions of the same files rather than scanning for external signatures. This helps catch new malware for which a detection signature may not exist.

Most good monitoring services include the ability to do so from multiple locations. Hacked sites often don’t serve malware to every user.

Instead, they include code that only displays it to certain users based on location, time of day, traffic source, and other criteria. By using a remote scanner that monitors multiple locations, you avoid the risk of missing an infection.

4. Local network security

It’s equally as important to manage your local security as it is that of the website you’re working on. Incorporating an array of layered security software is no use if access control is vulnerable elsewhere.

Tightening your network security is paramount, whether you’re working independently, remotely, or in a large office. The larger your network, the higher the risk of human error, while the risks of public networks cannot be understated.

Ensure you’re adhering to standard security procedures like limiting the number of login attempts possible in a specific time-frame, automatically ending expired sessions, and eliminating form auto-fills.

Wherever you’re working, encrypt your connection with a reliable VPN.

It’s also wise to filter your traffic with a Web Application Firewall (WAF). This will filter, monitor, and block traffic to and from an application to protect against attempts at compromise or data exfiltration.

In the same way as VPN software, this can come in the form of an appliance, software, or as-a-service, and contains policies customized to specific applications. These custom policies will need to be maintained and updated as you modify your applications.

Conclusion

Web security affects everyone. If the correct preventative measures aren’t taken and the worst should happen, it will have clear, lasting consequences for the site from a search perspective and beyond.

When working intimately with a website, client, or strategy, you need to be able to contribute to the security discussion or initiate it if it hasn’t begun.

If you’re invested in a site’s SEO success, part of your responsibility is to ensure a proactive and preventative strategy is in place, and that this strategy is kept current.

The problem isn’t going away any time soon. In the future, the best SEO talent – agency, independent, or in-house – will have a working understanding of cybersecurity.

As an industry, it’s vital we help educate clients about the potential risks – not only to their SEO, but to their business as a whole.

The post Cybersecurity in SEO: How website security affects SEO performance appeared first on Search Engine Watch.

Source: https://searchenginewatch.com/2019/02/07/how-website-security-affects-seo/