Share on facebook
Share on google
Share on linkedin
Share on twitter

How to Work Remotely Without Going Crazy

A woman shaking her head so that her long hair covers her face. Looks a bit crazy.

This is a contribution by Ronald Wolf. Ron got most of his experience from consulting web design and SEO companies like GWM.

I added line breaks, text formatting and clarified some points, especially the last paragraph. Ron approved the changes.

Many of us regard 9 to 5 office jobs as pure hell and dream of working remotely. It seems like a dream come true –

  • you can establish your own working hours
  • you can design your own working surroundings
  • and – most importantly – you can work from everywhere in the world.

But is it really that simple? Working without the strains of office cubicles and the need to punch the clock requires self-discipline, and we all know that this isn’t the most common characteristic.

Working remotely doesn’t mean you don’t have a boss anymore and there are some challenges you’ll have to face along the way.

Taking this path also has its rules. They seem quite simple, but they are not that easy to accomplish as they seem.

When you are someone who is used to having a boss breading down your neck, getting the work done remotely could make you crazy.

 

Whenever

This is the number one common mistake: the fact that you were productive in the office doesn’t mean you’re gonna stay productive at home.

Now that you don’t have strict working hours it may seem that you can work whenever you want. This is where the problem starts – when do we actually want to work?

This is how your productivity decreases, and in the end, you’ll be forcing yourself to get the job done which will make the task even harder.

It will seem to you that you have even less time for yourself than you had it while working in the office.

It’s a contradiction that will certainly make you crazy after some time. To avoid that, you need to make your own working schedule, and you need to stick to it.

At the beginning, it’s good to keep the same working hours you had in the office and then to slowly adapt them to your own needs. Contrary to the popular belief, this is not an easy transition!

You will need time to make it function. You need to find the time of the day when you feel most productive and make the most of it.

Only after you establish your own individual schedule you will be able to make exceptions and enjoy fully deserved freedom.

 

Wherever

This second most common mistake affects your productivity through the lack of concentration. Even when you’re alone it can be hard to focus.

Everybody is trying to escape from depressing office cubicles, but working from your comfortable couch at home is going to make it hard for you to focus on the tasks at hand.

There are various distractions inside everybody’s home – family obligations (especially when you have kids), pets, the temptation to turn on the TV, etc.

The constant lure of distractions is why you need to set up a working space in your home that will become your new office.

You need a place where you can shut the door behind you to fully dedicate yourself to the work that needs to be done.

You should carefully consider which part of your home is most suitable to become an office.

Soft furniture in which you can ‛sink in’ is not the best solution because you don’t want to fall asleep in the middle of your work. Also your back will hurt after a few hours.

You need an organized desk and a firm chair (which doesn’t mean it shouldn’t be a comfortable one). Get rid of all distraction along with a TV set (unless your line of work requires it).

Having a broad window and a lot of natural light is always a plus, but a view can sometimes be distracting as well.

In case you happen to work in a seaside town during the peak of the summer, it would be a smart move to pull the shades down.

And the last thing – dress up for the occasion. You’re not in an office so you don’t need a suit and a tie, but you’re not supposed to spend an entire day in your pajamas, either.

Get up and get dressed properly. This way you won’t fall into a temptation to remain in your bed with a laptop in your lap.

 

Whatever

This is not even a mistake, this is a pure delusion. There are some jobs that are impossible to be done by working remotely. Trying to get them done this way can only drive you completely crazy.

It’s understandable that everyone would like to travel the world and work while doing it, but first, you need to make sure you’ve got the right job for that.

The second thing is that you can’t just sit at home (or a beach) and do everything that falls into your hands. Even when you’re working in the office you need to ‛fight’ for your assignments.

It’s not a secret that there is a competition between co-workers, especially today when no job position is safe and it’s becoming more difficult to stand out.

When you’re working remotely you have no insight into the activities of your colleagues and you’re practically left in the dark in many cases.

The further you are from the office, the more your professional visibility decreases. People might even completely forget about you and your skills.

Every established company is advised to highlight personal strengths of every employee on their website and frequently update them.

Potential customers may not be able to meet you in the office so they need to be able to meet you online – out of sight is out of mind.

 

Thank you to Chris King, Bill Marshal and Andrew Akesson for helping me with the crazy headline!

 

The post How to Work Remotely Without Going Crazy appeared first on SEO 2.0.

More Articles

How to Make Your Website Less Scary and Intrusive to Gain More Business

Creepy looking guy wearing a red hoody squats in the dark. His eyes are ligh graffiti crosses, his mouth a light line. It seems there is a car in the background - we see the headlights.

When it comes to online business: are you a creepy stalker invading people’s privacy for profit?

  • Do you have dozens of trackers installed on your site?
  • Do you want visitors to allow you to send notifications up front on the first visit?
  • Do you let your ads follow people around the Web in a creepy way?

Not just tech-savvy users hate and block intrusive marketing tactics. They’re also unethical and harm the perception of your website.

You effectively scare people away! They won’t be doing business with you whether you are

  1. just a publisher
  2. running an online store
  3. or selling services on the Web.

It’s not hard to fathom. Being creepy is actually a bad habit in private but is downright self-sabotage when you ask people to spend time, effort or money on your site.

Are You Stalking Your Visitors You Creep?

Yeah, I know. Many marketers will advise you to “retarget” your visitors and the likes to increase profits but you actually stalk your users or let other stalkers do it.

Most business sites also do use Google Analytics because it’s free and everybody loves Google. Does everybody really?

Privacy Badger blocks Facebook and Google on a website.

There is a growing unease about Google and Facebook tracking every single step of ours online. Protecting your online privacy seems to be common sense by now.

Yet most website owners still treat privacy of their visitors lightly. They trade user data for some free tools or scripts. Is it worth it?

Protect Users or Wreck Yourself

By now there are many tools to protect your privacy online you can use directly in your browser – ideally Firefox – as Google which builds Chrome makes money off your “big data”.

Personally I use Privacy Badger by the EFF (Electronic Frontier Foundation) – an American non profit fighting for our online rights and Findx Privacy Control.

There is also a similar extension by the DuckDuckGo team which is probably a good start for beginners. I like the way Privacy Badger works though.

I have been using Privacy Badger for over a year. At first many sites literally broke when I visited them with privacy protection.

Many sites literally use hundreds of potential trackers on their unsuspecting users! Consider Wired.com – the website of the popular technology magazine.

239 tracking items blocked on Wired.com

I can’t link Wired.com here for security reasons! The Wired website is spyware! The FindX Privacy Control extension blocked 239 suspicious items! And guess what? The site still worked.

What does that mean? Wired used up to 239 redundant tracking scripts that you didn’t even need to load to make the website work or view the content!

Of course it’s mostly third party scripts and images. Google ads and analytics by themselves were responsible for dozens of trackers. Every image Google shows reports back on you!

Wired can get away with it why can’t I? Well, Wired exists for almost 20 years and has a faithful audience. Even I do visit it despite all the surveillance.

Imagine a site you don’t even know though that x-rays you on entry. That’s nono. That’s like a stranger looking under your skirt or opening your zipper.

Yet many sites add dozens or even hundreds of cookies to your local machine. Due to the European privacy law they have to ask for consent. It sometimes looks like this:

A wesbite ask for permission to use numerous cookies for all kinds of purposes. It's a dialog provided by Cookiebot.

This site is even one of the better examples – it just adds 60 cookies and only 34 of them “for marketing purposes”. I have seen worse ones. I rather decline in such cases.

Most of the cookies are used for tracking and are “unnecessary”.

By itself the idea and implementation by Cookiebot is a good one – it is also compliant with the EU privacy law. You just need to make sure to limit the number of cookies!

Lack of Privacy May Cost You Money

Some people by now think I’m one of those paranoid “tinfoil hat” nerds. I’m not advocating hiding in the woods with a bunch of survivalists though.

We need to use technology ind the information age or we’ll get left behind. I still want to be able to use the Web but I don’t want to be exploited by “big data” while at it.

Even in case you don’t care for privacy you will surely admit that some of the ramifications are unsettling. You surely care about money, don’t you?

Based on your Internet activity or data you share you may see a different pricing online. In simple terms: lack of privacy will cost you more money!

Based on your profile some products won’t even get shown to you while others may be overtly promoted.

For example inner city Afro-American youths are much more likely to see ads for alcohol while they won’t get shown real estate ads.

Who are you on the Web? Are you yourself with your

  • ethnicity and skin color
  • religion or lack there of
  • sexual orientation and gender
  • age and birth date
  • political bias and affiliations

or a carefully crafted persona made to be as likeable as possible?

Each of those very common “data points” may have some negative impact on your online and real life. For example I can’t see a lot of online content because I’m in Germany.

A lot of video content and music is limited to the United States or at least blocked in Germany as Google’s YouTube fails to pay German copyright holders.

  • To see videos or listen to music that is “not available in your country” you have to use a so called VPN or Virtual Private Network like Proton VPN that hides your actual whereabouts.
  • Muslims are not only subject to discrimination on the street but also on the Web. Are you sure you want to disclose that you believe in Allah?
  • Studies show that women are much more likely to be harassed online than men. Homosexuals or transgender people are exposed to even more hate speech.
  • Some online stores show different pricing depending on your background and browsing history. You may pay more than others without realizing it.
  • Age is clearly often used to decide whether you can access some online content. It’s not only about adult topics though.

Ageism becomes also apparent when you have to pay more because you’re older. Just think insurance policies.

When you follow the news online you might have noticed over the recent years that increasingly more and more people tend to agree with you. That’s the so called filter bubble.

Algorithms notice what you like and only show you items based on your political preferences. In the US this has led to a completely unexpected presidency by Donald Trump.

Many websites collect data like age or gender routinely. Just to sign up somewhere or to buy something you need to give away vital information on yourself.

Yet an increasing number of people – potential customers – are not fond of such random data collection even you have a – or despite of your – huge privacy policy.

Thus not only Internet users who end up on your site may lose money. You – the website owner – may lose money too when you neglect actual data protection.

I don’t even refer to the actual threat of getting sued when you don’t comply with local privacy laws . I mean losing customers because of lack of or downright disregard for privacy.

You Have a Privacy Policy? Awesome! Can I Read it?

Excerpt from Ecosia privacy policy explaining clearly that they us no third party trackers like Gogle Analytics

A privacy policy – some people already regard it as a “profiling policy” – may actually backfire. Most such policies are written in undecipherable legalese only lawyers can understand after many hours of study.

Even by skimming such wall of text written in alien language many people get scared. You only share the data with your partners, advertisers and everybody else? Back off!

There are some examples of actually human readable privacy policies out there you can not only understand but they don’t scare you with their message. Sadly they are few and far in between.

It seems the more complicated a privacy policy is the more suspicious activity of dubious data sharing it is hiding. Just think Facebook.

While Facebook may get away with an egregious tracking record because it is too big to fail and indispensable for most people your website may not.

Now with the new European privacy law every website serving visitors from the EU needs a plain language privacy policy. I have created one for myself. Yes, you can read it!

Know When to Ask for Private Information and Permission or if at All

One creepy yet wide-spread practice many business websites adhere to is asking strangers for private information or access to their mailboxes.

This is the infamous “I fcuk on the first date” mentality. You enter a site and have to close a pop up asking for your mail address, a notification permission dialog and a consent notice for cookies.

Some sites also ask for permission to have access to location data – that is where you are or where you live. No, thank you!

When I asked my colleagues on Twitter most of them mentioned these issues. Even as Web professionals with technical know how they are ostracized by such sites.

Thank you Dean Cruddace, Zack Neary-Hayes, Andrew Akesson for feedback and additional insights. Click their names for their feedback!

The privacy-oriented Firefox browser already allows to block most of these requests altogether out of the box:

Firefox brwoser permission that allow to block notification rquests by default along with pop-ups.

It’s a shame! These features can be very useful when used responsively. They are not just tools for stalkers and creepy marketers!

Personally I’m trying my best to reconcile website optimization and privacy needs. You need analytics to know how your website works and whether people really view it.

What you don’t really need unless for selfish reasons is tracking across domains and similar privacy breaches. Learn from the Facebook debacle!

Stop treating website visitors like easy prey. The goal of a website is not to make money off unsuspecting visitors by tricking them into giving away their data.

You need to convince people that you offer value. That’s marketing. Attract privacy oriented visitors to gain more business!

Stalking and selling private data is crime even if the laws aren’t applicable everywhere yet. Just because some sites still can get away with it does not mean it’s OK.

The post How to Make Your Website Less Scary and Intrusive to Gain More Business appeared first on SEO 2.0.

Read More »

The Complete Anchor Text Guide for 2019 (NEW)

What is anchor text and how do you optimize it for maximum SEO performance?

That’s what this guide is all about.

You’re going to learn:

  1. How to optimize your anchor text so you get SEO better results with less backlinks
  2. Why modeling your competitors anchor text is dangerous
  3. My advanced anchor text optimization strategy

Ready to become an anchor text optimization pro?

Let’s jump in.

Need more backlinks? Get access to 7 untapped link building techniques.

What is Anchor Text?

Anchor text is the visible and clickable text in a link.

Here’s how it looks in HTML:

<a href=”http://www.mywebsite.com/”>my cool website</a>

The phrase “anchor text” applies to all hyperlinks including internal and external links.

This guide you’re going to read is about external anchor text.

Why?

Because it alone can make or break your SEO performance.

But before you can learn how to optimize anchor text, you need to know the history.

A Short History of Anchor Text

Let’s rewind the clock back to 2011.

In those days, keyword-rich anchor text was all the rage.

anchor text in 2011

All you had to do was build links with keyword-rich anchor text and you would rank fast.

Then Google decided to launch the first Penguin update on April 24, 2012.

In short:

Your rankings got wrecked if you were using keyword-rich anchor text.

Google Penguin Penalty

When I say “your”, I mean “my” rankings.

Many of my sites got wrecked with the first Penguin update. That’s because I was nothing more than an algorithm manipulator back then.

But in hindsight:

I’m grateful for that update because it forced me to actually learn SEO.

It also motivated me to figure out how to optimize anchor text in a safe and effective way.

But before I can show you the methods, you need to know the basics.

The first thing you need to know are the different types of anchor text you can use.

9 Different Types of Anchor Text

Here are nine different types of anchor text you can use from the safest to least safe:

1. Branded Anchors

“Branded” anchors are any anchor that uses your brand name. Here are some examples:

  • Gotch SEO
  • Nathan Gotch
  • Nathan Gotch SEO

Sentence sample: “You can learn search engine optimization at Gotch SEO.”

Branded anchors are the safest type of anchor text if you’re using a branded domain.

If you have an exact or partial match domain, you need to be careful.

More on this in a later section.

To see the power of “branded” anchors, look at any big brand’s link profile.

Here are some examples for you:

Nordstrom Anchor Text

Best Buy Anchor Text

WebMd anchor text

2. Generic Anchors

“Generic” anchors are often calls-to-action (CTAs) like:

  • click here
  • go here
  • this website

In a sentence: “Go here if you are looking for SEO information.” – “Go here” is the generic anchor text.

3. Naked Link Anchors

Any anchor that uses a raw URL is considered a “naked” link.

Here are some examples:

  • https://www.gotchseo.com
  • www.gotchseo.com
  • gotchseo.com

4. No Anchor Trick

This is a tricky little strategy I see big brands doing.

Whether purposefully or not, it’s a good idea. Here’s what it looks like:

no text anchor text

The easiest way to build “noText” anchors is through images. You can also “forget” to include an anchor within an article.

5. Image Anchors

Google uses an image’s ALT text as the anchor text for a linked image.

6. Brand + Keyword Anchor

You can diversify your anchor tex profile by combining your brand name and your target keyword.

For example:

  • Gotch SEO Ahrefs
  • anchor text by Gotch SEO
  • Gotch SEO link building tactics

7. Keyword Variations

Keyword variations are perfect for diversifying your anchor text profile. They can also help drive more topical relevance to your page.

Here are some examples of my target keyword is “backlinks“:

  • what are backlinks
  • where to get backlinks
  • how do you build backlinks

8. Partial Match Anchors

Partial match anchors are similar to keyword variations. The key difference is that you’re adding generic words around the primary keyword phrase.

Here are some examples for the target keyword “anchor text”:

  • this anchor text guide
  • cool anchor text article
  • read this anchor text post

9. Exact Match Anchors

Exact match anchors are the king of all anchor text.

They have the power to increase your rankings, but also have the power to get your site penalized.

An exact match anchor is an exact match of whatever your target keyword is for the target page.

Example: if “buy backlinks” is my target keyword, then my exact match anchor would be “buy backlinks“.

Those are all the anchor text variations I recommend using.

Now let’s talk about something that will make your anchor text even more powerful.

How to Build Relevance Without Exact Match Anchors

Google recently filed a patent about: “anchor tag indexing in a web crawler system”.

Google Anchor Text Patent

Don’t worry:

I’m not going to bore you to death.

Once you get past all the technical language, there’s one big idea in this patent:

Google uses the text around your link (“annotation text“) to assign its topical relevance. It will also use the anchor text of the link to accomplish that goal as well.

So what does that mean for you?

The good news is that it’s simple and makes perfect sense.

Here’s what you need to do:

  1. Find relevant websites in your industry
  2. Get backlinks within content that’s relevant to your target page
  3. Try to place your primary keyword close to your link
  4. Use intelligent anchor text
  5. Understand that relevance is the key to link building success

Here are some examples for the target keyword “anchor text” (link placement in red):

“If you are looking for more information about anchor text go here right away.”

Anchor text is the visible and clickable text in a link. For more in-depth information you should read this article from Gotch SEO.”

“For more in-depth information about anchor text I highly recommend this article: http://www.gotchseo.com/anchor-text/.”

Here’s the big takeaway:

Place your links in relevant content and place your primary keyword close to your link.

Now the question is:

What anchor text should you use?

The Right Anchor Text to Use

There’s a common trend among the hundreds of penalized websites I’ve audited.

They almost always have aggressive anchor text percentages.

Over-Optimized Anchor Text

In fact:

It’s the first place I look when someone needs help with a penalized website.

You can analyze your anchor text distribution right now with Ahrefs.

Open up their Site Explorer tool and enter your domain.

Ahrefs site explorer

Then click on “Anchors”.

Ahrefs Anchor Text Section

Now before I go any further, I need to cover an important question:

Should You Copy Your Competitors Anchor Text?

My friend and fellow SEO, Matt Diggity recommends this strategy.

Matt Diggity Anchor Text

In short:

You should look at the anchor text percentages of the ranking competitors and model them.

I agree with the philosophy, but there are some issues.

1. Modeling the anchor text percentages of an authoritative website is risky.

Authoritative websites have built a lot of trust. Therefore, it’s more “acceptable” for them to have high percentages of keyword-rich anchors.

If you copy them, you’ll likely get wrecked.

Why?

Because your site doesn’t have the authority and trust to do so.

2. It doesn’t take into account site-wide anchor text percentages.

How do some websites get away with aggressive exact match anchor text?

It’s because they:

  1. Have the authority and trust to do so
  2. Have a high percentage of unoptimized anchors in their site-wide profile

That’s why you can’t model them on a page-by-page level.

You have to model their entire anchor text profile.

Let’s take Moz.com for example.

If you examine their site-wide anchor text, you’ll see that most of it is branded or generic.

Moz Anchor Text

This gives them the leeway they need to be more aggressive on the page level.

Here’s their anchor text for their “anchor text” page:

Moz anchor text percentage

One could argue that this is “aggressive”. But it’s acceptable because they have authority, trust, and unoptimized anchor text across the site as a whole.

59% is huge on its own, but it’s small relative to their entire site:

example

So what’s the big takeaway?

You shouldn’t model your competitor’s anchor text if you don’t have authority, trust, and unoptimized anchor text across your site.

Then what should you do?

Follow these percentages and you’ll never need to worry about penalties and you’ll still get awesome results.

Safe Anchor Text Percentages That Work

These percentages are not a law. Do what’s best for your situation. However, these ratios have helped me A) avoid getting penalized and B) still drive huge results (without being risky).

  • 70% = Branded Anchors
  • 20% = Naked Link Anchors
  • 5% = Generic Anchors
  • < 5% = Partial Match Anchors
  • < 1% = Exact Match Anchors

Now let me introduce you to a strategy I developing for building a natural anchor text profile.

How to Rank With Less Backlinks

My strategy is “Anchor Text Cycling” and it works like this:

Step 1: Hit your target page with an exact match anchor

You might be wondering:

“Isn’t it dangerous to hit a brand new website with an exact match anchor?”

Nope.

Sites get penalized for their link profiles as a whole.

Not one or two links.

It’s like saying eating McDonalds one time is the reason why someone is overweight.

We know that it’s the combined effect of a bad diet over a period of time that leads to obesity.

The same goes for your link profile!

Now that I got the weird analogy out of the way…

Why do I use an exact match anchor for my first backlink?

A) I want to see how the site reacts

B) I want to establish what my site or the target page is about right away

The next step is to:

Step 2: Hit your site with unoptimized anchor text variations

Use branded, naked link, generic, and keyword variations at this stage.

Step 3: Track your rankings and watch the progress

You can get a decent read on your performance within 1-3 months. If your page isn’t moving, then you need to reassess. The answer is rarely to use more exact match anchor text.

Often times pages aren’t performing well because:

  • The site isn’t strong enough
  • The page doesn’t have enough backlinks
  • The backlinks you do have are low-quality
  • The page is poorly built

If you feel you have a 10/10 on those three facets, then:

Step 4: Hit your site with another exact match anchor (if necessary)

Repeat this process over-and-over until you rank.

The entire point of using anchor cycling is to build a diverse and natural anchor profile.

Do you want to know the secret to a having a “natural” anchor profile?

The key is to be random and avoid patterns.

Take a look at these two examples:

Site #1 is the typical link profile you’ll see when someone is spamming anchor text.

Google’s algorithm can easily conclude that this site is building artificial links. A manual reviewer wouldn’t even be necessary.

Site #2 has a natural and diverse anchor profile.

It will outrank Site #1 with fewer backlinks and less keyword-rich anchor text.

It doesn’t matter whether you’re doing black, grey, or white hat SEO, this concept still applies to you.

Now that you understand how to cycle your anchors, let me show you WHERE to place your anchors.

Where to Place Your Anchor Text

Every link opportunity requires a unique anchor text strategy.

This is by far the biggest misstep I see people make.

They apply anchor text ratios and use cycling, but they place their anchor text the wrong way.

I would argue that this section is the most important part for you to understand.

Here is a list of every link type and exact anchor text you should use:

Exact & Partial Match Anchors

Concentrate your exact and partial match anchor text on your best link opportunities. “Best” link opportunities are often those that are difficult and expensive to get.

Here are some examples:

1. Niche Relevant Guest Posts

If you can score a link in the body of the content, then use a keyword-rich anchor. But if you can only get an author bio link, then use a branded (or unoptimized) anchor text.

guest posting dead

Google cracked down on spammy guest posting practices a long-time ago. One consistent footprint is when people jam keyword-rich anchors in author bios. Don’t do this.

2. Resource Pages

Resource pages are great opportunities to place exact or partial match anchors.

resource page

Using the title of your resource is an effective (and safe) route as well.

3. Private Blog Networks (PBNs)

I don’t mess around with PBNs anymore, but you should use exact and partial match anchors. You invested money to buy the expired domain, so you should try to get the most out of it.

Go here to get 7 of my favorite untapped link building techniques.

Where to Place Unoptimized Anchors

All foundational (non-editorial) links should unoptimized anchor text.

Some examples include:

  • Paid Directories
  • Traditional Directories
  • Business Citations
  • Press Releases
  • Niche Relevant Blog Comments
  • Web 2.0s
  • Forum Signatures
  • Site-Wide Sidebar or Footer Links
  • Profile Links
  • Social Bookmarks
  • Donations/Sponsorships

Only use branded, naked, or generic anchor text on these link types.

Now let me show you how to handle anchor text whenever you’re using 301 redirects.

How to Handle Anchor Text from a 301 Redirect

A 301 redirect is permanent redirect. All of the anchor text from the page or website being redirected gets transferred to the new page.

Why does this matter?

It matters because spammy links and aggressive anchor text will transfer to your new page or domain.

Here’s what you need to do:

  1. Avoid redirecting any garbage to your website
  2. Build unoptimized anchors to the new website

This will help combat anchor text over-optimization issues when you redirect.

How to Optimize Tier Two and Tier Three Anchor Text

You can be more liberal with your keyword-rich anchors on tier two and three.

Just don’t go crazy.

Here are tier two ratios that I stick to:

  • Naked links: 40%
  • Generic: 30%
  • LSI, Partial-Match: 25%
  • Exact Match: 5%

The same principles explained above apply here.

Concentrate your keyword-rich anchors on your best opportunities.

Here are my tier three ratios:

  • Naked links: 10%
  • Generic: 10%
  • LSI, Partial-Match: 50%
  • Exact Match: 30%

How to Optimize Anchor Text for Exact & Partial Match Domains

I always tell beginners to avoid exact or partial match domains. Why? Because they are super easy to over-optimize.

But if you already have one, let me show you how to optimize your anchor text the right way.

Here are the ratios I aim for with an exact match domain:

  • Naked Links: 70%
  • Generic: 20%
  • LSI: 5%
  • Partial Match: 1-5%
  • Branded / Exact Match Anchors: 1-5%

You’ll notice a few different things here.

First, I reduce the “Branded” anchor text percentages.

Why?

Because EMDs aren’t actually branded. They’re just keywords in a domain.

I also increase the amount of naked and generic anchor text. This helps combat over-optimization.

But there’s one thing you can do to make your life a lot easier.

Even if your domain is “exactmatchanchortext.com”, you can still create a brand name. So although your domain is “exactmatchanchortext.com”, your brand name could be “Growth Crew”.

Then you can build branded anchor text (“Growth Crew”) without getting any worries.

How to Fix Over-Optimized Anchor Text

Websites rarely get penalized for just having over-optimized anchor text.

That’s because aggressive anchor text strategies are often accompanied by other poor practices. The truth is that websites get penalized because they’re doing a lot of stuff wrong.

Google doesn’t have “Penguin” or “Panda” updates anymore.

However, these concepts still apply. You could theoretically have over-optimized anchor text and not get penalized if you’re doing everything else right.

But like I said:

This is rare.

Most websites with over-optimized anchors also have low-quality links, low-quality content, poor UX, and will often be too aggressive with on-page SEO.

You’ll need to run an SEO audit to tackle these issues.

But let’s just focus on over-optimized anchor text.

The first question is:

Should You Disavow?

I wanted to clear the air before I explain the strategies:

Disavowing is an absolute LAST resort.

There are two situations when it’s warranted:

  1. Your site is getting hit with negative SEO or its been hacked
  2. It’s impossible to remove of the links you built

That said:

You can recover from ALL algorithmic penalties without ever needing to use this tool.

If you have a manual penalty, then it may be necessary (I’ll be addressing manual penalties after this section).

Here are 3 ways to fix over-optimized anchor text without disavowing:

3 Ways to Fix Over-Optimized Anchor Text

1. Remove Links With Commercial Anchor Text (From Spammy Sites)

Before you go buck-wild removing links, listen carefully:

If you remove any link from your profile, your site’s authority will decrease.

EVEN IF THE LINK SUCKS.

You must replace the low-quality links you removed with high-quality links.

Just removing links won’t recover your rankings.

In fact:

It may hurt even more because you’re decreasing your site’s authority.

As I discussed in the section about anchor text placement, you should only use commercial / keyword-rich anchor text on “power” link sources.

If you made the unfortunate mistake of building backlinks on low-quality sources with keyword-rich anchors, then you have two options:

  1. Go back and delete the links.
  2. If you can’t delete the links, then disavow.

After you’ve done all you can do to remove keyword-rich anchors from spammy sources, then it’s time to jump into anchor text dilution.

Please notice that I said “spammy” sources.

Don’t go on a link-deleting spree because you will end up deleting links that are actually helping you.

2. Dilute Your Anchor Text

This is the most common technique and it does work in many cases.

All you are going to do is build unoptimized backlinks to your website with nothing but branded, generic, and naked link anchors.

Absolutely no keyword-rich anchors!

Use the “foundational” links I explained in the previous section to dilute your anchor text profile.

Before Penguin 3.0, you could counter over-optimized anchors by using the dilution technique.

Diluting works, but you need to consider this:

Websites aren’t just penalized because of the existence of low-quality links.

They’re penalized because the ratio of low-quality to high-quality links in their profile is off.

In other words:

You need more quality links to offset the low-quality links. That’s why some websites can “get away” with low-quality links in their profile.

Here are some link types to help you dilute your anchor text profile:

1.) Strong, Relevant Backlinks

These are the most costly, but are also the best for improving your overall link profile. Get as many as you can. Relevancy is king with or without a penalty.

2.) Business Directories / Local Citations

Take the time to create business listings because it’s a perfect way to send quality link unoptimized anchors to your site. Just make sure your NAP-W information is consistent.

3.) LEGIT Social Profiles

Go out and build REAL social profiles for your website. Populate the profiles with your information, content, etc.

Only use the best sites: Twitter, Facebook, Google +, Tumblr, etc.

These sites will give you a nice mix of NoFollow and Follow unoptimized anchors and will build trust for your website.

4.) High-Quality Press Release Distribution

Create a quality press release and distribute it through a quality channel. Press releases are great for quickly getting unoptimized anchors from many different IPs. You will also build diversity in your link profile because of the NoFollow / Follow mix.

3. Use the 301 Penalty Recovery Trick

Remember in an earlier section when I said that anchor text travels through a 301 to the new website?

You are going to use this to your advantage to recover from an algorithmic penalty.

There are two variations of the 301 penalty recovery trick.

Variation #1: Expired Domain > Penalized Domain

For the first variation, you will need to find quality expired domains through a service like Freshdrop. Make sure the anchor profile is clean and has very little keyword-rich anchors.

Look for a domain with branded, generic, and naked link anchors or what some people may refer to as a “natural” anchor profile.

Although you are not necessary using this expired domain for ranking purposes, it’s not a bad idea to find one with solid metrics.

Preferably DR 20 +, DA 20 +, and a Trust Flow of 10 +.

If you can find a domain that is relevant to yours, it will work even better.

Then just 301 redirect the expired domain to your penalized site and track the results.

Variation #2: Links > Penalized Site > New Website

For variation two, you are going to start fresh with a new website, but you are going to piggyback off the authority of your penalized domain (hopefully it has some).

Step 1: Buy a new BRANDED website (avoid EMD, PMD)

Step 2: Build high-quality branded backlinks to your new website
Use business directories, quality paid directories, niche relevant blog comments, press releases, etc. Only use branded anchor text.

Step 3: Build unoptimized backlinks to your penalized domain
The goal is to decrease the percentage of keyword-rich anchors. The percentage all depends on the severity your particular situation.

If you have 70% keyword-rich anchor text, then you will need to get that down to at least 30% or less. If you have 30% keyword-rich anchors, then you will want to get it down to 10% or less.

Step 4: Check your anchor profile with Ahrefs, Majestic, or Open Site Explorer.

Step 5: If you have cut your keyword-rich anchors in half, then it’s time to redirect your penalized site to the new domain.

This works because, A) you are using a new branded website with an established branded text anchor profile, B) you have improved the anchor profile of the penalized website, and C) you have transferred authority to a new domain.

Frequently Asked Questions About Anchor Text

What About Manual Penalties?

It’s less of a headache and much more cost-effective to just start a new website, than to try to get out of a manual penalty.

Like I always tell my clients, getting a manual penalty is like going to prison for a felony.

Although you may get out of prison one day, you are still always going to have the felony on your record.

Do you really think Google wipes the slate clean for a website that was previously given a manual penalty?

Even if you do get the penalty lifted, ranking your site will never be easy and it’s always going to feel like “something is holding you back”.

Changing Anchor Text: Red Flag?

I’ve heard this question a lot and I’ve actually done this many times.

The answer is: sometimes.

I know it’s an annoying answer.

If you want to raise a red flag, then change a non-keyword-rich anchor to a keyword-rich anchor. Google may or may not devalue a link when this happens, but it’s definitely not worth it.

Just leave the link how it is, and go acquire a link somewhere else.

Situations that won’t throw up a red flag:

1. Changing a keyword-rich anchor to a non-optimized anchor – going back and decreasing your amount of commercial anchor text can often increase your rankings. If your exact match anchors or keyword-rich anchors are above 25%, then you may want to consider unoptimizing some of those.

2. Deleting an anchor and placing it within a different part of the article – if you decide to change an anchor, you should always place the new one in a different part of the article. When you do this, it makes the anchor / link “new” in Google’s eyes.

You will be losing an aged link, but in theory, starting with a fresh link.

IMPORTANT: You should only change anchor text under extreme circumstances.

Most over-optimized anchor text issues can be solved with the techniques I listed in the penalty recovery section.

Anchor Text Tracking: Don’t Shoot Blindly at the Target

Tracking your anchor text is absolutely critical if you are building backlinks.

If you aren’t, you are basically shooting at a target blindfolded.

There are two ways to track:

1. Manually input every anchor text into a Google Sheet or Excel file

2. Use a tool like Linkio to streamline your anchor text monitoring and optimization.

I used to track my anchor text manually, but Linkio streamlines the entire process.

They pull anchor text from Ahrefs, Moz, and Google Search Console, which makes life a lot easier.

Regardless:

You need to be tracking your anchor text and optimizing throughout the entire link building process.

Last Word About Anchor Text

Anchor text is one very small piece of the SEO puzzle. Most websites are littered with on-site SEO, SEO content, and backlink quality issues.

These areas need to be tackled first. Once you’ve optimize those well, then dive into anchor text optimization. It can give you the edge you need to dominate your competitors.

Now to you:

Do you feel like an anchor text pro now?

Let me know your thoughts and questions below.

Read More »

Keyword Not Provided, But it Just Clicks

When SEO Was Easy

When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip.

What was the secret to a total newbie making decent income by accident?

Horrible spelling.

Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did.

The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller’s guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then.

In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn’t so their mistargeting was a huge win for me.

Search Gets Complex

Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one “one simple trick” to rank.

Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals.

Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence.

There was the Vince update in 2009, which boosted the rankings of many branded websites.

Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations.

Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin.

There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with.

There is a reason eBay is more popular than auction.com, Google is more popular than search.com, Yahoo is more popular than portal.com & Amazon is more popular than a store.com or a shop.com. When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand.

Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals.

Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense.

Modeling Web Users & Modeling Language

PageRank was an attempt to model the random surfer.

When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals.

Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy’s In the Plex: How Google Thinks, Works, and Shapes our Lives

“On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the “Long Click” — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query.”

Of course, there’s a patent for that. In Modifying search result ranking based on implicit user feedback they state:

user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.

If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market.

And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time.

One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views.

Attempts to manipulate such data may not work.

safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn’t conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal).

And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns.

Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user’s implicit feedback may be more valuable than other users due to the details of a user’s review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T.

Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country.

Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: “We continued to protect the value of authoritative and relevant links as an important ranking signal for Search.”

The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)…that’s an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links.

Best Practices Vary Across Time & By Market + Category

Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior.

Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain.

Nearly a decade after Panda, eHow’s rankings still haven’t recovered.

Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a “I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link.” Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn’t rank until after she took her reciprocal links page down.

With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO “best practices” which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market.

If you set your computer location to India Bing’s search box lists 9 different languages to choose from.

The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others.

Search engines can only rank what exists.

“In a lot of Eastern European – but not just Eastern European markets – I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn’t enough content as compared to the percentage of the Internet population that those regions represent. I don’t have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I’m not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we’re gonna go you know we don’t have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you’re number one. the moment somebody actually goes out and creates high quality content that’s there for the long haul, you’ll be out and that there will be one.” – Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016

Impacting the Economics of Publishing

Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with.

“Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. … more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all.”

As mainstream newspapers continue laying off journalists, Facebook’s news efforts are likely to continue failing unless they include direct economic incentives, as Google’s programmatic ad push broke the banner ad:

“Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow.”

Google is offering news publishers audience development & business development tools.

Heavy Investment in Emerging Markets Quickly Evolves the Markets

As the web grows rapidly in India, they’ll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India.

“If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes.

RankBrain

RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn’t state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for.

In a recent interview in Scientific American a Google engineer stated: “By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been.”

Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page.

And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc.

Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn’t like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank.

This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere.

Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal.

Neural Matching

Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query.

The above Tweets capture what the neural matching technology intends to do. Google also stated:

we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching.

To help people understand the difference between neural matching & RankBrain, Google told SEL: “RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches.”

There are a couple research papers on neural matching.

The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper

  • “Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements.”
  • “the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching.”
  • “according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document.”
  • “Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer”ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query.”
  • “Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model.”

The paper mentions how semantic matching falls down when compared against relevancy matching because:

  • semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas,
  • relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document)

Here are a couple images from the above research paper

And then the second research paper is

Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions
“interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here.”

That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent.

Here are a couple images from the above research paper.

For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one:

“Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms.”

I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page.

Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers.

The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one’s way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in.

Pre-penalized Shortcuts

Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: “The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites.”

Have you considered using a PLR package to generate the shell of your site’s content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth.

Navigating the Maze

When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc … and thus SEO was ultimately going to be a malinvestment.

Back then he was at least partially wrong because influencing search was so easy.

But in the current market, 16 years later, we are near the infection point where he would finally be right.

At some point the shortcuts stop working & it makes sense to try a different approach.

The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation.

If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit … then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind.

If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis.

This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait.

Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus.

Those who write content that only they could write are more likely to have sustained success.

Read More »
Scroll to Top