Share on facebook
Share on google
Share on linkedin
Share on twitter

AMP'd Up for Recaptcha

Beyond search Google controls the leading distributed ad network, the leading mobile OS, the leading web browser, the leading email client, the leading web analytics platform, the leading mapping platform, the leading free video hosting site.

They win a lot.

And they take winnings from one market & leverage them into manipulating adjacent markets.

Embrace. Extend. Extinguish.

AMP is an utterly unnecessary invention designed to further shift power to Google while disenfranchising publishers. From the very start it had many issues with basic things like supporting JavaScript, double counting unique users (no reason to fix broken stats if they drive adoption!), not supporting third party ad networks, not showing publisher domain names, and just generally being a useless layer of sunk cost technical overhead that provides literally no real value.

Over time they have corrected some of these catastrophic deficiencies, but if it provided real value, they wouldn’t have needed to force adoption with preferential placement in their search results. They force the bundling because AMP sucks.

Absurdity knows no bounds. Googlers suggest: “AMP isn’t another “channel” or “format” that’s somehow not the web. It’s not a SEO thing. It’s not a replacement for HTML. It’s a web component framework that can power your whole site. … We, the AMP team, want AMP to become a natural choice for modern web development of content websites, and for you to choose AMP as framework because it genuinely makes you more productive.”

Meanwhile some newspapers have about a dozen employees who work on re-formatting content for AMP:

The AMP development team now keeps track of whether AMP traffic drops suddenly, which might indicate pages are invalid, and it can react quickly.

All this adds expense, though. There are setup, development and maintenance costs associated with AMP, mostly in the form of time. After implementing AMP, the Guardian realized the project needed dedicated staff, so it created an 11-person team that works on AMP and other aspects of the site, drawing mostly from existing staff.

Feeeeeel the productivity!

Some content types (particularly user generated content) can be unpredictable & circuitous. For many years forums websites would use keywords embedded in the search referral to highlight relevant parts of the page. Keyword (not provided) largely destroyed that & then it became a competitive feature for AMP: “If the Featured Snippet links to an AMP article, Google will sometimes automatically scroll users to that section and highlight the answer in orange.”

That would perhaps be a single area where AMP was more efficient than the alternative. But it is only so because Google destroyed the alternative by stripping keyword referrers from search queries.

The power dynamics of AMP are ugly:

“I see them as part of the effort to normalise the use of the AMP Carousel, which is an anti-competitive land-grab for the web by an organisation that seems to have an insatiable appetite for consuming the web, probably ultimately to it’s own detriment. … This enables Google to continue to exist after the destination site (eg the New York Times) has been navigated to. Essentially it flips the parent-child relationship to be the other way around. … As soon as a publisher blesses a piece of content by packaging it (they have to opt in to this, but see coercion below), they totally lose control of its distribution. … I’m not that smart, so it’s surely possible to figure out other ways of making a preload possible without cutting off the content creator from the people consuming their content. … The web is open and decentralised. We spend a lot of time valuing the first of these concepts, but almost none trying to defend the second. Google knows, perhaps better than anyone, how being in control of the user is the most monetisable position, and having the deepest pockets and the most powerful platform to do so, they have very successfully inserted themselves into my relationship with millions of other websites. … In AMP, the support for paywalls is based on a recommendation that the premium content be included in the source of the page regardless of the user’s authorisation state. … These policies demonstrate contempt for others’ right to freely operate their businesses.

After enough publishers adopted AMP Google was able to turn their mobile app’s homepage into an interactive news feed below the search box. And inside that news feed Google gets to distribute MOAR ads while 0% of the revenue from those ads find its way to the publishers whose content is used to make up the feed.

Appropriate appropriation. 😀

Thank you for your content!!!

The mainstream media is waking up to AMP being a trap, but their neck is already in it:

European and American tech, media and publishing companies, including some that originally embraced AMP, are complaining that the Google-backed technology, which loads article pages in the blink of an eye on smartphones, is cementing the search giant’s dominance on the mobile web.

Each additional layer of technical cruft is another cost center. Things that sound appealing at first blush may not be:

The way you verify your identity to Let’s Encrypt is the same as with other certificate authorities: you don’t really. You place a file somewhere on your website, and they access that file over plain HTTP to verify that you own the website. The one attack that signed certificates are meant to prevent is a man-in-the-middle attack. But if someone is able to perform a man-in-the-middle attack against your website, then he can intercept the certificate verification, too. In other words, Let’s Encrypt certificates don’t stop the one thing they’re supposed to stop. And, as always with the certificate authorities, a thousand murderous theocracies, advertising companies, and international spy organizations are allowed to impersonate you by design.

Anything that is easy to implement & widely marketed often has costs added to it in the future as the entity moves to monetize the service.

This is a private equity firm buying up multiple hosting control panels & then adjusting prices.

This is Google Maps drastically changing their API terms.

This is Facebook charging you for likes to build an audience, giving your competitors access to those likes as an addressable audience to advertise against, and then charging you once more to boost the reach of your posts.

This is Grubhub creating shadow websites on your behalf and charging you for every transaction created by the gravity of your brand.

Shivane believes GrubHub purchased her restaurant’s web domain to prevent her from building her own online presence. She also believes the company may have had a special interest in owning her name because she processes a high volume of orders. … it appears GrubHub has set up several generic, templated pages that look like real restaurant websites but in fact link only to GrubHub. These pages also display phone numbers that GrubHub controls. The calls are forwarded to the restaurant, but the platform records each one and charges the restaurant a commission fee for every order

Settling for the easiest option drives a lack of differentiation, embeds additional risk & once the dominant player has enough marketshare they’ll change the terms on you.

Small gains in short term margins for massive increases in fragility.

“Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell

The other big issue is platforms that run out of growth space in their core market may break integrations with adjacent service providers as each want to grow by eating the other’s market.

Those who look at SaaS business models through the eyes of a seasoned investor will better understand how markets are likely to change:

“I’d argue that many of today’s anointed tech “disruptors” are doing little in the way of true disruption. … When investors used to get excited about a SAAS company, they typically would be describing a hosted multi-tenant subscription-billed piece of software that was replacing a ‘legacy’ on-premise perpetual license solution in the same target market (i.e. ERP, HCM, CRM, etc.). Today, the terms SAAS and Cloud essentially describe the business models of every single public software company.

Most platform companies are initially required to operate at low margins in order to buy growth of their category & own their category. Then when they are valued on that, they quickly need to jump across to adjacent markets to grow into the valuation:

Twilio has no choice but to climb up the application stack. This is a company whose ‘disruption’ is essentially great API documentation and gangbuster SEO spend built on top of a highly commoditized telephony aggregation API. They have won by marketing to DevOps engineers. With all the hype around them, you’d think Twilio invented the telephony API, when in reality what they did was turn it into a product company. Nobody had thought of doing this let alone that this could turn into a $17 billion company because simply put the economics don’t work. And to be clear they still don’t. But Twilio’s genius CEO clearly gets this. If the market is going to value robocalls, emergency sms notifications, on-call pages, and carrier fee passed through related revenue growth in the same way it does ‘subscription’ revenue from Atlassian or ServiceNow, then take advantage of it while it lasts.

Large platforms offering temporary subsidies to ensure they dominate their categories & companies like SoftBank spraying capital across the markets is causing massive shifts in valuations:

I also think if you look closely at what is celebrated today as innovation you often find models built on hidden subsidies. … I’d argue the very distributed nature of microservices architecture and API-first product companies means addressable market sizes and unit economics assumptions should be even more carefully scrutinized. … How hard would it be to create an Alibaba today if someone like SoftBank was raining money into such a greenfield space? Excess capital would lead to destruction and likely subpar returns. If capital was the solution, the 1.5 trillion that went into telcos in late ’90s wouldn’t have led to a massive bust. Would a Netflix be what it is today if a SoftBank was pouring billions into streaming content startups right as the experiment was starting? Obviously not. Scarcity of capital is another often underappreciated part of the disruption equation. Knowing resources are finite leads to more robust models. … This convergence is starting to manifest itself in performance. Disney is up 30% over the last 12 months while Netflix is basically flat. This may not feel like a bubble sign to most investors, but from my standpoint, it’s a clear evidence of the fact that we are approaching a something has got to give moment for the way certain businesses are valued.”

Circling back to Google’s AMP, it has a cousin called Recaptcha.

Recaptcha is another AMP-like trojan horse:

According to tech statistics website Built With, more than 650,000 websites are already using reCaptcha v3; overall, there are at least 4.5 million websites use reCaptcha, including 25% of the top 10,000 sites. Google is also now testing an enterprise version of reCaptcha v3, where Google creates a customized reCaptcha for enterprises that are looking for more granular data about users’ risk levels to protect their site algorithms from malicious users and bots. … According to two security researchers who’ve studied reCaptcha, one of the ways that Google determines whether you’re a malicious user or not is whether you already have a Google cookie installed on your browser. … To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages.

About a month ago when logging into Bing Ads I saw recaptcha on the login page & couldn’t believe they’d give Google control at that access point. I think they got rid of that, but lots of companies are perhaps shooting themselves in the foot through a combination of over-reliance on Google infrastructure AND sloppy implementation

Today when making a purchase on Fiverr, after converting, I got some of this action

Hmm. Maybe I will enable JavaScript and try again.

Oooops.

That is called snatching defeat from the jaws of victory.

My account is many years old. My payment type on record has been used for years. I have ordered from the particular seller about a dozen times over the years. And suddenly because my web browser had JavaScript turned off I was deemed a security risk of some sort for making an utterly ordinary transaction I have already completed about a dozen times.

On AMP JavaScript was the devil. And on desktop not JavaScript was the devil.

Pro tip: Ecommerce websites that see substandard conversion rates from using Recaptcha can boost their overall ecommerce revenue by buying more Google AdWords ads.

As more of the infrastructure stack is driven by AI software there is going to be a very real opportunity for many people to become deplatformed across the web on an utterly arbitrary basis. That tech companies like Facebook also want to create digital currencies on top of the leverage they already have only makes the proposition that much scarier.

If the tech platforms host copies of our sites, process the transactions & even create their own currencies, how will we know what level of value they are adding versus what they are extracting?

Who measures the measurer?

And when the economics turn negative, what will we do if we are hooked into an ecosystem we can’t spend additional capital to get out of when things head south?

Categories: 

More Articles

Why Isn’t My Website Ranking?

Reaching the top spots of the search engine results pages can improve your company’s exposure, establish you as an industry authority, and deliver the kind of ROI you want to see.

So why isn’t your website ranking at the top yet?

That’s the million-dollar question that every company has asked itself at one point or another. Unfortunately, there isn’t a million-dollar answer.

Or, at least, there isn’t one single million-dollar answer.

Your website might be struggling to reach the top for any number of reasons. So, if it feels like you’ve been at it for a long time without getting the best results, consider these possibilities:

You Haven’t Given It Enough Time

According to this Google Webmaster video, you need to be patient when it comes to SEO.

There’s no way around it. SEO is not an overnight process.

Things need to be done, and they need to be done in order. (It will do you no good to build a bazillion links if they all go to a website that isn’t able to convert the traffic.)

It takes time to research, create, and implement a strategy and begin producing content.

Then it takes more time for Google to realize changes have been made, and then you have to wait for the search engine to determine if you are really providing new value.

We’ve linked the above video before, but we like to back up our claims like this whenever possible.

In it, she states that, in general, it takes four months to a year to first implement improvements and then for you to start seeing results.

In other words, time is something that you can’t avoid. It’s hard, but SEO requires patience.

|

Get a free website report hbspt.cta.load(1849295, ‘c9b36dfc-d510-4236-b275-6ebe4c2be030’, {});  to see how you’re site is performing.|

Your Keyword May Not Mean What You Think It Should

This is our experience with the term “SEO.”

That keyword is obviously important for us (which we discussed in our blog about increasing traffic more than 200%). At one point, though, our homepage couldn’t be found for that term at all. We didn’t do anything to the page to make it drop out of the rankings, it was just gone one day.

So, we really started looking at the environment of search results page for that word.

We had been ranking on the second page for a long time, only ever able to crack #10 on occasion. And then it disappeared.

Our blog page, however, didn’t.

Turns out, when you really look at that first page, it’s easy to see that Google does not believe people searching for the term “SEO” are looking for a company to do the SEO for them.

Instead, they’re trying to provide as much information about what SEO is or how it is done.

That’s why (not including paid results) nearly every result on the first page is a guide to SEO, a discourse on what SEO is, and whether you need it.

Google itself is currently hogging at least 2 spots on the front page.

So, for a while, we were of the opinion that we simply couldn’t rank our homepage on the first page anymore. It’s simply not what Google considers an appropriate answer to the query of “SEO”.

Ranked 5

(Of course, just to prove us wrong, Google began ranking our homepage again. We’re currently the only agency site that ranks on the first page.)

The point of all this is that you may want to rank a certain page for a certain keyword, and despite all the good SEO you do, it never quite seems to break for you because the word means something different to you than it does to Google.

Take a closer look at the first page and see if maybe the types of results Google wants to show are different from the kind you want to provide.

Your Website May Look Great, but It’s Beauty is Only Skin Deep

You’ve paid a lot for a well-designed and very modern website. Everything about it looks great. You check it out on a daily basis just to appreciate the design a little more.

Why doesn’t Google appreciate it the way you do?

It’s possible that your design may look great but hasn’t accounted for every SEO angle.

There could be any number of things holding you back, including:

  • Duplicate content
  • Insufficient content
  • Old, untouched, stale content
  • Confusing navigation
  • Split keyword focus, so there are multiple pages that could rank for a given word
  • Incomplete basics, such as metas, alt tags, and schema markup
  • No blog or other way to continually refresh your content

You Have Gone Unnoticed by the Web at Large

Links are still a thing, and probably always will be. And while there’s a bajillion ways to get them, not all of them are worth having, and some could be detrimental.

You need a good portfolio of links from various sources. Some should be no-follow, some should be from really good sites, some should be just normal sites.

Buying links is out of the question. Link schemes, also bad. It’s important to find natural ways to increase the good links and avoid the ones that may raise flags for Google.

Of course, links could be a problem in another way, too.

For example:

You’ve Been Noticed by the Wrong Part of the Web

Some people may start using some unethical tactics on your website.

They don’t actually need a reason to do this. They may simply choose your website as a target to inject malicious code. Or they may start scraping your blog and republishing all your content. Or they could start building countless links to your site from questionable websites.

Google is pretty good at spotting a negative SEO attack, but you don’t want to risk the kind of penalty that could result from it, so stay on guard.

And this brings us to our next entry:

Google May Have Put You in the Penalty Box

A Manual Action penalty can completely remove your website from Google’s search results.

If you’ve previously ranked really well and then dropped significantly (if not completely out of the rankings), you may be on the wrong side of a penalty.

The only thing you can do is check the Google Manual Actions report and start correcting the issues.

If you have received one of these reports, it means a human reviewer has determined that your website is no longer compliant with Goggle’s guidelines.

What could cause a Manual Action? According to Google, you could be penalized if the reviewer determines that you have:

  • A hacked site – Someone has uploaded and hidden malicious content on your site.
  • User-generated spam – Spam comments on forums or blogs.
  • Spammy freehosts – A significant portion of the pages hosted on a service are spammy.
  • Spammy structured markup – Markup on the page is outside the guidelines, like making some content invisible to users.
  • Unnatural links to your site – If you have a lot of links deemed artificial, deceptive, or manipulative (including buying links or participating in link schemes), you may be penalized.
  • Unnatural links from the site – Same as above, but now they’re coming from your site.
  • Thin content with little or no added value – Your pages need to offer some real value to users.
  • Cloaking or sneaky redirects – I.e., showing different pages to users and to Google.
  • Pure Spam – This includes most of the stuff already mentioned, just more aggressive and overt.
  • Cloaked images – Manipulative use of images in order to get more clicks.
  • Hidden text and keyword stuffing – These are oldies but goodies, and apparently it’s still enough of a problem for Google to list it here.

You’re Treating Your Website Like It Exists in a Vacuum

SEO does not exist in a vacuum. It lives right here with its neighbors: content marketing, social media, PPC, and many other online endeavors.

|

Get your social media checklist and discover new social opportunities. hbspt.cta.load(1849295, ‘875f130c-1d1e-4770-b2b1-5c86c97afdb3’, {}); |

We’re not just trying to sell you on our other services, here. Online marketing is simply a far more holistic strategy than it once was.

Elements like time on site, number of clickthroughs, number of mentions around the internet, and engagement on social media all figure into your rankings.

Granted, some of them affect your rankings more indirectly than others, but they all play an important role.

Google is looking at more signals than just those you’re putting out on your website.

We’re not saying that posting regularly on Facebook is directly connected with better rankings. We’re saying that building a community on social media will lead to more people visiting your site, clicking your links, and reading your content.

And all of those things can lead to more than just better rankings.

These days, though, the most common reason your website isn’t ranking is probably:

Your Competition Is Doing More Than You

You’re not doing SEO in a vacuum.

Search engine optimization is no longer a secret technique that your competition has never heard of. It’s an integral part of modern marketing, and for every link you’re not building and every blog you’re not publishing, your competition is.facing the competition

So, if you start to think you don’t need it, or if you start to think you’ve done enough, then there is someone working really hard to show you how it really should be done.

If you’ve been dipping your toes into SEO, you may start to see a little movement up the rankings. However, the simple fact is that those who wade out into the deeper end of SEO are going to see more results than you.

Remember, when you start doing SEO, it isn’t you against Google. It’s you against all your regular competition. And you’re all aiming to set up shop in a very limited space.

Even Small Changes Can Make a Difference

You may be thinking that there is a lot to do to start climbing to the top of the search engine rankings.

And you’d be right.

However, that’s just more of a reason to get started now. You can begin by making some small and simple changes to your website, even before you start thinking about whether you should hire an agency or go in-house for your SEO.

If you’re not ranking yet, you may just need to give it a little more time.

Or…

You may need to dive into a serious overhaul of your website.

Either way, examine your current situation, start small, and begin making the changes you can.

You may be surprised how much they help.

 

SEO doesn’t exist in a vacuum. Download this ebook and see how all these components fit together to help you build your rankings.

Download Your Free eBook hbspt.cta.load(1849295, ’39cc3499-4586-421f-989f-136e8321d90e’, {});

The post Why Isn’t My Website Ranking? appeared first on SEO.com.

Read More »

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior “quality” updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” – Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse – a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google’s algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don’t add value to the ecosystem. Doorway pages don’t either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

  • the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
  • the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
  • higher ad loads which also lower organic reach (on both search & social channels)
  • the rise of programmatic advertising, which further gutted display ad CPMs
  • the rise of ad blockers
  • increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site’s revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else – like Facebook in its time – this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they’ve pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They’ve recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. … When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries.”

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They’ve created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

“It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. … Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm.”

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. … Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. … Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. … As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can’t really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

  • If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
  • If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can’t change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. … The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there’ll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there’ll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Update: It appears a major reverberation of this update occurred on April 7th. From early analysis, Google is mixing in showing results for related midtail concepts on a core industry search term & they are also in some cases pushing more aggressively on doing internal site-level searches to rank a more relevant internal page for a query where they homepage might have ranked in the past.

Categories: 

Read More »

SEO Basics: How to start with SEO?

You’ve had this great idea. You’ve built this amazing website. And then, you want that website to attract visitors! You want to be found! What to do? How do you get started with SEO? How do you start with SEO on a brand new site? In this blog post, I’ll talk you through the 7 steps you need to take in order to get your SEO strategy up and running. 

So, you’ve started your first site and you want it to be found, so you can share your thoughts and views with the world. What to do? Let’s go through the steps of starting with SEO!

  1. Install Yoast SEO

    Provided that your website is on WordPress, installing Yoast SEO should be the first step in your SEO strategy. Our Yoast SEO plugin will help you to make sure your website is crawlable and findable. Yoast SEO will immediately take care of some technical SEO issues, just by being installed on your website. Besides that, our plugin will help you to construct your website in such a way that Google will understand and rank it. We offer a free and a premium plugin. If you’re just starting out, you’ll probably won’t need our premium version yet, although it can already save you some valuable time.

  2. Get that first link

    Google needs to know your website exists. And, in order for Google to know about your awesome new site, you need at least one external link towards your site. The reason for this: Google crawls the web. It follows links and saves all the webpages it finds in a very large database called the index. So, if you want to get into that index, you need (at least) one external link. So make sure to get that link from an external website!

  3. What do you want to rank for?

    Make sure to attract the right audience to your website. Who are your customers? For whom did you build this website? What terms do your customers use when searching on Google? Find out as much as you can about your audience.

    SEOs refer to this stage as doing your keyword research. This is a hard and important phase. There are a lot of helpful tools that make doing keyword research easier. Some of these tools are free, others are rather expensive. While these tools will make the difficult phase of keyword research easier, you should remember that you can’t outsource your keyword research to a tool. You really need to think about your audience and about the search terms they are using. Take your time for this phase. It is crucial. If you do your keyword research correctly, you’ll come up with a long list of keywords you want to rank for.

  4. Set realistic goals

    For a new site, it is rather hard to rank high in the beginning. Older sites already have a history, established their authority and a lot of links pointing towards them. That means that Google’s crawlers come by more often at older sites. For a new site to rank, you’ll always need to be a little patient. And remember: some search terms will be out of reach for a new site because there’s too much competition. Trying to rank for [WordPress SEO] will be rather hard for any new blog, because of some fierce competition on that term from Yoast.com.

    If you’re just starting with your site, try to aim at ranking for long-tail keywords. Long-tail keywords are keywords that are longer and more specific and have far less competition than the popular head keywords. After a while, when your site starts to rank for the long-tail keywords, you could try and go after the more head keywords.

  5. Internal linking

    As I already mentioned in step 2, Google follows links. Google also follows the links on your website, your internal linking structure. It crawls through your website following the internal linking structure of your site. That structure is like a guide to Google. Make sure your internal linking structure is flawless. That’ll help with your ranking. 

    If you start with a brand new website, you’ll probably don’t have much content yet. This is the perfect time to think about structure. Now it is relatively easy. It’s like having a new closet and you haven’t started buying clothes. Now is the time to think about the things you want to put on the top shelf and which items you want to hide in the back of your closet. So, decide which pages are most important to you. What are the pages you want to rank with? Make sure that these pages have the most internal links pointing towards them.

  6. Start writing

    In order to get ranked, you need to have content. A very important step in how to start with your SEO is to write amazing content for all these search terms you want to be found for. The content analysis in the Yoast SEO plugin will help you to write that content. Our analysis will help you to write a text that is both readable and SEO friendly.

    While you’re writing, make sure to use the words you want to be found for. Use them in headings and in the introduction and conclusion of your text. After writing your text, you should optimize your SEO title and your meta description. The Yoast SEO plugin will help you to do all these things.

  7. Get those links!

    External links are important to get your site in high positions in those search engines. But gathering those external links can be a hard process. Make sure to write content people want to share and link to. Original ideas and great, valuable content will make the chance that people would want to share that much bigger.

    Of course, reaching out to people and making them aware of your awesome website and product can be a good strategy to get those external links too. Read more about a successful link building strategy or find out what link building is first.

And then what?

The truth is that SEO is more than these 7 steps. This is only the very beginning, the steps you take to start with SEO. In order to get longterm high rankings in the search engines, you need to do hard work. Your content has to be amazing, your site structure has to remain flawless (and that’s challenging when your site is growing) and you’ll have to keep earning those external links. The only way to really do that, in the long run, is to make sure that your audience enjoys visiting your website. If you want to rank the highest, make sure your site is the very best. Good luck!

Read more: wordPress SEOL the definitive guide »

The post SEO Basics: How to start with SEO? appeared first on Yoast.

Read More »
Scroll to Top