Share on facebook
Share on google
Share on linkedin
Share on twitter

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior “quality” updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” – Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse – a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google’s algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don’t add value to the ecosystem. Doorway pages don’t either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

  • the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
  • the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
  • higher ad loads which also lower organic reach (on both search & social channels)
  • the rise of programmatic advertising, which further gutted display ad CPMs
  • the rise of ad blockers
  • increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site’s revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else – like Facebook in its time – this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they’ve pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They’ve recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. … When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries.”

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They’ve created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

“It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. … Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm.”

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. … Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. … Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. … As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can’t really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

  • If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
  • If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can’t change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. … The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there’ll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there’ll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Update: It appears a major reverberation of this update occurred on April 7th. From early analysis, Google is mixing in showing results for related midtail concepts on a core industry search term & they are also in some cases pushing more aggressively on doing internal site-level searches to rank a more relevant internal page for a query where they homepage might have ranked in the past.

Categories: 

More Articles

How to Learn SEO in 2019 (According to 130 Experts)

What’s the fastest way to learn SEO?

I asked 130 real SEO experts to find out.

Here is the question we asked 130 different experts:

“If you had to start over, what steps would you take to learn SEO?”

Out of the 130 answers, there were many common recommendations for learning SEO.

Top 8 Ways to Learn SEO

  1. Take Action
  2. Learn the Fundamentals
  3. Get a Mentor
  4. Focus on Your Strengths
  5. Invest in an SEO Course
  6. Work at an SEO Agency
  7. Don’t Chase Algorithms
  8. Go to Conferences

1. Take Action

Getting your “hands dirty” and getting more real life experience is the best way to learn SEO (and any skill).

2. Learn the Fundamentals

Follow top SEO blogs, SEO news websites, join SEO communities, and go to SEO conferences to get exposure to all facets of the industry.

3. Get a Mentor

Having an experienced mentor/SEO expert guide you through the process can save you years of trial and error.

4. Focus on Your Strengths

For example, if you are a good writer, then focus on the content-side of SEO. Or, if you are a social butterfly, then focus on the relationship building and PR-side of SEO/link acquisition.

5. Invest in an SEO Course

Quality courses are structured and easy to follow. This is important when you’re starting out because there is an unlimited amount of SEO information online. That makes it challenging to put all the moving parts together.

A proven course trusted by over 700 students can help (hint, hint… Gotch SEO Academy)

6. Work at an SEO Agency

Working at an agency will help you grow as a professional at an accelerated rate. Not only will you be forced to learn SEO quickly, but you will also get exposure into how agencies operate.

7. Don’t Chase Algorithms

Focus on developing timeless skills such relationship building, persuasion, sales, SEO content creation, copywriting, and marketing in general.

8. Go to Conferences

SEO and marketing conferences are the single best way to network with other likeminded individuals. Also, you get to be around people who are more accomplished than you are. This forces you to elevate your gain and to learn.

Start Learning Now With These SEO Resources

You can start learning SEO right now by diving into these resources.

General SEO

What is SEO? – You probably already know that SEO is an acronym for “Search Engine Optimization”. That’s important, but there’s a lot more to know how this industry than you think. This guide is a good place to start.

SEO Strategy – Every successful SEO campaigns starts with a well-designed strategy. This guide will show you a simple 4-step strategy that has worked across nearly every vertical.

SEO Audit – The best way to start an SEO campaign is with a detailed audit. Properly performing an SEO audit will help you identify what to focus on. This is critical because not all SEO actions are created equally.

SEO Mistakes – I’ve made many mistakes throughout my SEO career, but you can avoid all of them by reading this guide.

Best CMS for SEO – Believe it or not, WordPress isn’t the #1 CMS for SEO. Read our data-driven case study to find out which one is.

On-Site SEO

On-Page SEO – Most people think on-page SEO is just placing keywords on a page. Wrong! There’s so much more you need to do to achieve perfect on-page SEO. This guide will show you our 80-point checklist.

Title Tags – Understanding how to optimize title tags is fundamental SEO skill. Real SEO pros know that it’s much more than just jamming keywords into the title. Use this guide to learn the nuances of optimizing title tags.

301 Redirects – 301 redirects can be used to exponentially grow your site’s authority. They can also destroy your website at the same time. Read this guide to learn how to use them to your advantage.

404 Errors – Not all 404 errors are “bad”. You just need to know how to handle them. This guide will show you the right way to design 404 pages. Then, I’ll show you how to find and fix them.

HTTP vs. HTTPS – Your website should be using an SSL certificate. This guide explains why.

Redirect Chains – Redirect chains can rob your page’s of precious link equity. The good news is that you can avoid it by reading this guide.

Content

How to Write a Blog Post – Your blog can be used a catalyst to grow your company. Don’t take it lightly!

Link Building

Backlinks – It’s nearly impossible to rank and achieve long-term SEO performance without quality links. This epic guide will give you the baseline knowledge you need to succeed with link building.

Expired Domains for SEO – Most people use expired domains to do shady, grey hat SEO. This guide teaches you how to leverage them in a safer way.

Anchor Text – Understanding how to optimize anchor text is fundamental link building skill. This guide will show you how to do it the right way, so you get killer results and avoid getting penalized.

Link Building Services – Finding quality link vendors is a tough. That’s why I teamed up with Chris Dreyer and analyzed over $1,000,000 worth of link orders from the most popular vendors. Our goal was simple: to find out what vendors produce the highest quality work. The #1 vendor might surprise you.

PBNs – Every experienced SEO has used or been tempted to use PBNs in their career. The question is: are they worth it? Find out when you read this article.

SEO Tools

Ahrefs – Ahrefs is the best SEO tool on the market. This guide will show you how to use it the right way to drive more organic search traffic to your website.

Google Search Console – There aren’t many free SEO tools that can compete with Google Search Console. It’s extremely powerful and can be used to explode your organic search traffic.

Making Money

How to Make Money Online – Client SEO is the single fastest way to grow your income online. This is the only guide you’ll ever need to get your first or more clients.

How to Build a Niche Website – Building niche website is the single best way to learn SEO and to build a new income stream. This is literally the only guide you’ll ever need.

Conversion Rate Optimization (CRO)

Squeeze Pages – Getting more organic search traffic is great, but converting that traffic is even more important. Did you know that ~98% of your website visitors are not ready to buy? That’s why it’s critical that you get them on your email list so you can nurture them. The first step is to build high-converting lead capture pages (also known as squeeze pages).

What’s Possible When You Learn SEO

I know it’s super internet marketer-like for me to say this, but learning SEO literally changed my life.

Here’s my story.

Read More »

Immagini e performance del sito, consigli di ottimizzazione

Consigli per ottimizzare immagini e prestazioni

A costo di sembrare ripetitivi, il nostro blog torna oggi a dedicarsi al tema degli interventi di ottimizzazione SEO e, in particolare, alle tecniche e agli strumenti che ci possono aiutare a ottimizzare le immagini e i contenuti multimediali pubblicati onpage, consentendoci di non pesare troppo sulle performance del sito.

Ribadiamo il valore delle immagini per la SEO

Il tema è piuttosto frequente su queste pagine, e non potrebbe essere diversamente, vista l’attenzione che da Mountain View invitano a tenere sia sul tema velocità dei siti e SEO (in ultimo, con il video di Martin Splitt e John Mueller di qualche settimana fa) che sul valore di Google Immagini come fonte di traffico organico alternativo e strategico. Inoltre, basta anche dare uno sguardo all’evoluzione della ricerca per immagini per capire quanto questo fronte sia caldo e quanto l’azienda americana ci stia investendo.

I 7 consigli per ottimizzare le immagini e le prestazioni

E quindi, la nuova guida pubblicata sul sito web.dev ci offre un percorso di 7 passi per riuscire a raggiungere l’obiettivo (anzi, gli obiettivi) di ottimizzazione delle immagini per impattare al minimo sulle prestazioni del sito, attraverso l’uso di alcuni strumenti e di tecniche che si rivelano in realtà abbastanza semplici anche per i meno esperti.

  1. Usa Imagemin per comprimere le immagini
  2. Sostituisci le GIF animate con i video per un caricamento più veloce della pagina
  3. Usa lazysizes per un caricamento lazyload delle immagini
  4. Offri immagini reattive
  5. Servi immagini con dimensioni corrette
  6. Usa immagini WebP
  7. Utilizzare i CDN delle immagini per ottimizzare le risorse

1.     Comprimere le immagini con sistemi specifici

È Katie Hempenius, software engineer presso Google, a descrivere il processo da seguire per usare in modo efficace Imagemin, un tool disponibile sia come CLI che come modulo npm che permette di eseguire la compressione delle immagini senza perdere in qualità. Il punto di partenza è piuttosto chiaro: le immagini non compresse gonfiano le pagine con byte non necessari, anche perché un utente medio non si renderà conto della differenza qualitativa.

Un passaggio su Google Lighthouse ci permette di verificare le opportunità di migliorare il caricamento della pagina comprimendo le immagini, mentre l’uso di Imagemin si rivela “una scelta eccellente” perché il software supporta un’ampia varietà di formati di immagine ed è facilmente integrabile con script di compilazione e strumenti di compilazione.

Senza andare troppo nel dettaglio, con il tool si può decidere se la compressione deve essere “lossy” o “lossless, ovvero quanti dati perdere; ovviamente, la compressione lossy riduce le dimensioni del file, ma a scapito della possibile riduzione della qualità dell’immagine, mentre nell’altra tipologia non c’è alcuna perdita. Secondo l’esperienza della Googler, però, proprio la modalità lossy è quella che generalmente si rivela migliore, perché permette di ridurre significativamente la dimensione dei file e di personalizzare i livelli di compressione per soddisfare le esigenze.

2.     Convertire le pesanti gif in altri formati di video

Il secondo punto è affidato a Houssein Djirdeh, altro membro del team di Google, che ci offre alcuni consigli tecnici relativi all’inserimento di animazioni in pagina: per rendere più veloce il caricamento, dice, meglio usare video al posto delle gif. I motivi sono molteplici, a cominciare dal fatto che le gif animate possono avere pesi davvero enormi.

Djirdeh però ci rassicura: convertire GIF di grandi dimensioni in video è un processo che richiede un lavoro relativamente rapido, che però offre ottimi risultati in termini di risparmio sulla larghezza di banda degli utenti. Anche in questo caso, il primo passaggio è controllare su Lighthouse l’effettiva presenza di gif che possono essere convertite, mentre il software suggerito per eseguire l’operazione è FFmpeg, che appunto trasforma l’animazione in un video MP4 o WebM (formato non supportato da tutti i browser).

Il risparmio tra peso di una gif e video è evidente: nell’esempio fornito, si passa dalla dimensione di 3,7 MB dell’animazione iniziale ai 551 KB della versione MP4, fino ai 341 KB della versione WebM. Attraverso alcuni comandi ed elementi (come ad esempio <video>) si possono impostare dei video che abbiano le stesse peculiarità delle gif, e che quindi hanno un play automatico, proseguono in loop (ma si può anche decidere di bloccare la riproduzione continua) e sono silenziosi.

3.     Usare lazysizes per il lazyloading delle immagini

È di nuovo Katie Hempenius a spiegare come usare il lazyloading per le immagini, ovvero il “caricamento lento” che consente di pianificare il caricamento delle risorse on page quando sono necessarie, anziché in anticipo, e quindi di evitare di dover attendere un caricamento completo anche di risorse non necessarie. Le immagini che sono fuori schermo durante il pageload iniziale sono candidati ideali per questa tecnica, e l’uso di lazysizes rende questa strategia molto semplice da implementare.

Lazysizes, la principale library per eseguire l’operazione, è uno script che carica in modo intelligente le immagini mentre l’utente si sposta attraverso la pagina e dà priorità alle risorse che l’utente incontrerà subito. È ritenuta una buona scelta perché è altamente performante per rilevare la visibilità degli elementi della pagina.

4.     Utilizza immagini responsive

Il consiglio successivo è facile da comprendere: è ancora Hempenius a evidenziare come l’uso di immagini responsive possa risolvere problemi di caricamento lento, perché “la pubblicazione di immagini di dimensioni desktop su dispositivi mobili può utilizzare 2-4 volte più dati del necessario”, mentre un approccio diverso ai contenuti multimediali – andando oltre al generale sito responsive – permette di servire immagini di dimensioni diverse su dispositivi diversi.

Oltre a consigliare alcuni strumenti per eseguire l’operazione (sharp npm package e ImageMagick CLI tool, ma anche  Thumbor e Cloudinary), l’esperta Googler risponde anche a una domanda frequente: “Quante versioni di immagine devo creare“? Ovviamente, non esiste una risposta unica o corretta, ma “è comune servire 3-5 diverse dimensioni di un’immagine: servire più dimensioni di immagine è meglio per le performance, ma occuperà più spazio sui server e richiederà la scrittura di un po’ di HTML in più”.

Gli attributi delle immagini

Importante è anche sapere quali attributi del tag <img> usare per raggiungere il risultato voluto:

  • srcset – è un elenco separato da virgole di nomi di file di immagini e dei loro descrittori di larghezza o densità; il width descriptor evita al browser di scaricare l’immagine per determinarne le dimensioni.
  • sizes – l’attributo size indica al browser la larghezza dell’immagine quando verrà visualizzata, ma non ha alcun effetto sulla dimensione del display (serve ancora CSS). Per determinare l’immagine da caricare, il browser utilizza queste informazioni, insieme a quanto conosce sul dispositivo dell’utente, ovvero le dimensioni e la densità dei pixel. A parità di altre condizioni, un display ad alta densità di pixel apparirà più nitido di un display a bassa densità di pixel, quindi bisogna usare più versioni se desideriamo fornire agli utenti immagini che siano il più nitide possibile a prescindere dai pixel del dispositivo.
  • src – l’attributo src fa funzionare questo codice per i browser che non supportano gli attributi precedenti, permettendo il caricamento della risorsa specificata dall’attributo src.

5.     Fornire immagini di dimensioni corrette

Proseguiamo ancora con consigli molto pratici, partendo da una dimenticanza piuttosto frequente: non ridimensionare un’immagine prima di aggiungerla alla pagina, che quindi sta sprecando i dati degli utenti e danneggia le prestazioni della pagina. Katie Hempenius suggerisce ancora di fare un check su Lighthouse per identificare le immagini che hanno dimensioni errate, ma chiarisce anche come fare a determinare le misure giuste.

Secondo la Googler, questo topic può essere “ingannevolmente complicato” ed esistono due tipi di approccio, uno buono e uno migliore: entrambi usano CSS units e consentono di migliorare le prestazioni, ma il secondo richiede sforzo e tempi maggiori per comprendere e implementare, a fronte comunque di risultati migliori.

  • L’approccio buono si basa su unità relative e assolute: le prime consentono di ridimensionare l’immagine a una misura che funzioni su tutti i dispositivi, mentre le altre indicano una corrispondenza precisa alle misure di visualizzazione. Il pannello DevTools Elements può essere utilizzato per determinare la dimensione in cui viene visualizzata un’immagine.
  • L’approccio migliore è più lungo: bisogna impostare innanzitutto unità assolute con gli attributi dell’immagine srcset e sizes, e successivamente quelle relative usando immagini responsive. Il punto di partenza è sempre che un’immagine che funziona su tutti i dispositivi sarà inutilmente grande per i dispositivi più piccoli, e quindi si possono impostare dimensioni più adatte ai vari dispositivi.

Ci sono anche strumenti che possono supportare questo lavoro, come ImageMagick per ridimensionare le immagini fino al 25% dell’originale e di scalarla fino ad adattarsi a “200px wide by 100px tall”.

6.     Sfrutta il formato WebP

Ancor più immediato il sesto suggerimento, ovvero usare il formato di file WebP al posto delle altre tipologie. I motivi sono evidenti: le immagini WebP sono più piccole delle loro controparti JPEG e PNG, con una riduzione media del 25–35% sulla dimensione dei file, che influisce quindi sulle dimensioni della pagina e migliora le prestazioni.

Anche due colossi come YouTube e Facebook usano questi file: su YouTube il passaggio alle anteprime WebP ha generato un caricamento delle pagine del 10% più rapido; passando al nuovo formato, Facebook ha invece registrato un risparmio del 25-35% rispetto alle dimensioni dei file JPEG e un risparmio dell’80% sui file PNG.

7.     Usare CDN di immagini per ottimizzare le risorse

L’ultimo consiglio di Katie Hempenius riguarda i CDN di immagini (Image content delivery networks), le reti di distribuzione di immagini che, secondo la Googler, sono eccellenti per l’ottimizzazione perché consentono un risparmio del 40/80% sulla dimensione del file.

I CDN sono sistemi specializzati nella trasformazione, ottimizzazione e consegna delle immagini. Per le immagini caricate attraverso loro, l’Url della risorsa indica non solo quale immagine caricare, ma anche parametri come dimensioni, formato e qualità, semplificando la creazione di varianti di un’immagine per diversi casi d’uso. Si distinguono dagli script di ottimizzazione delle immagini in fase di creazione perché creano nuove versioni delle immagini quando sono necessarie, e quindi sono generalmente più adatte alla creazione di immagini fortemente personalizzate per ogni singolo client rispetto ai build scripts.

Come scegliere un CDN di immagini

Esistono molte buone opzioni di reti CDN di immagini: alcuni fornitori hanno più funzionalità di altri, ma probabilmente tutti permettono di salvare byte sulle immagini e quindi di caricare le tue pagine più velocemente. Oltre ai set di funzionalità, altri fattori da considerare quando si sceglie un network sono il costo, il supporto, la documentazione e la facilità di installazione o migrazione.

L’articolo Immagini e performance del sito, consigli di ottimizzazione proviene da SEOZoom.

Read More »

User research: the ultimate guide

When you want to make some considerable improvements to your website, what’s the best place to start? At Yoast, we feel that research is always one of the most important things to do. It’ll help you find out what needs work, why it needs work, and of course, what you have to do to make things better.

Looking at our website data in Google Analytics, Google Search Console, and other SEO tools, is already part of our weekly activities. But, in order to dive deeper than just having a look at the plain data, we love to do user research. This is the part where you truly get to know your customers and where you’ll discover your blind spots when it comes to your own website. Within this ultimate guide, we’ll show you what types of user research could be valuable for your own website or company.

Table of contents

The top task survey

What kind of user research fits and complements the existing data always depends on the type of project you run. However, we believe that running a so-called ‘top task survey’ should always be the first step when you start doing user research. As you’ll able to use the outcomes of a top task survey within all future projects.

So, what is a top task survey exactly? To get to know why your customers visit your website, you’ll need to talk to your customers. And, how do you get to talk to your customers without actually having conversations with lots of customers? You could set up an online top task survey, which pops up on your visitor’s screen as soon as you like, either immediately after entering the website or after a couple of minutes. 

Questions in a top task survey

The popup is set up for one simple reason: to find out the purpose of their visit to your website. 

To make sure you’ll get valuable data out of your top task survey, it’s important to ask the right question. We recommend asking one open question: ‘What is the purpose of your visit to this website? Please be as specific as possible.’

With this open question, you give your customers the chance to truly say what they think. Closed questions make this harder, as you’ve already drawn up certain answers and then you risk missing other important thoughts or opinions your may customers have. And we know, analyzing the answers will take a lot more time, but when you do this right you’ll get the most valuable results.

Next to this one open question, it is possible to add one or two closed questions to take a closer look at your respondents. You might want to know the age or you want to know the type of customer it is. This data can be valuable to combine with the outcomes of the open question answers. In the top task surveys of Yoast, the second question is: ‘Do you have the Yoast SEO plugin?’. This is valuable information for us because we can see the difference between what free users are looking for on our website and what Premium users are looking for on our website. 

How often should you do a top task survey?

We recommend running your top task survey once a year. If you have a small website, you can choose to run the survey once every two years. The market you work in is always changing and customers always change, so every time you’ll run the survey, you’ll receive new, valuable information to work with and to improve on. 

The exit survey

The following two types of research we’ll discuss are more specific. And, which one you should perform at what time depends on the type of project you’re about to run. 

For example, you’ve noticed in your Google Analytics data that your most visited page has a very high bounce rate. This means that you need to know why visitors are leaving this fast. Couldn’t they find what they were looking for? Or did they find what they were looking for and are they already satisfied? You can get answers to these questions by running an exit survey on a specific page.

What is an exit survey?

An exit survey pops up when a visitor is about to leave the page. When a visitor moves their mouse cursor towards their browser bar, they are usually about to leave your website. So, this is the right moment to ask your visitor one or more questions. 

Questions in an exit survey

So, your visitor is about to leave, what do you want to know before they’re gone? We recommend keeping the survey short and simple: people are already leaving, so if you want them to fill out your survey, it needs to be short.

The question you ask depends on the page and the problem you want to solve. When you have a specific blog post with a high bounce rate, you might want to know if visitors have found the information they were looking for. The simple open question you could ask: ‘What information were you looking for today on our website?’. You could add a second closed question to see if the page fulfills your visitor’s needs: ‘Have you found what you’ve been looking for?’. A simple ‘yes’ or ‘no’ is enough to get this overview. 

Within our post ‘What is an exit survey and why should you use it?’ we’ve added some more examples of questions you could ask depending on the page that needs attention.

User testing

The third type of research is ‘user testing’. User testing is the type of research in which you get ‘live’ feedback from your clients because you actually see people using your website or product. At the beginning of this guide, we already mentioned ‘blind spots’ and user testing is the best way to find these blind spots. For example, you know exactly where to find what information or what product on your website, but visitors might not. Seeing testers struggle with finding the right page on your website can be embarrassing, but the good news is, when you know, you can improve!

Why should you do user testing?

User testing can give you very valuable insights during every stage of your process. When you’re creating a new product, it’s valuable to see what potential customers think of it, but it’s just as valuable to see what your customers think of your product that has already existed for over three years. Every test will give you new insights to work with!

User testing also guarantees that the test results are ‘real’. You can see for yourself how your customers use your website or product. The customer can’t ‘lie’ about things. And, that’s a big difference with survey respondents: they can say different things compared to what they really experience.

How to get started

There are three main types of user testing which you could use for your own website or product:

  • Live, moderated user testing: your testers will test with a moderator in the same room.
  • Remote, moderated user testing: your testers will test with a moderator while they’re in contact through a video call.
  • Remote user testing without a moderator: your testers will test without a moderator in their own time and space. They will record the test so you can watch it later.

Within our specific post ‘What is user testing and why should you do it?’, we explain more thoroughly what type should be used in what situation. 

After picking the method you want to use, it’s important to set up a clear plan with goals and a test scenario. Hereby you make sure the testers will follow the right path and will give you the insights you need. After that, it’s time to recruit your testers. Decide on what types of testers you’ll need to get the best test results. We recommend recruiting different types: young people, older people, experienced people, inexperienced people, etc. Think of all the types of people that might use your website or product now and in the future. 

Then it’s time to get started! Create a plan and start testing with your recruited people. Make sure you record all tests, making it easier to analyze the results. As it’s nearly impossible to remember everything that happened during the tests. 

Analyzing user research results

There is some difference in analyzing the results of surveys, such as the top task survey and the exit survey compared to the user testing results. 

When analyzing an online survey, we recommend to export all data to a sheet and to create categories for all answers. Place every answer into a specific category to get a clear overview of what the biggest problems are. After that, you can easily see what problems need to be prioritized and you can start thinking of improvements. Set up an action plan and start improving!

For the user tests, it can more difficult to create a couple of categories that fit the test results. Here, it’s easier to write a summary for every user test and to combine those at the end. Can you discover similarities? Can you combine some issues to improve more at once? It’s important to look at the bigger picture so you can make improvements that will have a big impact on the future user experience of your website or product!

User research tools

There are several tools in which you can create a top task survey or an exit survey (or other surveys!). We’re currently using Hotjar, but we’re planning to create our own design and implementing it with Google Tag Manager. Tools we know for setting up online surveys are:

  • Hotjar
  • SurveyMonkey
  • Mopinion

On their sites, they have a clear explanation of how to use these tools to perform an exit survey.

For user testing, your needs are different. Testing a website or a product, you’ll need a testing environment for your testers or a test product they are allowed to use. Besides, you’ll need recording material: for testing a website, you can easily record a screen session, but for testing a product, you’ll need to think of a recording set up. Do you have a good camera and a tripod, for example? Then you can get started! When you’re doing user tests more often, you can use an eye tracker as well to get more insights on how people are looking at your website or product, but it’s not necessary!

Are you already doing user research as well? Or have we convinced you to start doing user research? Let us know in the comments below!

Read more: Panel research for your business: Benefits and tips »

The post User research: the ultimate guide appeared first on Yoast.

Read More »
Scroll to Top