Use the hier­ar­chy of your site’s tech­ni­cal SEO needs to under­stand its tech­ni­cal growth and actu­al­iza­tion — and uncov­er what to do next to improve it.

What makes a site become the best site it can be? Healthy, func­tion­al sites that have reached their full SEO poten­tial have been opti­mized based on mar­ket and key­word research, E‑A-T, con­tent rel­e­vance to search intent, back­link pro­files, and more. But they all have one thing in com­mon: their tech­ni­cal SEO needs are met.

Your site’s tech­ni­cal SEO needs to form a hier­ar­chy. If needs low­er in the hier­ar­chy aren’t met, needs on the next lev­el are dif­fi­cult to ful­fill. Each lev­el responds to a dif­fer­ent require­ment in the world of search engines: crawla­bil­i­ty, index­a­bil­i­ty, acces­si­bil­i­ty, rank­a­bil­i­ty, and clickability.

Under­stand­ing what each lev­el of the pyra­mid involves helps make tech­ni­cal SEO look less intim­i­dat­ing with­out over­sim­pli­fy­ing its role in mak­ing a web­site great.

The foundations of technical SEO: crawlability

At the foun­da­tion of the pyra­mid of tech­ni­cal SEO needs is a URL’s crawlability.

Crawla­bil­i­ty con­cerns a URL’s abil­i­ty to be dis­cov­ered by search engine bots. URLs that are not crawlable might still be acces­si­ble to users nav­i­gat­ing your web­site, but because they are invis­i­ble to bots, they can’t appear in search results.

Crawlable URLs, there­fore, are:

  • Known to search engines. Search engines dis­cov­er URLs by crawl­ing links and read­ing sitemaps.
  • Not for­bid­den to bots. Most search engine bots will respect meta robots instruc­tions and direc­tives in a robots.txt file that ask them not to crawl cer­tain pages or directories.
  • Cov­ered by the website’s crawl bud­get. Less com­mon­ly, the “bud­get” accord­ed by Google’s algo­rithms is spent on oth­er parts of a site, caus­ing delays or prob­lems in get­ting a spe­cif­ic URL crawled.

The first step in a tech­ni­cal SEO audit, for exam­ple, is to uncov­er pages that can’t be indexed, and why. Some­times this is inten­tion­al, and some­times it’s an error and a quick win for SEO.

Sim­i­lar­ly, while crawl bud­get may seem eso­teric and dif­fi­cult to quan­ti­fy, the basic prin­ci­ple is that when the cost of crawl­ing is opti­mized and when pri­or­i­ty pages are pre­sent­ed first, more traf­fic can be gained through search engines. Tech­ni­cal SEO uses how pages are dis­cov­ered and pri­or­i­tized to pro­mote bet­ter crawl­ing; it lever­ages his­tor­i­cal data for crawl fre­quen­cy and past sit­u­a­tions that pro­voke increased crawl­ing activ­i­ty to improve cur­rent crawl rates.

Indexability

Just above crawla­bil­i­ty in the hier­ar­chy of tech­ni­cal SEO needs is indexability.

Index­able URLs are URLs that a search engine can include in a cat­a­log of pages that are avail­able to be pre­sent­ed in search results pages. Even when a URL has been crawled, var­i­ous prop­er­ties can pre­vent it from being added to the index.

In the most straight­for­ward sit­u­a­tions, pages can be pre­vent­ed from being indexed by meta robots and robots.txt directives.

But Google also choos­es not to index pages when a more author­i­ta­tive ver­sion exists for the same con­tent. This is the case when a bot dis­cov­ers the fol­low­ing elements:

  • Dupli­cate content.
  • Canon­i­cal declarations.
  • Alter­nate ver­sions such as print­able pages or mobile pages. (In the cur­rent move to a mobile-first index, mobile ver­sions are indexed instead of desk­top versions.)
  • Redi­rec­tions.

To ensure that the right pages can be indexed, tech­ni­cal SEO ver­i­fies that these ele­ments are cor­rect­ly set up and that they apply to the cor­rect pages.

Accessibility and website performance

An acces­si­ble URL is easy to dis­play or render.

A URL that is both crawlable and index­able might still be inac­ces­si­ble at the moment when a search engine’s bot attempts to crawl it. Pages and sites that rank but that have per­sis­tent acces­si­bil­i­ty prob­lems are often penal­ized in the search results.

Acces­si­bil­i­ty for bots — and for users — cov­ers a broad range of relat­ed topics:

  • Serv­er performance.
  • HTTP sta­tus.
  • Load time/page size.
  • JavaScript ren­der­ing.
  • Page depth in the site architecture.
  • Orphan pages.
  • Web­site resis­tance to spam and hacking.

The goal is to dis­cov­er the thresh­old at which acces­si­bil­i­ty and per­for­mance met­rics neg­a­tive­ly impact SEO per­for­mance and to ensure that all pages of a web­site meet at least that min­i­mum lev­el. Tech­ni­cal SEO, there­fore, uses tools to mea­sure any­thing from serv­er down­time or HTTP sta­tus served to bots and users, to the size of resources (CSS, JS, images…) trans­ferred when a page is request­ed or load time met­rics such as TTFB, FCP, or TTLB.

Tech­ni­cal SEO audits that con­clude you need links to cer­tain pages are often work­ing to elim­i­nate under­per­form­ing orphan pages and URLs with exces­sive page depth. Some will include acces­si­bil­i­ty for users; a page that does not work with a screen read­er can­not be used by many users, no mat­ter how great its con­tent or key­word optimization.

Once acces­si­bil­i­ty issues have been addressed, we can say that the basic tech­ni­cal SEO needs of a page are met. With­out them, page and web­site SEO suf­fer. As we con­tin­ue to move fur­ther up the hier­ar­chy of needs, we pass from block­ing fac­tors to fac­tors of improvement.

Rankability: the role of technical SEO in improving positions

Rank­a­bil­i­ty is the first of the two top lev­els of the pyra­mid that deal with opti­miza­tions. Instead of form­ing the foun­da­tions of SEO, they are some­times con­sid­ered advanced tech­ni­cal SEO.

Clear­ly, crawlable, index­able and acces­si­ble URLs can rank. Some can even rank well. How­ev­er, the aver­age URL will rank bet­ter with a lit­tle help.

Using links to boost rankings

Link­ing, whether inter­nal or exter­nal, trans­fers page impor­tance (and traf­fic!) from pop­u­lar pages to less pop­u­lar pages. This sec­ond group prof­its. Tech­ni­cal SEO strate­gies will, there­fore, exam­ine back­links to deter­mine the most advan­ta­geous pro­file, or use inter­nal link­ing struc­tures to pro­mote pages.

Not only can inter­nal links improve crawl rate (by rein­forc­ing fresh­ness when link­ing from new or updat­ed con­tent) and con­ver­sion (by fun­nel­ing users towards high-con­vert­ing and goal pages), but they also trans­fer page impor­tance and help build con­tent silos, two strate­gies for improv­ing page rank.

Improving positions with semantic optimization

Con­tent silos, cre­at­ed by inter­link­ing seman­ti­cal­ly relat­ed con­tent, help groups of pages rank bet­ter than a sin­gle page could. They build both depth and exper­tise while expand­ing key­word reach with pages that focus on long-tail key­words and seman­ti­cal­ly relat­ed concepts.

In some cas­es, it can also be worth­while to look at the per­ti­nence of a page with regard to the rest of the site, exam­ine key­word den­si­ty, num­ber of words, text-to-code ratio, and oth­er fac­tors that can be either red flags or con­tent qual­i­ty indi­ca­tors for a giv­en key­word group.

Clickability: the link between SEO and user behavior

The final lev­el of tech­ni­cal SEO opti­miza­tion con­cerns tech­ni­cal ele­ments that make it more like­ly for a user to click on your results.

Because of how search engines present results, this can include earn­ing cov­et­ed SERP loca­tions out­side of the nor­mal organ­ic results order and enrich­ing your URL listings.

Con­tent struc­ture, such as lists, tables, and head­ings, help search engines under­stand your con­tent and facil­i­tate dynam­ic cre­ation of fea­tured results, carousels and more.

Sim­i­lar­ly, for­mal struc­tured data, includ­ing Schema.org markup, enhance search list­ings with rich elements:

  • Bread­crumbs.
  • Star rat­ings.
  • Prod­uct infor­ma­tion (price, stock…).
  • Event infor­ma­tion (date, location…).
  • Recipe infor­ma­tion (thumb­nail, rat­ing, prepa­ra­tion time, calories…).
  • Site links to key pages on the same site.
  • Site search from the SERP.

Like­wise, videos and images with appro­pri­ate markup have an advan­tage in image and video search.

Rel­e­vance to search intent and con­tent unique­ness draw users. While these remain abstract con­cepts, the tech­ni­cal tools to ana­lyze and improve them are emerg­ing. Tech­niques such as machine learn­ing can be applied to search intent and user click behav­ior, while con­tent cre­ation aids such as AI are intend­ed to facil­i­tate the cre­ation of new content.

In the mean­time, tech­ni­cal SEO aims to use tech­ni­cal means to spot and sig­nal poten­tial dis­crep­an­cies in search intent or dupli­cate con­tent through sim­i­lar­i­ty analysis.

Final­ly, tech­ni­cal SEO ana­lyzes user behav­ior data com­bined with web­site char­ac­ter­is­tics in order to dis­cov­er cor­re­la­tions. The objec­tive is to cre­ate more of the sit­u­a­tions in which your web­site draws users. This strat­e­gy can uncov­er sur­pris­ing cor­re­la­tions between page or web­site struc­ture and user-based met­rics like bounce rate, time on site or CTR.

Implementing technical improvements

You don’t need a tech­ni­cal back­ground to under­stand or to meet the crit­i­cal needs at the bot­tom of the tech­ni­cal SEO hierarchy.

If there are issues that keep your site from being crawled, indexed, ranked, or clicked, SEO efforts in oth­er areas won’t be as effec­tive. Spot­ting and resolv­ing these issues is the role of tech­ni­cal SEO.

Where does your site fall on the hier­ar­chy of tech­ni­cal SEO needs?

SOURCE