|
Site Audit (Details)
|
XIRI.TN | Votre plateforme e-commerce de confiance en Tunisie
Errors: 585 | | Warnings: 3,513 | | Info: 9,967 | | Pages: 1,018 | | Date: Mar 26, 2024 |
Some of your resources return 4xx status codes.
4xx errors often point to a problem on a website. For example, if you have a broken link on a page, and visitors click it, they may see a 4xx error. It's important to regularly monitor and fix these errors, because they may have negative impact and lower your site's authority in users' eyes.
Some of your resources return 5xx status codes.
5xx error messages are sent when the server has a problem or error. It's important to regularly monitor these errors and investigate their causes, because they may have negative impact and lower the site's authority in search engines' eyes.
Good job! Your site's 404 error page is set up correctly.
A custom 404 error page can help you keep users on the website. In a perfect world, it should inform users that the page they are looking for doesn't exist, and feature such elements as your HTML sitemap, the navigation bar and a search field. But more importantly, a 404 error page should return the 404 response code. This may sound obvious, but unfortunately it's rarely so.
According to Google Search Console:
"Returning a code other than 404 or 410 for a non-existent page... can be problematic. Firstly, it tells search engines that there's a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site's crawl coverage may be impacted. We recommend that you always return a 404 (Not found) or a 410 (Gone) response code in response to a request for a non-existing page."
Well done! A robots.txt file is available on your website.
The robots.txt file is automatically crawled by robots when they arrive at your website. This file should contain commands for robots, such as which pages should or should not be indexed. If you want to disallow indexing of some content (for example, pages with private or duplicate content), just use an appropriate rule in the robots.txt file. For more information on such rules, check out http://www.robotstxt.org/robotstxt.html.
Please note that commands placed in the robots.txt file are more like suggestions rather than absolute rules for robots to follow. There's no guarantee that some robot will not check the content that you have disallowed.
Well done! An .xml sitemap is present on your website. Remember to resubmit it to search engines each time you make changes to it.
An XML sitemap should contain all of the website pages that you want to be indexed, and should be located on the website one directory structure away from the homepage (ex. http://www.site.com/sitemap.xml). In general, it serves to aid indexing. You should update it each time you add new pages to your website. Besides, the sitemap should follow particular syntax.
The sitemap allows you to set the priority of each page, telling search engines which pages they are supposed to crawl more often (i.e. they are more frequently updated). Learn how to create an .xml sitemap at http://www.sitemaps.org/.
None of your site's resources are restricted from indexing.
A resource can be restricted from indexing in several ways:
Good job! www and non-www versions on your website have been merged.
Usually websites are available with and without "www" in the domain name. Merging both URLs will help you prevent search engines from indexing two versions of a website.
Although the indexing of both versions won't cause a penalty, setting one of them as a priority is a best practice, in part because it helps funnel the SEO value from links to one common version. You can look up or change your current primary version in the .htaccess file. Also, it is recommended to set the preferred domain in Google Search Console.
There are no HTTP/HTTPS content duplication issues on your website.
There are 302 redirects found on your website. Please make sure that the use of these redirects is justified.
302 redirects are temporary, so they don't pass any link juice. If you use them instead of 301s, search engines may continue to index the old URLs, and disregard the new ones as duplicates. Or they may divide the link popularity between the two versions, thus hurting search rankings. That's why it is not recommended to use 302 redirects if you are permanently moving a page or a website. Stick to a 301 redirect instead to preserve link juice and avoid duplicate content issues.
There are 301 redirects found on your website. Check all your 301 redirects and make sure they point to relevant pages and are set up correctly.
301 redirects are permanent and are usually used to solve problems with duplicate content or to redirect certain URLs that are no longer necessary. The use of 301 redirects is absolutely legitimate, and it's good for SEO because a 301 redirect will funnel link juice from the old page to the new one. Just make sure you redirect old URLs to the most relevant pages.
There are pages with long redirect chains (longer than 2 redirects) found on your website. It is strongly recommended to avoid more than 2 redirects in a redirect chain to make sure that your page is indexed properly by search engines.
In certain cases, either due to bad .htaccess file setup or due to some deliberately taken measures, a page may end up with having two or more redirects. It is strongly recommended to avoid such redirect chains longer than 2 redirects since they may be the reason of multiple issues:
Well done! No Meta refresh redirects were found on your website.
Basically, Meta refresh may be seen as a violation of Google's Quality Guidelines and therefore is not recommended from the SEO point of view. As one of Google's representatives points out: "In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect)... This is currently not causing any problems with regards to crawling, indexing, or ranking, but it would still be a good idea to remove that." So stick to the permanent 301 redirect instead.
No pages with rel="canonical" tag or rel="canonical" HTTP header were found on your website.
In most cases duplicate URLs are handled via 301 redirects. However sometimes, for example when the same product appears in two categories with two different URLs and both need to be live, you can specify which page should be considered a priority with the help of rel="canonical" tags. It should be correctly implemented within the <head> tag of the page and point to the main page version that you want to rank in search engines. Alternatively, if you can configure your server, you can indicate the canonical URL using rel="canonical" HTTP headers.
Well done! None of your HTTPS pages have mixed content issues.
Your site's homepage is not mobile-friendly. Here is the list of problems that have been spotted by WebSite Auditor:
Mobile-friendliness is one of the ranking factors used by Google for mobile search engine results. If lots of your traffic comes from mobile devices, you should make your site mobile-friendly to get higher rankings and more traffic.
According to Google, the mobile-friendly algorithm affects mobile searches in all languages worldwide and has a significant impact in Google's search results. This algorithm works on a page-by-page basis - it is not about how mobile-friendly your pages are, it is simply are you mobile-friendly or not.
The algo is based on such criteria as small font sizes, tap targets/links, readable content, your viewpoint, etc.
Good job! None of your website's pages have multiple canonical URLs.
In case of multiple rel="canonical" declarations, Google will likely ignore all the rel=canonical hints, so your effort to avoid duplicate content issues may go useless.
Well done! Your website pages are free from Frames.
Frames allow displaying more than one HTML document in the same browser window. As a result, text and hyperlinks (the most important signals for search engines) appear missing from such documents. If you use Frames, search engines will fail to properly index your valuable content.
Some pages on your site have errors in HTML markup.
Search engine spiders find it easier to crawl through semantically correct markup, this is why site's HTML markup should be valid and free of errors. If for example, one of the tags has been left unclosed, the spiders may miss an entire chunk, thus reducing the value of the page.
The validation is usually performed via the W3C Markup Validation Service. And although compliance with W3C standards is not obligatory and will not have direct SEO effect, bad code may be the cause of Google not indexing your important content properly. It's recommended fix your pages' broken code to avoid issues with search engine spiders.
Some pages on your site have errors in CSS markup. Please, analyze the CSS issues on the page and fix the most critical ones.
The validation is usually performed via the W3C Markup Validation Service (W3C stands for World Wide Web Consortium).
CSS styles are used to control the design and formatting of the page, and to separate styles from the structure, which ultimately makes the page load faster.
Errors in CSS may be not that important to search engines, but they can lead to your page being incorrectly displayed to visitors, which, in turn, may affect your conversion and bounce rates. So, make sure the page is displayed as intended across all browsers (including mobile ones) important to you.
There are dofollow links to other sites on the website.
Please, revise your followed links and make sure they point to high-quality, relevant pages. It's recommended to remove any links to pages of questionable quality or accompany them with rel="nofollow". To add the nofollow attribute to a link, simply write rel="nofollow" within the <a href> tag.
For instance: <a rel="nofollow" href="example.com">Example</a>.
Simply speaking, dofollow links are links missing the rel="nofollow" attribute. Such links are followed by search engines and pass PageRank (please note that links can also be restricted from following in bulk via the nofollow <meta> tag).
While there is nothing wrong with linking to other sites via dofollow links, if you link extensively to irrelevant or low-quality sites, search engines may conclude your site sells links or participates in other link schemes, and it can get penalized.
There are broken images found on the website.
An image is considered broken if it returns a 4xx or 5xx status code, if the image URL is not specified in the <img> tag, if its URL leads to a non-image content or if there's a DNS error detected.
To fix the problem, make sure that the correct image URL is specified in the HTML code, and amend it if needed. Second, check if the image itself is still available on the server and restore it if possible. And finally, if there's no way to restore the broken image, simply replace it with another one or remove it from the content altogether.
While broken images on the website don't influence its search engine rankings directly, they definitely deserve being fixed for two reasons.
First and foremost, broken images are a crucial factor for user experience and may result in visitors bouncing away from the site without completing their goals.
And second, missing images may impede the site's crawling and indexation, wasting its crawl budget and making it hard for search engine bots to crawl some of the site's important content.
There are empty image alt attributes on your website. Create alternative texts that best describe your image's content and, if relevant, include target keywords.
Some pages on your site have size that's bigger than 3MB. Review these pages and decrease their size if possible.
If you have pages that are too big, this can influence user experience and even search engine rankings; so think about reducing the size of such pages and make them load faster.
Naturally, there's a direct correlation between the size of the page and its loading speed, which, in turn, is one of the numerous ranking factors. Basically, heavy pages load longer. That's why the general rule of thumb is to keep your page size up to 3MB. Of course, it's not always possible. For example, if you have an e-commerce website with a large number of images, you can push this up to more MBs, but this can significantly impact page loading time for users with a slow connection speed.
Dynamic URLs have been found on your website. Please, see if you can fix them.
URLs that contain dynamic characters like "?", "_" and parameters are not user-friendly because they are not descriptive and are harder to memorize. To increase your pages' chances to rank, it's best to setup URLs so that they would be descriptive and include keywords, not numbers or parameters. As Google Webmaster Guidelines state, "URLs should be clean coded for best practice, and not contain dynamic characters."
Too long URLs were found on your website. Please, see the pages with too long URLs and think whether you would like to shorten them.
URLs shorter than 115 characters are easier to read by end users and search engines, and will work to keep the website user-friendly.
There are some broken outgoing links on your website. This may result in poor user experience and signal to search engines that your site is neglected. Look through those links and fix them.
Broken outgoing links can be a bad quality signal to search engines and users. If a site has many broken links, they conclude that it has not been updated for some time. As a result, the site's rankings may be downgraded.
Although 1-2 broken links won't cause a Google penalty, try to regularly check your website, fix broken links (if any), and make sure their number doesn't go up. Besides, users will like your website more if it doesn't show them broken links pointing to non-existing pages.
There are pages on your site with more than 100 outgoing links. Check these pages and, if possible, decrease the number of outgoing links.
According to Matt Cutts (former head of Google's Webspam team), "...there's still a good reason to recommend keeping to under a hundred links or so: the user experience. If you're showing well over 100 links per page, you could be overwhelming your users and giving them a bad experience. A page might look good to you until you put on your "user hat" and see what it looks like to a new visitor." Although Google keeps talking about users experience, too many links on a page can also hurt your rankings. So the rule is simple: the fewer links on a page, the fewer problems with its rankings. So try to stick to the best practices and keep the number of outgoing links (internal and external) up to 100.
Some of your website pages are missing titles. Please, review the problematic pages and create unique titles for them.
If a page doesn't have a title, or the title tag is empty (i.e. it just looks like this in the code: <title></title>), Google and other search engines will decide for themselves which text to show as your page title in their SERP snippets. Thus, you'll have no control what people see on Google when they find your page.
Therefore, every time you are creating a webpage, don't forget to add a meaningful title that would also be attractive to users.
Some of your website pages have identical (or duplicate) titles. Please, review the problematic pages, and rewrite titles to make them unique.
A page title is often treated as the most important on-page element. It is a strong relevancy signal for search engines, because it tells them what the page is really about. It is, of course, important that title includes your most relevant keyword. But more to that, every page should have a unique title to ensure that search engines have no trouble in determining which of the website pages is relevant for a query. Pages with duplicate titles have fewer chances to rank high. Even more, if your site has pages with duplicate titles, this may negatively influence other pages' rankings, too.
Some of your titles are longer than 70 characters. Review and rewrite them.
Every page should have a unique, keyword-rich title. At the same time, you should try to keep title tags concise. Titles that are longer than 70 characters get truncated by search engines and will look unappealing in search results. Even if your pages rank on page 1 in search engines, yet their titles are shortened or incomplete, they won't attract as many clicks as they would have driven otherwise.
Some of your pages do not have meta descriptions. Review those pages and create meta descriptions where necessary.
Although meta descriptions don't have direct influence on rankings, they are still important while they form the snippet people see in search results. Therefore, it should "sell" the webpage to the searcher and encourage them to click through.
If the meta description is empty, search engines will decide for themselves what to include into a snippet.
Some of your website pages have identical (or duplicate) descriptions. Please, review the problematic pages, and rewrite descriptions to make them unique.
According to Matt Cutts, it is better to have unique meta descriptions and even no meta descriptions at all, than to show duplicate meta descriptions across your pages. Hence, make sure that your top-important pages have unique and optimized descriptions.
Some meta descriptions on your website are longer than 160 characters. They will be cut by search engines and as a result will look unappealing to users. Please review the problematic descriptions, and rewrite them.
For a meta description, use a maximum of 160 characters, while longer meta descriptions will get truncated by search engines.
Although meta descriptions don't have direct effect on rankings, they are still important while they form the snippet people see in search results. Therefore, descriptions should "sell" the webpage to the searchers and encourage them to click through. If the meta description is too long, it'll get cut by the search engine and may look unappealing to users.
Great job! All the pages in your project pass the Core Web Vitals assessment.
Core Web Vitals are a set of field metrics that measure important aspects of real-world user experience on the web such as loading, interactivity, and visual stability. The assessment is based on three core metrics:
•Largest Contentful Paint (LCP);
•First Input Delay (FID);
•Cumulative Layout Shift (CLS);
To pass the assessment, a page should meet the recommended targets at the 75th percentile for all of the above three metrics.
Great job! All of your pages' Performance Score is high.
Performance score summarizes the main page performance metrics including:
•First Contentful Paint;
•Speed Index;
•Largest Contentful Paint;
•Time to Interactive;
•Total Blocking Time;
•Cumulative Layout Shift;
A score of 90 or above is considered good. 50 to 90 is a score that needs improvement, and below 50 is considered poor.
Great job! No render-blocking resources have been found on your pages.
Render-blocking resources are scripts and stylesheets that are blocking the first paint of your page and increase the load time. To reduce the impact of the render-blocking URLs, it's recommended to deliver critical JS/CSS inline, defer all non-critical JS/styles, and remove anything unused.
Well done! All the images on your website are properly sized.
Ideally, your page should never serve images that are larger than the version that's rendered on the user's screen - serving images that are appropriately-sized helps saving cellular data and improves page load time.
Serving responsive images is the main strategy that lets you generate multiple versions of each image, and then specify which version to use in your HTML or CSS using media queries, viewport dimensions, and so on. Using image CDNs and replacing complex icons with SVG can also translate into 40-80% savings in image size.
Great job! No offscreen/hidden images on your website load above-the-fold.
Loading offscreen/hidden images may affect user experience by downloading data that isn't immediately required, increasing load time and delaying page's full interactivity.
Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower Time to Interactive.
Good job! All the images on your website are efficiently encoded.
Proper image optimization makes the pages load faster and reduces data consumption. Image compression can result in significant size savings. Below are some of the steps you can take to optimize images across your website:
•Using image CDNs;
•Сompressing images;
•Replacing animated GIFs with video;
•Lazy loading images;
•Serving responsive images;
•Serving images with correct dimensions;
•Using WebP images;
Great job! All images on your website are using next-gen formats.
JPEG 2000, JPEG XR, and WebP are image formats that have superior compression and quality characteristics compared to their older JPEG and PNG counterparts. Encoding your images in these formats rather than JPEG or PNG means that they will load faster and consume less cellular data.
Well done! There are no CSS files on your website that require minifying.
CSS files are often larger than they need to be - minifying CSS files can reduce network payload and improve your page load performance. CSS minifiers can perform clever optimizations and improve byte efficiency of your files by removing unnecessary whitespaces, comments, reducing values that have shorthand equivalents, etc.
Well done! There are no JavaScript files on your website that require minifying.
Minifying JavaScript files can reduce payload sizes and script parse time. Minification is the process of removing whitespace and any code that is not necessary to create a smaller but perfectly valid code file. You can use a JavaScript compression tool to get the job done.
Great! No unused CSS files have been found on your website.
By default, a browser must download, parse, and process all external stylesheets that it encounters before it can display or render any content to a user's screen. Each external stylesheet must be downloaded from the network. These extra network trips can significantly increase the time that users must wait before they see any content on their screens - this is how unused CSS slows down the performance.
Ideally, you should remove dead rules from stylesheets and defer the loading of CSS not used for above-the-fold content to reduce unnecessary bytes consumed by network activity.
Great! No unused JavaScript files have been found on your website.
Unused JavaScript can slow down your page load speed:
If the JavaScript is render-blocking, the browser must download, parse, compile, and evaluate the script before it can proceed with all of the other work that's needed for rendering the page.
Even if the JavaScript is asynchronous (i.e. not render-blocking), the code competes for bandwidth with other resources while it's downloading, which has significant performance implications. Sending unused code over the network is also wasteful for mobile users who don't have unlimited data plans.
Following best practices, you should detect and remove unused JavaScript to reduce bytes consumed by network activity.
Great job! There are no duplicate modules in JavaScript bundles found on your website.
JavaScript bundles on a majority of webpages are typically built by importing code from popular libraries, dependencies, and packages. This can often result in your page inheriting duplicate modules from multiple sources. Removing duplicate modules in JavaScript bundles ensures you don't ship unnecessary JavaScript code to your visitors.
Well done! None of your pages serve legacy JavaScript to modern browsers.
Unnecessary legacy code is often being shipped to modern browsers even though they have native support for modern JavaScript features (i.e, ES6). Ultimately, this increases the size of the JavaScript files being downloaded, parsed, and executed by the browser. This happens because developers often translate ES6 code to the ES5 standard to account for the small portion of users who may still be using browsers with no support or partial support for ES6. Avoid serving legacy JavaScript code (i.e., ES5 standard) to modern browsers so that you can prevent unnecessarily large JavaScript files from being downloaded by users.
Great job! There are no text-based resources on your website that require compression.
Enabling text compression allows you to serve smaller text-based resources like HTML, CSS, and JavaScript in the interest of faster file downloads. The larger those files are, the longer it takes to download them, and the longer your visitors have to wait to view the content on your page.
Great job! None of third-party resources used on your website require preconnecting.
As third-party resources (e.g., Facebook or YouTube embeds) do not originate from your domain, their behaviour is sometimes difficult to predict and they may negatively affect page experience for your users. Third-party requests can slow down page loads for several reasons like slow networks, long DNS lookups, multiple redirects, slow servers, poor performing CDN, etc. Establishing early connections to these third-party origins by using a resource hint like preconnect can help reduce the time delay usually associated with these requests.
Well done! Your website server has passed a response time audit (TTFB).
Server response time (Time to First Byte (TTFB)) is the time it takes for the browser to receive the first byte in response to the browser request. Reducing TTFB is critical to your visitors' page experience as it affects every resource referenced in your HTML, and directly influences how long it takes for your page to load. A slow TTFB may negatively affect your front-end resources as your visitors may only see a blank page while the browser is waiting for a response from the server.
Well done! None of your pages use multiple redirects (two or more redirects).
Redirects slow down your page load speed. When a browser requests a resource that has been redirected, it has to make another HTTP request at the new location to retrieve the resource. This additional trip across the network can delay the loading of the resource by hundreds of milliseconds. That is why, whenever possible, try to minimize using URL redirects. A page fails this audit when it has two or more redirects.
Well done! None of critical requests that are used on your pages require preloading.
When a page loads, the browser must download and parse the HTML in order to fetch the resources needed for the page content. Some late-loading resources (i.e. third-level requests) may be cascaded or called from within other requests while others may simply be large - both instances impact your page performance. Using link rel="preload" can help you prioritize important requests, resulting in a faster page load.
Great job! There are no large GIF-files on your website.
Large GIFs are inefficient for delivering animated content. By converting large GIFs to videos, you can save big on users' bandwidth. Consider using MPEG4/WebM videos for animations and PNG/WebP for static images instead of GIF to save network bytes.
Great job! Your Largest Contentful Paint image loads within the recommended time range.
Largest Contentful Paint (LCP) is a Core Web Vitals metric that measures when the largest content element in the viewport becomes visible. It can be used to determine when the main content of the page has finished rendering on the screen. For better user experience and page performance it is recommended to preload Largest Contentful Paint image to provide users with immediate results, as soon as the browser sends the first request to load the page.
Well done! There are no pages with the excessive size on your website.
Every time your page loads, the browser requests the server for your page resources. The total size of all these resources determines your network payload.
The higher the network payload, the larger the page; thus, the longer it takes to download the resources and load the page. Large payloads may also cost your visitors more money; for example, users may have to pay for more cellular data.
You can avoid enormous network payloads using the following strategies:
•Defer non-critical resources;
•Minimize the size of your resources;
•Cache relevant requests;
Great job! No uncached resources have been found on your website.
HTTP caching can speed up your page load time on repeat visits.
When a browser requests a resource, the server providing the resource can tell the browser how long it should temporarily store or cache the resource. For any subsequent request for that resource, the browser uses its local copy rather than getting it from the network.
Well done! All of your website pages meet the recommendations on the number of DOM elements.
Whenever a page loads, the browser downloads and parses the HTML before it begins building the Document Object Model (DOM) tree. This DOM tree contains all the HTML elements comprising the structure and content of the webpage.
A large DOM tree can negatively affect your page performance in the following ways:
•Unnecessarily increase the number of bytes transferred;
•Drastically slow down the rendering of your page
•Overwhelm the memory capabilities of your users' devices.
Great job! No processes keep the main-thread busy for too long.
The browser's renderer process is what turns your code into a web page that your users can interact with. By default, the main-thread of the renderer process typically handles most code: it parses the HTML and builds the DOM, parses the CSS and applies the specified styles, and parses, evaluates, and executes the JavaScript.
The main thread also processes user events. So, any time the main-thread is busy doing something else, your web page may not respond to user interactions, leading to a bad experience.
Consider reducing the time spent parsing, compiling and executing JS - this could be done by delivering smaller JS payloads.
Great job! No long main-thread tasks maximize the main thread work for your pages.
Events like HTML/CSS parsing, JavaScript parsing/execution, among others, are "tasks" that run on the main-thread (by default). When any one of these tasks run for longer than 50 ms (also known as a "long task"), they can delay both First Paint, and the time it takes for your page to become fully interactive.
Decreasing the number of long main-thread tasks improves your overall page performance by minimizing main-thread work.
Great job! There's no JavaScript on your website which execution time has to be reduced.
Avoiding large JavaScript libraries can help prevent a large JavaScript payload for your page. This, in turn, reduces the time needed by the browser to download, parse, and execute JavaScript files. It is always preferable to use smaller yet functionally equivalent JavaScript libraries to prevent a large JavaScript bundle size. Smaller JavaScript bundles can help you avoid a long main-thread blocking time.
Great job! There are no images on your website that do not have explicit wight and height.
Images and/or videos that do not have explicit width and height attributes can cause large layout shifts as your page loads. Those are usually re-sized using CSS (either on the image itself or the parent container). When this happens, the browser can only determine their dimensions and allocate space for them once it starts downloading the 'unsized images' and/or videos.
Specify both the width and height for your webpage's image and video elements to ensure that the correct spacing is used for images and videos.
Great job! No fonts on your website cause a flash of invisible text during font load.
Fonts are often large files that take a while to load. Some browsers hide text until the font loads, causing a flash of invisible text (FOIT).
The easiest way to avoid showing invisible text while custom fonts load is to temporarily show a system font. By including font-display: swap in your @font-face style, you can avoid FOIT - it tells the browser that text using the font should be displayed immediately using a system font. Once the custom font is ready, it replaces the system font.
Another option is using link rel="preload" as="font" to fetch your font files earlier.
Well done! No scripts on your website are injected via 'document.write()'.
Using 'document.write()' can delay the display of page content by tens of seconds and is particularly problematic for users on slow connections. Chrome therefore blocks the execution of 'document.write()' in many cases, meaning you can't rely on it.
Remove all uses of 'document.write()' in your code. If it's being used to inject third-party scripts, try using asynchronous loading instead.
Good job! No third-party resources have been found on your website that impact your pages' performance.
Third-party resources are often responsible for poor web performance, as they may contain scripts that block your main-thread and prevent other tasks from being performed.
They may also slow down your page for several other reasons, including slow server response times, slow DNS lookups, server response errors, among others, which can impact your page performance.
It is vital to limit the number of redundant third-party providers - at least try loading third-party code after your page has primarily finished loading.
Great job! No third party embeds are slowing down your pages' load.
Third-party resources are often used for displaying ads or videos and integrating with social media. The default approach is to load third-party resources as soon as the page loads, but this can unnecessarily slow the page load.
If the third-party content is not critical, this performance cost can be reduced by lazy-loading it. In that case, a facade is used in place of the third-party content until the user interacts with it.
Great job! No elements on your pages cause large layout shifts.
Large layout shifts can create a frustrating experience for your visitors as they make your page appear visually jarring, as page elements appear suddenly, move around, and affect how your visitors interact with the page. Avoiding large layout shifts is essential in creating a smooth and streamlined experience for your visitors.
Best practices to avoid large layout shifts:
•Specifying image dimensions;
•Reducing layout shifts caused by ads, embeds, and iframes;
•Avoiding inserting new content above existing content;
•Preventing the Flash of Invisible Text (FOIT);
•Avoiding non-composited animations;
Great! No non-composited animations have been found on your website.
Animations which are not composited can be janky and increase CLS which affects your overall Performance Score.
A non-composited animation refers to any animation in which CSS or JavaScript modifications would trigger re-painting of pixels to your page, which increases main-thread work.
Avoiding non-composited animations can speed up your page load and prevent page jank i.e., your page stuttering or appearing visually unstable when it loads.
The website does not have any language- or region-specific pages.
If you have a multi-language website with different regional versions of a page, there is good way to tell search engines about these localized variations by using hreflang elements.
In this section you can review the list of all hreflang values that have been found on pages of the project domain. You may also check Google guidelines (https://support.google.com/webmasters/answer/189077) for multi-language websites about how to use hreflang elements properly.
The website does not have any language- or region-specific pages.
This factor checks and verifies hreflang elements that have been used on the project domain pages. For every page WebSite Auditor provides detailed report on the incorrect usage of hreflang elements and their values.
The website does not have any language- or region-specific pages.
This factor analyses and verifies values of all hreflang elements found on your pages. One of the most common mistakes is to use incorrect language-country values for hreflang annotations, for example: 'en-UK' instead of 'en-GB'.
All hreflang values should comply with ISO 639-1 standards (https://en.wikipedia.org/wiki/List_of_ISO_639-1_codes) if only language versions of a page are specified or with ISO 3166-1 Alpha 2 (https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2) if both language and country versions are indicated. You can also use the "x-default" value for pages that do not have other better suited language/region versions.
The website does not have any language- or region-specific pages.
The factor validates all URLs that are used in hreflang attirbutes on your website. Make sure to use fully-qualified URLs for each language/region version of a page including the “http://” or “https://” protocol. It is not allowed to use relative URLs for hreflang attributes.
The website does not have any language- or region-specific pages.
Each language version of a page should have hreflang attributes that point to the page itself and also to other language/region versions of a page. It is also required that all language/region versions should point to each other in their hreflang attributes.
Example: there is a page in English(en) with 2 other language variants - Russian(ru) and German(de). The head section of the English version (en) should contain 3 hreflang elements: one pointing to the English version itself and 2 others pointing to the Russian(ru) and German(de) pages. Both German(de) and Russian(ru) variants should also have 3 hreflang elements that point back to the English version, to each other and also list themselves in hreflang attributes. Otherwise, if pages do not point back to each other their hreflang attributes will be ignored by search engines.
The website does not have any language- or region-specific pages.
This factor checks for conflicting hreflang values: a page language variant should have only one hreflang language attribute assigned (ex.: a page should not have both "en" and "de" hreflang values). Although, it is possible to assign the same language but different regions: en-US and en-GB.
The website does not have any language- or region-specific pages.
This factor lists all pages that have hreflang attributes but at the same time use a canonical element which points to some other page. Such a combination of elements can confuse search engines because it suggests different URLs for indexation. The best practice is to either remove the canonical tag or to link the canonical element to the page itself.
The website does not have any language- or region-specific pages.
This factor checks for pages that use hreflang elements without the "x-default" value. Using "x-default" is not obligatory but is a good way to tell search engines which page version it should use for languages and regions that have not been defined through your hregflang attributes.
The website does not have any language- or region-specific pages.
This factor analyzes all pages that use language-region hreflang attributes and verifies that all these pages also have the "x-default" value and provide a generic URL for geographically unspecified users of the same language. Example: a page may have specific variants for english speaking users from Canada(en-Ca) and Australia(en-Au) but you also need to specify a generic URL that will be used for English-speaking users from other countries.
Report created: Mar 26, 2024 by SEO PowerSuite
|