What are some of the most common technical errors on a website that search engines don't like at all?
From a technical point of view, many web pages have errors, but the most important ones are those related to web page setup and indexing. The most common are:
- existence of duplicate pages (web page opens with and without www, both with and without slashes);
- insignificant sheets ( thin content ) are found and indexed, and wasting a web page devoted to resources;
- important pages cannot be found and indexed ( noindex , pages without incoming links, incorrectly configured URLs)
- Missing robots.txt and map file (sitemap.xml)
- The website responds with 4xx errors or unnecessary 3xx redirects.
- The website uses too many design resources and the page is not loading fast enough
- no HTTPS;
- The website is not mobile friendly.
Even if a website is well optimized for search terms and other content, these technical errors can significantly affect its discovery. When search engines fail to get enough of this optimized content and see that users are also experiencing problems, they tend to bring up a web page in search that users do not encounter.
Mobile-friendly and quick download of the website have become especially important lately.
What does it mean if a website is not mobile-friendly and how does it affect search results?
This means that when viewed from a mobile, users will have the same structure as the desktop. This in turn means that the text and all the content are extremely small and that the website itself is difficult to use. Zoom east to view content , and clicking on links and buttons is especially difficult. Because mobile internet is not always the fastest, it takes longer to load the page, as it loads the same resources as the desktop version, meaning the size and size of the items are large. This affects search results in particular on mobile searches, as bringing up such a website will make life more difficult for users - they will have to wait longer to download and it will also be difficult to use.
That's why mobile-friendly websites, and especially their simplified versions (such as AMP), which are designed to load even faster but have the same content and functionality, are more prominent.
Do search engines also look at how well a website is compatible with different browsers?
Search engines look at a lot of factors for each website, and compatibility is also one of them. But keep in mind that each search engine first looks for compatibility with the browser it is using, which is Chrome for Google (which is what Google bots do).
Therefore, the website should be updated from time to time to prevent the technologies used by new web browsers from running away. Unmodified pages may have certain security risks in addition to outdated code that make them more attractive to hackers.
How can this compatibility be best verified by myself?
You should definitely check the compatibility with the web developer when you do this, but once the web site has been received from the developer, it could be checked annually. There are a number of free and paid web-based tools available for this purpose.
How fast should a website open, and how do I monitor it to be fast enough for search engines?
Ideally, a website could be fully downloaded within three seconds. The longer it takes to load, the greater the chances of losing your customer.
Currently, many websites have room for improvement in this regard, as most do not even get close to three seconds.
Different tools can control the load time of your website. Some of the more popular are Webpagetest.org (also recommended by Google), Google's own PageSpeed Insights (and also web.dev ), and GTmetrix .
What to do to make your website fast enough for search engines?
The main thing to keep in mind is not to use unnecessary resources. This means that all images used should not be too large, .css and .js files should be minimized, static images could be used instead of animations, etc. Certainly one website can look very beautiful when animated and stylized, but then you also have to spend resources on optimizing resources rather than leaving it to chance.
These same tools mentioned above are also helpful in finding speed related drawbacks. Many people use PageSpeed Insights results as benchmarks when developing websites (but 100/100 is not always necessary).
How do I find broken links and what can I do with them so my search results aren't badly affected?
Finding broken links manually is a very time consuming process and you may not find all such links. That is why I would recommend using either web based tools like Broken Link Checker , a plugin for the same name as a web browser, or standalone programs like Screaming Frog SEO Spider(a smaller version will suffice for smaller pages). In finding, broken links should either be repaired against the working ones or removed. It often happens that a page's link may have been changed, but the links in the content have not been updated (or redirected), resulting in broken links that no longer point to the page you need, but redirect authority away from it. Keep checking for broken links, as it is very easy to do and, if fewer are found, easy to repair.
What are some typical configuration errors for content management solutions such as Wordpress that are immediately reflected in poor SEO results?
In recent years, WordPress and other content management systems have made great strides in out of the box optimization. While in the past even URLs were by default very user-friendly and search engine friendly, nowadays the situation is much better.
But speaking of typical configuration errors, one of the most common mistakes that occur all too often is to uncheck WordPress's "Disable search engine indexing of this site" setting after getting a new website from the developer and replacing the old one. This, in turn, tells the search engines that there is no need to show this web page in search, and that existing results start to disappear. However, with a brand new page, it never appears in search.
Another common mistake is the incorrect use of language parameters on multilingual pages. Specifically, many use the default? Lang = en and similar parameters, which could be replaced in the URL with a language subfolder, making life easier for themselves, for search engines, and for users, since such references are easier to use, manage, and remember. Don't make life complicated for search engines, they already need trillions to visit and manage.
What are some technical errors in uploading images that can be penalized by search engines?
The biggest problem with pictures is their size. This is not directly penalized by search engines, as large, well-resolved images can be useful to anyone (such as photographers, artists, NASA), but usually do not receive bonus points. Namely, large and compelling images slow down the downloading of websites, which in turn may prevent them from reaching higher positions in search results.
Modern technologies can optimize images so that even images of the same size and quality can vary in weight by tens of times. The difference is in reducing background noise and shades that are not really visible to the naked eye. However, for a visitor, the difference is whether the image weighs 2 MB or 50 KB, especially with mobile internet, which may not always be good at speed.
Therefore, keep in mind that you do not need to make the images larger than necessary and optimize them each time you upload them to the website. In most cases, this can save you a few megabytes, which means valuable seconds for your site to be downloaded by visitors.
If a web page address or web address of an important page changes, how can it be redirected painlessly to make the new address work in search engines?
The correct answer to this is 301 code redirects. Whenever there is a need - namely a need, not just a desire - to change the address of a page, it should be remembered that it is 99% likely to bring about a change in search results and visitor numbers, even temporarily. I have encountered a number of such migrations and, at best, the effect was almost imperceptible, but at worst, there was a temporary decrease in visitors of about 30% (though it should be mentioned that the number of visitors even increased after the results recovered).
For best results when changing your home page address, you should follow the recommendations of the search engines (for example, Google's guide ). While there are quite a few steps you need to take to change the web page address, you can only set up 301 redirects when you change the page address. For the sake of certainty, both versions of the page in Google Search Console could be reviewed by URL inspection to alert the crawler of the change.