We continue with the guide dedicated to Bing, Microsoft's search engine, the alternative to the market monopolized by Chrome.
It's worth remembering that the results indexed by Bing are also indexed by Yand. Both account for 5% of the market compared to Google's 83%, according to the latest statistics from Global Stats StatCounter.
Bing Webmaster Instructions
These guidelines are intended to help you understand how Bing detects, indexes and ranks websites.
Following these guidelines will help you with indexing your site on Bing. It will also help you optimise your site to increase its chance of ranking for relevant queries in Bing's search results. Please pay particular attention to the guidelines in the ‘Abuse’ section and the examples in the ‘Things to avoid’ section. By following the guidelines you will ensure your site complies with the rules and is not considered spam, which could lead to the site being downgraded or even removed from Bing search results.
How Bing searches and indexes your site
Helping Bing find all pages
Sitemap: Sitemaps are an essential way for Bing to locate URLs and content for your website. It is a file that provides information about URLs, other files and content such as images and videos on your website. The sitemap informs the crawler which pages and files you consider important within your site. It also provides additional information, e.g. when the page was last updated. We strongly recommend using an XML Sitemap File to enable Bing to locate all relevant URLs and content within your website. Please keep your sitemap files as up-to-date as possible; update them in real time or at least once a day. This will enable the timely removal of old URLs and deadlinks once content is removed from your website.
Make your sitemap available to Bing via:
Sending it to Bing via the Bing Webmaster Tools Sitemap command
Insert the following line in the robot.txt file, specifying the path to the sitemap: Sitemap
http://example.com/sitemap_location.xml
Once Bing knows your sitemap, Bing will scan it regularly. It is not necessary to submit it again, except in case of significant changes on the site.
General Sitemap Guidelines:
Bing supports several sitemap formats, including XML, RSS, MRSS, Atom 1.0 and a text file.
Use consistent URLs. Bing will only search by indexing the URL exactly as listed.
Please only list canonical URLs in your sitemaps.
If your website has multiple versions (HTTP vs. HTTPS, or mobile vs. desktop), we recommend that you only point to a single version in your sitemap. If you decide to have a unique mobile vs. desktop URL, please annotate it with the rel=‘alternate’ attribute.
If you have multiple pages for different languages or regions, please use the hreflang tags in both sitemaps or the HTML tag to identify the alternate URL.
Use the lastmod attribute to indicate the date and time when the content was last modified.
The maximum size of the sitemap is 50,000 URLs/50 MB uncompressed. If your site is large, consider splitting the large sitemap into smaller sitemaps and then use a Sitemap Index file to list all the individual sitemaps.
Refer to your sitemap in the robots.txt file.
If you have not changed your sitemap since Bing crawled it, there is no need or benefit to resubmit it.
Using a sitemap does not guarantee that all elements within a sitemap will be crawled and indexed; however, in most cases you will benefit from having a sitemap as it provides a recommendation and guidance to the crawler.
Use the IndexNow API or the Bing URL or content submission API to instantly reflect website changes. If you can't adopt the API, we recommend submitting updated URLs directly via Bing Webmaster Tools or including them in your Sitemap.
Golf Course: Links are traditionally considered as a signal to determine website popularity. The best way to convince other sites to link to your site is to create high-quality, unique content. Il crawler di Bing (Bingbot) segue i collegamenti all'interno del tuo sito Web (link interni) o da altri siti (collegamenti esterni) per aiutare Bing a scoprire nuovi contenuti e nuove pagine.
Bing recommends that all pages on a site link to at least one other discoverable and crawlable page.
Crawlable links are tags with an href attribute. The referring link should include text or an alt image attribute that is relevant to the page.
Limit the number of links on a page to a reasonable amount, no more than a few thousand links per page.
Make a reasonable effort to ensure that any paid or advertising links on your site use rel=‘nofollow’ or rel=‘sponsored’ or rel=‘ugc’ to prevent links from being followed by a crawler and potentially affecting search rankings.
Bing rewards links that have grown organically; links added over time by content creators on other reputable and relevant websites that drive real users from their site to yours. Plan link building both internally and externally, organically.
Abusive tactics that aim to inflate the number and nature of incoming links, e.g. link buying participation in link schemes (link farms, link spamming and excessive link manipulation) can lead to your site being penalised and removed from the Bing index.
Limit the number of web pages: Limit the number of pages on your website to a reasonable number. Avoid duplicate content within your site; Help deduplicate duplicate content:
Avoid publishing different URLs with the same content using the canonical Tag command.
Configure your website and URL parameters to improve search efficiency by indexing and reducing multiple variations of the same URL pointing to the same content.
Avoid mobile-specific URLs. Try using the same URL command for desktop and mobile devices Users.
Use redirects as appropriate: If you move your website content to another location, use a permanent HTTP 301 Redirect for at least three months. If the move is temporary, i.e. less than a day, use a temporary 302 redirect. Avoid using a canonical rel=tag in place of a proper redirect when site Content has been moved from one location to another.
Allow Bing to crawl for more indexing: The Webmaster Tool crawl control feature allows you to manage how Bingbot crawls your content, including when and at what rate. We encourage webmasters to enable Bingbot to quickly and thoroughly crawl sites to ensure as much content as possible is discovered and indexed.
JavaScript:Bingcan process JavaScript, however, there are limitations to processing JavaScript at scale while minimising the number of HTTP requests. Bing recommends Dynamic Rendering to switch between client-side rendered and pre-rendered content for specific user agents such as Bingbot, especially large web sites.
Remove content by returning a 404 ‘Not Found’ HTTP code. Expedite content removal by using the Bing Content Removal and Page Removal tools. Content removal requests last for a maximum of 90 days, and you need to renew it, or content may reappear in the search results.
robots.txt: A robots.txt file informs search engine crawlers such as Bingbot, which pages and files the crawler can or cannot access on your site. Robots.txt is primarily used to instruct and manage crawler traffic, For example you can tell Bingbot not to crawl less helpful content such as search result pages or login pages.
Place the robots.txt in your root directory (the topmost directory) of your website. Do not place it in a subdirectory.
Blocking Bing from crawling a page will likely remove the page from the index. However, using Disallow does not guarantee that a page will not appear within the index or search results. If you would like to block a specific page from getting crawled or indexed, you should use the noindex robots meta tag instead of disallowing it in the robots.txt.
Review your robots.txt often to ensure it's up to date. Review URLs deallowed by robots.txt in Bing Webmaster Tools to ensure it remains accurate.
Learn more by reading how to create a robots.txt text file.
Save resources: Use HTTP compression and conditional get to reduce bandwidth used by crawlers and your clients while improving page load speed.
Help Bing understand pages
Bing looks for rich, valuable and engaging content created for users, not search engines. By creating clear, unique, high-quality, relevant and easy-to-find content on your website, you will increase the likelihood that Bing will index and display your content in search results.
Content: Websites that are poor in content, display mainly ads or affiliate links, or quickly redirect visitors to other sites tend to drop in rankings on Bing. In some cases, it may not be indexed at all. Your content should be easy to navigate, rich and engaging for the website visitor and provide the information they are looking for.
Create content for search users and not for search engines: Develop advanced content based on Keyword Research that shows which search users the information they are looking for.
Create enough content to satisfy visitors expectations entirely. There are no hard and fast rules on the number of words per page, but providing more relevant content is usually better...
Make it unique. Do not reuse content from other sources. Content on Your page must be unique in its final form. If you choose to host content from a third party, use the canonical tag (rel=‘canonical’ to identify the original source or use the alternate tag (rel=‘alternate’) .
Images and videos: Use unique, original images and videos relevant to the topic of the page. Bing can extract information from images, captions, structured data, titles, and related text such as transcripts.
Do not embed text or important information within images or videos. Optical Character Recognition is less reliable than HTML text and is not accessible. Alt Text Improves Accessibility for people and devices that cannot see images on pages. When choosing alt text, focus on creating information-rich content that uses keywords appropriately to contextualise the image related to the page content.
Include titles, file names and descriptive text for images and videos.
Videos should be in a supported format and not blocked behind paywall protection or access.
Subtitles and captions can be used for videos to make your content available to a wider audience and provide search engines with a textual representation of the content in video and audio files.
Choose high-quality photos and videos; they are more attractive to users than blurred or out-of-focus images.
Optimise images and videos to improve page load times. Images are often the most significant factor in page size and slower page loading.
Comments
Post a Comment
Do not insert clickable links or your comment will be deleted. Checkbox Send me notifications to be notified of new comments via email.