Internal Website Optimization

Internal Website Optimization: Tools, Technical Factors, Content, Microdata 

Internal website optimization presumes lots of finest details. In order to achieve the maximum indexing of the web source, bring it to the top of ranked participants of search results, and get in good with users one shall pay attention to various aspects of website performance, from engaging support services and advanced technical settings to ensuring quality content and its design.

 

It’s simply impossible to put detailed information on all peculiarities of website functioning and promotion together in one article. However, we’ve tried to put together some recommendations on all aspects mentioned above so that you would get an overview of the problem and prepare yourself to solve it efficiently.

 

Tools

 

To ensure optimal performance of your site and to analyze the results of works completed one needs to utilize various useful tools. They will provide you with analytical data, give an opportunity to detect and correct existing errors, add new metrics, etc. Please read on to learn more about services that every website shall be equipped with and peculiarities of their setup.

 

Analytics

 

1. Install the Google Analytics system

 

Google Analytics is a very useful free service, using which you’ll be able to monitor your website attendance and behavior of visitors on its pages, advertising efficiency, and existence of technical failures.

 

Install the Google Analytics system

 

 

To register on this service simply go to the service official website  (http://www.google.com/analytics/) and complete the registration procedure.

  

Webmaster Toolbar

  

1. Use capabilities of Google Webmaster Tools.

 

Use capabilities of Google Webmaster Tools

 

 

Adding your website to the database of this service allows to:

 

  • detect scanning errors;
  • find duplicate pages using duplicate titles and meta tags;
  • assign a website to a specific region;
  • download and verify the sitemap.xml file;
  • verify the robots.txt file;
  • select primary mirror for your website (including or excluding “www”);
  • and much more.

To take advantages of all benefits of this application, register at the manufacturer’s website (https://www.google.com/webmasters), add your website to the list of analyzed resources and confirm your rights to it.

 

 

2. Register in the Bing Webmaster Tools system.

 

Register in the Bing Webmaster Tools system

 

 

In order to ensure that the data you receive is as diversified and complete as possible, it won’t go amiss to also connect to Bing Webmaster Tools, which is an alternative to Google Webmaster Tools. Unlike the latter, here your data will be received not from the Google search robot, but from its “colleague” of the Bing system.

 

After registering on the official website (http://www.bing.com/toolbox/webmaster) you will need to add your source to the list of websites analyzed by the software.

 

Below we’ll discuss the issues you shall pay attention to while optimizing your website internally.

 

 

Technical Factors

 

It’s simply impossible to reach the top of search results without strengthening your technical base. Below we’ll discuss the main factors you shall pay special attention to.

 

1. Create and properly set up the Robots.txt file.

 

Robots.txt file allows you to close access to duplicate pages or, for example, those pages that you want to exclude from indexing.

 

This file is created as a simple text document and then placed in the top-level directory of your web server. Like this, for instance: 

 

User-agent: *

Disallow: /admin/
Disallow: /private/
Sitemap: www.site.com/sitemap.xml

 

 

It is very important to make sure that no important and necessary pages of your website are included in this file by mistake. Otherwise, their indexing will be forbidden and your source will remain unnoticed by internet users.

 

It is possible to monitor the correctness of the robots.txt file performance in the Google Webmaster Tools service.

 

2. Create your website map using the Sitemap.xml file.

 

The Sitemap file informs search engines about the way data is organized on your website. Existence of this file allows search robots to index your pages with higher precision.

 

When creating the Sitemap.xml file it’s important to include all essential URLs of your website (without redirects and errors, and it’s best to use canonical URLs). This will allow search engines to clearly understand your source’s structure and index it better.

 

When a website is frequently renewed, it’s important to monitor and timely renew the Sitemap file. Many CMS systems do it automatically. However, if you need to create the file manually, you can use specialized software or a cloud service. Like this one, for example (www.xml-sitemaps.com/).

 

After your sitemap is created, it needs to be downloaded and uploaded to the root folder of your website. After that, make sure to add the file to Webmaster Tools. Make sure to check for possible indexing errors.

 

What other technical factors shall one pay attention to?

 

 

Correct coding of pages

 

When a server works correctly, it responds to users’ requests with “2xx” format codes. They aren’t visible to users due to their specifics, because by providing the “2xx” response, the server simply performs the action requested by the visitor.

 

Search engines only index pages that regularly respond with the code of such format. Therefore it is important to pay close attention that all pages of your source answer with “2xx” codes.

 

 

Proper use of 301/302 redirects

 

301 and 302 redirects are used to redirect users to proper pages.

 

301 redirect

The 301 code (which means “permanently removed”) means that the requested page had been permanently transferred to a new location (for example, as a result of redesign or global restructuring of the previous site). The server will automatically redirect users to the new address.

 

When a site is moved to a new domain name, or when changing the CMS platform, the 301 redirect is the best way to preserve the site’s ranking in search engines. When setting up the 301redirect, old and new website addresses will be merged, and 85 to 90% of link weight will be transferred to the new page from the old one.

 

When using redirects, it is important to follow these rules:

 

  • Don’t link to a page, which you know has a redirect;
  • Transfer to an active page by only using one redirect, avid cycling redirection..

 

If redirect is not set up or works improperly, user will see the 404 error message.

 

There are several ways to setup the 301 redirect:

 

  • Using the .htaccess file on your Apache server.
  • Using the nginx server setting.
  • Using the following scripts: PHP redirect, ASP redirect, ASP.net redirect, Could Fusion redirect, JSP redirect, CGI PERL, RubyonRails.

 

302 redirect

The 302 code (which means “Found”) means that the requested page’s address has changed temporarily, for instance, for the purpose of page restructuring. Unlike using the 301 code, link weight will not be transferred to the new location, and the page itself will not be deleted from the index.

 

To ensure the correctness of redirect settings one can use such a tool as LiveHTTPHeaders, crawler or other tools.

 

 

Proper display of the 404 page

 

User sees the 404 error when he or she tries to view a non-existing page on the website. For example, if it was deleted or if a user entered a wrong URL. Such page can easily be created using HTML tools.

 

The 404 page shall be designed in accordance with the overall site design. However, one may also use original style ideas or utilize some humor (if acceptable according to the source’s topic) to lower user’s frustration from seeing a server error. Besides, it is important to assist the visitor in finding a page he or she needs. For example, provide necessary links or suggest using a search bar.

 

error 404

 

 

If this mistake takes you by surprise, the first thing you shall do is find the source of the broken link. If it’s located on your website, it is necessary to correct the link or to remove it. If the source of such link is on an external website, try using the 301 redirect in order to avoid loosing traffic.

 

 

Existing of the 500 error

 

The 500 error (means “Internal Server Error”) is displayed if there is a server failure, which prevents it from executing the user’s request.

 

Such errors often occur due to syntax errors in the .htaccess file, or when such file contains unsupported directives. To resolve the issue it is usually sufficient to comment the Options directive, i.e. to add the “#” symbol at the beginning of the line.

 

 

Setting up the primary mirror

 

Search engine sees www.site.com and site.com as two different pages that duplicate each other completely. It is essential to select the primary, so-called canonical, domain for your website.

 

After you decide on the primary domain, it is important to properly set up the 301 redirect to the primary domain. One way to do so is to create a directive in the .htaccess file. Here’s an example of the domain redirect from http://www.domain.com to domain.com:

 

Options +FollowSymLinks

RewriteEngine On

RewriteCond %{HTTP_HOST} ^www.domain.com$ [NC]

RewriteRule^(.*)$ http://domain.com/$1 [R=301,L]

 

 

Page loading speed

 

Loading speed is one of the key factors affecting the site ranking by search engines, because modern users are savvy with advanced technologies and want to receive information they need as fast as possible.

 

At that, the perfect web page loading speed is considered to be 2 seconds for PCs and 1 second for mobile devices. If your ratios are higher, it’s going to be hard to earn a high spot in search results.

 

page speed

 

 

One shall do the following to increase the website performance speed:

  • Monitor your site’s performance speed. You can use the following service for this purpose (https://developers.google.com/speed/pagespeed/insights/).
  • Monitor hosting performance and make adjustments or change the service provider if necessary.
  • Determine scripts or files that overload your site.

 

 

Size of the page code

 

Google recommends that the page code shall not exceed 100 KB. It doesn’t affect relevance that much, but significantly affects the page loading speed and the number of aborted requests. Many professionals also recommend decreasing the share of code on a page in favor of content, which is especially important for large websites.

 

 

Using external Js and CSS scripts

 

To optimize your site’s performance and to increase its page loading speed, put JavaScript and CSS code in an external file.

 

One shall also make sure that files are not forbidden from scanning by search bots in the robots.txt or otherwise.

 

 

Existence of the mobile version of the website

 

Google requires more and more rigidly for websites to have mobile versions, because the share of mobile devices is growing every year.

 

Check how your website is displayed on mobile devices, and optimize it if necessary following recommendations by Google. You can do so using the Google Webmaster Tools service in the Mobile Friendly Test section.

 

mobile version

 

 

 

Use of attributes rel=”alternate” hreflang=”x”

 

If you’re creating an international source with various language versions and content aimed at users from different regions, pay attention to attributes rel=”alternate” and  hreflang=”x”. 

 

It’s best to use these attributes, when:

 

  • Website content is fully translated.
  • Content is offered in different languages to different regions. For instance, English language content may be aimed at users from the USA, Canada and Australia.
  • Only the web-page template is translated (navigation menu, etc.), but core content remains in original language. This often happens, when a website has user-generated content.

 

How to do it in practice?

 

Let’s say you have a website in English and Russian languages. Using the attributes you can point out that available versions of the website (say, “site.com” and “site.com/ru/”) are different language versions of the same source. In this case it will be easier for each particular user to find the website version he or she needs. There are a number of ways to do so:

 

 

1. Use the element rel=”aternate” on the home page

 

Add the element rel=”alternate” to the <head> section of the “site.com” homepage. It will redirect the user to the Russian version of the website at “site.com/ru/”.

<link rel=”aternate” hreflang=”ru” href=http://site.com/ru/”/>

 

 

2. Use HTTP header

 

If you publish files in a format that is different from html, such as pdf or doc, for example, use the HTTP header to specify the URL address in a different language.

Link: <http://site.com/ru/>"; rel=”alternate”; hreflang=”ru”

 

This way it is possible to add several hreflang attributes to the link element separated with commas:

Link: <http://site.com/ru/>"; rel=”alternate”; hreflang=”ru”,

<http://site.com/en/>"; rel=”alternate”; hreflang=”en”

 

 

3. Use the Sitemap file

 

If you have several versions of the website in different languages, make sure that the code of each version specifies attributes for all versions. For example, if you website offers its content to users speaking Russian, English and German, the page address in English shall include its custom attribute rel=”alternate” hreflang=”x” as well as links to other versions of the content. This rule also applies to pages in other languages.

 

Let’s say you have three language versions for your site contents:

 

  • Primary version in English language (www.site.com/english/)
  • French language version (www.site.com/french/)
  • German language version (www.site.com/deutsch/)

 

The Sitemap file informs the system that the page www.example.com/english/ has versions in German (http://www.example.com/deutsch/) and French  (http://www.example.com/french/) languages respectively.

 

Write the primary and alternative versions of the site for each language in the code of the Sitemap file:

 

<url>

<loc>http://www.site.com/english/</loc>

<xhtml:link

 

rel="alternate"

hreflang="de"

 

href="http://www.site.com/deutsch/"

                 />

<xhtml:link

 

rel="alternate"

hreflang="de-ch"

 

href="http://www. site.com/schweiz-deutsch/"

                 />

<xhtml:link

 

rel="alternate"

hreflang="en"

 

href="http://www. site.com/english/"

 

/>

</url>

 

 

Number of outgoing links on a page

 

Try not to have too many outgoing links on a page. According to Google recommendations, there should be no more than 100 of them.

 

links from a site not more than 100

  

 

Attributes nofollow and noindex for external links

 

Using this attribute will forbid the search engine to transfer the weight of specific links or of all links on a page. This allows to create a link without transferring its link weight. If you are not going to sell links, we recommend that you close all links to external sources using the “nofolow, noindex” attribute.

 

Normally, the “nofollow, noindex” attribute is used in the following cases:

 

  • Paid links are being used
  • Insecure content is available.
  • Priority is set up for scanning.

 

You can implement this using the following code:

<a href=”http://www.site.com” rel=”nofollow, noindex”>

 

 

Properly constructed URL address

 

Proper construction of website’s URLs is an important aspect of search engine optimization. There is a number of aspects to creating good URLs that every webmaster shall pay attention to.

 

 

1. Use keywords in the beginning of the URL. 

 

Google keeps reminding that first 3 to 5 words of the URL give more weight to a page, therefore it’s recommended to use keywords in the beginning of the URL.

 

good Urls

 

 

 

2. Length of the URL shall not exceed 100 characters.

 

This also affects the level of indexing, as it’s easier for users to remember and enter shorter addresses.

  

.

Prevent duplicate pages 

 

Duplicate pages are a common problem for websites, which may diminish the promotion results. It is important to monitor your site against this problem. If you detect such pages, they shall be eliminated using the 301 redirect from the duplicate page to the canonical (primary) one.

 

There are special services to find duplicates, such as Screaming Frog. You may also use the “HTML Optimization” section of Goolge Webmaster Tools to detect duplicate pages.

  

 

Use the Canonical tag.

 

The Canonical tag is a part of the <head> block in the page’s html code. It looks as follows: <linkrel="canonical" href="https://site.com/blog" />

 

This code helps search engines to view the page with this code as a copy of the page "https://site.com/blog" and to allocate the link weight and other metrics to the primary page specified in the tag.

 

This tag allows you to:

 

  • Establish the primary domain;
  • Specify the primary URL;
  • Specify primary canonical URLs in the Sitemap file;

 

The Canonical attribute is quite similar to the 301 redirect in its nature, however there are several differences between the two:

 

  • The Canonical tag does not redirect a user to a page with a different URL.
  • The Canonical tag is not applicable if you use several domains. The 301 redirect allows you to redirect a user from domain.com to domain2.com.

 

 

Pagination errors

 

Pagination of web pages is sequential numbering of website pages, which is displayed in the bottom, top or side part of a page.

 

Pagination errors affect two significant aspects of search engine optimization:

 

  • Decrease the number of duplicate pages.
  • Depth of page scanning by search robots.

 

Pagination errors result in at least two major problems:

 

1. Besides partial duplication of the content, pagination pages may repeat meta tags, which significantly decreases website’s internal optimization level and relevancy of primary landing pages to users’ search queries. Therefore, such sites are ranked lower in the search results.

 

2. Depth and number of pages scanned by search engines is limited and depends on the source’s trustworthiness and proper interlinking. If a website has many pagination pages, the chance the robot indexes the proper ones is significantly lower, because a number of transfers between pages will be spent on partial duplicates.

 

Therefore it is obvious that proper setup of website’s pagination is essential. In particular, you can avoid problems as follows:

 

 

1. Pagination using tags rel=“prev”/“next”.

 

This is the most complicated method, however it’s the most universal one as related to the Google search engine.

 

It’s pretty complicated to implement such setup, therefore one shall have a clear understanding of its mechanics.

 

Say we have 4 pagination pages on our website. It’s important to use rel=prev/next to create a chain of existing pagination pages starting from the first page. Add <link rel=»next» href=»http://site.com/page2.html»> to its <head> block.

 

Next page in a chain is the second pagination page. You shall add the following code to its <head> block:

 

<link rel=”prev” href=”http://site.com/page1.html”>

<link rel=”next” href=”http://site.com/page3.html”>

 

Similarly amend the third page.

Fourth page is the last one in our chain, therefore the <head> block shall contain the following code:

<link rel=”prev” href=”http://site.com/page3.html”>

 

Properly placed tags on pagination pages show the Google search engine that pages page1 through page4 shall be merged in a single element in the index, and only the first page will be treated as a relevant page.

 

Major drawback of this method is that it’s only Google that accounts for such tags, while Yandex will still see your pagination pages as duplicates.

 

 

2. Activate pagination using the tag rel=”canonical”. 

 

Google recommends using this method for pagination setup. The tag rel=”canonical” shows the canonical page to the search engine and allows to avoid indexing irrelevant pages. Using this tag, search engine doesn’t take into account non-canonical pages and duplicate content they contain. Therefore, this method is best to use against partial duplicates.

 

Such tag looks as follows:

<link href=”http://site.com/canonical-page” rel=”canonical” />

 

The drawback of this method is that it’s impossible to implement such setup in most turnkey CMS systems.

 

 

3. Use the AJAX pagination.

 

The AJAX pagination is the most optimal and beneficial for the SEO function. When transferring to the second and consequent pages of a catalogue, website page is not reloaded, but the list of displayed products or articles is changed using AJAX, therefore pages with URL addresses like http://site.com/page2 etc. simply don’t exist, and they cannot be indexed by search engines and will never be included in their index.

 

The drawback of this method is that such pagination can only be implemented by experienced programmers.

 

 

4. Close pagination pages from being indexed by search engines.

 

To do so you can either close these pagination pages in the robots.txt file using the mask directive Disallow: *page, or close pagination pages using the meta tag <meta name=”robots” content=”noindex, follow” /> by placing it in the <head> section.

 

However, this method also has its drawback: if there’s no xml sitemap, it will take long time for products or pages displayed on pagination pages to get indexed.

 

  

Onpage optimization of website pages.

 

Content

 

Content of the page affects the relevance of information to the user’s request. Therefore it’s of upmost importance to properly optimize all important pages.

 

Having quality content is the most important thing in achieving good results in search engine optimization.

 

To do so, a webmaster shall follow the rules below.

 

 

1. Publish unique content.

 

Ideally one shall only publish original professional articles on a website. However, if it is impossible, high quality rewriting and copywriting can also be suitable. At that, it is still recommended to avoid automatic text generation, even when we’re talking of websites with large number of pages. Sources with carefully selected content are indexed much better, and therefore they’re easier to promote.

 

 

2. Don’t write articles that are too short.

 

Preferably publications shall have at least 500 words. At that, the more competitive is the topic, the more content shall be on the page.

 

 

3. Keywords shall be placed in the beginning of articles.

 

Using keywords within the first 50 or 100 words is most efficient for page indexing, because it helps search robots to easily determine the topic of your site.

 

 

 

4. Know when to stop when using keywords.

 

There are no specific numbers, but  usually each page shall contain no more than 10 keywords. Otherwise search engines will consider your content “overspammed”, which will have significant negative consequences on its ranking in search results. It is also important to take into account the ratio of high-, medium- and low-frequency requests in your articles.

 

 

Page tags that are important for optimization

 

Besides the optimized content, it is critically important to properly use primary page tags. They allow search robots to better determine the page relevancy.

  

 

1. The Title tag shall be unique and under 65 characters long.

 

The <title> tag is one of the most important for placement of keywords. It must reflect on the topic of the entire page. It is important to have a unique title tag for each page and to include several keywords in it.

 

Besides, the <title> affects the click-through ratio of your website in search results, because it is displayed as a header in search results:

 

The Title tag shall be unique and under 65 characters long

 

 

Hence we recommend to follow the advices below when creating titles:

 

  • Include keywords in them.
  • Place keywords in the beginning of titles.
  • Pay attention to the title’s length – it shall not exceed 65 characters.
  • Give preference to long key phrases in titles.
  • Experiment with titles while monitoring their click-through ratios in analytics
  • Use separators do divide the title from the descriptive part.

Like this, for example: Product | Simpledescription | Site name. This doesn’t affect the relevance that much, but significantly improves the readability of the title.

 

      2. The H1 tag shall be unique and contain keywords.

 

First level header, the <H1> tag, is the second most important element of a page. Essentially it’s the page header and you must only have one of those on the page.

 

The <H1> tag shall be visually larger than other headers and, preferably, precede the remaining content of the page.

 

This is what we recommend:

 

  • Don’t use more than one H1 tag per page.
  • Use the header not for decoration purposes, but rather to underline the important text.
  • Use keywords in the header.
  • Maintain the header hierarchy.

 

       3. Using sub-headers H2-H6.

 

Don’t forget to use tags <H2 – H6> for sub-headers on a page. This will not only create a structure that is convenient to the user, but also show search robots that these keywords are really important.

 

We recommend:

 

  • Use these tags to distinguish sub-headers.
  • Maintain the hierarchy.
  • Use keywords.

 

      4. Meta descriptions shall be unique and not exceed 140 symbols.

 

The meta description is not a mandatory tag for page optimization, however it’s still better to use it.

 

Search engines often use this tag as a snippet in search results, therefore good meta description will gain more user attention.

 

<head>
<metaname="description" content="Here is a description of the applicable page">
</head>

 

Key phrases contained in the meta description are shown in bold, which attracts more visual attention to it and increases the click-through ratio.

 

Meta descriptions shall be unique and not exceed 140 symbols

 

 

We recommend:

 

  • Be brief and concise.

 

Don’t use very long meta descriptions. They shall not exceed 135 characters for Google, 200 characters for Bing, or 165 characters for Yahoo!.

 

  • Write attractive descriptions.

 

Write descriptions as if you were writing an ad. Descriptions shall be noticeable, convincing and as informative as possible.

 

  • Use keywords.
  • Honestly describe your content.

 

Users will be very disappointed if they don’t find information advertised in the snippet on your page. Therefore they will probably not return to your web source again. Moreover, being enraged by your dishonesty, they can even spoil your reputation by leaving negative reviews about your site or company on relevant message boards.

 

5. Use of the Keywords meta tag.

 

Relying on Google’s announcement that the use of keywords in meta tags does not affect the site ranking, many webmasters ceased using them completely. Taking into account best optimization practices and our own experience, we still recommend that you fill them out.

 

As an example, you can place the following code with meta descriptions:

 

<head>

<meta name="keywords" content="JavaScript, HTML, CSS, XML">

</head>

 

So, this is what we recommend:

 

  • Try to use unique and target keywords. It’s best to start by writing long keywords first followed by shorter ones, separated by comma.
  • Pay attention that keywords correspond with the page’s content.
  • Use mistakes in words (people often make mistakes when entering search queries).
  • Don’t limit yourself to high-frequency keywords – also use low- and medium-frequency requests.

 

 

6. Include keywords in the Alt tag for images.

 

Many projects gain lots of traffic from image search results. Since search engines currently can’t properly determine the image content, it is important to use the <alt> tag for their description.

 

To help search engines better familiarize with images published on your website, use keywords in <alt> tags.

 

These are our recommendations:

 

  • Use brief and informative image titles and Alt text. For example, «Green-car.jpg» looks and sounds much better than «1235454.jpg».

 <imgsrc="green-car.jpg" alt="green car">

 

  • Use the Alt text for graphic links. <a href=”http://car.com/”>imgsrc="green-car.jpg" alt="green car"</a>
  • Activate Sitemap for graphic files.
  • Only insert images under 100 KB.

 

To ensure that your page loads fast, it’s essential to use the lightest possible images under 100 KB.

 

If images on your website are larger than 100 KB, they must be optimized. It’s best to use graphic editor to shrink them by decreasing their size as much as possible without losing quality.

 

 

Microdata and Rich Snippet

 

If you want your site to be clicked on more often from search results, it’s important to have attractive and clear snippets.

 

Microdata and Rich Snippet

 

 

Using microdata will help your make your snippets more informative and noticeable on the search results page, which increases their click-through ratio. There are plenty of microdata options available.

 

 

1. Quick additional links - Sitelinks.

 

We recommend using quick links located under the snippet title in search results. This helps users to go directly to the section they’re interested in.

 

quick links in snippets

 

 

 

2. Contacts – for local SEO (NAP).

 

Put contact information in snippets. This helps to improve trust of your users, as it makes easier for them to understand how and where to contact you.

 

 

3. Reviews and Rating.

 

Reviews and rating are used to evaluate the content, product, or service on a site. Rating is based on a 5-point grading scale. Based on reviews from other visitors, new potential customers can evaluate how valuable and interesting is the information on your website, your company itself and its products.

 

 

4. Price.

 

It’s worth placing price of your product or service in microdaa to familiarize users with price levels for their related query as early as on the search results page.

 

 

5. Use geo-tagging SEO.

 

Geo-tags help to assign your website content to a particular geographic location. They’re usually used for locally operated businesses.

 

Best practice is to explain the meaning of a physical address shown in local search results. Say, it’s a restaurant. In this case, if a user from Toronto enters “Italian cuisine restaurant” into a search bar, he or she will first see restaurants located in Toronto.

 

 Use geo-tagging SEO

 

 

To make sure you company appears in local search results, you need to add necessary geographical references into your code. Generally, it’s latitude and longitude coordinates that search engines and users utilize to locate your company.

 

An example of the code with the geo-tag:

 

  • Geomicroformat (coordinates are shown on the website).

 

<span class=’geo’>

<span class=’latitude’>40.693889</span>;

<span class=’longitude’>-74.043611</span>

</span>

 

  • RDF (W3C is determined).

 

<rdf:RDF xmlns:rdf=http://www.w3.org/1999/02/22-rdf-syntax-ns#xmlns:geo=”http://www.w3.org/2003/01/geo/wgs84_pos#”>
<geo:Point>

<geo:lat>40.693889</geo:lat>

<geo:long>-74.043611</geo:long>

</geo:Point>
</rdf:RDF>

 

  • Geotagmetadataformat (coordinates are not shown on the website).

 

<meta name=”geo.position” content=”40.693889;-74.043611″>

<meta name=”geo.placename” content=”Toronto, Canada”>

<meta name=”geo.region” content=”us-ny”>

 

 

Conclusion

 

The list of methods and tools provided in this article is not exclusive, however if every website owner or webmaster pays attention to these issues at the very least, optimization and promotion of your website will become much faster and more efficient.

 

It is especially important to work out delicate technical parameters that allow your web source to operate properly and therefore ensure its proper indexing, which in turn brings your closer to the top of search results.