Thursday 11 February 2016

Open Graph Protocol Meta Tags

Open Graph Meta Tags

For Facebook and other social media account open graph feature is very important. To using this property tag you can give best look to your post. You can change the post image, title, description accordingly what you want. Showing unique post and showing it differently from other competitor you can improve the website traffic. Visitor will attract to this and can click your post. Here you have different types of open graph meta tags like title tag, description tag, image tag .


Open Graph Protocol Meta Tags



Here you can see these tag has been put under the head tag and there are different type of tags you can see here.

Thursday 4 February 2016

How to Add Google Analytics

From Analytic Tool  you can get the knowledge of you website visitor from that you can improve traffic and your website page traffic. From Google analytic you can get the information about organic, paid, visitor after that you can separate the information and can analyse your website and take appropriate action for improve your site performance. Below you can see step by step Google analytic code implementation form this you can get know the estimated unique monthly traffic



Apply Analytic Code
Add caption

 


Login to Google analytic account go to admin and click tracking Info - than get the script and copy that.





After copy code from Google analytic tool than go to website source paste code above </head> close header tag.
















Wednesday 3 February 2016

How to Create a Robots.txt File ?



How to apply or put its code?


The best and short answer is put it into top-level directory of your web server.


When a robot looks for the "/robots.txt" file for URL, it strips the path component from the URL (everything from the first single slash), and puts "/robots.txt" in its place.

For example, for "http://www.example.com/shop/index.html, it will remove the "/shop/index.html", and replace it with "/robots.txt", and will end up with "http://www.example.com/robots.txt".



See also:

What program should I use to create /robots.txt?
How do I use /robots.txt on a virtual host?
How do I use /robots.txt on a shared host?
What to put in it

The "/robots.txt" file is a text file, with one or more records. Usually contains a single record looking like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /~joe/
In this example, three directories are excluded.


Note: When you create this you need to sepaate "Disallow" line for every URL prifix you want to exclude -- Here you can't "Disallow: /cgi-bin/ /tmp/" on a single line.


Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "User-agent: *bot*", "Disallow: /tmp/*" or "Disallow: *.gif".

What you want to exclude depends on your server. Everything not explicitly disallowed is considered fair game to retrieve. Here follow some examples:

To exclude all robots from the entire server

User-agent: *
Disallow: /

To allow all robots complete access

User-agent: *
Disallow:
(or just create an empty "/robots.txt" file, or don't use one at all)

To exclude all robots from part of the server

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/
To exclude a single robot

User-agent: BadBot
Disallow: /
To allow a single robot

User-agent: Google
Disallow:

User-agent: *
Disallow: /
To exclude all files except one

This is currently a bit awkward, as there is no "Allow" field. The easy way is to put all files to be disallowed into a separate directory, say "stuff", and leave the one file in the level above this directory:
User-agent: *
Disallow: /~joe/stuff/

Alternatively you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /~joe/junk.html
Disallow: /~joe/foo.html
Disallow: /~joe/bar.html