My title page contents

This is default featured post 1 title

This is a Sandeep Mishra. I am a SEO Expert in Lucknow..

This is default featured post 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured post 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Thursday 14 August 2014

Major Five Steps to Casually Hold Google From Listing Your Website


It might be that you're blocking Google without knowing it. That means that Google won't index all the pages of your website. In this article, you'll learn how to block Google and how to make sure that you do not block Google inadvertently.

Keep out - Google
1. Errors in the robots.txt file of your website will keep Google away

The disallow directive of the robots.txt file is an easy way to exclude single files or whole directories from indexing. To exclude individual files, add this to your robots.txt file:

User-agent: *
Disallow: /directory/name-of-file.html

To exclude whole directories, use this:

User-agent: *
Disallow: /first-directory/
Disallow: /second-directory/
If your website has a robots.txt file, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google's search results.

2. Use the meta robots noindex tag and Google will go away

The meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the <head> section of a web page:

<meta name="robots" content="noindex, nofollow">

In this case, search engines won't index the page and they also won't follow the links on the page. If you want search engines to follow the links on the page, use this tag:

<meta name="robots" content="noindex, follow">

The page won't appear on Google's result page then but the links will be followed. If you want to make sure that Google indexes all pages, remove this tag.

The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages.

3.Search Engine Warning Status Code
The server header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a "200 OK" status code. For example, you can use these server status codes:

    301 moved permanently: this request and all future requests should be sent to a new URL.
    403 forbidden: the server refuses to respond to the request.

For search engine optimization purposes, a 301 redirect should be used if you want to make sure that visitors of old pages get redirected to the new pages on your website.

4. Google won't index password protected pages
If you password protect your pages, only visitors who know the password will be able to view the content.

Search engine robots won't be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this.

5. If your pages require cookies or JavaScript, Google might not be able to index your pages

Cookies and JavaScript can also keep search engine robots away from your door. For example, you can hide content by making it only accessible to user agents that accept cookies.

You can also use very complex JavaScripts to execute your content. Most search engine robots do not execute complex JavaScript code so they won't be able to read your pages.

                                 In general, you want Google to index page pages.

Friday 27 June 2014

SandeepSEOExpert: Domain Analysis is Best Way To Establish The Business


Strategy #1: Optimize sites for SEO. Select keyword rich domains. Develop a 5-10 page site at 250 words and build links.

Strategy #2: Buy domains with existing PR. Sometimes people are sitting on domains. Find one that fits your business strategy.

Aftermarket sites: Bido.com, Moniker, Snap Names, Grand Names, etc. Check trademark availability before buying an expensive domain.

Strategy #3: Domains as an investment. Investors are looking at domains for several reasons - a global commodity. Not based in debt. Internet is only growing.

Strategy #4: Develop a business on a domain. If you build it, they just wont come. Real value in domain is developing into a business, unless it has huge type in traffic. Build content, links, and functionality.

Strategy #5: Protect the domain by owning variations and the extensions. Countries, spellings, misspellings, etc.

Strategy #6: Stay current. DN Journal, ICANN, T.R.A.F.F.I.C, Domain Masters podcast on WebmasterRadio.fm.

Victor Pitts:
Why domains matter? Growing to 168 million worldwide. Aftermarket sales have been increasing by 20% year after year. Even in down markets, they are increasing in value.

Domains are both collectibles, and revenue producing. Unique - no two alike. Direct navigation traffic accounts for 10-15% revenue in Yahoo and Google. It's your first impression online. Primary way of locating your site. Tells customers what your business is about. Helps define and reposition brand promises. Improve SEO and SMO.

Protect your brand, register typos and other TLDs.
Case studies:

ToddlerToys.com = Fisher Price.
CreamCheese.com = Kraft.
Underwear.com = Calvin Klein.
All things going equal, your domain name can be the tie breaker in SEO. Case Study: TropicalBirds.com is a new site with less than 6 months uptime. Ranks above highly competitive sites.

Aftermarket domains offer additional benefits - Age, PR, Links. Properly redirected, can give strength to other domains.

Do your research. Check if they have shady links. Can get you hurt.

Domain does have an impact on your CTR in the SERPs. Expands your ad message.

Thursday 24 April 2014

SandeepSEOExpert: Choosing The Right Domain Name For SEO


A few tips on what to do, what steps to take, when researching a domain name buy. Choose a good domain name for your store. Having your own domain rather than using the subdomain supplied with your store adds credibility and professionalism.
(1) Do a site command and if no content is found, that can be a bad thing. Of course, if the domain name was never registered or is new or is parked, then it is likely no content will be found. But if it is buying someone else's domain name, then no content found is a red flag.

(2) Search for the domain name without the .com and see what people wrote about it. See if there are bad stories on it. See if someone went around and did a lot of spamming.

(3) Use archive.org to see the site before you owned it. Did it look spammy?

(4) Ask to see Google Webmaster Tools and look for messages and stats there.

(5) Ask to see Google Analytics or other analytics they may use.

Below I have listed Some Tips Which Helping People to find Great Domain Name..
website. Below, I've listed 12 tips I find indispensable when helping people select a great domain name.

Brainstorm 5 Top Keywords

When you first begin your domain name search, it helps to have 5 terms or phrases in mind that best describe the domain you're seeking. Once you have this list, you can start to pair them or add prefixes & suffixes to create good domain ideas. For example, if you're launching a mortgage related domain, you might start with words like "mortage, finance, home equity, interest rate, house payment" then play around until you can find a good match.
Make the Domain Unique
Having your website confused with a popular site already owned by someone else is a recipe for disaster. Thus, I never choose domains that are simply the plural, hyphenated or misspelled version of an already established domain. I still believe that Flickr desperately needs to buy Flicker.com - I hear kids in their 20's tell parents in their 40's and 50's to see photos on Flickr and always envision that traffic going straight to the wrong domain.

Only Choose Dot-Com Available Domains
If you're not concerned with type-in traffic, branding or name recognition, you don't need to worry about this one. However, if you're at all serious about building a successful website over the long-term, you should be worried about all of these elements, and while directing traffic to a .net or .org (as SEOmoz does) is fine, owning and 301'ing the .com is critical. With the exception of the very tech-savvy, most people who use the web still make the automatic assumption that .com is all that's out there - don't make the mistake of locking out or losing traffic to these folks.

Make it Easy to Type
If a domain name requires considerable attention to type correctly, due to spelling, length or the use of un-memorable words or sounds, you've lost a good portion of your branding and marketing value. I've even heard usability folks toute the value of having the letters include easy-to-type letters (which I interpret as avoiding "q," "z," "x," "c," and "p").

Make it Easy to Remember
Remember that word-of-mouth and SERPs dominance marketing (where your domain consistently comes up for industry-related searches) both rely on the ease with which the domain can be called to mind. You don't want to be the company with the terrific website that no one can ever remember to tell their friends about because they can't remember the domain name.

Keep the Name as Short as Possible
Short names are easy to type and easy to remember (the previous two rules). They also allow for more characters in the URL in the SERPs and a better fit on business cards and other offline media.

Create and Fulfill Expectations
When someone hears about your domain name for the first time, they should be able to instantly and accurately guess at the type of content that might be found there. That's why I love domain names like Hotmail.com, CareerBuilder.com, AutoTrader.com and WebMD.com. Domains like Monster.com, Amazon.com and Zillow.com (whom I usually praise) required far more branding because of their un-intuitive names.
Avoid Copyright Infringement

This is a mistake that isn't made too often, but can kill a great domain and a great company when it does. To be sure you're not infringing on anyone's copyright with your site's name, visit copyright.gov and search before you buy.

Set Yourself Apart with a Brand
Using a unique moniker is a great way to build additional value with your domain name. A "brand" is more than just a combination of words, which is why names like mortgageforyourhome.com or shoesandboots.com aren't as compelling as branded names like bankrate.com or lendingtree.com. SEOmoz itself is a good example - "SEO" does a good job of explaining the industry we're in and creating expectations, while "moz" gives a web association, and an association with being free, open, and community-driven.

Reject Hyphens and Numbers
Both hyphens and numbers make it hard to give your domain name verbally and falls down on being easy to remember or type. I'd suggest not using spelled-out or roman numerals in domains, as both can be confusing and mistaken for the other.

Don't Follow the Latest Trends
Website names that rely on odd mis-spellings (like many Web 2.0 style sites), multiple hyphens (like the SEO-optimized domains of the early 2000's), or uninspiring short adjectives (like "top...x," "best...x," "hot...x") aren't always the best choice. This isn't a hard and fast rule, but in the world of naming conventions in general, if everyone else is doing it, that doesn't mean it's a surefire strategy. Just look at all the people who named their businesses "AAA... x" over the last 50 years to be first in the phone book; how many Fortune 2000's are named "AAA company?"

Use an Ajax Domain Selection Tool
Websites like Domjax make it easy to determine availability of a domain name - just remember that you don't have to buy through these services. You can find a name you like that's available, then go to your registrar of choice.



Saturday 8 March 2014

SandeepSEOExpert: Content Is Useless Without Traffic


A HighRankings Forum thread has an interesting post by a person who has a site that wants more traffic.

He made one statement that stood out to me, actually a few, but one more than others. The statement is:

    All the design and content in the world are useless without traffic, and thus the real world need for SEO.

In today's world of SEO, this thinking seems backwards to me. All the design and content in the world leads to more traffic. Content and site design is part of SEO, it is SEO. Without the quality content, without the awesome user interface and site architecture - how do you get quality traffic from Google or Bing?

SEO 8 years ago, yea - this made sense. But nowadays, in most established niches, including insurance lead generation, you need the content first to get the traffic.

Ammon Johns, aka BlackKnight, who is an old time SEO and relatively quiet in the forums for the past few years, chimed in on this one adding:

    First there seem to be the fact that you created a website without actually knowing what it is for. 'Possibly more' than a test site? It leaves me wondering if it really looks as professional as you hope, when yours apparently has no clear reason/purpose for existence other than to test some SEO ideas.

    Secondly there's that whole thing of being overly concerned with what other sites are doing - researching the design, content and keywords of the top 20 websites in what field? The top 20 SEO test sites? How did you know they were top 20 before knowing what keywords they'd be ranked in the top 20 for? Do you see what I mean? There's a lot of logical fallacies involved here.

Monday 3 February 2014

Google Updates- Latest SEO News Techniques


Get Latest Seo news Techniques, search engine optimizations updates in Google, penguin updates, matt cutts google algorithm, latest panda update, what is white hat seo, black hat seo tactics, how to do Social Media Marketing, learn link building tips, on page seo, what is SEO, off page Seo tricks.

Matt Cutts, Google’s head of search spam, made an announcement on Twitter that action has been taken on another link network:

In a tag on up to that tweet, Matt added in the form of a reply: “today France; Germany soon.” So expect news in the near future of a German link network being taken down.
That marks two more link networks to fall victim in Google’s ongoing crusade against websites that openly violate Google’s guidelines.

Buzzea is one of the few to really take offence to being called a link network and attempt to protect themselves. In a translated statement posted on their website they state that they “oppose this declaration since we never stopped wanting to keep the ethical side of sponsored articles focusing on quality and natural links created.”

However, in the same statement they also decided it was time to leave their business behind: “This marks the end of an adventure for our team but also the end of our collaboration with thousands of publisher sites, agencies and advertisers who have given us their trust.”


That trust Buzzea speaks of will likely result in a warning in Google Webmaster Tools very shortly, as well as a dip in rankings. It would be wise of anyone who participated in the link network to disavow the links immediately and stay far away from anything resembling a link network in the future. 

Thursday 30 January 2014

Google EMD Update: Exact Match Domains

What is Google’s EMD Algorithmic Update?


Google is going after "low quality" exact match domains (EMD) to ensure they do not rank well in the Google search results. 

It seems like many sites were hit, as many webmasters have reported being hurt by this update. A WebmasterWorld thread has several webmasters claiming to be victims. I will do a poll on this in about a week, I don't want to poll our readers until they have time to investigate if they were impacted by this. But it seems pretty significant, especially for SEOs and domainers.

It’s still early, but it seems that it’s not intended to wipe the search results entirely clean of sites with spammy domain names. Rather, it’s intended to keep the search results in check for anything that could ruin the user experience.

Furthermore, Danny Sullivan of SearchEngineLand wrote that Google confirmed that the EMD algorithm is going to be run periodically, so that those that have been hit stay filtered or have a chance to escape the filter, as well as catch what Google what might have missed during the last update.
It’s clear that Google wants its search results to be natural and free of manipulation. What used to be one of the industry’s most powerful ranking tactics is now something that could 


Thursday 16 January 2014

Changes in SEO: What You Need to Know in 2014


In the spirit of a New Year's Eve edition, I'm presenting some of the most popular (according to the US Government) New Years Resolutions through an SEO lens, in the hope of spreading some good cheer and great strategy along with a very happy new year to you and yours!

The 2013 Updates
So what exactly changed in 2013? Most of what occurred came as a result of Google algorithm updates.

Panda was first introduced in 2011 to reduce the amount of content farming that was going on within websites. In 2013, Google updated Panda on multiple occasions, although this was not anything that webmasters weren’t already used to.

Penguin 2.0 (May 2013) and 2.1 (October 2013) brought more to the algorithm to fight against spam links and dish out more penalties for sites that broke the rules.


However the biggest and most significant update Google has tackled in years wasn’t an update at all — it was a massive algorithm overhaul in August 2013

Hummingbird completely changed the way sites would look at their content, keywords, and SEO strategies. It gave a heavier focus on semantic search and placed greater emphasis on the power of the Google Knowledge Graph.

Preparations For 2014
The rules, guidelines, and even major players in SEO seem to shift and adjust on a regular basis. While it is impossible to predict exactly what will happen next, it is possible to make assumptions and form strategies from there.

To have effective SEO in 2014, make sure your site is optimized for mobile users, provides quality and informative content, and has a high level of authority.


Continue to stay up to date and follow guidelines that Google lays out as you work to improve your site’s traffic and rankings.

Wednesday 8 January 2014

Google Webmaster Tools Now Shows Exact Number of Clicks per Keyword by URL

You know the search queries report in Google Webmaster Tools? Of course you do. Do you know, you can show your top pages by impressions and clicks? Did you know that now you can see the top keywords that led to impressions and clicks to those pages?

Friday 3 January 2014

Google Not Indexing Your Sitemap URLs? Might Be A Canonical Issue

                                           
A Google Webmaster Help thread has a webmaster all upset that Google shows that they have indexed none of the URLs they submitted via the XML Sitemap file. Obviously, this can be concerning to any webmaster.

The thing is, you need to be careful what Sitemap file you submit. If you verify a non-www version with Google Webmaster Tools and submit a www version of your sitemap, or visa versa, Google may be very literal and show you that they didn't index any non-www versions of your URL.

Google's Zineb said in the thread:

    All the URLs listed in your XML Sitemap are non www. and they all permanently redirect to the www. version (which is your preferred domain). That explains why these specific URLs are not indexed. In order to fix that, you'll need to specify the right URLs with www., resubmit your Sitemap and wait for it to be processed again.

So technically, it is an "easy fix" and the site is indeed being indexed. But a report like this can be scary to see in Google Webmaster Tools.

                                                           



SEO 2014 Updation: Google Can not Crawl You Site


If Bot Can Not Crawl Your Robts.txt file Then we stop Crawling Your Site

Did you know that if Google cannot crawl your robots.txt file, that it will stop crawling your whole site?

This doesn't mean you need to have a robots.txt file, you can simply not have one. But if you do have one and Google knows you do and it cannot access it, then Google will stop crawling your site.

Google's Eric Kuan said this in a Google Webmaster Help thread. He wrote:

If Google is having trouble crawling your robots.txt file, it will stop crawling the rest of your site to prevent it from crawling pages that have been blocked by the robots.txt file. If this isn't happening frequently, then it's probably a one off issue you won't need to worry about. If it's happening frequently or if you're worried, you should consider contacting your hosting or service provider to see if they encountered any issues on the date that you saw the crawl error.


 
ssss