How can I get Google to index more of my Sitemap URLs?

Bill from Stuart, FL asks: “The Sitemap.xml file states there are 10000 URLs but only 1500 have been indexed. After numerous crawls it does not appear Google is going to index these additional detail pages. What can I do to get Google to index my unique & current detail pages?”
Video Rating: 4 / 5

14 Replies to “How can I get Google to index more of my Sitemap URLs?”

  1. @wintogreen1 Well it probably won’t work well for a blog since “dynamic” isn’t really the name ID parameter as product site. (I.E. – product_id=xxxx). For a blog I would make all the pages static and use keywords in the file name. If your blog is 1,000+ pages and all of them are dynamic Google most likely isn’t going to run parameter queries to grab all the pages.

  2. @infiltrator7777 -Im a beginner, pardon me if its a silly qtn, Im trying to understand infiltrator7777.
    So I have a blog which is dynamic (ie new content & new posts) and have hyperlinks in the blog to static pages like the home page-that helps trap the google bot to crawl better? Does that sound correct? Am I understanding it correctly?

  3. So, it’s the backlinks.
    But, backlinks to what? Homepage? Inner pages? or the inner inner pages which are not yet indexed?
    We all know it’s almost impossible getting backlinks for all the pages of a website which have 10000+ URLs in sitemap.

  4. What a minute what in the world would conversion rates have to do with a 70% of that guys URLs not indexed? That is a new one to me Cutts.

    Also why do you keep saying these people need more links? It is not like it is even possible to get deep links to pages that are not indexed right? So i guess you mean more back links to home page and other pages, while having good site architecture?

  5. i tried changing the order of my pages and google indexed all my pages…kinda changed the priority and the where the actual file was at…

  6. And BILL in Stuart, FL –

    To answer your questions and a little SEO secret –

    YOU NEED TO TRAP THE BOT.

    Create ONE page that has ONE link to your dynamic content and keep the bot from exiting to other static site pages. Once the bot is on this dynamic content it will continue to grab as many pages as your server request can handle.

  7. Links. What a great answer Matt.

    “Well Allow Me To Retort” –

    If a 10,000+ site isn’t being indexed and Google relies on “links” to point too the site content – What makes NEW and FRESH content part of Google?

    Answer – Nothing. It’s your way or the highway Google.

  8. Matt, what do you mean when refferring to visitors that “convert well”? Are these visitors that stay on your site for extended amounts of time or possibly exit your site through sponsor links?

    Please clarify?

  9. In addition I would check wemaster tools carefully – any redirect errors? any duplicate titles or meta information? Do you have a good structure within your site to get to all those pages. I have 300,000 pages in sitemaps and have improved my position dramatically by paying attention to this. Grahame

Leave a Reply

Your email address will not be published. Required fields are marked *