Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 15 of 15
  1. #1
    New Coder
    Join Date
    Mar 2005
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Sites not being indexed by google

    Hi,

    I am a self-taught web designer, so there are some definite holes in my knowledge.

    Weird thing...the last 2 web sites I designed are not showing up on google at all. Just not indexed. And one web site has been up and running for over a year.

    No, neither one of the sites is 100% validated but they have fewer errors on them than earlier sites I designed, and those earlier sites ARE showing up in google.

    Can anyone help me figure out how this could be?

    Sites not showing up:

    http://mercuryfreect.org (up for 1+ year)
    http://wbc-ct.com (up for 4+ weeks)

    Older sites with MORE errors that ARE showing up:

    http://oxygenesis.org
    http://flexflicks.com

    Maybe this has nothing to do with validation, but something else altogether? And by the way, I know it was not wise to use frames for the wbc site, but the client insisted. It is the only site I ever made with frames, so frames does not explain why mercuryfreect is not showing up.

    I know I should validate but I am not good at understanding and fixing all of the problems they point out. Still have a lot to learn.

  • #2
    Supreme Master coder! _Aerospace_Eng_'s Avatar
    Join Date
    Dec 2004
    Location
    In a place far, far away...
    Posts
    19,291
    Thanks
    2
    Thanked 1,043 Times in 1,019 Posts
    This likely has a lot to do with validation as well as the use of tables for page layout. Read the link in my sig about tables. Another issue could be content not being updated often enough. Search engines like sites that have content that gets updated regularly. You should also try submitting your site to search engines. Google allows you to do this. Another thing google likes to see is how many link backs you get meaning other sites that have links to your site from their site. I suggest installing analytics on your sites.
    http://www.google.com/analytics/
    Also look into google webmaster tools.
    https://www.google.com/webmasters/
    Last edited by _Aerospace_Eng_; 06-23-2007 at 05:45 PM.
    ||||If you are getting paid to do a job, don't ask for help on it!||||

  • #3
    Regular Coder
    Join Date
    Feb 2007
    Posts
    196
    Thanks
    9
    Thanked 0 Times in 0 Posts
    Yep. Try editting keywords or put your weblinks into the crawler of google on /webmasters/

    It'll take some time before the bot will sniff your site, but i never encountered that a Bot did not index my site cause of bad code.

    If u got a few minor errors like "ALT not specified" or whatever it's most likely not the reason. If u put up a website and you don't tell anybody about it, how the heck will google ever find it?

  • #4
    Senior Coder
    Join Date
    Jul 2005
    Location
    UK
    Posts
    1,051
    Thanks
    6
    Thanked 13 Times in 13 Posts
    This likely has a lot to do with validation as well as the use of tables for page layout.
    This is totally untrue. No matter how much you'd like it to be the case, search engines simply don't care about markup.

    Also submitting your site to the search engines is a waste of time.

    Anyway, the first site is indexed.

    The other site isn't indexed because it has no links.

    If u put up a website and you don't tell anybody about it, how the heck will google ever find it?
    Yep, now there's a man thinking logically

  • #5
    Supreme Master coder! _Aerospace_Eng_'s Avatar
    Join Date
    Dec 2004
    Location
    In a place far, far away...
    Posts
    19,291
    Thanks
    2
    Thanked 1,043 Times in 1,019 Posts
    Quote Originally Posted by Pennimus View Post
    This is totally untrue. No matter how much you'd like it to be the case, search engines simply don't care about markup.
    What proof do you have of that? Clearly a site coded using semantic html will be easier for a search engine to index. You saying search engines don't care if a site is coded in frames either?
    Last edited by _Aerospace_Eng_; 06-23-2007 at 11:48 PM.
    ||||If you are getting paid to do a job, don't ask for help on it!||||

  • #6
    New Coder
    Join Date
    Mar 2005
    Posts
    11
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks everyone. I will take what you said into account.

  • #7
    Master Coder
    Join Date
    Apr 2003
    Location
    in my house
    Posts
    5,211
    Thanks
    39
    Thanked 201 Times in 197 Posts
    Quote Originally Posted by Pennimus View Post
    This is totally untrue. No matter how much you'd like it to be the case, search engines simply don't care about markup.
    I recall an expression of my Grandmother...Balderdash! Effective/valid mark-up can make a huge difference because concise content means that search engines find the content that matters to individuals keywords/guesswords so that they can rank you more highly than those sites which have a load of 'tag soup'.
    How old is that web page? I have to ask: how well was it read by the poster of its link? reading it again may explain why, as a small business, submission of your website is necessary. Ando fo course, he was promoting his own book too! ( icon for :despair at promotion of self interest: )

    And forgive me but I have to highlight the contradiction in Pennimus post (4th down from the top of page 1)

    'Submission to search engines is a waste of time'

    How does this statement marry up with his quote of another poster's statement,
    "If u put up a website and you don't tell anybody about it, how the heck will google ever find it?"

    1. from his self-appointed knowledagble standpoint, he contradicts himself in the one post and also,
    2. If links were the only criteria for good search engine rankings this world of SEO would be a much easier place to operate.

    My suggestrion is that one should listen to those who have a vast knowledge of the subject. chekc out the posts number of _AeroSpace_Eng_ and check out his own links. Thne you should understand the valididity of his comments.

    Sorry if my post sounds/reads very harsh but, I despair at the soundings of those who profess their knowledge when it clearly doesn't stand up to scrutiny.

    bazz

    In summary; _AeroSpace_Eng_ knows what he is talking about.
    And.yes, I am a bit cross tonight but that only makes a difference as to what I 'say' and not, what I believe.

    Last edited by bazz; 06-24-2007 at 02:04 AM.

  • #8
    Regular Coder kewlceo's Avatar
    Join Date
    Mar 2006
    Location
    California, US
    Posts
    484
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Pennimus View Post
    This is totally untrue. No matter how much you'd like it to be the case, search engines simply don't care about markup.
    BS. A plethora of validation errors will screw up a bot's ability to read a site, plain and simple. If you have doubts, read Matt's blogs on Google Webmaster.
    UBERHOST.NET
    Shared, reseller, semidedicated hosting and dedicated server plans.
    DirectAdmin • Installatron • Money-Back Guarantee • 24/7 Support
    Providing "Service Above All Else" since 2005.

  • #9
    Master Coder
    Join Date
    Apr 2003
    Location
    in my house
    Posts
    5,211
    Thanks
    39
    Thanked 201 Times in 197 Posts
    I really must remind myself of the word 'concise'

  • #10
    Senior Coder
    Join Date
    Jul 2005
    Location
    UK
    Posts
    1,051
    Thanks
    6
    Thanked 13 Times in 13 Posts
    What proof do you have of that?
    Gee I dunno, why not pick any competitive keyword and see how many of the top ranking sites will validate? I'm willing to bet it will be about the same proportion of sites that would validate overall.

    Clearly a site coded using semantic html will be easier for a search engine to index.
    I'm not sure how that is so clear. All the evidence from almost any search results page will show you that sites get indexed regardless of if they are valid or not.

    You saying search engines don't care if a site is coded in frames either?
    Frames are one of the few exceptions, arguably outside the realm of markup (and in any case, they don't automatically invalidate a page). Your original post made the case for validation and tableless design in connection with getting a site indexed, which is the point I disagreed with.

    How old is that web page? I have to ask: how well was it read by the poster of its link? reading it again may explain why, as a small business, submission of your website is necessary. Ando fo course, he was promoting his own book too! ( icon for :despair at promotion of self interest: )
    It's my page, and is about a year old (it's still relevant, hence my posting of it). I wrote it and the book to have a place to direct people to, because submitting your site to the search engines is one of the most oft-given bits of bad advice and I got tired of typing out the same thing over and over again. Yes, it's also my book.

    Shame on me for giving away useful information for free

    How does this statement marry up with his quote of another poster's statement,
    "If u put up a website and you don't tell anybody about it, how the heck will google ever find it?
    You must not have read the article properly, or have otherwise missed the point, which is that getting links to the site is better in every respect than submitting it to the search engines, because it does the same job, but better, and helps your rankings. Therefore submission = a waste of time... unless your idea of fun is filling in forms.

    My suggestrion is that one should listen to those who have a vast knowledge of the subject. chekc out the posts number of _AeroSpace_Eng_ and check out his own links. Thne you should understand the valididity of his comments.
    It's true AE has a lot of posts but that certainly doesn't equate to him having a "vast knowledge of the subject". Firstly, AE is primarily a developer and that's attested to by the fact that the majority of his posts are about HTML or CSS and actually have nothing to do with SEO, and his site (or his companies site, whichever) makes no mention of SEO at all.

    I'm really not against valid and semantic markup, I think it's a brilliant thing. What I do object to is disinformation within my areas of expertise, in much the same way as a HTML expert would rightly object if I replied to a question saying tables were the best way to lay out a site.

    Anyway, I'll leave you with this. If search engines actually cared about validation wouldn't it seem reasonable for, say, Google to make the effort to validate?

  • #11
    Regular Coder kewlceo's Avatar
    Join Date
    Mar 2006
    Location
    California, US
    Posts
    484
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Pennimus View Post
    I'm really not against valid and semantic markup, I think it's a brilliant thing. What I do object to is disinformation within my areas of expertise...
    Agreed, there is a lot of SEO debunking to be done. Goolebot's job is made easier by virtue of sitemaps and valid code, so Google encourages Webmasters to write clean pages and have a sitemap. Webmasters then get the idea that doing these things will get them 'in' better with the search engines, and they spend too much time being anal about semantic code and learning about sitemaps instead of attending to the more important things, like getting high-quality backlinks.

    Still, semantic and valid markup never hurt anyway, especially the small guy. Amazon can get away with 1,400 page errors because valid inbound links beat valid markup every day. In any case, though, it's a good idea to help the bots do their job as efficiently as possible.
    UBERHOST.NET
    Shared, reseller, semidedicated hosting and dedicated server plans.
    DirectAdmin • Installatron • Money-Back Guarantee • 24/7 Support
    Providing "Service Above All Else" since 2005.

  • #12
    Supreme Master coder! _Aerospace_Eng_'s Avatar
    Join Date
    Dec 2004
    Location
    In a place far, far away...
    Posts
    19,291
    Thanks
    2
    Thanked 1,043 Times in 1,019 Posts
    Quote Originally Posted by Pennimus View Post
    Anyway, I'll leave you with this. If search engines actually cared about validation wouldn't it seem reasonable for, say, Google to make the effort to validate?
    Why? They aren't trying to index themselves.
    ||||If you are getting paid to do a job, don't ask for help on it!||||

  • #13
    Senior Coder
    Join Date
    Jul 2005
    Location
    UK
    Posts
    1,051
    Thanks
    6
    Thanked 13 Times in 13 Posts
    Why? They aren't trying to index themselves.
    Firstly, yes they are, if you expand your horizon beyond just the search interface. Here's a short list of Google properties that they want indexed.

    - Webmaster Central (11 validation errors)
    - Base (139 validation errors)
    - Maps (147 errors)
    - YouTube (186 validation errors)
    - Google Video (245 validation errors)

    Needless to say these sites are indexed and rank for a range of appropriate keywords.

    I could go on but at this stage you're either going to be convinced that having an invalid badly marked up page makes no difference to whether a site is indexed, or you're not and no amount of proof will change your mind.

    Secondly, even if Google for some reason didn't want to index it's own sites, if they cared about valid mark up I would still expect them to ensure their sites validated as a matter of principle: in much the same way that I would expect your sites to be valid given how you are a proponent of valid code.

    Still, semantic and valid markup never hurt anyway
    I quite agree, just not for anything to do with search engines. When sites with hundreds of errors are being indexed without any problems I can't see how anyone can think otherwise.
    And yes I've focussed on the "big guys" but if I had the motivation I could easily find small websites with an equally large amount of errors that are indexed.
    Last edited by Pennimus; 06-24-2007 at 09:21 PM.

  • #14
    Supreme Master coder! _Aerospace_Eng_'s Avatar
    Join Date
    Dec 2004
    Location
    In a place far, far away...
    Posts
    19,291
    Thanks
    2
    Thanked 1,043 Times in 1,019 Posts
    I never said that sites couldn't get indexed because of invalid code. I said its easier for them to be indexed when they have valid code. I'm done with this, no one is going to change each others mind.
    ||||If you are getting paid to do a job, don't ask for help on it!||||

  • #15
    Senior Coder
    Join Date
    Jun 2002
    Location
    The Netherlands, Baarn, Ut.
    Posts
    4,252
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Unicity

    In case of the Mercury Free site, I'm afraid it's more a problem of sticking out of the crowd, or rather not enough so; using the search terms "mercury free", I get a plethora of sites about the subject: tough market to compete in.

    Edit:
    By the way: if you use " mercury free ct", you rank on top! It's also a matter of using the right search terms...
    Last edited by ronaldb66; 06-25-2007 at 12:52 PM.
    Regards,
    Ronald.
    ronaldvanderwijden.com


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •