Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 6 of 6
  1. #1
    Senior Coder
    Join Date
    Jun 2008
    Location
    New Jersey
    Posts
    2,535
    Thanks
    45
    Thanked 259 Times in 256 Posts

    Selecting object not in DOM

    I'm sure this is an oft asked question; I just can't seem to word it in a way to to find said answer.

    Common practice says to minimize the number of CSS and JS files you load. However, I always made the assumption that if I had a selector searching that didn't find anything (specially a class selector), it was much slower to do that then to load a separate page.

    For example, I wrote a private messaging system on one website. I use AJAX to send and delete messages. The selectors used for this are unused anywhere else. Am I better off:

    a) Having scripts specific to a page in a separate JS file, loaded on each said page.
    b) Have the code in a single page, leaving each selector to run as is
    c) Have the code in a single page, but use if statements to check if we're on the particular page, and only have those selects under the if block (each page has a unique ID, so I can test for this ID and do an if/else chain)

  • #2
    Senior Coder rnd me's Avatar
    Join Date
    Jun 2007
    Location
    Urbana
    Posts
    4,273
    Thanks
    10
    Thanked 581 Times in 562 Posts
    Quote Originally Posted by Keleth View Post
    I'm sure this is an oft asked question; I just can't seem to word it in a way to to find said answer.

    Common practice says to minimize the number of CSS and JS files you load. However, I always made the assumption that if I had a selector searching that didn't find anything (specially a class selector), it was much slower to do that then to load a separate page.

    For example, I wrote a private messaging system on one website. I use AJAX to send and delete messages. The selectors used for this are unused anywhere else. Am I better off:

    a) Having scripts specific to a page in a separate JS file, loaded on each said page.
    b) Have the code in a single page, leaving each selector to run as is
    c) Have the code in a single page, but use if statements to check if we're on the particular page, and only have those selects under the if block (each page has a unique ID, so I can test for this ID and do an if/else chain)
    interesting and extremely complex question. Browsers have really changed the way they load resources in the past year or two, so any advice from before 2010 is moot.


    unused selectors do cause work, which slows page rendering.
    unused selectors also waste http bandwidth, which slows page loading, though usually it's a trivial amount, unless the site has a lot of repetitive rules from several years of redesign and enhancement.

    in general, the best performance comes from re-using the same assets again and again. if every page asks for different scripts or stylesheets, those uncached files must be slowly fully fetched before the page loads.

    you need to balance it all out. if 60/100 pages all use some CSS, put it in a file that all 100 pages see. if only 5 pages need it, and it's a lot of rules (>15kb), put it in a separate file.


    it's not quite true that one js file is faster than many. most servers can serve up to 8 HTTP connections/browser at once. this means that 4 scripts of 250kb/each loaded in parrellel will arrive and parse ~3X faster than would a single 1mb file. If each of the 4 250kb scripts is on a different domain and thus needs dns, then the single file might win.

    browsers now grab all the <script src> tags at once, not one-at-a-time like they had done since 1995. So, the performance penalties of page-load-killing <script src> tags are nowhere near as severe as they were not too long ago.


    here's a strategy to get the good perf on the widest range of circumstances:

    from the page's point of view, a page should grab:

    html:
    1 file, at the url

    css:
    1 master file used in all pages
    1 section-specific file (if needed)
    1 page-specific file (if needed)

    js:
    1 cdn jquery
    1 cdn jquery ui

    1 all jquery plugins concatenated to one file locally
    1 site-wide js file for configing plugins, monkey-patching, etc.
    1 section-specific script file for the functionality of the page (if needed)

    that uses 4 http connections to get through the <head>
    those are all closed by the time the next local batch of 3 js files is asked for
    even while loading the three scripts, you have 5 channels open to fetch images...

    in general, i would err on the side of over-fetching unused stuff as cached transactions instead of fetching things i need each page without the aid of caching.

    does that make sense?
    Last edited by rnd me; 04-12-2013 at 02:19 AM.
    my site (updated 13/9/26)
    BROWSER STATS [% share] (2014/5/28) IE7:0.1, IE8:5.3, IE11:8.4, IE9:3.2, IE10:3.2, FF:18.2, CH:46, SF:7.9, NON-MOUSE:32%

  • #3
    Senior Coder
    Join Date
    Jun 2008
    Location
    New Jersey
    Posts
    2,535
    Thanks
    45
    Thanked 259 Times in 256 Posts
    I was under the impression that browsers auto-cached? When you say cached there, do you mean something additional? I followed up to the very end.

    How I have my stuff setup now is similar to what you presented:

    1 sitewide CSS
    1 page CSS

    I make use of jQuery through google's CDN
    I actually don't merge my plugins, but I see the benefits there, didn't think of that, I'll start doing it.
    1 sitewide JS
    1 page JS

    So I knew that multiple resources get pulled at once, I guess I wasn't sure if the load was worth it. From what you're saying, I'm more or less on the right track, though I could use some extra sorting (for some reason, I never considered section side pages... this is so obvious now, I'm gonna go force myself to read a paragraph of Twilight as punishment).

  • #4
    Senior Coder rnd me's Avatar
    Join Date
    Jun 2007
    Location
    Urbana
    Posts
    4,273
    Thanks
    10
    Thanked 581 Times in 562 Posts
    Quote Originally Posted by Keleth View Post
    I was under the impression that browsers auto-cached? When you say cached there, do you mean something additional?
    they can/should cache automatically, depending on you server's header config. the yslow extension for firebug is great about spelling these out in detail in the context of your/any site.

    what i'm saying is that once you make sure your server is correct (expires headers, versioning, etc) you should avoid first-time url pulls as much as possible in the HTML, and your site will be fast.

    i bundle the plugins because i don't ever modify them, so i can simply copy new script files into the folder when they are released, and rebuild the pack without touching the code.

    on windows, you can concat a whole folder of plug-ins on the command line in one step:

    Code:
    copy *.js plugins.packed.js

    this also works for CSS files as well, we like to concat several files into a site-wide base pack (reset,layout,typog,form,theme,etc).
    the one slight catch with CSS is that you need to mind your order when bundling. a priority number in front of the version number and name works well (0.14.reset.css, 1.11.layout.css, etc).

    i've anecdotally found that any larger than 30kb on a single CSS file/url makes the performance go down a fair amount, especially on mobile.
    perhaps since browser progressively parse and render the CSS, hogging the mic for so long stalls out other loading activity, but who knows...

    anyway, sounds like you're on the right track.


    for the ultimate performance, a manifest file works WAY better than HTTP headers, but does require editing two places to make each change, not sure if it's worth it for your site, but it can be a godsend for heavy applications, and is worth knowing about, if not using.
    my site (updated 13/9/26)
    BROWSER STATS [% share] (2014/5/28) IE7:0.1, IE8:5.3, IE11:8.4, IE9:3.2, IE10:3.2, FF:18.2, CH:46, SF:7.9, NON-MOUSE:32%

  • #5
    Senior Coder
    Join Date
    Jun 2008
    Location
    New Jersey
    Posts
    2,535
    Thanks
    45
    Thanked 259 Times in 256 Posts
    What do you mean by manifest file?

    And coincidentally... here I was, thinking I understood some things about best practice and how to speed up sites...

    So I'm not good with server stuff unfortunately... I need to learn about how to set the headers you're talking about, for cacheing. Also, in what context are you talking about versioning here? Meaning give file names versions? Or is this another header item I need to learn about?
    Last edited by Keleth; 04-12-2013 at 07:55 PM.

  • #6
    Senior Coder rnd me's Avatar
    Join Date
    Jun 2007
    Location
    Urbana
    Posts
    4,273
    Thanks
    10
    Thanked 581 Times in 562 Posts
    Quote Originally Posted by Keleth View Post
    What do you mean by manifest file?

    And coincidentally... here I was, thinking I understood some things about best practice and how to speed up sites...

    So I'm not good with server stuff unfortunately... I need to learn about how to set the headers you're talking about, for cacheing. Also, in what context are you talking about versioning here? Meaning give file names versions? Or is this another header item I need to learn about?
    see Javascript on tablets and smartphones ... question for info on "my" manifests.

    versioning is done within the manifest, or, without one using distinct URLs in conjunction with the Expires header to make sure any url is only downloaded once.
    Most assets don't ever change, especially images, so at least add "Expires" http headers to "image/*" in your server config.

    Since it's only downloaded once, you would need to give the browser a new url to change any external content like a script, image, or stylesheet.


    a filename numbering schema is best for this (site.typo.123.css), but you can simply append a "?v=123" to the end of any <script src or <link href URL. This forgoes caching on old proxy servers, but that's a tiny minority anyway, and it just make it slightly slower for them, not broken. You then need to edit the link url if you edit the file. this is kind of a pain, but worth it if you don't want to get into manifests. While you are developing a site, use links that don't emit "Expires" headers so you can just F5 instead of changing two things and then F5...
    my site (updated 13/9/26)
    BROWSER STATS [% share] (2014/5/28) IE7:0.1, IE8:5.3, IE11:8.4, IE9:3.2, IE10:3.2, FF:18.2, CH:46, SF:7.9, NON-MOUSE:32%


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •