Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 14 of 14
  1. #1
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post

    Avoiding document.getElementById() or similar DOM instantiation

    Having many calls to document.getElementById() may be detrimental to script performance. I developed a noSQL database for Web Browsers with JavaScript that stores references to DOM elements in a short array (its length is equal to that of number of elements on user screen). This element reference cache makes possible invocation of DOM elements without instantiation, ie without using getElementById(), like so:

    Code:
    var doc = dbInstance.select(col:'id', val='idOfElement')
    or by any other element property, like class:
    Code:
    var doc = dbInstance.select(col:'className', val='classOfElement')
    The noSQL database takes care of indexing and retrieval of elements. The index is built once at time when a web page is rendered. More about this tech at noSQL database for Web


    Gonki

  2. #2
    Senior Coder deathshadow's Avatar
    Join Date
    Feb 2016
    Location
    Keene, NH
    Posts
    2,169
    Thanks
    2
    Thanked 314 Times in 304 Posts
    Seeing that it would introduce the overhead of a database to it, have to build the database in the first place, etc, etc... I fail to see how this would be any "faster".

    USUALLY the best bet for avoiding the overhead is to only call it once and save it in a variable or object property, instead of calling it every blasted time you need it. GOUM -- get once, use mostly. That would skip the overhead fo your DB which I can't see ACTUALLY being any faster; quite the opposite in fact. If anything that should be many, many, MANY times slower -- more so with what you'd have to be doing for startup code.

    You're taking something simple, and making an over-complicated mess out of it.
    I would rather have questions that can't be answered, than answers that can't be questioned.
    http://www.cutcodedown.com

  3. #3
    Regular Coder
    Join Date
    Oct 2014
    Location
    California
    Posts
    399
    Thanks
    1
    Thanked 84 Times in 83 Posts
    Well, a few points I feel the need to make:

    Having many calls to document.getElementById() may be detrimental to script performance.
    That simply isn't true in practice. Even on a 3-year-old mobile device, JavaScript performance is just really good. It's quite a challenge to code a fairly normal website and actually tax the user's browser. Even the most inefficient tactics cause only small delays most of the time. I'm not saying that's an excuse to use bad coding practices but for the majority of tasks a front-end developer faces, there's very little opportunity to create noticeable differences in performance.

    Aside from that, if you have "many calls to document.getElementById()" in your script, then there's a good chance you're simply doing things inefficiently. As Deathshadow recommends, if you need to reference a DOM object multiple times, you should store it in a variable for later use. You should also bind events to the document.body object as a delegate and then cross-reference their ID/ClassName/Etc. rather than binding events to a bunch of individual objects. There are all kinds of small tricks you can use to increase efficiency but your main priorities should be collaboration and simplicity: is your code well-documented and simple enough that you could invite any random developer to collaborate on your project and have them reasonably understand your code without having to organize an exhaustive training process?

    Code that can be shared and understood by other people is more valuable than code that shaves off a few milliseconds of execution time.
    I almost always format my code examples with [php][/php] tags because it looks nicer than the regular [code] tag. That does not mean the code example is in PHP.

  4. #4
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    The database takes care of saving DOM element references in an index (array). The index is created only once during page rendering. If same page is called more than once the index is not re-created.

    In a single-page application environment, which is part of 2DX functionality, there may be tens of thousands of elements in DOM tree. I ran benchmark tests loading thousands of DIV elements across dozens of web pages and saw very good performance. I'll try to put up benchmark test results up on the website.

    Obviously avoiding document.getElementById() is only one small feature of 2DX. Moreover it's a single page application env that I hope one day will present a competition to more well-known such platforms.


    Take a look at how pages are defined



    Code:
    // given following DOM tree:
    
    [{div:{innerHTML:'A'}},
     {div:{innerHTML:'B',
           nodes:[{div:{innerHTML:'B_1'}},
                  {div:{innerHTML:'B_2'}}]}},
     {div:{innerHTML:'C',
           nodes:[{div:{innerHTML:'C_1',
                        nodes:[{div:{innerHTML:'C_1_1'}}]}},
                  {div:{innerHTML:'C_2'}}]}}]
    Code:
    // pages with elements (A), (B, B_1, B_2) and (C, C_1, C_1_1, C_2)
    // are referenced using these pseudo "href" :
    
    [0]                  // el. A
    [{1:[0,1]            // el. B, B_1, B_2
    [{2:[{0:[0]}, 1]}]   // el. C, C_1, C_1_1, C_2

    Such schema for dynamic page referencing in single-page app will automate definition of page element structures in websites with large amounts HTML tags.

  5. #5
    Moderator
    Join Date
    May 2002
    Location
    Hayward, CA
    Posts
    1,493
    Thanks
    1
    Thanked 24 Times in 22 Posts
    I'm inclined to agree with deathshadow and sagebrushfire. But I also need to point out that you probably don't even need NoSQL with modern JavaScript. Just use the Map, WeakMap, Set or WeakSet built-ins...
    "The first step to confirming there is a bug in someone else's work is confirming there are no bugs in your own."
    June 30, 2001
    author, ES7-Membrane project (Github Pages site)
    author, Verbosio prototype XML Editor
    author, JavaScript Developer's Dictionary
    https://alexvincent.us/blog

  6. #6
    Senior Coder deathshadow's Avatar
    Join Date
    Feb 2016
    Location
    Keene, NH
    Posts
    2,169
    Thanks
    2
    Thanked 314 Times in 304 Posts
    Quote Originally Posted by Gonki View Post
    The database takes care of saving DOM element references in an index (array). The index is created only once during page rendering. If same page is called more than once the index is not re-created.
    Since a new page would be a new DOM, then how in blazes is the relationship between said index and the DOM references maintained -- for that matter, how are they established in the first place since last I checked, you can't store DOM references in noSQL.

    I was thinking about it, and your entire system does'nae make a lick of sense since the relationship between noSQL and the DOM needed for the functionality you're saying... it doesn't exist last I knew! Even if it did, it wouldn't hold across page-loads!

    Quote Originally Posted by Gonki View Post
    In a single-page application environment
    Ah, buggy slow battery wasting crapplets. Explains why you're doing stuff I'd never do on a website.

    Quote Originally Posted by Gonki View Post
    there may be tens of thousands of elements in DOM tree.
    Making 'tens of thousands of elements on the DOM" is epic fail. For how much content? Admittedly, the "slop everything into the DOM at once" garbage has never been a favorite of mine as development techniques go -- but if you have "tens of thousands" of elements... well. There's something horrifyingly and terrifyingly WRONG with whatever it is you're working on.

    Nothing you're saying makes a lick of sense... though it SOUNDS like you might be screwing around making things more complex than they need to be -- though with people slopping out five to twelve dozen K of markup to do a tenth that's job, I can see how you might come to such a conclusion. Much less the broken bloated idiotic techniques that are still commonplace like innerHTML or the halfwit disaster of developer ineptitude like jQuery... Maybe if you're slopping in six div and four HTML 5 tags with seven dozen classes to do the job of one semantic tag and inheritance it MIGHT be worth using, but in that case just fix the bloody markup!

    More than anything, I'm just not seeing how all this extra code and startup time would actually provide any speedups or benefits "real world" on... any sanely written page.

    Unless... wait, are you looking for a way to serialize and de-serialize ALL the markup into scripttardery? Oh, that's going to be efficient... NOT. I think I get it you want to send markup as an object to be constructed... Something I would NEVER do since I build WEBSITES with accessibility in mind... and delivering content via scripting is telling users to go **** themselves on websites.

    Even on crapplets, the overhead of saying what the markup is really has no business being vomited up into the transmission between server and client. You're just making extra overhead for nothing... but I can see where you might think you found savings. You're optimizing for a sloppy inefficient technique that shouldn't be done in the first place.
    I would rather have questions that can't be answered, than answers that can't be questioned.
    http://www.cutcodedown.com

  7. #7
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    Quote Originally Posted by deathshadow View Post
    Since a new page would be a new DOM, then how in blazes is the relationship between said index and the DOM references maintained -- for that matter, how are they established in the first place since last I checked, you can't store DOM references in noSQL.

    All elements in noSQL (nodes + children + all properties) are assigned a unique integer Primary Key. So whether it's a DIV or onClick() event or className - each gets a unique key in the database. The stack of DOM references has structure of "parent DOM PK" + "DOM object PK" + "reference to DOM in page". The stack is not part of noSQL database but a standalone in-memory table because noSQL is abstracted from DOM stack.

    When same page gets rendered second time its DOM is taken from the stack rather than re-created. This is also true if same DOM object is part of different pages. I ran benchmark tests which show good rendering speeds with less than 10ms per page with 100 objects, I'll put up the results soon. This technology is online for one month and many aspects of it are in development. I've been Web programming since 1998 and see many good things that 2DX can do for which I would have needed to write bunches of code before.

    Serialization will encode DOM tree along with user input into a JS object, save it on server, then restore after reload or on a different device. Accessibility will remain up to styling/CSS, I don't see how serialization would affect that.

    2DX will also serialize/de-serialize graphics. A user can display and edit graphs, then save them in serialized format just by dumping noSQL database to server, same as with plain UI. It took many years to implement algorithms behind noSQL database with unique PK for each element (node or property), now it's phase of implementing features that run off it. I couldn't find another in-memory noSQL database for web browsers so not sure with what to compare it.

    I failed to measure startup delay, noSQL database size is 10k. I'll need to benchmark how long it takes to load DOM tree into memory. For the js2dx.com website it's not noticeable to me as a user granted the DOM tree there is not large.


    2DX


    PS: I've loaded pages with 20-50 k DOM objects, don't think it's too unusual. For example in scrolling through user activity/news updates/etc.

  8. #8
    Senior Coder deathshadow's Avatar
    Join Date
    Feb 2016
    Location
    Keene, NH
    Posts
    2,169
    Thanks
    2
    Thanked 314 Times in 304 Posts
    What I really think I missed is that you're not manipulating existing HTML sent as HTML by the server. You're building the DOM and all its content FROM the JavaScript, or serializing it on first load after...

    That first load is a battery draining pig, so that's not something I'd be doing if the project needs to care about mobile. The rest of it is a cute idea, but the only practical reason to do it that way would be multiple loads or that STUPID MALFING endless scrolling garbage that's why if you accidentally leave Facebook open overnight or try to (Joe forbid) scroll down more than ~20 pages the browser blows up in your face for running out of memory.

    Either way, it's the "scripting only content delivery" that's a GIANT middle finger to usability and accessibility. The only real way I could see it having a practical application is for local applets where you KNOW JS is active -- like when working with nw.js or electron -- and really if you're running it locally you're just introducing more overhead.

    In some ways, your "10,000+ DOM Element" page sounds like the type of thing that leaves me screaming at the display "OH FOR **** SAKE, JUST PAGINATE THE BLASTED THING!!!"

    It sounds like it would fill a very narrow niche of the type of things I'm usually trying to convince people NOT to do with web technologies in the first place -- since basically it sounds like the stuff that tells users wholesale to sod off from a lack of graceful degradation and the exact opposite of why HTML even exists.
    I would rather have questions that can't be answered, than answers that can't be questioned.
    http://www.cutcodedown.com

  9. #9
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    Quote Originally Posted by deathshadow View Post
    In some ways, your "10,000+ DOM Element" page sounds like the type of thing that leaves me screaming at the display "OH FOR **** SAKE, JUST PAGINATE THE BLASTED THING!!!"
    Main page of Browser noSQL, if you click on "demonstration of 2DX noSQL performance", contains 11k+ DOM elements, loads snappy and supplies a paging box. I've loaded >1 million elements and my browser stayed working. All in all 20-100 thousand DOM is a good workable range for 2DX.

  10. #10
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    I'm just curious how you would enable DOM element text paging with plain HTML. On 2DX website you can see an input box that pages in real time over 10k of elements innerHTML. I have tested it with 1 million records, it works

    Please let us know how same possible with plain HTML. Also don't forget to separate pageable text from pageable numbers in innerHTML and non-pageable texts on webpage, like it works on the website. Also comment on JavaScript animation function of 2DX. There is a demo if you click on page logo.


    Thank you.

  11. #11
    Senior Coder deathshadow's Avatar
    Join Date
    Feb 2016
    Location
    Keene, NH
    Posts
    2,169
    Thanks
    2
    Thanked 314 Times in 304 Posts
    Quote Originally Posted by Gonki View Post
    Main page of Browser noSQL, if you click on "demonstration of 2DX noSQL performance", contains 11k+ DOM elements, loads snappy and supplies a paging box.
    If by snappy you mean an empty page... oh look, Ghostery is blocking something in your scripting. Ok, make exception to let it through...

    ... and if by snappy you mean a CPU spike that cranks my laptop fan up to high; I'm SO sure mobile users will thank you for sucking down half their battery...

    But more than that, WHAT'S THE APPLICATION for this?!? Ok, you can do it. WHY would you want to shove a wad of meat that big in the user's face? Much less if this were REAL data and not just randomly generated series of four digits, how "peppy" would it be then? WHAT business would you have sending that much data client-side at once in the first place?

    I'm just not getting why you would do ANY of this in the first place, unless your entire intent is to tell users to sod off.

    Hence your contentless website since without scripting, there is no site -- at ALL. Big empty page of nothing but -- talking STRICTLY from a HTML and CSS standpoint -- developer ineptitude. You've got some real fancy scripting going on there, impressive code even... but in terms of usability, accessibility, and functionality you're flipping the double bird at users and missing the entire reason client is separate from server, why HTML even exists, what it is for, and why content delivery via JavaScript with zero graceful fallbacks is EPIC FAILURE at web development!

    I'm just not seeing a real-world application scenario for this to serve any purpose other than "gee ain't it neat" scripttardery -- the type of "JS for nothing" that tells users to sod off, leaves search engines scratching their heads with "where's the content", and that takes what should be 4k of markup and 5k of CSS and turns it into 80k of gibberish spanning 20+ files.

    Quote Originally Posted by Gonki View Post
    I'm just curious how you would enable DOM element text paging with plain HTML.
    I have no idea what you mean by "DOM element paging" but it sounds like the same type of asshattery as trying to use something like AJAX or client-side script processing to replicate the behavior of iframes -- missing that iframes and that approach were deprecated in 4 Strict for a REASON -- that reason being they are accessibility TRASH.

    Quote Originally Posted by Gonki View Post
    On 2DX website you can see an input box that pages in real time over 10k of elements innerHTML. I have tested it with 1 million records, it works
    NOT that one should EVER be using innerHTML to write to a page in the first place -- and again, WHY would you send 10K elements to a user at one time? Just TRYING to tell the user to go **** themselves? How is that USEFUL in an accessible manner with doing what HTML is for -- delivering content to users?

    You're asking:

    Quote Originally Posted by Gonki View Post
    Please let us know how same possible with plain HTML.
    I'm asking why in blazes you would do this on a website in the first place?!? How is that USEFUL to USERS? What's the ACTUAL practical usage scenario?!?

    You don't send 10K records client side and then process it. You process it on the server and then send two dozen at MAXIMUM, IN HTML!!! That's what I meant by paginate -- you do it BEFORE you send it client side with REAL page-loads so you aren't telling users that intentionally block JS, don't have it available due to workplace restrictions, cannot use it because they are on a screen reader or braille reader, to, well... again, using scripting this way is -- from an accessibility and sane/rational web development standpoint -- telling users to go **** themselves!

    Which again is why I'd assume this is for a crapplication built with web technologies like nw.js or electron, and NOT for building ACTUAL websites?!?

    Quote Originally Posted by Gonki View Post
    Also comment on JavaScript animation function of 2DX. There is a demo if you click on page logo.
    I don't know if I'd call siezure inducing random flashing boxes in random colours an "animation", but usually that's ALSO the type of garbage I'd NEVER do on a website in the first place, nor would I be doing that in a web crapplet.

    I'm trying to grasp the point of it, since the only real objective your little project seems to have is to tell non-scripting users, users with accessibility needs, and search engines to **** off! Your existing bodyless HTML is accessibility trash, what HTML you do have is outdated nonsense, and what you generate is a sloppy mess. Honestly given what you have, I question if you know enough about HTML, CSS, accessibility, UI design or UX to be playing with JavaScript yet.

    Now, you've written some REALLY impressive scripting, but WHY?!?
    I would rather have questions that can't be answered, than answers that can't be questioned.
    http://www.cutcodedown.com

  12. #12
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    Quote Originally Posted by deathshadow View Post
    but WHY?!?
    Flight sim and other toys.

  13. #13
    Senior Coder deathshadow's Avatar
    Join Date
    Feb 2016
    Location
    Keene, NH
    Posts
    2,169
    Thanks
    2
    Thanked 314 Times in 304 Posts
    Quote Originally Posted by Gonki View Post
    Flight sim and other toys.
    Wouldn't / shouldn't that be handled in Canvas/WebGL and not on the DOM?
    I would rather have questions that can't be answered, than answers that can't be questioned.
    http://www.cutcodedown.com

  14. #14
    New Coder
    Join Date
    Jan 2014
    Posts
    12
    Thanks
    1
    Thanked 1 Time in 1 Post
    Quote Originally Posted by deathshadow View Post
    Wouldn't / shouldn't that be handled in Canvas/WebGL and not on the DOM?
    2DX is a database cluster with DOM rendering singleton object in web browsers. Those rendering objects are also database instances that can exchange logs and commit transactions. The system logs all UI i/o into master database on server. It is a good idea to look into 2DX-WebGL API, there is one for Angular.


 

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •