We are about to completely rebuild a clients website, it currently has over 1000 pages.
There will be a cull, however my idea is to dynamically load assets based on what's on the page but I wanted to get feedback.
Let's say I have 100 global components (carousel,buttons,videos,Nah etc) currently over time we've just put all javascript for all components into a bundle.js file, same with css, however if a page only uses 3 of those 100 components it seems redundant to include everything.
So I guess my question is if it wrong to dynamically request only the components used, at runtime rather than loading all assets every time?
The big downside I can see is that almost every page will request new files, so caching will be harder, also more HTTP request would have to be made.
But if someone has a better idea please let me know
Firstly, I suggest an evidence-based approach. Don't do anything without data to back up the decision.
My thoughts on an overall approach. I'm thinking about React as I write this, but nothing is React-specific.
Server-render your content. It will then display to your users without needing your JavaScript bundle.
Get a good CDN and/or something like varnish and cache each route/page response. You'll get fast response times no matter how big the site.
Now, when the user visits a page they'll get it quickly and then you asynchronously download your JavaScript file that will breath life into the page.
Because the user is already reading your page, you can take your time loading the JS - up to a second or two. If you think most of users will have decent internet (e.g. they're all in South Korea) then I'd go as big as a 2mb JS bundle before bothering to do chunking. Opinions will vary, it's up to you. If your users have bad internet (e.g. they're all in North Korea) then every kb counts and you should aim to be making the smallest chunks needed for each page. Both for speed and to respect the users' download quota.
Related
I'm working in an MVC application that has about 10 BIG JavaScript libraries (jquery, modernizr, knockout, flot, bootstrap...), about 30 jQuery plugins and each view (hundreds of them) has it's own corresponding Javascript file.
The default MVC4 bundling is used, but all these JavaScript files are packaged in two bundles; one containing all the plugins and libraries, and one for the view specific files. And these two bundles are loaded on EVERY page, regardless if needed or not.
Now, they're loaded only the first time the user opens the application, but even minified the two are about 300 KB (way more raw), and the bulk of that code is very specific to certain pages.
So, is it better for the browsers to have 2 giant bundles, or to create "smarter" and lighter bundles specific to pages, but have more of them? The browser would cache them regardless first time they're opened, but is there any performance benefit to having less javascript loaded per page vs having all of it loaded on every page?
If there's any chance of not needing them for a session then it would make sense to split them into smaller bundles. Obviously any bytes that you don't have to send are good bytes ;)
You're right about the caching somewhat eliminating this problem as once you need it once it can be cached, but if, for example, you have a calendar page and a news page, it's conceivable that someone could not care at all about the calendar and so it wouldn't make sense to load it.
That said, you can probably go overboard on splitting things up and the overhead caused by creating each new request will add up to more than you save by not loading larger libraries all at once.
The size of the files is irrelevant to a browser on its own, size of the page as a whole is relevant to the user's computer, it will impact processor, network and memory (where the 3 mentioned components performance will somewhat depend on the browser used).
Many small files will probably provide a better response on slow clients because the file downloads and is executed, vs. waiting to download a big file (waiting for memory to be allocated to read the file) and the executing the scripts.
People will probably suggest to go easy on the scripts and plugins if you want a leaner web application.
Concepts like image sprites and JS bundling are inventions due to the goal of minimising HTTP requests. Each HTTP request has an overhead and can result in bottlenecks, so it's better to have one big bundle than many small bundles.
Having said that, as Grdaneault said, you don't want users to load JS that they won't use.
So the best approach would be to bundle all the common stuff into one, then do separate bundles for uncommon stuff. Possibly bundle per view, depends on your structure. But don't let your bundles overlap (e.g. bundle A has file A & B, bundle B has file A & C), as this will result in duplicate loading.
Though 30 plugins, wow. The initial load is just one of the many issues to sort out. Think carefully as to whether you need them all - not everyone will have an environment that's as performant as you hopefully do!
So I was pretty interested in the idea of head.js, however the idea doesn't play nice with the way I have developed my backend.
Basically what I'm trying to do is decide how I should serve my scripts(js and css) giving my services the most performance and the least traffic. I feel like I can replicate a more integrated idea of head.js using more of my backend. I'm using node.js, without any frameworks around it, to serve everything.
MoreOver, I have several javascript and css files. Basically the idea is theres one for the entire site for the header, footer and reused methods and styles. Then there's one for each page.
The original idea was to make the server send a get request to "/js?src=" and just request one file like "index", then index will be all the javascript that is needed to load on that page will be sent as a single response as a big conjoined script. The problem with this is I'm using Closure Compiler and this just seems like I would run into a lot of issues with this except for the CSS.
The second idea is to create a loop in the template to create separate requests for each script. Utilizing more of the idea of head.js, however moving it to the backend.
The third idea, maybe I'm overthinking at this point. Create a Closure Compiled version of the scripts for each page which will include the javascript for not only the page but the header and footer, therefore the scripts never conflict. This creates redundant data in each of the files, and posses the issue of not pipelining my assets.
The basic idea of my service is a website that will serve social media content, images, and music in realtime. So initial page loading isn't extremely important, however, I want the server to be able to handle a large number of requests quickly. So I'm more focused on the big picture of serving many users than I am on the individual users experience. Whats my best approach?
I'm currently building a portfolio-website for an architect that has a whole lot of images on its pages.
Navigation is done with history.js (=AJAX). In order to save loading time and make the whole thing more "snappy", I wrote a script that crawls the page body for links to other pages and fetches these automatically in the background. So far, it works like a charm.
It basically keeps a queue-Array that holds all the links. A setTimeout()-Function works through them and fetches each page using jQuery $.ajax(). The resulting HTML is stored in a Javascript Object.
Now, here's my question:
What are possible problems that might occur when using this on different machines/browsers/operation systems?
I'm thinking about:
max. javascript object/variable size (The fetched HTML is stored in an javascript object)
possible performance problems
max. number of asynchronous requests?
… anything you can think of?
Thanks a lot in advance,
a hobby programmer
Although it might be a good idea to cache the whole website on the client side there are a lot of things that can cause issues:
Memory
Unnecessary load on the webserver
Loading uneeded pages into memory
Some users have a limit to their internet so loading the entire website is not smart in those cases
Once the user naviagets away or refreshes the entire "cache" is gone
What I would do is first try to optimize the server side.
Add a bunch of caching mechanisms from the database to the user, the "Expires" header can really help you.
And if that doesn't help I would then think about caching some pages(which ones are for you to decide) in the offline cache, see (HTML 5 Offline Features)
That way you are safe even on page reload, keep the memory to a minimum and only load what you need.
PS: Don't try to reinvent stuff that the browser already has :P
You should queue the async requests, and only launch one at a time.
And since you're storing everything in variables, at some point you (the browser) may consume to much memory, and the whole thing can become very slow. I suggest you limit the size of your cache to a certain number of pages.
You can also try to not to store fetched content - just fetch it and throw-out. Browser still will cache fetched pages and images in its internal storage, so subsequent loads will be much faster (of course if ajax library does not forcibly disable the cache i.e. by using POST)
Sites like Facebook use "lazy" loading of js.
When you would have to take in consideration that I have one server, with big traffic.
I'm interested - which one is better?
When I do more HTTP requests at once - slower loading of page (due to limit (2 requests at once))
When I do one HTTP request with all codes - traffic (GB) is going high, and apaches are resting little bit more. But, we'll have slower loading of page.
What's faster in result ?
Less requests! Its the reason why we combine JS files, CSS files, use image sprites, etc. You see the problem of web is not that of speed or processing by server or the browser. The biggest bottleneck is latency! You should look up for Steve Souders talks.
It really depends on the situation, device, audience, and internet connection.
Mobile devices for example need as little HTTP requests as possible as they are on slower connections and every round trip takes longer. You should go as far as inline (base-64) images inside of the CSS files.
Generally, I compress main platform and js libs + css into one file each which are cached on a CDN. JavaScript or CSS functionality that are only on one page I'll either inline or include in it's own file. JS functionality that isn't important right away I'll move to the bottom of the page. For all files, I set a far HTTP expires header so it's in the browser cache forever (or until I update them or it gets bumped out when the cache fills).
Additionally, to get around download limits you can have CNAMES like images.yourcdn.com and scripts.yourcdn.com so that the user can download more files in parallel. Even if you don't use a CDN you should host your static media on a separate hostname (can point to the same box) so that the user isn't sending cookies when it doesn't need to. This sounds like overfill but cookies can easily add an extra 4-8kb to every request.
In a developer environment, you should be working with all uncompressed and individual files, no need to move every plugin to one script for example - that's hard to maintain when there are updates. You should have a script to merge files before testing and deployment. This sounds like a lot of work but its something you do for one project and can reuse for all future projects.
TL;DR: It depends, but generally a mixture of both is appropriate. 'Cept for mobile, less HTTP is better.
The problem is a bit more nuanced then that.
If you put your script tags anywhere but at the bottom of the page, you are going to slow down page rendering, since the browser isn't able to much when it hits a script tag, other then download it and execute it. So if the script tag is in the header, that will happen before anything else, which leads to users sitting there stairing at a white screen until everything downloads.
The "right" way is to put everything at the bottom. That way, the page renders as assets are downloaded, and the last step is to apply behavior.
But what happens if you have a ton of javascript? (in facebooks example, about a meg) What you get is the page renders, and is completely unusable until the js comes down.
At that point, you need to look at what you have and start splitting it between vital and non vital js. That way you can take a multi-stage approach, bringing in the stuff that is nessicary for the page to function at a bare minimum level quickly, and then loading the less essential stuff afterwards, or even on demand.
Generally, you will know when you get there, at that point you need to look at more advanced techniques like script loaders. Before that, the answer is always "less http requests".
I got a webpage that calls oracle and then does some processing and then a lot of javascript.
The problem is that all of this make it slow for the user. I have to use internet explorer 6 so the javascript takes very long to load, around 15 seconds.
How can i make my server do all of this every minute for example and save the page so if a user requests it it would server them that page that is all ready calculated etc
im using tomcat server my webpage is mainly javascript and html
edit:
By the way I can not rewrite my webpage, it would have to remain as it is
I'm looking for something that would give the user a snapshot of the webpage that the server loaded
YSlow recommendations would tell you that you should put all your CSS in the head of your page and all JavaScript at the bottom, just before the closing body tag. This will allow the page to fully load the DOM and render it.
You should also minify and compress your JavaScript to reduce download size.
To do that, you'd need to have your server build up the DOM, run the JavaScript in an environment that looks (enough) like web browser, and then serialize the result as HTML.
There have been various attempts to do that, Jaxer is one of them (it was originally a product from Aptana, now an Apache project). Another related answer here on SO pointed to the jsdom project, which is a DOM implementation in JavaScript (video here).
Re
By the way I can not rewrite my webpage, it would have to remain as it is
That's very unlikely to be successful. There is bound to be some modification involved. At the very least, you're going to have to tell your server-side framework what parts it should process and what parts should be left to the client (e.g., user-interaction code).
Edit:
You might also look for "website thumbnail" services like shrinktheweb.com and similar. Their "pro" account allows full-size thumbnails (what I don't know is whether it's an image or HTML). But I'm not specifically suggesting them, just a line you might pursue. If you can find a project that does thumbnails, you may be able to adapt it to do what you want.
But again, take a look at Jaxer, you may find that it does what you need or very similar (and it's open-source, so you can modify it or extract the bits you want).
"How can i make my server do all of this every minute for example"
If you are asking how you can make your database server 'pre-run' a query, then look into materialized views.
If the Oracle query is responsible for (for example) 10 seconds of the delay there may be other things you can do to speed it up, but we'd need a lot more information on what the query does