#11453 closed feature request (wontfix)
Use compression for CSS and JS file output
Reported by: | micasuh | Owned by: | |
---|---|---|---|
Milestone: | Priority: | normal | |
Severity: | normal | Version: | 2.9 |
Component: | Optimization | Keywords: | css, js, minify, compression, speed |
Focuses: | Cc: |
Description
2010 will bring awareness to page speed since Google is going to release it's newest version of search called Caffeine. Google's Matt Cutts says Caffeine will bring a new algorithm and a new focus on page speed to affect rankings.
I propose Wordpress use YUI Compressor or similar package to automatically compress CSS and JS files called by an webpage and cache these files for visitors to use. I'd even propose it scan CSS and JS files inserted into wp_head and compress those files but only if this is practical.
If nothing else, can we add a dynamic textbox that uses an online YUI compressor to Appearance>Editor section in Dashboard? An on-the-fly compression input/output would allow the developer to do everything from within Wordpress Dashboard.
Wordpress should be on the forefront to keep up with website optimization and minifying would be a huge benefit to all Wordpress users.
Change History (25)
#2
@
15 years ago
Awesome! Glad to see the admin-side is already doing this.
Is it possible for client-side output to use the same feature? I realize that theme development can be continuous (thus CSS and JS files can change from time to time) but with a built-in or referenced compressor, and accompanying cache files to serve to visitors to a website, would increase page speed and optimization. Obviously, rounding this out with gzip compression would be optimal.
#3
follow-up:
↓ 4
@
15 years ago
Being in the middle of rewriting my cache plugin at the moment, a couple of remarks on the idea:
- mod_deflate is, imo, better than a php ob_gzhandler, as discussed in #10365
- Some assets (js and css files) have arguments or are dynamic (eg, cookies for comments) and should not be concatenated. Retrieving assets that aren't inserted using the depends API is messy to say the least. In addition, they generally have no version, are/or are potentially dynamic, so should not get concatenated either.
- Minifying is overrated, especially when it's done on the fly. The whole idea is to reduce the file size, but a minified, gzipped asset is the same size (give or take) as a non-minified, gzipped asset. Minifying on the fly additionally introduces a lag, which can then lead to concurrent write problems (see below).
- Placing the concatenated assets on a Content Delivery Network (I'm sure someone will bring that up too) is overrated for a similar reason: placing the file on a CDN on the fly is an extremely messy business, and it introduces further lags as the page is rendered -- since it needs to be completed before the script or style tag is output.
- Generating concatenated page assets introduces all sorts of possibilities for concurrent writes, which means we then need to deal with file lock-related problem. This can significantly slow down concurrent page loads. (Sites that are spread over multiple servers are actually easier to manage here, since the assets can be placed in memcached.)
- Removing concatenated page assets that are no longer used is an extremely messy business.
- There needs to be a non-concatenated mode in the event a user is in the process of editing his site's js and css files; it should also be turned on by default, so as to avoid support problems ("why can't I see my changes?").
- Lastly, I suspect that the focus on speed is overhyped. Intuitively, google wants to eliminate spammy sites that consistently load in several seconds because they're aggregating data from all over the place while they load. I honestly doubt that they're going to penalize WP installs overnight -- especially considering that most of them load faster than Yahoo! Finance.
In short, I suggest we close this as plugin material for now, and revisit the idea in a few years when the dependencies API is rock solid.
#4
in reply to:
↑ 3
;
follow-up:
↓ 5
@
15 years ago
Thanks for the great feedback. My knowledge is not as extensive as yours but I have a few comments.
Replying to Denis-de-Bernardy:
- Minifying is overrated, especially when it's done on the fly. The whole idea is to reduce the file size, but a minified, gzipped asset is the same size (give or take) as a non-minified, gzipped asset. Minifying on the fly additionally introduces a lag, which can then lead to concurrent write problems (see below).
I disagree that minifying is overrated, but I won't sit here and claim it's going to reduce a website by 80 or 90%.
Minifying CSS and JS, backed up by a recent report, does make a double digit percentage effect on load time. If we were talking about 1-9% effect, I would agree with your argument.
When I refered to on-the-fly compression, I was only referring to using the online YUI compressor. I agree that having a continuous, on-the-fly compressor in Wordpress would be hazardous and would introduce a new lag that isn't needed.
CSS and JS compression could be handled in one of two ways:
- As a button you could press in the Editor to create compressed files that are immediately cached. These cached files would then become the client-side files downloaded by visitors.
- When editing or saving CSS/JS in the Editor, you could choose an option to compress and cache a new version of these files and deliver them to client-side requests.
- Placing the concatenated assets on a Content Delivery Network (I'm sure someone will bring that up too) is overrated for a similar reason: placing the file on a CDN on the fly is an extremely messy business, and it introduces further lags as the page is rendered -- since it needs to be completed before the script or style tag is output.
CDNs, while theoretically sound, are too immature as a technology to rely on for offloading requests. I agree with you on this one.
- Lastly, I suspect that the focus on speed is overhyped. Intuitively, google wants to eliminate spammy sites that consistently load in several seconds because they're aggregating data from all over the place while they load. I honestly doubt that they're going to penalize WP installs overnight -- especially considering that most of them load faster than Yahoo! Finance.
I suspect we really won't see Google Caffeine's algorithm (and similar-type changes on competing search engines) to immediately affect search results. But, as I keep hearing from Matt Cutts, Google's official blog and SEO websites, these results could have a great impact on results. Thus, WP sites that use tens of plugins are going to suffer from lags and slow page loads due to increased http and php requests.
While I know Wordpress is still a blogging system at heart, it's increasingly being used as a CMS by folks like me and I realize how much bloat adding tons of plugins can cause to decrease page speed and rendering times.
Wordpress can continue its dominance against even major CMS competitors by taking these issues seriously and promoting best practices with integration for speed and reliability. It's for these reasons why I hope this doesn't just get dismissed as a plugin or for future consideration.
#5
in reply to:
↑ 4
@
15 years ago
Replying to micasuh:
I disagree that minifying is overrated, but I won't sit here and claim it's going to reduce a website by 80 or 90%.
Minifying CSS and JS, backed up by a recent report, does make a double digit percentage effect on load time. If we were talking about 1-9% effect, I would agree with your argument.
Right. and based on the above url, from slowest to fastest: no-minify (100%), minify (~45%), no-minify+gzip (~30%), minify+gzip (~17%). Note that his sample is a rather large file, too.
At any rate, the author achieves a 70% saving with little effort (gzipping). Is the extra 15% saving from minifying before gzipping truly worth the effort? To me it seems a bit overkill.
#6
@
15 years ago
Also, there is this point to keep in mind if we were to use something php based:
http://minify.googlecode.com/svn/tags/release_2.1.3/min/lib/Minify/CSS/Compressor.php
* This is a heavy regex-based removal of whitespace, unnecessary * comments and tokens, and some CSS value minimization, where practical. * Many steps have been taken to avoid breaking comment-based hacks, * including the ie5/mac filter (and its inversion), but expect tricky * hacks involving comment tokens in 'content' value strings to break * minimization badly. A test suite is available.
#8
follow-up:
↓ 9
@
15 years ago
Denis-de-Bernardy is correct about minify if it consumes a lot of CPU. Have to analyze on cost-benefit. At least on hosts like Dreamhost, lots of minify could push high volume sites into mandatory private virtual host mode.
re: concatenation--will Google's scheme include # of HTTP requests in the speed factor? If not, then concatenation is 0 value for this purpose. Concat is only a good idea if you have static JS or CSS files. If your concat includes dynamic JS or CSS content, or changes on page-to-page, then you've defeated the advantage of concat because the browser cannot cache it.
CDN makes a lot of sense for static files on a high traffic site but only makes sense to relieve load on primary host. You'd have to get tons and tons and tons of hits for that to make sense, and at that point you probably need professional services that put you in a different league.
#9
in reply to:
↑ 8
;
follow-up:
↓ 10
@
15 years ago
Replying to novasource:
Denis-de-Bernardy is correct about minify if it consumes a lot of CPU. Have to analyze on cost-benefit. At least on hosts like Dreamhost, lots of minify could push high volume sites into mandatory private virtual host mode.
This is mostly true if visitors are downloading on-the-fly minified files. I don't think anyone is recommending this. While minifying files can add increased CPU load, if done right this increase should be very minimal.
For minifying to be a realistic option, client-side website visitors should only be downloading cached CSS and JS files which have been minified/gzipped.
I know there's other factors involved like someone who edits CSS and JS files in an external or desktop editor. I often use Dreamweaver/Coda/Textmate/Notepad++ to make CSS changes. Thus, when would Wordpress minify these files? Maybe there could be a switch to turn off caching and minification while a developer is coding the site which would again not increase CPU load.
#10
in reply to:
↑ 9
@
15 years ago
Replying to novasource:
re: concatenation--will Google's scheme include # of HTTP requests in the speed factor?
Reducing the number of assets that are downloaded per page load significantly adds to the end user's experience. It makes the page load faster.
(Also, the primary benefit of placing static assets on a CDN that lives on a separate domain name, is to place these assets closer to the end-user, without him sending cookie headers.)
Replying to micasuh:
Replying to novasource:
Denis-de-Bernardy is correct about minify if it consumes a lot of CPU. Have to analyze on cost-benefit. At least on hosts like Dreamhost, lots of minify could push high volume sites into mandatory private virtual host mode.
This is mostly true if visitors are downloading on-the-fly minified files. I don't think anyone is recommending this. While minifying files can add increased CPU load, if done right this increase should be very minimal.
The main point I'd like you to understand is that asset concatenation, followed by conditional gzipping using mod_deflate, are low hanging fruits that are fast to generate, and easy to implement. The other two (PHP-level gzipping and minifying) are much less so, and they're prone to introducing more trouble than benefits.
#11
@
15 years ago
- Cc dalmaatieur added
- Keywords speed added
- Milestone changed from 3.0 to 2.9.1
- Priority changed from normal to highest omg bbq
- Severity changed from normal to critical
I get the impression, that the many complaints about the notorious slow speed of Wordpress Blogs, have always been underestimated. Even with WP cache active my website hosting blog http://get-website-hosting.com/ is still slow. And now the situation is even getting worse, because Google starts "penalizing" low speed sites. Is this the beginning of the end for Wordpress? Wake up pls!
#12
@
15 years ago
- Milestone changed from 2.9.1 to Future Release
- Priority changed from highest omg bbq to normal
- Severity changed from critical to normal
Priority and Severity reset, Please leave this for core commitors to manage for feature requests.
Milestone set to Future release until a time where a decision has been made for inclusion, and a patch is available.
#13
follow-up:
↓ 14
@
15 years ago
I've never seen google loading my CSS and JS files and I strongly doubt a speed-rating mechanism will rate their downloads in the future.
#14
in reply to:
↑ 13
;
follow-up:
↓ 15
@
15 years ago
Replying to Denis-de-Bernardy:
(PHP-level gzipping and minifying) are much less so, and they're prone to introducing more trouble than benefits.
Good point. Like novasource said, we should look to analyze cost benefit to include it or not. Are there many known problems with using minification scripts like YUI compressor?
I'd make a guess that most Wordpress websites only contain only a small number of both static CSS and JS files. Thus, there's no increased CPU load if these are minified once per file and then cached to serve to end users. While I can't predict what kind of problems this would introduce, it seems that user error would lead to a big majority of problems.
Replying to hakre:
I've never seen google loading my CSS and JS files and I strongly doubt a speed-rating mechanism will rate their downloads in the future.
Google's new algorithm, nicknamed Caffeine, will start rollout in the new year. Included in this newer, rebuilt search system is attention to page speed, as indicated in the link in the ticket description.
I agree that Google spiders or bots probably won't rate download speeds of individual files. it is, however, very possible and likely to take account of total time and rendering speed of the sites it visits in consideration for ranking. If not using this method, it will use another similar method that Matt Cutt's hinted at in the video.
And while I love dalmaatieur's enthusiasm, I really do hope we can target Wordpress 3.0 in the long-term. I'd like to change the status back to Milestone 3.0 if no one else opposes.
#15
in reply to:
↑ 14
;
follow-up:
↓ 18
@
15 years ago
Replying to micasuh:
Replying to Denis-de-Bernardy:
(PHP-level gzipping and minifying) are much less so, and they're prone to introducing more trouble than benefits.
Good point. Like novasource said, we should look to analyze cost benefit to include it or not. Are there many known problems with using minification scripts like YUI compressor?
I'd make a guess that most Wordpress websites only contain only a small number of both static CSS and JS files.
It ultimately depends on the number of plugins you use. My own site contains a dozen such files for each.
Replying to hakre:
I've never seen google loading my CSS and JS files and I strongly doubt a speed-rating mechanism will rate their downloads in the future.
Google's new algorithm, nicknamed Caffeine, will start rollout in the new year. Included in this newer, rebuilt search system is attention to page speed, as indicated in the link in the ticket description.
If you're so worried, here's your chance to use the Semiologic Cache or the Total Cache plugin. Both do a good job at pruning everything and trimming the daylights out of server requests (I'd argue that the first does a better job than the second, but I'm obviously biased).
I agree that Google spiders or bots probably won't rate download speeds of individual files.
this, I think, is where you're wrong. it will have utmost importance if speed is the question. In particular your main page, rather than its assets, since it determines how long it takes to display something.
#16
follow-up:
↓ 19
@
15 years ago
- Resolution set to wontfix
- Status changed from new to closed
Maybe we should introduce the term SEO-kiddies in the same row as Script-kiddies. I've taken the freedom to solve this non-issue as wontfix.
At least for the moment this is all speculation and it's a fact that this is solved with plugin material.
#18
in reply to:
↑ 15
@
15 years ago
hakre, please don't close this while we're having a big discussion. The problem with dismissing this as a plugin is that it defeats the purpose of the whole idea.
Replying to Denis-de-Bernardy:
Replying to micasuh:
I'd make a guess that most Wordpress websites only contain only a small number of both static CSS and JS files.
It ultimately depends on the number of plugins you use. My own site contains a dozen such files for each.
Yes and each site contains a variable amount of files. We can't simply blanket everyone in the same category since it's impossible to know everything about all installations. I run 10's of different installs of Wordpress and many only have between 3-5 at most, even with plugins.
Replying to hakre:
I've never seen google loading my CSS and JS files and I strongly doubt a speed-rating mechanism will rate their downloads in the future.
Google's new algorithm, nicknamed Caffeine, will start rollout in the new year. Included in this newer, rebuilt search system is attention to page speed, as indicated in the link in the ticket description.
If you're so worried, here's your chance to use the Semiologic Cache or the Total Cache plugin. Both do a good job at pruning everything and trimming the daylights out of server requests (I'd argue that the first does a better job than the second, but I'm obviously biased).
Using another plugin, which obviously means more external PHP requests, is the antithesis of this idea. The purpose of building these features into Wordpress is bypass the use of plugins, reduce the rendering load and bring mainstream processes and forward thinking utilities to the developer. Plugins are great, but essential processes such as minifying and compression should not be dismissed as plugins or second rate.
I agree that Google spiders or bots probably won't rate download speeds of individual files.
this, I think, is where you're wrong. it will have utmost importance if speed is the question. In particular your main page, rather than its assets, since it determines how long it takes to display something.
Okay, point taken. But this is just another argument for including this fundamental idea into Wordpress from the get go. I think it's naive to undervalue minification and/or compression and what it does for CMSes such as Wordpress.
#19
in reply to:
↑ 16
;
follow-up:
↓ 20
@
15 years ago
Replying to hakre:
Maybe we should introduce the term SEO-kiddies in the same row as Script-kiddies. I've taken the freedom to solve this non-issue as wontfix.
At least for the moment this is all speculation and it's a fact that this is solved with plugin material.
While I am concerned about SEO, the bigger issue here is about making a wonderful system like Wordpress even better. As far as I'm concerned, however, what Matt Cutts says isn't speculation. His word is straight from the horse's mouth. What IS speculation is the how, not the what.
The point of including this kind of feature fundamentally is about making the internet a better place. By better, I mean more efficient, faster, and bringing forward thinking to this platform. Gzipping would be a great start, but minification is the icing on the cake.
I would hope it is of the decision of the whole and not of one individual to consider these ideas. If there's a larger discussion to be had, I'd be happy to involve myself not as a SEO-kiddie but as a proponent of good web practices.
BTW, script kiddies use poorly written patches and 3rd party plugins to solve problems. I am against this idea! ;-)
#20
in reply to:
↑ 19
;
follow-up:
↓ 21
@
15 years ago
Replying to micasuh:
hakre, please don't close this while we're having a big discussion. The problem with dismissing this as a plugin is that it defeats the purpose of the whole idea.
For important discussion please use the wpdevel mailinglist. That's what it has been made for. Until you come to concrete conclusions please do not use trac as message board. That only blocks important changes.
Replying to micasuh:
While I am concerned about SEO, the bigger issue here is about making a wonderful system like Wordpress even better.
We're all in here to make things better. If you really care and you're in the mood of an eloquent talk, share your thoughts on the wpdevel mailinglist. Thanks.
#21
in reply to:
↑ 20
@
15 years ago
Replying to hakre:
Replying to micasuh:
hakre, please don't close this while we're having a big discussion. The problem with dismissing this as a plugin is that it defeats the purpose of the whole idea.
For important discussion please use the wpdevel mailinglist. That's what it has been made for. Until you come to concrete conclusions please do not use trac as message board. That only blocks important changes.
This shouldn't be closed. It's legit to keep this open, especially while this good discussion is happening.
#22
@
15 years ago
Like hakre said, trac is not a message board. Either provide a working patch to base discussions on, or start a new thread on wp-hackers.
Will leave this open for now.
#23
@
15 years ago
After some days passed I suggest to close as wontfix for the moment until more information is available.
WordPress core JS/CSS files are already minimized, And served in a single Gzip-compressed stream (If supported by the browser) in 2.9.
See #8628 & #10664
Only core-files are supported due to the implementation (Paths are hard-coded in the js/css provider for known types).
I'm thinking that the file is cached.. but i'm not sure.. Oh, and this is also only for admin-side too..