Skip to content
This repository has been archived by the owner on Apr 21, 2023. It is now read-only.

Response with .gz.css file is Transfer-Encoded twice #646

Closed
GoogleCodeExporter opened this issue Apr 6, 2015 · 30 comments
Closed

Response with .gz.css file is Transfer-Encoded twice #646

GoogleCodeExporter opened this issue Apr 6, 2015 · 30 comments

Comments

@GoogleCodeExporter
Copy link

What steps will reproduce the problem?
1. Enable filter combine_css,rewrite_css,inline_css

What is the expected output? What do you see instead?
The browser said that content encoding error, and the URL can't be parsed by 
pagespeed.

What version of the product are you using (please check X-Mod-Pagespeed header)?
1.3.25.3-2556

On what operating system?
CentOS 6.4 x86_64

Which version of Apache?
Apache/2.2.15

Which MPM?
prefork

URL of broken page:
http://cdn.lintas.me/style/newlintasme_style/A.public.prod.gz.css,,qv==1011+basi
c-jquery-slider.css+home.css,Mcc.JcmuxRxM4i.css.pagespeed.cf.pUN0xmIPte.css 
(could not viewed at this time, my pages are broken and says content encoding 
error if I access that URL)

I think there are 2 comma (,) and there is no data between those comma. And the 
URL containt .gz before the double comma with empty data. IMHO

Original issue reported on code.google.com by dewangg...@xtremenitro.org on 14 Mar 2013 at 5:32

@GoogleCodeExporter
Copy link
Author

It seems to work for me, is this still claiming a content encoding error for 
you?

Original comment by sligocki@google.com on 14 Mar 2013 at 6:27

  • Changed state: RequestClarification

@GoogleCodeExporter
Copy link
Author

Hi sligocki,

Thanks for your response, yes I still got content encoding error, you could 
check my sandbox here http://mps.lintas.me.

You could check my new sandbox URL here 
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,,qv==10
11+basic-jquery-slider.css+home.css,Mcc.JcmuxRxM4i.css.pagespeed.cf.AgPUZT9SVx.c
ss


Original comment by dewangg...@xtremenitro.org on 14 Mar 2013 at 11:52

Attachments:

@GoogleCodeExporter
Copy link
Author

Hello there,

I just want to update the error will occurs if I enable this filter :

        ModPagespeedEnableFilters combine_css,rewrite_css
        ModPagespeedEnableFilters inline_css
        ModPagespeedEnableFilters outline_css

Now you could check mps.lintas.me and compare with www.lintas.me, the 
www.lintas.me doesn't use those filter above.

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 3:41

@GoogleCodeExporter
Copy link
Author

Here is my sandbox environment, get broken pages ...

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 9:54

Attachments:

@GoogleCodeExporter
Copy link
Author

I can confirm the following:
1. Fetching the reported URL in wget works fine.
2. Fetching the reported URL in Chrome results in a 330 error.
3. Fetching the reported URL in Firefox works fine.
4. Fetching each of the 3 components of the URL in Chrome works fine.

Ergo it would appear that our combined CSS is faulty in some way that Chrome 
doesn't like.

Original comment by matterb...@google.com on 15 Mar 2013 at 1:53

  • Changed state: New

@GoogleCodeExporter
Copy link
Author

Hmmm... I'm using firefox, and try to access the following URL and normal, but 
If I access from my sandbox on http://mps.lintas.me, my pages were broken, and 
I've try to access the URL from source, the errors occurs.

Any hints guys?

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 4:51

@GoogleCodeExporter
Copy link
Author

If you haven't already I suggest disabling combine_css for now until we work 
out what's going on. From my tests I expect that to fix it.

Original comment by matterb...@google.com on 15 Mar 2013 at 5:14

@GoogleCodeExporter
Copy link
Author

Hi,

I've disable combine_css and still error, the new rewrite URL were like this 
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,qv=1011
.pagespeed.cf.77RTnFMsuL.css

I think the error occurs when I enable inline_css + outline_css, lemme try to 
enabling and/or disabling that filters one by one.

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 5:25

@GoogleCodeExporter
Copy link
Author

Here are the new result of my sandbox and testing few rules belongs to 
rewriting css..

The suspect are on the css filter, because if I didn't enable rewrite filter of 
css, the pages should be fine and normal. I disable the css rewriting filters 
on my live webpages on www.lintas.me, the mps is only for sandbox and testing 
on mps.lintas.me

Filter: rewrite_css
<pre>
Orig URL: http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011
Result: FAIL
URL: 
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,qv=1011
.pagespeed.cf.KdY8IQ67pm.css
</pre>

Filter: rewrite_css, combine_css
<pre>
Orig URL:
1. http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011
2. http://i.brta.in/style/newlintasme_style/basic-jquery-slider.css
3. http://i.brta.in/style/newlintasme_style/home.css

Result: FAIL
URL: 
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,,qv==10
11+basic-jquery-slider.css+home.css,Mcc.JcmuxRxM4i.css.pagespeed.cf.AgPUZT9SVx.c
ss
</pre>

Filter: rewrite_css, combine_css, inline_css
The requested URL and the result URL are same with rewrite_css + combine_css 
filter only.

Filter: rewrite_css, combine_css, inline_css, outline_css
The requested URL and the result URL are same with rewrite_css + combine_css 
filter only.

Now, mps.lintas.me is using rewrite_css, combine_css, inline_css filters. Thank 
you for your support guys.

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 5:44

@GoogleCodeExporter
Copy link
Author

For the comparison, the rewrite_css, combine_css and inline_css are normal on 
my new version. You can switch the pages version by triggering /switch on the 
URL. By default, the pages show old version. To changes on new sites, click 
http://mps.lintas.me/switch

On new version, the filters should be normal and OK. I think, the pagespeed 
couldn't parsing something on my css. 

On new version, the rewritten URL of css were like this 
http://lmecdn.antituhan.com/style/newlintasme_style/A.global.css+facebox.css,Mcc
.YH8VaviiPo.css.pagespeed.cf.Mz79xXIKCa.css

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 5:50

@GoogleCodeExporter
Copy link
Author

This is very strange, this seems like some sort of Transfer-Encoding: chunked 
issue. I've attached a raw netcat pull of your site using "Accept-Encoding: 
gzip".

@Matt, I think the reason that this works on wget and not on chrome or firefox 
is that it only comes up for "Transfer-Encoding: chunked" or maybe 
"Content-Encoding: gzip" which the servers don't send unless you explicitly 
"Accept-Encoding: gzip".

When I send the attached file directly to the browser, it fails with the same 
330 error. But if I manually de-chunk it (or fetch with curl which de-chunks) 
and send the result to the browser, that seems to work fine, I'm very confused 
on what's going on here.

Original comment by sligocki@google.com on 15 Mar 2013 at 7:39

Attachments:

@GoogleCodeExporter
Copy link
Author

I noticed in Chrome that "Transfer-Encoding: chunked" appears twice.  I haven't 
seen that before.  Maybe Chrome dislikes it?  I'm surprised we are putting that 
in.

Original comment by jmara...@google.com on 15 Mar 2013 at 10:18

@GoogleCodeExporter
Copy link
Author

Original comment by jmara...@google.com on 15 Mar 2013 at 10:18

  • Changed state: Accepted

@GoogleCodeExporter
Copy link
Author

Haha, that's it, this document has been chunked twice! So when it's been 
de-chunked the first time, browsers fail to ungzip it because it needs to be 
de-chunked again.

dewanggaba, do you know of any reason this might be getting "Transfer-Encoding: 
chunked" applied to it twice? What other modules are you running on Apache?


Original comment by sligocki@google.com on 15 Mar 2013 at 11:13

@GoogleCodeExporter
Copy link
Author

Slig: Standard module on apache and I remove some unneeded modules, here's my 
modules http://fpaste.org/hwpO/. Anyway about transfer encoding, don't know why 
the result get chunked, it's only happen on my old view only. 

I thought that the pagespeed can't parse because of the file type .gz and 
detected by "application/x-gzip gz tgz" by mime types. CMIIW. So, what should I 
do, slig, J and Matt ? :D

Original comment by dewangg...@xtremenitro.org on 15 Mar 2013 at 11:53

@GoogleCodeExporter
Copy link
Author

Hello there,

I'll try to catch the problem now by disallow 3 file that causes error. 

        ModPagespeedDisallow */public.prod.gz.css*
        ModPagespeedDisallow */basic-jquery-slider.css
        ModPagespeedDisallow */home.css

Hang on. I'll update this issue after allow and/or disallow one-by-one the 
filter above. I thought that one of file above causes error (content encoding 
error).

Original comment by dewangg...@xtremenitro.org on 21 Mar 2013 at 12:37

@GoogleCodeExporter
Copy link
Author

Here is the result by enable and/or disable the suspect one-by-one.

Enable: public.prod.gz.css
Disable: basic-jquery-slider.css,home.css
Result: FAIL, page were broken

Enable: basic-jquery-slider.css,home.css
Disable: public.prod.gz.css
Result: OK

Now, I've disable only public.prod.gz.css and enable both of 
basic-jquery-slider.css,home.css and the pages are OK right now. What's going 
on with public.prod.gz.css ?

You can download the css file right there 
http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011

Original comment by dewangg...@xtremenitro.org on 21 Mar 2013 at 12:53

@GoogleCodeExporter
Copy link
Author

Not sure if this is relevant but ...

If I fetch those 3 files with 'wget --header="Accept-Encoding: gzip' I get:
* home.css: 822 bytes of plain text
* basic-jquery-slider.css: 733 of gzip'd content
* public.prod.gz.css: 32,197 of gzip'd content.

If I fetch public.prod.gz.css without the AE header I get 182,851 bytes of 
plain text.

But I still don't know what's going on here sorry :(

Original comment by matterb...@google.com on 21 Mar 2013 at 12:50

@GoogleCodeExporter
Copy link
Author

Yes, if you directly access to un-rewritten files, you'll get nothing. Because 
of the direct files from i.brta.in domain are using nginx. I don't know why, 
only public.prod.gz.css could not optimized. The other files are normal.

So, I do the tricks by using disallow directive.

Original comment by dewangg...@xtremenitro.org on 21 Mar 2013 at 3:54

@GoogleCodeExporter
Copy link
Author

I have got some serious problems too. I am trying to use mod-pagespeed to 
optimize JPEG images only on a server which delivers static assets. So I 
disabled the core ruleset and enabled only the jpeg filters. All JS/CSS/SWF/PDF 
and maybe other files are no longer available due to a content/transfer 
encoding problem. I am using the latest beta versionof mod-pagespeed. The 
reason to use this setup is to make use of the new in-place optimization for 
images so we do not need to do this manually before upload.

As workaround I could use the ModPagespeedDisallow directive but I have to 
exclude all other file extensions than .jp(e)g. It really seems to be a serious 
problem with double encoding. For pictures we do use chunked encoding instead 
of gzip as an optimized jpeg should not get smaller by zipping it.

I am also using CentOS 6.4 x64 with standard Apache 2.2 like the op.

Original comment by jor...@gmail.com on 11 Apr 2013 at 7:46

Attachments:

@GoogleCodeExporter
Copy link
Author

joramk, can you provide a link to your site? I want to see if you are having 
the same problem as dewanggaba or if it's a different issue.

Original comment by sligocki@google.com on 11 Apr 2013 at 2:50

  • Changed title: Response with .gz.css file is Transfer-Encoded twice

@GoogleCodeExporter
Copy link
Author

I just thought that this issue was same with ngx_pagespeed issue 482.
Is the gzip compression on apache break this things ?

I don't have any issue again after changes some design on my server(s). Putting 
ngx_pagespeed in front of mod_pagespeed. Using 2 pagespeed daemon on 1 server, 
and put it on same cache_path.

Original comment by dewangg...@xtremenitro.org on 10 Aug 2013 at 12:48

@GoogleCodeExporter
Copy link
Author

Sorry, I forgot the ngx_pagespeed's issue link 
https://github.com/pagespeed/ngx_pagespeed/issues/482

Original comment by dewangg...@xtremenitro.org on 10 Aug 2013 at 12:49

@GoogleCodeExporter
Copy link
Author

Hello,

We also have this "double-encoding" problem as soon as we activate the In-Place 
Resource Optimization (ModPagespeedInPlaceResourceOptimization)

Currently using 1.4.26.3-stable but also tried with 1.6.29.4-beta and no luck.

Just after activation, first request to any .js or .css optimized resource, 
comes double encoded. After that, it happens randomly.

We really would like the In Place Optimization to work, as we use a CDN for 
delivery of resources gathered from a single mod_pagespeed optimized server to 
several other domain.

Original comment by i...@rimontgo.es on 12 Aug 2013 at 10:52

@GoogleCodeExporter
Copy link
Author

It's worth trying MPS 1.7 which has a fix that might resolve this issue.

I think we were not able to reproduce this so it'd be great if you could 
confirm it.

The fix is here: 
https://code.google.com/p/modpagespeed/source/detail?r=3480

Original comment by jmara...@google.com on 12 Nov 2013 at 5:22

  • Changed state: RequestClarification

@GoogleCodeExporter
Copy link
Author

Yes, the FetchWithGzip directive solve the problem. BUT!, the css still having 
encoding error, altough, here is the PoC :

http://unik-aneh.lintas.me/assets/foundation/css/app.css+offcanvas.css.pagespeed
.cc.hsDY3_UCma.css

From this CSS below :
http://unik-aneh.lintas.me/assets/foundation/css/app.css
http://unik-aneh.lintas.me/assets/foundation/css/offcanvas.css

Original comment by dewangg...@xtremenitro.org on 12 Nov 2013 at 5:26

@GoogleCodeExporter
Copy link
Author

Edited:

The CSS 's still have encoding error but the pages are fine

Original comment by dewangg...@xtremenitro.org on 12 Nov 2013 at 5:27

@GoogleCodeExporter
Copy link
Author

Can you try flushing your cache after setting FetchWithGzip?

It looks like maybe we have captured some gzipped CSS content in the cache.  I 
have not seen the ".gz" syntax in CSS files before, but I guess that makes 
sense in a way; if you put the precompressed files on disk then you don't have 
to compress them when serving to gzip-accepting clients.  The only thing I'm 
unclear on is whether ngx_pagespeed/mod_pagespeed is seeing the 
content-encoding:gzip header when it runs.

Original comment by jmara...@google.com on 12 Nov 2013 at 9:17

@GoogleCodeExporter
Copy link
Author

Hi,

You can close this issue, I've update to 1.70.30.3 and do this things from 
ngx_pagespeed issues https://github.com/pagespeed/ngx_pagespeed/issues/614

Nginx:
Remove pagespeed CustomFetchHeader Accept-Encoding gzip;
And enable pagespeed FetchWithGzip on;

I use reverse proxy and server the static files from nginx, I don't know is 
that affect to Apache versions :)

Original comment by dewangg...@xtremenitro.org on 8 Feb 2014 at 4:05

@GoogleCodeExporter
Copy link
Author

Original comment by jmara...@google.com on 18 Sep 2014 at 5:42

  • Changed state: Closed

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant