On Wed, Dec 10, 2008 at 01:26:43PM +0000, Chris G wrote:
On Wed, Dec 10, 2008 at 11:37:37AM +0000, Richard Lewis wrote:
On Wednesday 10 December 2008 11:15:20 Chris G wrote:
I have some python scripts which convert reStructuredText to HTML 'on the fly' when xxxx.rst files are browsed.
When I look at the pages locally performance is acceptable but I have just tried remotely (from work) and it takes several minutes to display a new xxxx.rst page. Normal HTML and other stuff are shown at normal speed, I have some quite complex PHP (which I didn't write) being served by the same apache server and it's fine.
When you say "locally", do you meaning executing the Python in its own process? Or making an HTTP GET request from localhost to Apache running on localhost? i.e., is it the fact that the Python is running within Apache that's making it slow? Or is it the fact that you are making the request from elsewhere?
Sorry, wasn't clear. When I look at the xxxx.rst pages using Firefox on the same computer where the apache server is running then performance is OK (in fact, more than 'OK', it runs as fast as everything else).
Can anyone suggest any reason for remote access to this being so incredibly slow? It's probably some dire misconfiguration somewhere but I don't know where to start looking really.
If the script runs slowly *because* it's being run within Apache, the next question to ask is how are you running it in Apache? CGI? FastCGI? mod_python?
It's using:-
ExtFilterDefine rst-to-html mode=output \ intype=text/rst outtype=text/html \ cmd="/home/chris/bin/info2html.py"
with some other additions (of course) to feed .rst files into the filter.
I know it's not going to handle heavy loads but as I'm the only user that doesn't matter. As I said it works fine via firefox on the computer where the apache server is.
It's not that the above is generating masses of network traffic is it? That would give the symptoms I'm seeing. It's odd, I have just tried from another remote site using lynx and, while hardly instant, the response times are acceptable. Even odder, using lynx from here (work) response is OK but using firefox it's still truly awful.
One thing I've noticed, the URLs as they appear while being loaded in lynx are of the form xxxxx.html.gz. I'm not explicitly compressing them so some cleverness somewhere appears to be doing so.
OK, problem solved, or at least moved elsewhere. It appears that my Firefox on my work system is the culprit, I suepct that it's taking a very long time to unzip compressed HTML. I tried IE in my Vmware guest and that works impressively quickly on the same pages that hang up Firefox.
So the question changes to - does anyone know why Firefox would be very slow with gzipped HTML?