Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Varnish Cache #20

Open
Xarex opened this issue Nov 15, 2016 · 7 comments
Open

Varnish Cache #20

Xarex opened this issue Nov 15, 2016 · 7 comments

Comments

@Xarex
Copy link

Xarex commented Nov 15, 2016

It looks like there may be an issue with over-optimizing and over-caching.

I have a website that is sitting on DreamHost's DreamPress server, an optimized WordPress server that uses Varnish cache. The website receives about an average of 1,000 visitors a day. When I added this code by calling it through a script, it was not immediate, but over the course of the week, about 2-3 days in, the server began to overload. What is normally 20-40 queries soon became 2000 queries and what is normally under a 1 second page load became no less than 4 seconds to load pages.

No new plugins were installed and no new code, other than CacheP2P was added.

When I contacted DreamHost support, they said the Apache server was overloaded to capacity at 100% of resources being used. This shut down the website. It happened twice. It was as if the two cache systems - Varnish cache and CacheP2P were battling it out. Once the 3 lines calling the script were removed and processes ran their course, the site restored to normal functioning. The code has not been added back and the website is functioning like normal and has not gone down since.

On my Shared hosting, I have another website that receives less traffic, but a somewhat steady flow of traffic everyday of about 40-70 visitors. I had added the same 3 lines of code. I'm currently storing the CacheP2P scripts on a CDN. There has been absolutely no issues, but it is also not optimized for Varnish cache. There has been no overloading or anything like that. Improvements in performance? Possibly. But too soon to tell, but no issues as far as the site going down.

So I'm just putting it out there, it looks like this code will not always work with other caching systems.

@deckar01
Copy link

I suspect that it is an issue with the javascript on your pages and not the varnish cache.

Navigating to cached pages replaces the body of the DOM, but leaves the previous page's javascript running. TurboLinks provides a special event framework to avoid this type of issue.

If you email me a link to the site that had the issues I can take a quick peek at the javascript to see if anything stands out.

@Xarex
Copy link
Author

Xarex commented Nov 18, 2016

the site i tried to install it on was http://www.confessionsoftheprofessions.com/
I did have to remove the script, so you won't find it there anymore.

it is also on my other site: https://mypost.io/
Navigate to a page like the About page or the FAQ page to actually see the script. (/post/ must be in URL)
You can see the script in the footer and there has been no issues.

@deckar01
Copy link

I am fairly confident it is the javascript now. That first site has an order of magnitude more javascript. When I load it without ad blocker turned on my laptop fan kicks on and a steady stream of network chatter starts streaming in.

I am surprised that you were able to use CacheP2P at all for a wordpress site since it's not static pages.

@Xarex
Copy link
Author

Xarex commented Nov 18, 2016

So... CacheP2P ... no WordPress. Static only. MyPost uses a database to load pages, but the pages remain fairly static. Thanks deckar for the insight!

@deckar01
Copy link

How are you generating the page hashes?

@Xarex
Copy link
Author

Xarex commented Nov 18, 2016

This may have been another issue I oversaw, copying only the file that was provided on the cachep2p website and uploading it to the CDN. I suppose it is working without security then. There would be no way for me to grab every single page's hashes without doing it automatically and systematically. There are thousands of pages on both websites. I wonder if there is a way to grab a page's security hashes and then just generate them.

@deckar01
Copy link

Without a list of hashes it shouldn't do anything. That makes me think the increased load you saw was caused by the peer connection events triggering analytics requests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants