Terrible page load times, is this normal without optmizations?

Permalink
I'm running Concrete5 on a shared server at Dreamhost. I'm running 8.5.2 and I'm using PHP 7.3 fast CGI.

My website has always been slow, I'm talking 3 - 5 seconds to load a page slow. I'm not sure when it happened but it got worse. Now the page load times are between 5 and 10 seconds. I have the instant page speed magic extension, without it this would probably be even worse.

Is concrete5 just this bad right out of the gate? I'm thinking the shared server isn't dedicating much processing speed and maybe concrete5 just takes an absurd amount of processing to put together a page? I have page caching on, but on GTmetrix the bulk of the time is spent " waiting."

I attached a screenshot from gtmetrix where you can see how much time is spent "waiting." I think that's how much time the server spends processing the information before it can send out the request. If anyone know more about this than me, correct me if I'm wrong. If this is a problem on the web hosts side, then I will contact them again.

I'm thinking though that concrete5 is just super hard on the server processor to put together pages. Does anyone have any idea? My website is basically unusable with 4 to 10 second page load times, and my google rankings are trash because of it too. No matter what I do on this crap it's going to be ranked like hell because of the horrible page load times. Any thoughts on this matter would be greatly appreciated.

update: Here's a link to a GTmetrix analysis of a page that did exceptionally terrible. The page in question just had text, no images, no videos, no complex blocks. Go to the waterfall tab and look at the first request. Does anyone know why it might be waiting for 8 seconds to finish the 10kb first request?
https://gtmetrix.com/reports/www.videogamedatabank.com/g6yIK5JV...

1 Attachment

 
JohntheFish replied on at Permalink Best Answer Reply
JohntheFish
Most shared hosting can handle concrete5, as long as you are not on the bottom tier of a very budget host such as GoDaddy.

use Speed Analyzer to identify where issues are https://www.concrete5.org/marketplace/addons/speed-analyzer...

If it is not an underlying issue with your hosting, then I expect you will find the culprit is Autonav, or possibly a page list.

Then look carefully at your autonv settings. Does it scan more pages than it needs to?
Can it be replaced with the much more efficient nested manual nav?

The other possibility is if you have a large number of page attributes or express form fields.
Kibbles replied on at Permalink Reply 1 Attachment
Thank you so much, I did not know about this add on. I'm one step closer to solving this issue.

For starters, in the diagnosis tab I noticed this information about OPcache:

Enabled
opcache.validate_timestamps should probably disabled
The OPcache cache is full.
Used memory: 8132 KB.
Free memory: 13 KB.

Also, I'm not familiar with this add on, but without me accessing any pages, it quickly has filled 75 entries of the same two pages. Are these two pages repeatedly being accessed? Or is this somehow normal for page speed analyzer? In a support chat my web host said there was a high amount of traffic from my IP address and suggested I flush my DNS. I did that just to be safe, but is somehow my computer trying to request these pages every few seconds? Or is this something to do with page speed analyzer?

Also, on the pages that take the longest to load, it is in fact page lists, like you suspected. Is there any way to make page lists faster? I absolutely have to have page lists in order to build the site I'm trying to build. Surely there's some way to make them not pull hundreds of queries and take 2 to 5 seconds.

I saw other posts on the forums now that I've realized this and I saw you discussing ways to make page lists faster with other users. Have you found anything that might work and if so can you point me towards the final verdicts on what can be done about it?

Also one of my slow pages is slow for reasons I'm not understanding. I attached a screenshot of the graph.

Lastly, I noticed the URLS on my insanely slow pages have things like this tacked onto the end of the URL. Does anyone know what it means? I'm not an expert at this kind of thing.

"ccm_order_by=cv.cvDatePublic&ccm_order_by_b3535=cv.cvDatePublic&ccm_order_by_b6708=cv.cvDatePublic&ccm_order_by_direction=desc&ccm_order_by_direction_b3535=desc&ccm_order_by_direction_b6708=desc&ccm_paging_p=9&ccm_paging_p_b3535=16&ccm_paging_p_b6708=2"
Kibbles replied on at Permalink Reply
I'm also finding the issue I saw in another post. One of my pages loaded in 3 seconds, but didn't output for 6 seconds because it spent three seconds doing 600 queries. It had a bunch of garbage in the queries, but the main stand out was it has this string of code 148 times with a different id number at the end. Can anyone help me understand what this means so I can attempt to fix it?

select Pages.cID, Pages.pkgID, Pages.siteTreeID, Pages.cPointerID, Pages.cPointerExternalLink, Pages.cIsDraft, Pages.cIsActive, Pages.cIsSystemPage, Pages.cPointerExternalLinkNewWindow, Pages.cFilename, Pages.ptID, Collections.cDateAdded, Pages.cDisplayOrder, Collections.cDateModified, cInheritPermissionsFromCID, cInheritPermissionsFrom, cOverrideTemplatePermissions, cCheckedOutUID, cIsTemplate, uID, cPath, cParentID, cChildren, cCacheFullPageContent, cCacheFullPageContentOverrideLifetime, cCacheFullPageContentLifetimeCustom from Pages inner join Collections on Pages.cID = Collections.cID left join PagePaths on (Pages.cID = PagePaths.cID and PagePaths.ppIsCanonical = 1) where Pages.cID


EDIT: I think this query insanity on this particular reply is from being logged in and it has something to do with the editor probably. My other issues are still relevant, but if this is slow I can deal with that because that's just for me when I'm editting, and I can ignore that. My users however can't ignore insane page load times.
mnakalay replied on at Permalink Reply
mnakalay
Not 100% Sure but that looks like the kind of query that the autonav block would use. DO you have an important number of pages and levels?
Kibbles replied on at Permalink Reply 1 Attachment
I'm thinking the query is the page list. On the page speed analyzer it shows the 600+ queries as part of loading the page list. And I'm not sure what you mean by an "important" number of pages and levels, and I'm not sure what you mean by levels. My website is hundreds of pages large and ideally eventually it will be thousands of pages.

Also I really need someone to help me understand why some of my pages are being called a few times every minute. I added a screenshot of the page speed analyzer to show what I mean. Each page request has something like this tacked onto the end of the URL:
ccm_order_by=cv.cvDatePublic&ccm_order_by_b3535=cv.cvDatePublic&ccm_order_by_b5130=cv.cvDatePublic&ccm_order_by_b6707=cv.cvDatePublic&ccm_order_by_b6708=cv.cvDatePublic&ccm_order_by_direction=desc&ccm_order_by_direction_b3535=desc&ccm_order_by_direction_b5130=desc&ccm_order_by_direction_b6707=desc&ccm_order_by_direction_b6708=desc&ccm_paging_p=9&ccm_paging_p_b3535=8&ccm_paging_p_b5130=2&ccm_paging_p_b6707=1&ccm_paging_p_b6708=2


I thought maybe it had something to do with the page list on that page, so I tried deleting the page list from that page, but even after deleting the page list that page still gets requested from the server multiple times per minute with this tacked onto the end of the URL.
JohntheFish replied on at Permalink Reply
JohntheFish
Check the server log for where that comes from. Is it within the site or external? Have you enabled an RSS feed and is an external reader or aggregator thrashing it?

Thousands of pages on a shared hosting site is probably a bit too much. Depends a lot on volume of use and, for page lists, how the site is structured.

For example, listing all pages beneath a specific page is cheap.

Listing pages of a type within a date range with specific tags from anywhere on the site and applying permission checks is expensive.

An autonav listing only top level pages is cheap. An autonav listing all levels is expensive - especially if you only show the top 1 or 2.
Kibbles replied on at Permalink Reply
Okay so I learned how to access the server logs and it's looking like crawler bots are accessing my site probably once every second on average. Probably around 75% of those requests or more are to access those pages with the extra code gibberish tacked onto the end of the URL. I'm not an expert at this but it seems like maybe somehow some of my URL's have extra versions with tons of crap tacked on or something and these crawlers are stuck in infinite loops rechecking those messed up pages a few times every minute and between all the different crawlers out there it's at the point of accessing those pages once every second.

Since my pages take like 4 to 10 seconds to process, maybe these bots accessing my website once per second is enough to mess everything up? I don't know much about web crawler bots, I'm assuming they access less resources when they grab a page, but I don't know, even if they're grabbing 1/4th the resources they're still capping out the server because of the volume of page access.

Also about page lists. I really need to be able to make pages using topics and then have those pages get categorized into different areas on the website in order to correctly catalog and organize what I'm working on. If I don't do it with some automatic way like this then I would have to make a page, and then go to 3 or 4, sometimes as much as 10+ different pages and manually add it into a list and then create additional pages for simulated pagenation.

Surely there's some way to use topics and page lists without nuking the server. Any thoughts or tips on how to reduce server overhead for topic page lists would be greatly appreciated.
JohntheFish replied on at Permalink Reply
JohntheFish
If the bots are well behaved, you can disallow them from crawling the url in a robots.txt file
https://support.google.com/webmasters/answer/6062608?hl=en...
jb1 replied on at Permalink Reply
jb1
It might be helpful, but I've got a heavy C5 site loading pretty fast (based on gtmetrix and webpagetest.org). TTFB: 0.49 sec, first contentful paint: 1.4 sec, largest contentful paint: 2.7 sec.

This is the setup:
1) Enable full page caching in Concrete5 dashboard
2) Use Litespeed (instead of Apache)
3) Turn on Litespeed cache (requires tweaks to htaccess file), get it to minify and combine CSS & JS.
4) I also setup a subdomain to do the C5 editing which bypasses the Litespeed cache (eg. mywebsite.com vs admin.mywebsite.com). It took a little fiddling, but possible. Had to turn off C5 redirect to base URL setting. Some minor C5 features don't like this - like viewing previous versions of a page.
5) Used a dynamic robots.txt file to discourage search engines from indexing the admin subdomain. Also some customisations to the header_required.php element (for noindex meta tag).
6) Enabled Cloudflare. Added rules to support caching on main domain and disable it on the admin subdomain.

It sounds simple when I condense it like this, but there was a lot of tweaking, testing and fine-tuning (and breaking!).

It might be worth comparing - I also setup a few Wordpress sites recently. And straight out of the box, their load time sucked. When I used some of the above steps (litespeed + cache + cloudflare), I got the load time down by over 75%. So this kind of issue happens with all CMSs.

Hope this helps.
stewblack23 replied on at Permalink Reply
stewblack23
Great info in this thread. I used all of these different techniques in this post when I had page speed issues a few weeks ago. I ended up switching to siteground hosting. My page speed is a ton faster in siteground. Also stay away from Autonav as @JohntheFish mentioned. The Nestled Manual Nav plugin is a much better option. If you have picture heavy sites the image optimizer addon along with TinyPNG cloud server makes all the difference.

Here is the current GTMetrix page score for our company site.

https://gtmetrix.com/reports/www.sdagency.net/4TcgEIYo...