Is Concrete still Anti-SEO after all these years?

We just migrated our primary website into Concrete
First let me state that we love Concrete! However, being able to properly SEO a site is of vital importance to us.

We just discovered that Google was seeing 3 different versions of each page of our site. We searched the forums and discovered a variety of posts regarding this particular issue. We hope that after modifying some of the core files per the posts, we have fixed the problem.

However, the reason for this post is the fact that this problem has been known since 2009 and the suggested fixes should have been included in the versions since that time (and such a recommendation was made in these old posts) - but obviously have not - 6 years later.

So, before we migrate more sites into Concrete, is there anyway to contact to get the attention of the core team to see if we are going to have to manually fix all these issues in other sites?

See these old posts for details of the problems and time frames involved.

View Replies:
goldhat replied on at Permalink Reply
I care about SEO but not as much as Social because this is the social era where people wake up and check Facebook or Twitter and most traffic comes from social not search. But anyway does Google care about this petty technical BS anymore? Not really, the little hobbit guy from Google has already said if you have duplicate pages who cares we figure which one is the main deal and that's what we evaluate. Google knows the deal is social has already taken over they mostly gauge do you have traction in social and if you don't then who cares about your SEO tactics? And if you have social traction, you're page will rank regardless of various little technical tidbits of pointless obscurity.
mhawke replied on at Permalink Reply
What ended up being the cause of the 3 different versions?
ssnetinc replied on at Permalink Reply
The short story is when you use the Insert Link to Page function that concrete puts on top of the Content editor - it puts in the link without the trailing slash. The autonav functions uses the trailing slash (as it should be).

Also, concrete uses the internal index.php/page name type of format for internal links.

I discovered this problem in Google Webmaster Tools when it told me I had 45 pages with the same Title and Description tags. Turns out it was 15 pages with 3 different URLs - with / without the trailing slash and the index.php/ version. Google saw them all as different pages with the same content.

Had to change the rewrite rules in .htaccess and also add a file to the core files so everything was rewritten to the full trailing slash version of the URLs. Googlebot simulators now only see the correct page versions - too soon to tell if the actual Googlebot sees the same.
mesuva replied on at Permalink Reply
What version of concrete5 is this on? Platform? Browser?

I'm fiddling with a site right now - links added via the 'Insert Link to Page' button/link on the content block add a trailing slash.

I also don't see index.php in the paths, unless I turn off pretty urls in the dashboard.

I'd hardly suggest concrete5 is 'Anti-SEO', especially since it includes (without needing add-ons) some very powerful ways to create redirects and edit page titles and metadata. It also as a Bulk SEO Updater built in.

I'm no SEO expert, but I have spent a fair bit of time looking at concrete5 sites with SEO in mind. I've seen cases where duplicate pages pop up, but nearly all are the result of changing page names a lot and google not picking this up.

Out of interest, the only SEO issue I haven't been able to fix without a code edit is how paginated pages are treated. We're talking something pretty particular here, for this I worked out potential fix -
(I also would suggest this isn't a problem anymore SEO wise)
ssnetinc replied on at Permalink Reply 1 Attachment
Greetings Mesuva,
As I stated in my original post - I do love Concrete. I'm impressed with it functionality, ease of use and community of developers and users.

Let's me explain the situation and time line. The site in question used to hold the #1-#3 positions in Google for all our primary keyword phrases - before the Penguin update - at which time is completely disappeared. The site had not been updated in some time. As part of the plan to revitalize the site we decided to move it into concrete.

Installed and built the site. After the launch of new site we started to closely monitor all aspects via Google Webmaster Tools / Analytics and such. That's when we discovered Google was counting different versions of the same page URLs as having duplicate Title and Description tags. I have attached a screenshot, but the below is a perfect example:

Pages with duplicate meta descriptions
Vcheck offers a 30 day satisfaction guarantee. If you are not satisfied with the checks by phone, ch

So, I went to the forums, searched for the problem and came up with the 2009 topics I referenced in my first post. I implemented the suggested fixes. It must be working because in a matter of days, the Number of "Duplicates" Google is counting has dropped from 45 to 37. I believe it will drop to 0 once the entire site has been reindexed.

To answer you specific question - I must admit I did not test the Insert Link to Page function by adding a new link and then viewing the HTML. That problem was described in one of the older topics.

However, after reading the posts, I went to the site and moused over a number of the internal links I had created using that function - and sure enough, the status line in FireFox showed the link pointing to the page WITHOUT the trailing slash - whereas the AutoNav link to the same page showed it WITH the trailing slash. Perhaps the way concrete translates the internal links?

Anyway, I made the suggested fixes and it appears to be working - so something in concrete was causing the problem. Something that was discussed in 2009 and still seems to be present.

I guess the trauma of having our primary site completely disappear from Google (some time ago) and discovering such a glaring problem (multiple links to the same pages that Google was counting as different pages on the new concrete version of the site) and then seeing the exact same problem being discussed in 2009 made we wonder.

The intent of my post was really to get someone's attention to see:
1. The problem obviously exist - but why.
2. Does this mean I will have to make such manual changes on other concrete site to prevent it.
3. If you review some of the older posts, the fixes to correct this were known in 2009 but not included in the core code even though that was highly suggested - why?

I'm not trying to be anti-concrete - and I may be worried about nothing, but I'm being extremely careful to make sure the new site follows all of Google's guidelines. I am grateful for all the support I've been given via these forums as I learned concrete and apologize if I've wasted people's time with this.
SheldonB replied on at Permalink Reply
... I think there could be better documentation on this all the seo how-tos while they do talk about this are very general instead of being like "first check this, if that then apply this" ..

as far as editing the .htaccess I see that as being more of a server side set up (personal) issue then concrete5 * concrete 5 editing everything in your htaccess could create a problem because not everyone would have the same server set up

One thing I noticed for my sites is generally if you use a basic concrete5 out of the box I would generally get around 70% in seo tests

conrete5 has a great easy workflow to help me fix the optimization such as image replace, page properties *ussally gets my seo score to jump 5-10% so in reality if I the user would optimize my content before putting it into concrete 5 i could easily hit 75-80%

to get higher i needed to compress my themes and js files, add change server settings/ add server settings. use cron jobs to auto update sitemap.xml 2 times a week (google really likes sitemaps). I usually hit around 85 to 95% depending on the site test.

so im pretty happy with concrete5 but i can understand how someone who isn't sure where to start could hit an seo test just randomly and be like wow thats really low and blame the cms
(I know ssnetic isnt blaming c5, just wondering why the issues have been brought up with work arounds but not implemented, im saying there are users out there that don't know how to search issues and place blame on the cms)

the only thing i think concrete5 might be able to do better is when it caches a site is to give an option to compress and remove the spaces in the code *it might already do this its just that whenever tested a site it seemed to complain about compression first.

So in general I see SEO and concrete5 as friends and have no issue

end thoughts/rant
ssnetinc replied on at Permalink Reply
I agree about the need for a uniformed section on SEO in the documentation / forums. During the course of figuring out how concrete handles things like page loading speed, pretty URLs and such, I had to research numerous topics scattered throughout the forums - some new posts, some old post that did not apply to newer versions.

While I also agree the .htaccess stuff is not really concrete's responsibility, with my particular issue, it was. Concrete writes the mod_rewrite code in the the .htaccess if it can. However, it was a modification to that code that helped fix the issue I was having with Google seeing all the index.php/ URLs as separate pages. How hard is it for the core team to modify that code?

To give you an example of the need for actual documentation on the topic of SEO, below is how my current .htaccess looks - keeping in mind it started out with only some of the RewriteCond lines once Pretty URLs was turned on - all the rest was added (and made significant improvements) by searching around a dozen topics in the forums - some not easy to find.
# -- concrete5 urls start --
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} !index.php
RewriteCond %{REQUEST_URI} !(.*)/$
RewriteRule ^(.*)$ /$1/ [L,R=301]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L]
# -- concrete5 urls end --
AddType application/x-httpd-php5 .html .htm .php
mhawke replied on at Permalink Reply
All servers are not configured equally. I just swapped your .htaccess file contents for my own and it killed my site. I agree that a good how-to is needed so why don't you write one? Members of the C5 community write most of the How-to's so with your recent experience, it should be fresh in your mind.
ssnetinc replied on at Permalink Reply
Yea, I had to play with each section of that .htaccess stuff based on the configuration of my particular server.

It made a world of difference in how the site performs though.

"I agree that a good how-to is needed so why don't you write one?" Yea, should have seen that one coming :-)

I'm not an SEO expert so don't have much to offer - I just know enough to see when something is very wrong and fix it from there. However, when time permits I could put together a How-To that basically points to all these other detailed topics I had to search and search for.

BTW, if something in my .htaccess broke your site, it's probably the #gzip section - not all servers run that.
mhawke replied on at Permalink Reply
My point was that a free CMS cannot be expected to detect/adjust it's installation and operating behaviour to every server configuration from php5.2 to nginx to AWS . I'm sure if you hired the core team to optimize your website for your particular server, they would be happy to help and send you a nice fat invoice for their trouble. If they hired a bunch of staff to keep the documentation up, then it couldn't be free anymore. The longer I hang around here, the more admiration I have for how Franz and Andrew balance the compromises i.e. The most good for the most people.
ssnetinc replied on at Permalink Reply
Absolutely Agreed. On all counts.
I'm relatively new to Concrete - so don't have much to offer the community (except questions). So am grateful for all the assistance and will try to give instead of take as my expertise improve.
jincmd replied on at Permalink Reply
We appreciate you, and i'd love to know how things are looking up for your seo hopes. Thanks
jincmd replied on at Permalink Reply
before Finshing your cmoment I must dissagree with the fifrst statement. c5 has always been proud of being a "cms buiilt for SEO"
jincmd replied on at Permalink Reply
Sheldon, it's late here now..... However I appreciate your comments as well as the originators comments of this post.