Hey all, so these elements have been causing me a little grief of late, here's my situation:
I have a website, a user browses a bunch of squid/matrix/browser-cached pages. At some point they decide to log in. To get around caching, they login to a subdomain URL, in this case, members.website.com/_login - This beats the cache, as the subdomain is excluded from Squid, and a few other tweaks are in place.
My real problem is with cookies. I figure this must be an old problem and that some bright spark might have an age-old solution for this...
So user crosses over and logs in to members.website.com, but no matter how many times a login is successfully completed, the presence of session cookies stop the user from actually seeing a logged-in version of the page. Deleting all cookies fixes everything, but I can't expect users to delete cookies, and removing cookies with scripts isn't working well (and isn't a clean fix in my opinion).
I've found a few threads related to this, but can't find any solid solutions, can anyone share some advice?
There doesn't appear to be any Squiz-knowledge on this topic so far, so I might add some notes here for anyone struggling with this issue in the future (as I'm sure caching/session/logged in problems aren't likely to be that isolated).
Unfortunately there's no way that I can find to get around this issue, short of completely disabling caching on a website and removing the multiple URL cache workaround. Although it's not an ideal measure to completely disable caching, it may be the only option for Squiz Matrix users trying to consistently push session-based content to users.
Tried so far: allowing IP fluctuation (unlikely but worth a try), setting a parent domain, using a site network, using AJAX to send auth credentials ahead of a request. Using a site network logged out a user every time they visited a page on a different domain, as noted elsewhere on these forums.
I'll keep trying, if anyone has any experiences they'd like to share in the mean time, please feel free! :)
For sites where I've needed to have both public and logged in users, we've just disabled the Squid caching all together. However, you should be able to just cache content and send cache headers for public users, not logged in users.
On the send cacheable headers screen on the cache manager, there should be options for it. So you want HTTP cache control level to be set as Public. And below for Root URLs to send cacheable headers for, should be ticked to "Public user only" for the URL you are using.
Thanks Bart, I've got my test URLs excluded from Squid already. I'll try your advice and post back tonight. I don't suppose you have any tips to share regarding sessions causing problems from a top-level URL through to a subdomain?
Case in point: User browses public site, logs into members.publicsite.com - no ability to successfully log in until all cookies are deleted.
Unfortunately there is no easy fix for the sub domain cookie issue, although it has been discussed internally within the company in our roadmap. A patch/fix has been suggested, but nothing official has been released or implemented yet. Doesn't look like there will be anything released for this for a while. :(