Be interesting to know more about your use case and why you need to rate limit the requests. The obvious includes
Security
the need to block spurious requests from bots or nuisance requests.
Reduce load on CMS service
Large asset listings may cause ling running requests, when there are no updates to the page its an unnecessary performance hit.
Using sessions:
Yeah you could set a session var based on incrementing a hit count. using a condition to mange the visibility of the content or redirecting the user. This has the downside of having to run all requests on the server.
Using client side:
you could load the content using AJAX and simply use JS to check a cookie. This has the downside that it can be ignored easily by bots.
Realistically the better solutions will use either:
Network controls to prevent BOT or other tools to identify traffic you want to limit. Cloudflare for example.
Cache controls increasing your cache limit to hold the page longer will mean its causing no hassle on the server.