This topic came to life again recently, while I am working on a very demanding project. Of course, a high-load website with multiple calculations needs proper caching, apart from somewhat light-weight Couch caching system.
So far, with absence of solution with snippets (as above) there had to be found another way. Let's say we need to show on map points of interests, related to the place. Let's also put some constraints on it - a map has a legend which could look like this:
[restaurants] - 300 restaurants in this area
-- [vegetarian r] - 30
-- [mexican r] - 50
-- [chinese food r] - 150
-- [international cousine r] - 50
-- [other r] - 20
The same goes for
[other local businesses]
The idea is that each main category shows localities, filtered by both - location and some subcategory. And the problem is that amount of calculations is massive so that a page can be generated in 60 seconds with default tags or 10-15 seconds with optimized settings for <cms:pages> and 5-7 seconds with direct <cms:query> which bypasses default routes and gets data from database directly with sql request.
It only can get worse from this, when map is zoomed out, when user chooses additional filters [currently opened; average paycheck; good reviews; public transportation available].
So, for such a situation one could see a multi-server approach with tens of thousands of investments into hardware and programming. CMS is expected to be some very enterprise-like to handle such a task
The fact is this also can be done still keeping ties with couch and having smart caching mechanisms. One of solutions could be cached snippets of pregenerated datasets. It has some cons. Another solution I discovered recently and is implementing. Caching is performed in form of pregenerated tables in mysql database. Users still send requests the normal way from the front-end with Couch <cms:query>, but data is served from caching tables and not regular monstrous couch_pages
table (yes, it holds all cloned pages from the whole website).
Dynamic requests to tables are performed 10-100x faster than common fetching. The salt of this approach is ability to place triggers in mysql, so that when any info in admin panel is updated/deleted/inserted - then relevant info is updated/deleted/inserted in those generated tables. This makes it very plug-and-play approach - set up everything once and it works. This is called materialized view
in other databases. Google for it, because it's really cool feature. Mysql doesn't support it though, but there are ways to bend it and make behave like more established rivals.
Hope this helps someone who suffers from very slow operations on server.
I'd gladly prepare some documentation and tutorial for this once I'm through and finally can have some time.
P.S. Currently database looks like this. I am working on extending the above approach and probably will ease also sorting, as sorting by page names is incredibly demanding for a million rows.