Forum for discussing general topics related to Couch.
2 posts Page 1 of 1
Here I will try to answer your questions and discuss with you the Subject, and also share code working with cURL, Cron, background tasks, etc. I don't have a recent experience with API and Tokens, which is interesting for me, because it goes closely with async tasks. I'm also publishing an addon soon (after a feedback from the reviewer) that helps to send tasks to the background and run the async/multithread. This topic will be used to discuss all that matters in regards to these articles. I believe that any website with seo / marketing aspirations must have something from this list.

Recently, in a discussion with a fellow coucher, I formulated a few thoughts that were not formulated before in words, but I felt them constantly throughout my 8 years with Couch. It is also valid for any CMS out there (almost any). Here is what this is about: we got trapped. We - the designers, the coders, the marketing team, the visitors of the website - we are all very tightly squeezed within a 100-200-300 ms of time from the request to the browser rendering.

Of course, I love Couch, we love Couch, and we can see its beautiful performant tags, very open possibilities with endless features that can be further implemented in PHP upon demand to expand this CMS. When I started working with Couch, I could feel the same vibe any programmer feels - a command (tag) works and something is getting done. Couch can be your personal self-hosted solution for almost any kind of task, from recipes collector to smart home manager, given the proper code. But even if we stick to simple one-two-three-pages websites and do not expand Couch into other realms, we are lied to about the endless opportunities. Because the opportunity can *not* be "endless" if time is not endless.

Whatever our fantasies are, or marketing team's fantasies about "know-your-client" sort-of-thing, or website owner fantasies about how everything is reliable, or visitor's fantasies about how fast and quick everything is — it all ends when the request is fulfilled and generated page is sent to browser. Yes, you could do amazing stuff with code, and tags, but it is all curtained from you, because there is no time! Getting the location of the visitor is not an instant thing. Much more time is needed for artificial-intelligence neural-net-trained scripts that decide which product to show among related products. How many scripts would a marketing/seo guy like to put on a website? 5? 20? All of them! But there is no time within the 200-300 ms. timeframe.

This is our trap: a *procedure*, repeating itself like 4 seasons: request → tags generate HTML → response → die. When to preprocess/postprocess pictures uploaded to website? Cron job set on 3am? The visitor was long gone by then. When to decide what page to recommend the next? NOW! While the visitor (a lead) is *still on the website*. Now but how? When? HOW??

I am making a huge step towards the new reality, colleagues. Now I think I am stepping beyond the impossible, walking in the forbidden zone for the first time. My first step is physically small, just some code, but the curtain is removed, so the breakout is huge. The beast is gone! And a good feeling persists of having accomplished something important for me. I am giving my fellow couchers all the time in the world, beyond the damned *procedure*. There is no "die" anymore, friends. I mean, the page is still sent to the visitor and the visitor can even close the browser. But Couch will live, your code will continue to do amazing things, request outside resources, convert images, create pages, send emails, trigger all those emotional pins on the body of your admin-panel and website.

And the code will know where it comes from, who is the visitor, who is the neighbor (code), what is the context, and of course, it will know the future. It will 'program' the future, tap-wire it, so when the future comes (e.g. the visitor visits any new page) your code will know it, will be ready for it. And it will be ready for others too, anonymous visitors, registered users, admin-panel admins, bots, crawlers. You will be able to run a small piece of code for each of them or run a big piece of code in parallel, split among your visitors on a high-traffic website. We are breaking through; we are not bound anymore. It's the ultimate f.u. to the time constraints and to the death trap. I hope to talk more about it with you.
Fellow couchers, I invite you to test the newly published addon, that caters to remove last boundaries for you code. It's called "Sling" which means that we can 'sling' any task to background and run it indefinitely if needed. All along keeping the visitors happy with fast page loads.

There is so much more to the Living Code idea. I invite you to join me in the exploration of this adventurous world.

Tweakus-Dilectus » Addon SLING


I have included the following small example in the README, but over time we'll have many more (hopefully also in this thread). It gets the visitor's city in background. Job is triggered upon first visit. Once the information is retrieved from 3rd-party API service, we can safely show the city on the webpage.

- - -

Query external service API

Responses from 3rd-party services may take unpredictable time. They even may impose some throttle on your requests, if a hundred visitors suddenly decided to visit your website and clog the usual synchronuos execution, with pages of your website not loading indefinitely. Let's make the requests async and let's give our visitors the first class treatment without waiting time for their page load.

Using the 3rd party geo service (via function geoip), we request visitor's data and show it only when it is ready.

Code: Select all
<cms:set visitor = "<cms:call 'geoip' cached_only='1' />" is_json='1' />

<cms:if visitor.city>

   Are you from <cms:show visitor.city />?

<cms:else_if "<cms:not visitor.status />" />

   <cms:then>

      <!-- programming async update of info -->

      <cms:set response = "<cms:call 'geoip' />" is_json='1' />

      <cms:if response.status = '200' && response.city >

         <cms:log "Successfully detected city - <cms:show response.city />" />

      </cms:if>

   </cms:then>

<cms:else_if visitor.status && visitor.status ne '200' />

      <cms:log "Request reported status different from '200'. Review the response:" />
      <cms:log "<cms:show visitor as_json='1' />" />

</cms:if>


In the above snippet (hopefully, self-explanatory) there is a lot of 'sugar' i.e. logging some steps in detail. Of course, you don't need much of it. The idea is to not trust the outside API which may deliver a status different from '200' (e.g. '404'), because servers tend to have some downtime once in a while or some other error happened. Therefore, the server's response with visitor's geo data is first stored to cache. Subsequent pageviews will get data from the cache (parameter cached_only). And if there was some error in the process, there will not be any requests done anymore for this visitor, at least before you review the log and find out what the issue it.

Logging is important for the operations performed completely in background with no output whatsoever (there is no browser to send the data to). Always prepare for the worst but enjoy the first class solutions.

You will find the link to the geoip function in the README with this example.
2 posts Page 1 of 1