Pretty cool idea that leans into Lemmy’s ability to provide a rich and federated blogging platform. Essentially, with this tool, it seems, a lemmy post along with its comments can become static content on a static web page of your choice.

cross-posted from: https://aussie.zone/post/1244281

This is a great way to include comments and discussion on a static site. Take a look at the demo.

  • sugar_in_your_tea
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Could this help with SEO? I don’t think I’ve seen a lemmy instance show up in a search yet.

    • levi@aussie.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The short answer is no. It doesn’t create a static page from lemmy comments. It loads dynamic lemmy comments into static pages.

      • sugar_in_your_tea
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        If they include links back to the original lemmy instance, it should help a bit. At least in the past the ranking algorithm liked seeing lots of links from other sites.

        I’m going to need to brush up on my SEO skills. That’s really the last thing I’m looking for.

        • levi@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Sadly no, it won’t help in that way either (although that wasn’t the intent).

          The lifecycle of a webpage in a browser is something like:

          • download page - includes text content, and links to other resources like format, logic (javascript), and images
          • start downloading other resources
          • render text, and other resources as they arrive
          • start manipulating page with javascript, which in this case includes a final step
          • download lemmy post and render that, including the post, comments, and a link to the original post.

          When google and others crawl the web looking for data to include in search they generally only index content from step 1, so it would only see the parameters passed to LBS (shown as “declaration” in the demo), and would not see anything rendered by LBS.

          This is the “static” nature of static sites. The page downloaded by the browser is not built on request, rather it’s only built periodically by the author - usually whenever they have an update or another post. I haven’t posted anything on my blog in months so the pages wouldn’t have been re-built during that time. There are benefits to this strategy in that it’s very secure, very robust, very fast, and very easy to host, but the disadvantage is that any dynamic or “up to date content” (like comments from lemmy) need to be prepared by the client and thus can not be included in step one above and indexed in search.

          There is a best of both worlds approach (SSR) where you could render all the comments when a page is originally built, and then update it when the client later renders the page. This means there’s at least something for search indexers to see even if it’s not up to date. The problem here is that there’s a plethora of different engines which people use to build pages and I can’t make a plugin for all or even a few of them.

          With all that in mind, this is fantastic feedback, and why I posted this pre-alpha demo. Lots of commenters have said the same thing. I can re-factor to at least make SSR easier.

          • sugar_in_your_tea
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’ll have to play with it. I haven’t looked through the code, but I’m pretty sure the default Lemmy UI isn’t SEO-friendly as well (though kbin probably is), so it would be cool to have a tool that provides it.

            But maybe it doesn’t make sense for it to be your tool. I think it would be cool to have a lemmy feature where it’ll serve a cached, static page if it detects a bot is crawling it or the client doesn’t support JavaScript, and serve the regular site otherwise. Maybe your tool is the right option, or maybe someone should build a different one. IDK.

            • levi@aussie.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              You’re right, this tool isn’t designed to address this problem and is ill-suited.

              Lemmy should definitely render a static page and then “hydrate” that with JavaScript in the client. This is a common problem with modern js apps. SSR (server side rendering) is the solution but it can be very complex. You really need to build the whole app with SSR in mind, it’s not really something to bolt on as an additional feature at the end.

              • sugar_in_your_tea
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I think we could get away with sniffing the user agent and serving up static content instead if it looks like a bot.

                I’m a full stack dev by day, but everything I’ve worked on has been a web app that doesn’t need SEO, so I’m not exactly sure how that works in practice. But presumably we could generate and cache a basic webpage periodically for each post for use by bots and perhaps for accessibility (e.g. very basic HTML that’s just good enough to use with links or something). It would have the same content as the main page, but none of the JS or CSS.

                It shouldn’t be too hard to render with a templating library.