6 comments

  • meander_water 1 hour ago
    Isn't this just HATEOAS as espoused by libraries like htmx, datastar, hotwire etc.

    https://htmx.org/essays/hateoas/

    • sheept 1 hour ago
      No, they're not related. The submission's article is about CSS view transitions. HATEOAS seems to involve interacting with an API, but linking between pages on a blog does not require this.
      • regularfry 55 minutes ago
        HATEOAS does not require interacting with an API.
  • bingemaker 1 hour ago
    I have a question: After clicking on a blog in the listing page ("Collective Speed is..."), the page navigated to that particular blog. What CSS transitions are used to convert that title to a header? I saw some animation which pushed that title to become a header. How does that work? I'm curious
  • cluckindan 2 hours ago
    This is close to how things used to be, in the time before server-side includes.
  • camillomiller 1 hour ago
    I dunno, it wants to challenge our dependence on javascript and then to make it work it needs to inject a “back” behavior into a normal link?

    Js and fallbacks for menus is a solved issue. this is just another form of LLM dunning krueger derangement where you think the LLM-suggested solution is novel because you haven’t encountered it before, or because you fundamentally don’t understand the underlying problems that we have already solved.

    • sheept 1 hour ago
      Yeah, I don't think the menu should've been a separate page. It can be made JavaScript-less as a dialog opened by the popover HTML attributes,[0] and the escape key would be able to close it.

      [0]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

    • CraigJPerry 1 hour ago
      I guess it doesn't have to use JavaScript for the back behavior. It could use a server-side rendered referrer if that hasn't been stripped by the browser?

      You say that JavaScript and fallbacks for menus is a solved issue but the number of menus that are just an absolute clusterfuck is ridiculous on the web today. They're really not a solved issue, Progressive enhancement is hard to do. Genuinely hard in some cases.

      On balance, while this is not without flaws, it's interesting. Accessibility, deep linking, reduction in cognitive load for the developer. There's some merit here.

    • OuterVale 1 hour ago
      I'm unsure why you think this was an LLM-suggested solution.
    • snowe2010 1 hour ago
      Why do you think this is LLM?
  • alex1sa 30 minutes ago
    [dead]
  • jazzypants 2 hours ago
    I just don't see the appeal when it's much easier to just build a nice website using JavaScript.

    Google Search doesn't work without JavaScript.

    Seriously, what's the point? Don't just reflexively downvote me. Try to articulate why this is a good idea. It's not that hard to use your words.

    • OuterVale 1 hour ago
      You're pre-emptive hostility seems rather unwarranted.

      This article is my usual go-to and lists several reasons why JavaScript might not be available, and thus why you shouldn't take it for granted: https://piccalil.li/blog/a-handful-of-reasons-javascript-won...

      • mr_mitm 1 hour ago
        I feel compelled to add:

        - the user explicitly disabled JavaScript

        - the browser does not support JavaScript (I sometimes view websites using elinks)

        AFAIK screen readers also work better without JavaScript, so it's also an accessibility issue.

        • jazzypants 1 hour ago
          JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.

          As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.

      • jazzypants 1 hour ago
        How am I being hostile? I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web, and attempts to avoid it are quixotic at best.

        That link is not nearly as convincing as you seem to think it is. I suppose that I will need to refute the points if I want you to stop sharing it, so here we go:

        A browser extension has interfered with the site - okay? That can be true of literally anything. An extension can interfere with View Transitions too.

        A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"? Does this author know how JavaScript works?

        Internal IT policy has blocked dependencies - How? Are they bundled? Does this author still think modern websites load things like jQuery from a CDN? What year is it? (WYII from this point on, for the sake of brevity)

        WIFI network has blocked certain CDNs - WYII

        A user is viewing your site on a train which has just gone into a tunnel - The CSS and HTML won't load either!

        A device doesn’t have enough memory available - WYII???

        There’s an error in your JavaScript - and, you don't have any tests? You didn't notice when developing the site? Can you not have errors in your CSS? Sure, an error in JS is worse, but that doesn't mean you should never use it.

        An async fetch request wasn’t fenced off in a try catch and has failed - This usually wouldn't change anything. fetch failures are rarely actual errors (even a 500 response doesn't result in an exception), and it's async so it wouldn't affect the initial load.

        A user has a JavaScript toggle accidentally turned off - The <noscript> tag exists.

        A user uses a JavaScript toggle to prevent ads loading - <noscript>

        An ad blocker has blocked your JavaScript from loading - Modern ad blockers are URL based. How are they loading literally anything else from my domain?

        A user is using Opera Mini - No, they aren't.

        A user has data saving turned on - Okay... And!??!?!

        Rogue, interfering scripts have been added by Google Tag manager - Do I really need to explain how module scoping works here?

        The browser has locked up trying to parse your JS bundle - This literally doesn't happen.

        • ivan_gammel 1 hour ago
          >A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?

          Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.

          • jazzypants 59 minutes ago
            Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?

            Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.