9 comments

  • recursivedoubts 9 minutes ago
    I continue to maintain that the best metaphor for the current situation in software development is "The Sorcerers Apprentice" in Fantasia:

    https://www.youtube.com/watch?v=m-W8vUXRfxU

  • aitchnyu 3 minutes ago
    One dev of a Lovable competitor pointed me to the rules thats supposed to ensure queries are limited to that user's data. This seems like "pretty please?" to my amateur eyes.

    https://github.com/dyad-sh/dyad/blob/de2cc2b48f2c8bfa401608c...

  • firefoxd 1 hour ago
    Lovable is marketed to non developers, so their core users wouldn't understand a security flow if it flashed red. A lot of my non dev friends were posting their cool new apps they built on LinkedIn last year [0]. Several were made on lovable. It's not on their users to understand these flaws

    The apps all look the same with a different color palette, and makes for an engaging AI post on LinkedIn. Now they are mostly abandoned, waiting for the subscription to expire... and their personal data to get exposed I guess

    [0]: https://idiallo.com/blog/my-non-programmer-friends-built-app...

    • alfiedotwtf 1 hour ago
      Developers with decades of experience still make basic security holes. The general public are screwed once they start hosting their own apps and serving on the Internet.
      • cube00 55 minutes ago
        There's something so innocent about the early days when even Microsoft thought we'd be running Personal Web Servers and hosting our own websites in a peer-to-peer fashion.

        Although cynically, in 1996 Microsoft would probably tell you anything you wanted to hear if it got you using Internet Explorer.

        The Personal Web Server is ideal for intranets, homes, schools, small business workgroups and anyone who wants to set up a personal Web server.

        https://news.microsoft.com/source/1996/10/24/microsoft-annou...

        • eddythompson80 3 minutes ago
          I always held the belief that we (as programmers and industry) failed the initial premise of the "distributed internet". On one hand, the core of the internet (whether its arpanet or even tcp/ip) was designed to be fully distributed, trustless, selfhostable, etc. The idea that you if you want an email you do a `pkg_add email`, want a file server, `pkg_add file-server`, want remote access, `pkg_add openssh` and you're done. But what we have today is [1].

          Securing all that got very technical and nuanced with hundreds of complex scenarios and tools and protocols. Tech companies raced to produce services the mass public can use, hiring hordes of very smart, expensive and technical developers to develop and secure, and they still get it wrong frequently. While the FOSS community adopted the "get good or gtfo" approach as in [1].

          The average person has no chance. That's why closed wall-gardened platforms like iOS and Android are winning.

          1: https://www.youtube.com/watch?v=40SnEd1RWUU

  • carlgreene 1 hour ago
    The hardest part about this stuff is that as a user, you don't necessarily know if an app is vibe-coded or not. Previously, you were able to have _some_ reasonable expectation of security in that trained engineers were the ones building these things out, but that's no longer the case.

    There's a lot of cool stuff being built, but also as a user, it's a scary time to be trying new things.

    • 627467 5 minutes ago
      The frequency with which I see contemporary apps updating (sometimes multiple times a day) says there's a change in culture that also makes professionals prone to mistakes.

      I get that we'll never ship a perfect release, but if you have to push fixes once a day it seems you've lost perspective.

      Vibe coding slopiness is more acceptable now because we've lowered our standards

    • yoyohello13 1 hour ago
      Yeah, my trust for new open source projects is in the toilet. Hopefully we will eventually start taking security seriously again after the vibe code gold rush.
      • dizhn 2 minutes ago
        This applies to all software not just open source.
      • esseph 1 hour ago
        > Hopefully we will eventually start taking security seriously again after the vibe code gold rush.

        Companies don't take security seriously now (and predating vibe coding)

    • ctoth 1 hour ago
      I'm sorry, what?

      > Previously, you were able to have _some_ reasonable expectation of security in that trained engineers were the ones building these things

      When was this? What world? Did I skip worldlines? Is this a new Universe?

      The world I remember is that anybody could write a program and put it on the Internet. Is this not the world you remember?

      Further, when those engineers were "trained" ... were there no data breaches before 2022?

      • carlgreene 1 hour ago
        Of course there were. Don't be pedantic. Anybody could write a program and put it on the internet. But to get a reasonably polished version with decent features and an enjoyable enough UX for someone to sign up and even pay money more, it generally took people who kind of knew what they were doing.

        Of course shortcuts were taken. They always were and always will be. But don't try to compare shipping software today to even just 3 years ago.

        • kimixa 1 hour ago
          Yes - AI has completely destroyed the set of "Signals" people used to judge quality of much software. They weren't ever 100% accurate, sure, but they were often pretty good heuristics for "level of care", what the devs considered important (or didn't consider important) and similar.

          And I mean that as both "end user" software signals, and "library" signals for other devs.

          I assume that set of signals will slowly be updated. If one of those ends up being "Any Use of AI At All" is still an open question, depending on if the promised hype actually ends up meeting capability as much as anything.

  • melecas 1 hour ago
    Vibe coding democratized shipping without democratizing the accountability. The 18,000 users absorbed the downside of a risk they didn't know they were taking.
    • shimman 28 minutes ago
      I don't think you know what democracy means, democracy means that users can reject poorly made apps. If you can't reject or destroy something, it's not a democratic process.

      Having someone dump shitty wares onto the public is only democracy if you think being held unaccountable as democratic.

      • SetTheorist 22 minutes ago
        One of the meanings of the word "democratization" is "the action of making something accessible to everyone", which is clearly the sense meant here.
    • andersmurphy 39 minutes ago
      With the power of LLMs anyone can make and sell foot guns.
  • ch4s3 1 hour ago
    I've been thinking a bit about how to do security well with my generated code. I've been using tools that check deps for CVEs, static tools that check for sql injection and similar problems, and baking some security requirements into the specs I hand claude. I can't tell yet if this is better than what I did before or just theater. It seems like in this case you'd need/want to specify some tests around access.

    I'm interested to hear how other people approach this.

    • s_ting765 1 hour ago
      Ask the LLM to create for you a POC for the vulnerability you have in mind. Last time I did this I had to repeatedly make a promise to the LLM that it was for educational purposes as it assumed this information is "dangerous".
    • ctoth 1 hour ago
      Same way you handle preserving any other property you want to preserve while "vibecoding" -- ensure tests capture it, ensure the tests can't be skipped. It really is this simple.
  • aplomb1026 41 minutes ago
    [dead]
  • octoclaw 1 hour ago
    [dead]
  • julianlam 1 hour ago
    > One example of this was a malformed authentication function. The AI that vibe-coded the Supabase backend, which uses remote procedure calls, implemented it with flawed access control logic, essentially blocking authenticated users and allowing access to unauthenticated users.

    Actually sounds like a typical mistake a human developer would make. Forget a `!` or get confused for a second about whether you want true or false returned, and the logic flips.

    The difference is a human is more likely to actually test the output of the change.