4 comments

  • emregucerr 40 minutes ago
    I would love to see someone build it as some kind of an SDK. App builders could use it as a local LLM plugin when dealing with data involving sensitive information.

    It's usually too much when an app asks someone to setup a local LLM but this I believe could solve that problem?

  • montroser 35 minutes ago
    Not sure if I actually want this (pretty sure I don't) -- but very cool that such a thing is now possible...
  • avaer 2 hours ago
    There's also the Prompt API, currently in Origin Trial, which supports this api surface for sites:

    https://developer.chrome.com/docs/ai/prompt-api

    I just checked the stats:

      Model Name: v3Nano
      Version: 2025.06.30.1229
      Backend Type: GPU (highest quality)
      Folder size: 4,072.13 MiB
    
    Different use case but a similar approach.

    I expect that at some point this will become a native web feature, but not anytime soon, since the model download is many multiples the size of the browser itself. Maybe at some point these APIs could use LLMs built into the OS, like we do for graphics drivers.

  • Morpheus_Matrix 1 hour ago
    [dead]