Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. Chrome looks set to ship an LLM Prompt API to the web platform.

Chrome looks set to ship an LLM Prompt API to the web platform.

Scheduled Pinned Locked Moved Uncategorized
47 Posts 25 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • toldtheworld@mastodon.socialT toldtheworld@mastodon.social

    @Aedius @firefoxwebdevs but doesn't Firefox already allow us to disable AI features?

    aedius@lavraievie.socialA This user is from outside of this forum
    aedius@lavraievie.socialA This user is from outside of this forum
    aedius@lavraievie.social
    wrote last edited by
    #26

    @toldtheworld @firefoxwebdevs

    Yes but it mean that they still burn money for crap.

    toldtheworld@mastodon.socialT 1 Reply Last reply
    0
    • xela@troet.cafeX xela@troet.cafe

      @firefoxwebdevs honestly, currently I couldn't think of any "magical twist", that makes the problems (model neutrality, legal pitfalls) go away.

      Our perspectives seem to differ a bit - to me yours reads like
      "is it technically feasible, is it fun to implement?"
      while mine's rather
      "do I want that in my browser and which problem does that solve, anyway?". 😁
      But that's only my interpretation, of course. 😹

      firefoxwebdevs@mastodon.socialF This user is from outside of this forum
      firefoxwebdevs@mastodon.socialF This user is from outside of this forum
      firefoxwebdevs@mastodon.social
      wrote last edited by
      #27

      @xela eh I'd say my view is "is this good for the web?", and I don't think this API is. If the technical issues were sorted, then maybe it's worth another look, but like I said in the standards position, I think developer desire of this API is being massively overstated by Google.

      1 Reply Last reply
      0
      • firefoxwebdevs@mastodon.socialF firefoxwebdevs@mastodon.social

        Chrome looks set to ship an LLM Prompt API to the web platform. At Mozilla, we oppose this API.

        We feel it has a large interoperability risk, and Google imposing T&Cs on a web API sets a dangerous precedent.

        Full details: https://github.com/mozilla/standards-positions/issues/1213#issuecomment-4347988313

        scudo@antani.cyouS This user is from outside of this forum
        scudo@antani.cyouS This user is from outside of this forum
        scudo@antani.cyou
        wrote last edited by
        #28

        @firefoxwebdevs@mastodon.social the same guys that have put AI in Firefox said "we won't put AI in Firefox"

        1 Reply Last reply
        0
        • aedius@lavraievie.socialA aedius@lavraievie.social

          @toldtheworld @firefoxwebdevs

          Yes but it mean that they still burn money for crap.

          toldtheworld@mastodon.socialT This user is from outside of this forum
          toldtheworld@mastodon.socialT This user is from outside of this forum
          toldtheworld@mastodon.social
          wrote last edited by
          #29

          @Aedius @firefoxwebdevs ah, you mean during development? I suppose there's no good alternative right now; I can't imagine a browser developer not using LLMs for coding. The biggest problem I see arises from lack of transparency on energy usage (and environmental impact) on the part of providers. Blaming them would be more productive. If we can get them to report the true impact of each inference request, I'm pretty sure people will freak out and slow down.

          aedius@lavraievie.socialA 1 Reply Last reply
          0
          • toldtheworld@mastodon.socialT toldtheworld@mastodon.social

            @Aedius @firefoxwebdevs ah, you mean during development? I suppose there's no good alternative right now; I can't imagine a browser developer not using LLMs for coding. The biggest problem I see arises from lack of transparency on energy usage (and environmental impact) on the part of providers. Blaming them would be more productive. If we can get them to report the true impact of each inference request, I'm pretty sure people will freak out and slow down.

            aedius@lavraievie.socialA This user is from outside of this forum
            aedius@lavraievie.socialA This user is from outside of this forum
            aedius@lavraievie.social
            wrote last edited by
            #30

            @toldtheworld @firefoxwebdevs

            The alternative is to continue the human development, lower the entry level for contribution.

            LLM development is producing technical debt faster than ever before.

            1 Reply Last reply
            0
            • stux@mstdn.socialS stux@mstdn.social

              @marc_eu @firefoxwebdevs Yes, AI API

              placebo@mastodon.ieP This user is from outside of this forum
              placebo@mastodon.ieP This user is from outside of this forum
              placebo@mastodon.ie
              wrote last edited by
              #31

              @stux @marc_eu @firefoxwebdevs if only you guys knew how many times I typed openai instead of openapi and vice versa

              1 Reply Last reply
              0
              • marc_eu@veganism.socialM marc_eu@veganism.social

                @stux @firefoxwebdevs
                En dus is de reactie van Mozilla niet zo gek?

                yokhai@gaygeek.socialY This user is from outside of this forum
                yokhai@gaygeek.socialY This user is from outside of this forum
                yokhai@gaygeek.social
                wrote last edited by
                #32

                @marc_eu if not for the fact that Mozilla ships Firefox with AI chatbots despite a majority of people telling them, "don't do that"...this *would* have been a noble cause.

                marc_eu@veganism.socialM 1 Reply Last reply
                0
                • firefoxwebdevs@mastodon.socialF firefoxwebdevs@mastodon.social

                  @valpackett yeah, "strongly positive" seems so misrepresentative that it'd break Google's T&C's if it was fed to the Prompt API.

                  phl@mastodon.socialP This user is from outside of this forum
                  phl@mastodon.socialP This user is from outside of this forum
                  phl@mastodon.social
                  wrote last edited by
                  #33

                  @firefoxwebdevs @valpackett Incidentally that links to a github md file which itself says merely positive and links further to an issue with ONE comment, and a blog that doesn't exist — and two other things. That's not exactly overwhelming support and excitement.

                  1 Reply Last reply
                  0
                  • yokhai@gaygeek.socialY yokhai@gaygeek.social

                    @marc_eu if not for the fact that Mozilla ships Firefox with AI chatbots despite a majority of people telling them, "don't do that"...this *would* have been a noble cause.

                    marc_eu@veganism.socialM This user is from outside of this forum
                    marc_eu@veganism.socialM This user is from outside of this forum
                    marc_eu@veganism.social
                    wrote last edited by
                    #34

                    @yokhai
                    Yeah, but in all fairness, they're focusing on only local LLM's and, more importantly they implemented a AI kill switch that turns every AI functionality off and is enabled (= no AI) by default.

                    yoasif@mastodon.socialY 1 Reply Last reply
                    0
                    • firefoxwebdevs@mastodon.socialF firefoxwebdevs@mastodon.social

                      Chrome looks set to ship an LLM Prompt API to the web platform. At Mozilla, we oppose this API.

                      We feel it has a large interoperability risk, and Google imposing T&Cs on a web API sets a dangerous precedent.

                      Full details: https://github.com/mozilla/standards-positions/issues/1213#issuecomment-4347988313

                      rejzor@mastodon.socialR This user is from outside of this forum
                      rejzor@mastodon.socialR This user is from outside of this forum
                      rejzor@mastodon.social
                      wrote last edited by
                      #35

                      @firefoxwebdevs Thing is, Google doesn't care what anyone thinks, especially not Mozilla unfortunately because they own the internet with Chrome. And they push shit that entirely benefits them and not the internet as ecosystem.

                      1 Reply Last reply
                      0
                      • marc_eu@veganism.socialM marc_eu@veganism.social

                        @yokhai
                        Yeah, but in all fairness, they're focusing on only local LLM's and, more importantly they implemented a AI kill switch that turns every AI functionality off and is enabled (= no AI) by default.

                        yoasif@mastodon.socialY This user is from outside of this forum
                        yoasif@mastodon.socialY This user is from outside of this forum
                        yoasif@mastodon.social
                        wrote last edited by
                        #36

                        @marc_eu @yokhai None of their LLMs are local, what are you talking about?

                        PS: Link Previews is enabled by default (and is disabled by the kill switch - weird, right?): https://www.quippd.com/writing/2026/01/06/architecting-consent-for-ai-deceptive-patterns-in-firefox-link-previews.html

                        marc_eu@veganism.socialM 1 Reply Last reply
                        0
                        • wcbdata@vis.socialW wcbdata@vis.social

                          @firefoxwebdevs I'm assuming @Vivaldi will disable the whole thing, yes?

                          techienotnetie@social.vivaldi.netT This user is from outside of this forum
                          techienotnetie@social.vivaldi.netT This user is from outside of this forum
                          techienotnetie@social.vivaldi.net
                          wrote last edited by
                          #37

                          @wcbdata @firefoxwebdevs @Vivaldi We have been disabling Gemini (GLIC) at compile-time for a while (and we needed to redo that recently after the Chromium team removed all the ifdefs). Several others features are disabled by overriding the "Is this feature enabled?" logic, and in this case, AFAICT this particular API depends on a component that is already disabled that way. (Actually disabling the code for most of those features when building would require hundreds of large and small patches, which would be a maintenance nightmare; I just tried that last week.)

                          wcbdata@vis.socialW 1 Reply Last reply
                          0
                          • firefoxwebdevs@mastodon.socialF firefoxwebdevs@mastodon.social

                            Chrome looks set to ship an LLM Prompt API to the web platform. At Mozilla, we oppose this API.

                            We feel it has a large interoperability risk, and Google imposing T&Cs on a web API sets a dangerous precedent.

                            Full details: https://github.com/mozilla/standards-positions/issues/1213#issuecomment-4347988313

                            sterpeto@social.vivaldi.netS This user is from outside of this forum
                            sterpeto@social.vivaldi.netS This user is from outside of this forum
                            sterpeto@social.vivaldi.net
                            wrote last edited by
                            #38

                            @firefoxwebdevs w3m and lynx our only hopes.

                            1 Reply Last reply
                            0
                            • firefoxwebdevs@mastodon.socialF firefoxwebdevs@mastodon.social

                              Chrome looks set to ship an LLM Prompt API to the web platform. At Mozilla, we oppose this API.

                              We feel it has a large interoperability risk, and Google imposing T&Cs on a web API sets a dangerous precedent.

                              Full details: https://github.com/mozilla/standards-positions/issues/1213#issuecomment-4347988313

                              rafaelmartins@mastodon.socialR This user is from outside of this forum
                              rafaelmartins@mastodon.socialR This user is from outside of this forum
                              rafaelmartins@mastodon.social
                              wrote last edited by
                              #39

                              @firefoxwebdevs it is too late to pretend that you care...

                              firefoxwebdevs@mastodon.socialF 1 Reply Last reply
                              0
                              • techienotnetie@social.vivaldi.netT techienotnetie@social.vivaldi.net

                                @wcbdata @firefoxwebdevs @Vivaldi We have been disabling Gemini (GLIC) at compile-time for a while (and we needed to redo that recently after the Chromium team removed all the ifdefs). Several others features are disabled by overriding the "Is this feature enabled?" logic, and in this case, AFAICT this particular API depends on a component that is already disabled that way. (Actually disabling the code for most of those features when building would require hundreds of large and small patches, which would be a maintenance nightmare; I just tried that last week.)

                                wcbdata@vis.socialW This user is from outside of this forum
                                wcbdata@vis.socialW This user is from outside of this forum
                                wcbdata@vis.social
                                wrote last edited by
                                #40

                                @TechieNotNetie @firefoxwebdevs @Vivaldi Excellent - thank you!

                                1 Reply Last reply
                                0
                                • rafaelmartins@mastodon.socialR rafaelmartins@mastodon.social

                                  @firefoxwebdevs it is too late to pretend that you care...

                                  firefoxwebdevs@mastodon.socialF This user is from outside of this forum
                                  firefoxwebdevs@mastodon.socialF This user is from outside of this forum
                                  firefoxwebdevs@mastodon.social
                                  wrote last edited by
                                  #41

                                  @rafaelmartins I raised these same concerns in a podcast two years ago, before I joined Mozilla https://offthemainthread.tech/episode/chromes-llm-ai-api-omg/

                                  So if this is pretend, then wow I'm really committing to the bit.

                                  rafaelmartins@mastodon.socialR 1 Reply Last reply
                                  0
                                  • yoasif@mastodon.socialY yoasif@mastodon.social

                                    @marc_eu @yokhai None of their LLMs are local, what are you talking about?

                                    PS: Link Previews is enabled by default (and is disabled by the kill switch - weird, right?): https://www.quippd.com/writing/2026/01/06/architecting-consent-for-ai-deceptive-patterns-in-firefox-link-previews.html

                                    marc_eu@veganism.socialM This user is from outside of this forum
                                    marc_eu@veganism.socialM This user is from outside of this forum
                                    marc_eu@veganism.social
                                    wrote last edited by
                                    #42

                                    @yoasif @yokhai
                                    'None' is incorrect.

                                    I just did a fresh install of FF. AI is off by default.

                                    FF has local LLM's like for translation. More will follow (can't find the article about that now).

                                    But yes, *as an option* you can also add third-party LLM's (online).

                                    And link previews:
                                    "Optionally, you can also use AI to read the beginning of the page and generate a few bullet points. To prioritize your privacy, the AI works on your device. This means you’ll need at least 3 GB of available RAM to use the optional AI."

                                    Keywords: 'optional' and 'local'.

                                    Link Preview Image
                                    On-device AI models in Firefox | Firefox Help

                                    Learn what on-device AI models are and how you can manage them.

                                    favicon

                                    (support.mozilla.org)

                                    Link Preview Image
                                    Preview webpages in Firefox with link preview | Firefox Help

                                    Learn how to preview links in Firefox, use AI-generated key points and manage link preview settings.

                                    favicon

                                    (support.mozilla.org)

                                    yoasif@mastodon.socialY 1 Reply Last reply
                                    0
                                    • marc_eu@veganism.socialM marc_eu@veganism.social

                                      @yoasif @yokhai
                                      'None' is incorrect.

                                      I just did a fresh install of FF. AI is off by default.

                                      FF has local LLM's like for translation. More will follow (can't find the article about that now).

                                      But yes, *as an option* you can also add third-party LLM's (online).

                                      And link previews:
                                      "Optionally, you can also use AI to read the beginning of the page and generate a few bullet points. To prioritize your privacy, the AI works on your device. This means you’ll need at least 3 GB of available RAM to use the optional AI."

                                      Keywords: 'optional' and 'local'.

                                      Link Preview Image
                                      On-device AI models in Firefox | Firefox Help

                                      Learn what on-device AI models are and how you can manage them.

                                      favicon

                                      (support.mozilla.org)

                                      Link Preview Image
                                      Preview webpages in Firefox with link preview | Firefox Help

                                      Learn how to preview links in Firefox, use AI-generated key points and manage link preview settings.

                                      favicon

                                      (support.mozilla.org)

                                      yoasif@mastodon.socialY This user is from outside of this forum
                                      yoasif@mastodon.socialY This user is from outside of this forum
                                      yoasif@mastodon.social
                                      wrote last edited by
                                      #43

                                      @marc_eu @yokhai None of the LLMs that could serve Google's "Prompt API" proposal to the web platform is local in Firefox.

                                      Context matters.

                                      marc_eu@veganism.socialM 1 Reply Last reply
                                      0
                                      • yoasif@mastodon.socialY yoasif@mastodon.social

                                        @marc_eu @yokhai None of the LLMs that could serve Google's "Prompt API" proposal to the web platform is local in Firefox.

                                        Context matters.

                                        marc_eu@veganism.socialM This user is from outside of this forum
                                        marc_eu@veganism.socialM This user is from outside of this forum
                                        marc_eu@veganism.social
                                        wrote last edited by
                                        #44

                                        @yoasif @yokhai
                                        Yeah context.

                                        If you would have read what I was responding to, then you could have known what the context of my response was.

                                        You are the one who jumps out of context.

                                        But let's end it here. Agree to disagree.

                                        #noAI #firefox

                                        yoasif@mastodon.socialY 1 Reply Last reply
                                        0
                                        • marc_eu@veganism.socialM marc_eu@veganism.social

                                          @yoasif @yokhai
                                          Yeah context.

                                          If you would have read what I was responding to, then you could have known what the context of my response was.

                                          You are the one who jumps out of context.

                                          But let's end it here. Agree to disagree.

                                          #noAI #firefox

                                          yoasif@mastodon.socialY This user is from outside of this forum
                                          yoasif@mastodon.socialY This user is from outside of this forum
                                          yoasif@mastodon.social
                                          wrote last edited by
                                          #45

                                          @marc_eu @yokhai You said that Mozilla is focusing on local LLMs, which is obviously not true, and it doesn't even make contextual sense, since none of the local LLMs in Firefox can serve a prompt API, since they don't respond to prompts.

                                          It isn't "agree to disagree" -- your response was to a comment about chatbots that could respond to prompts - unlike the local LLMs that do not.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups