Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Cyborg)
  • No Skin
Collapse
Brand Logo

CIRCLE WITH A DOT

  1. Home
  2. Uncategorized
  3. I'm going to say something that's been festering in my mind for a while now.

I'm going to say something that's been festering in my mind for a while now.

Scheduled Pinned Locked Moved Uncategorized
39 Posts 21 Posters 61 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • da_667@infosec.exchangeD da_667@infosec.exchange

    I'm going to say something that's been festering in my mind for a while now. In my two decades of practice in information security, I have yet to see responsible disclosure result in measurably better security posture.

    Code quality hasn't improved, patch management hasn't improved, minimum viable product hasn't improved, automated security updates, especially for IoT devices... Jesus Fucking Christ haven't improved. The cost of failure for organizations losing your data due to gross negligence has in no way improved, why should responsibility be the domain of the security researcher when nobody else is willing to share in that responsibility?

    I'm half-tempted to say if you have 0-days you might as well get paid for them than be responsible. Because even with a tilted playing field, nothing has measurably improved since I've been here and I would argue with "vibe coding" and the tech industry's view of "Let the AI handle it" that software quality is the worst it has been since the 90s. I lived through windows millennium edition. I've seen shit you wouldn't believe.

    "Hardware's fucked because we can't buy any, software is fucked because the LLMs trained by reddit and stack overflow are in charge now. You might as well fucking guess at this point."

    kallisti@infosec.exchangeK This user is from outside of this forum
    kallisti@infosec.exchangeK This user is from outside of this forum
    kallisti@infosec.exchange
    wrote last edited by
    #30

    @da_667
    Hard agree!

    1 Reply Last reply
    0
    • da_667@infosec.exchangeD da_667@infosec.exchange

      I'm going to say something that's been festering in my mind for a while now. In my two decades of practice in information security, I have yet to see responsible disclosure result in measurably better security posture.

      Code quality hasn't improved, patch management hasn't improved, minimum viable product hasn't improved, automated security updates, especially for IoT devices... Jesus Fucking Christ haven't improved. The cost of failure for organizations losing your data due to gross negligence has in no way improved, why should responsibility be the domain of the security researcher when nobody else is willing to share in that responsibility?

      I'm half-tempted to say if you have 0-days you might as well get paid for them than be responsible. Because even with a tilted playing field, nothing has measurably improved since I've been here and I would argue with "vibe coding" and the tech industry's view of "Let the AI handle it" that software quality is the worst it has been since the 90s. I lived through windows millennium edition. I've seen shit you wouldn't believe.

      "Hardware's fucked because we can't buy any, software is fucked because the LLMs trained by reddit and stack overflow are in charge now. You might as well fucking guess at this point."

      infosecdj@infosec.exchangeI This user is from outside of this forum
      infosecdj@infosec.exchangeI This user is from outside of this forum
      infosecdj@infosec.exchange
      wrote last edited by
      #31

      @da_667 Welcome to the club!

      Yes, the "responsible" disclosure was designed to push as much responsibility to whoever finds The Bug and absolve everyone else. It is an emotionally-charged term, and I think purposefully so. You are supposed to feel bad about *not* doing it or doing it in a way The Company disagrees with. I mean, think of the children^W^W^Wusers! And then when you, in your silliness, try to do the supposedly right thing, and get a legal threat back -- well, folks, that ain't kind of the responsibility I remember ever taking upon myself. If I get threats and violence for doing supposedly good, I ain't doing good no more, sorry. Not interested. Maybe someone else will, I don't care. So I say we treat vulnerability disclosure as proper journalism, according to Orwell: "Journalism is printing what someone else does not want published; everything else is public relations."

      Yes, the select few have made a fortune on bug bounties or whatever, but the vast majority gets breadcrumbs and the feeling of Doing The Right Thing. That feeling is where they got us. Taking responsibility for someone else's fuck-ups and feeling guilty for not being responsible enough, that's so weird, man. I didn't put the bugs in there, you did, dear company, by hiring the cheapest contractors to do the job and firing the one person who actually cared. We all know how it goes. After all, nothing a company does is in the interest of the end user or anybody else but the company itself and/or the shareholders.

      So yeah, got a 0-day? To full disclosure, or sell it off if that's your thing. At least remember you got a choice here.

      Sorry for a bunch of words, the topic hits rather close here too.

      gary_alderson@infosec.exchangeG 1 Reply Last reply
      0
      • da_667@infosec.exchangeD da_667@infosec.exchange

        nobody is held liable when breaches occur and your PII gets stolen for the fifth time in a single year.

        And then we read the inevitable report that it was a third-party managed system that was 6 months behind in patches that got popped. Or it was a risk assessment result that they said "they would get to that eventually" and never did.

        You start throwing executives in cuffs for failing to do their duty and sure as shit things would start changing.

        beng@mastodon.socialB This user is from outside of this forum
        beng@mastodon.socialB This user is from outside of this forum
        beng@mastodon.social
        wrote last edited by
        #32

        @da_667 when I was in consulting there were a few times we uncovered huge security problems in systems. We'd get very serious, tell the client, say something doom-laden like 'this is an existential risk to your business'. No-one really believed us and, honestly, they were right not to. There's genuinely no accountability for any of this.

        1 Reply Last reply
        0
        • da_667@infosec.exchangeD da_667@infosec.exchange

          It has always been the privilege of the corporations and the rich to define what responsibility is. I'm here to tell you don't give them what they aren't willing to give us.

          huronbikes@cyberplace.socialH This user is from outside of this forum
          huronbikes@cyberplace.socialH This user is from outside of this forum
          huronbikes@cyberplace.social
          wrote last edited by
          #33

          @da_667 As a programmer, I've seen the result of the same degradation if only from a different angle. It's super-frustrating and even before LLM code generation things weren't going well.

          Nobody wants to be careful because being careful cuts into margins.

          I'm glad you are putting to words something I am feeling.

          1 Reply Last reply
          0
          • da_667@infosec.exchangeD da_667@infosec.exchange

            It has always been the privilege of the corporations and the rich to define what responsibility is. I'm here to tell you don't give them what they aren't willing to give us.

            hal_pomeranz@infosec.exchangeH This user is from outside of this forum
            hal_pomeranz@infosec.exchangeH This user is from outside of this forum
            hal_pomeranz@infosec.exchange
            wrote last edited by
            #34

            @da_667 I started doing computer support professionally in 1985. By the end of the dot-com era in the early 2000's, I had long burned out on fighting the same battles endlessly in corporate IT. Things were never going to get better for the reasons you cite--basically coming down to a lack of real consequences for doing a bad job.

            In addition, there are now entire industries that have grown up around offering "solutions" for how broken these practices and products are. And also industries around handling the blast effects from the latest successful intrusions. You can buy "cyber insurance" to give the appearance of managing your corporate risk. InfoSec has become "too big to fail".

            After thinking about this long and hard, I ended up going into the incident response business. If security breaches are inevitable, IR services will always be in demand. I get paid better and get more respect from customers than I ever did trying to do things right the first time. I don't kid myself that our remediation strategies are likely to make a long-term difference in most organizations' security postures, but sometimes there's a win.

            pa@hachyderm.ioP gary_alderson@infosec.exchangeG 2 Replies Last reply
            0
            • infosecdj@infosec.exchangeI infosecdj@infosec.exchange

              @da_667 Welcome to the club!

              Yes, the "responsible" disclosure was designed to push as much responsibility to whoever finds The Bug and absolve everyone else. It is an emotionally-charged term, and I think purposefully so. You are supposed to feel bad about *not* doing it or doing it in a way The Company disagrees with. I mean, think of the children^W^W^Wusers! And then when you, in your silliness, try to do the supposedly right thing, and get a legal threat back -- well, folks, that ain't kind of the responsibility I remember ever taking upon myself. If I get threats and violence for doing supposedly good, I ain't doing good no more, sorry. Not interested. Maybe someone else will, I don't care. So I say we treat vulnerability disclosure as proper journalism, according to Orwell: "Journalism is printing what someone else does not want published; everything else is public relations."

              Yes, the select few have made a fortune on bug bounties or whatever, but the vast majority gets breadcrumbs and the feeling of Doing The Right Thing. That feeling is where they got us. Taking responsibility for someone else's fuck-ups and feeling guilty for not being responsible enough, that's so weird, man. I didn't put the bugs in there, you did, dear company, by hiring the cheapest contractors to do the job and firing the one person who actually cared. We all know how it goes. After all, nothing a company does is in the interest of the end user or anybody else but the company itself and/or the shareholders.

              So yeah, got a 0-day? To full disclosure, or sell it off if that's your thing. At least remember you got a choice here.

              Sorry for a bunch of words, the topic hits rather close here too.

              gary_alderson@infosec.exchangeG This user is from outside of this forum
              gary_alderson@infosec.exchangeG This user is from outside of this forum
              gary_alderson@infosec.exchange
              wrote last edited by
              #35

              @infosecdj @da_667 the government buys them so sell them for the most responsible price you can get #cut the shit in half

              1 Reply Last reply
              0
              • da_667@infosec.exchangeD da_667@infosec.exchange

                I'm going to say something that's been festering in my mind for a while now. In my two decades of practice in information security, I have yet to see responsible disclosure result in measurably better security posture.

                Code quality hasn't improved, patch management hasn't improved, minimum viable product hasn't improved, automated security updates, especially for IoT devices... Jesus Fucking Christ haven't improved. The cost of failure for organizations losing your data due to gross negligence has in no way improved, why should responsibility be the domain of the security researcher when nobody else is willing to share in that responsibility?

                I'm half-tempted to say if you have 0-days you might as well get paid for them than be responsible. Because even with a tilted playing field, nothing has measurably improved since I've been here and I would argue with "vibe coding" and the tech industry's view of "Let the AI handle it" that software quality is the worst it has been since the 90s. I lived through windows millennium edition. I've seen shit you wouldn't believe.

                "Hardware's fucked because we can't buy any, software is fucked because the LLMs trained by reddit and stack overflow are in charge now. You might as well fucking guess at this point."

                drewdaniels@mastodon.onlineD This user is from outside of this forum
                drewdaniels@mastodon.onlineD This user is from outside of this forum
                drewdaniels@mastodon.online
                wrote last edited by
                #36

                @da_667 I’ve been at improvements for decades inside companies. Automated scans in pipelines, and IDE tools make things a lot better for those that care. I’ve worked with many developers that take pride in their work, and just need guidance. Reported vulnerabilities motivate many internally to improve not just the one problem, but the system involved.
                It still takes time to change (though far less now). Sprints are measured in weeks, and work needs justification (like a report).

                1 Reply Last reply
                0
                • hal_pomeranz@infosec.exchangeH hal_pomeranz@infosec.exchange

                  @da_667 I started doing computer support professionally in 1985. By the end of the dot-com era in the early 2000's, I had long burned out on fighting the same battles endlessly in corporate IT. Things were never going to get better for the reasons you cite--basically coming down to a lack of real consequences for doing a bad job.

                  In addition, there are now entire industries that have grown up around offering "solutions" for how broken these practices and products are. And also industries around handling the blast effects from the latest successful intrusions. You can buy "cyber insurance" to give the appearance of managing your corporate risk. InfoSec has become "too big to fail".

                  After thinking about this long and hard, I ended up going into the incident response business. If security breaches are inevitable, IR services will always be in demand. I get paid better and get more respect from customers than I ever did trying to do things right the first time. I don't kid myself that our remediation strategies are likely to make a long-term difference in most organizations' security postures, but sometimes there's a win.

                  pa@hachyderm.ioP This user is from outside of this forum
                  pa@hachyderm.ioP This user is from outside of this forum
                  pa@hachyderm.io
                  wrote last edited by
                  #37

                  @hal_pomeranz Dealing with people who've been burned and are willing to learn from their mistakes: priceless.
                  @da_667

                  1 Reply Last reply
                  0
                  • hal_pomeranz@infosec.exchangeH hal_pomeranz@infosec.exchange

                    @da_667 I started doing computer support professionally in 1985. By the end of the dot-com era in the early 2000's, I had long burned out on fighting the same battles endlessly in corporate IT. Things were never going to get better for the reasons you cite--basically coming down to a lack of real consequences for doing a bad job.

                    In addition, there are now entire industries that have grown up around offering "solutions" for how broken these practices and products are. And also industries around handling the blast effects from the latest successful intrusions. You can buy "cyber insurance" to give the appearance of managing your corporate risk. InfoSec has become "too big to fail".

                    After thinking about this long and hard, I ended up going into the incident response business. If security breaches are inevitable, IR services will always be in demand. I get paid better and get more respect from customers than I ever did trying to do things right the first time. I don't kid myself that our remediation strategies are likely to make a long-term difference in most organizations' security postures, but sometimes there's a win.

                    gary_alderson@infosec.exchangeG This user is from outside of this forum
                    gary_alderson@infosec.exchangeG This user is from outside of this forum
                    gary_alderson@infosec.exchange
                    wrote last edited by
                    #38

                    @hal_pomeranz @da_667 some people are against ai but most of their customers use it - are you supposed to make them do it the right way first all over again? #corp culture #drawn and quartered

                    1 Reply Last reply
                    0
                    • da_667@infosec.exchangeD da_667@infosec.exchange

                      nobody is held liable when breaches occur and your PII gets stolen for the fifth time in a single year.

                      And then we read the inevitable report that it was a third-party managed system that was 6 months behind in patches that got popped. Or it was a risk assessment result that they said "they would get to that eventually" and never did.

                      You start throwing executives in cuffs for failing to do their duty and sure as shit things would start changing.

                      dalias@hachyderm.ioD This user is from outside of this forum
                      dalias@hachyderm.ioD This user is from outside of this forum
                      dalias@hachyderm.io
                      wrote last edited by
                      #39

                      @da_667 I don't particularly want them in cuffs for failing to patch because it just strengthens the paternalistic forced patching bs.

                      I want them in cuffs for possession of PII we never consented for them to collect or store in the first place.

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups