It's either very funny or very depressing to watch executives trip over themselves to prove who has the worst understanding of what software development actually entails.
-
@cloudhop
I remember managers at a firm I worked for suggesting that the typists should enter the code to speed things up
-
@cloudhop
I remember managers at a firm I worked for suggesting that the typists should enter the code to speed things up
@julesbl @cloudhop
That was actually common practice in the 1960s and early-mid 1970s. The people who did the typing were called "keypunch operators". Programmers would hand-print their programs on coding forms.
It may have been the case that most programmers did not have typing skills, but that was not the primary force driving that method of computer usage, and it certainly did not make programming faster. -
@cloudhop The number of times in 30+ years my development speed has been constrained by the speed of my fingers: 0.
-
@cloudhop A lot of people are absolutely horrified to discover I can't touch-type. Just never bothered to learn. Because it doesn't limit me.
@TomF @cloudhop
I'm glad I learned to touch-type. At the time (around 1977), my junior high school actively discouraged boys from taking typing class, because that was "women's work". I was already using computers by hunt-and-peck typing, but my motivation wasn't primarily about speeding up typing. I was already well aware that a much greater portion of the process, and time spent on it, was thinking. -
@TomF @cloudhop
I'm glad I learned to touch-type. At the time (around 1977), my junior high school actively discouraged boys from taking typing class, because that was "women's work". I was already using computers by hunt-and-peck typing, but my motivation wasn't primarily about speeding up typing. I was already well aware that a much greater portion of the process, and time spent on it, was thinking.@TomF @cloudhop
I'd hoped that touch typing would reduce my cognitive load (though I didn't know that term), making it easier to concentrate on the programming, and less on the typing. It did that somewhat, although I had already gotten so good at hunt-and-peck that it really wasn't as much change as I'd expected. -
Software development is no longer constrained by typing speed, but by how clearly engineers articulate intent.
Writing code directly without AI articulates intent best. So, vibe coding is about articulating vague intent and hoping magic 8-ball fills the gaps im such a way that it covers your use case. -
@TomF @cloudhop
I'd hoped that touch typing would reduce my cognitive load (though I didn't know that term), making it easier to concentrate on the programming, and less on the typing. It did that somewhat, although I had already gotten so good at hunt-and-peck that it really wasn't as much change as I'd expected. -
-
It's either very funny or very depressing to watch executives trip over themselves to prove who has the worst understanding of what software development actually entails.
@cloudhop "30% of all sewing is now done by our interns, this means our workers are no longer constraint by how fast they can change out the threads in their sewing machines anymore but by how clearly they can tell the interns to do it for them"
-
It's either very funny or very depressing to watch executives trip over themselves to prove who has the worst understanding of what software development actually entails.
@cloudhop More classically for software engineering, per Fred Brooks (1975): "one woman can produce a baby in nine months but nine women cannot produce a baby in one month".
-
It's either very funny or very depressing to watch executives trip over themselves to prove who has the worst understanding of what software development actually entails.
@cloudhop I knew it! Having taken a typing class over that expensive Comp Sci degree was the right choice! /s
-
@cloudhop seriously... I spent far longer planning and designing a complex embedded system than actually coding it. Typing in the code is the easy part.
@AbramKedge @cloudhop My personal experience goes that coding often involves "quite some time" staring at code that is already there/thinking maybe even 2 ..3 days like that touching few lines at the time. Then you start getting "the new ideas" and could be a few days of "code this and that rinse and repeat" only finally you get "the moment" where maybe you can even write 5000 lines of code in a few hours where 4995 will be correct and 5 will take 2 weeks to debug

-
@AbramKedge @cloudhop My personal experience goes that coding often involves "quite some time" staring at code that is already there/thinking maybe even 2 ..3 days like that touching few lines at the time. Then you start getting "the new ideas" and could be a few days of "code this and that rinse and repeat" only finally you get "the moment" where maybe you can even write 5000 lines of code in a few hours where 4995 will be correct and 5 will take 2 weeks to debug

@gilesgoat @cloudhop my cube was outside the VP of Engineering's office. For weeks I saw him quietly fuming as he walked past. Often I'd be sketching ideas on a whiteboard, or sitting back staring at it with my feet up on a filing cabinet. Four o'clock each afternoon I disappeared off to the war room to chat with the other three system architects.
Sometimes he saw me actually typing into a code editor. "How's it going?"
"Pretty good - I've got the data structures locked down, most of the function headers in place, just working on the state machine now."
"So no code yet?"
"Not yet."
The code worked the first time it was flashed into the fpga prototype, reading and writing data to a RAM disk. In three months from the start of the project, we were booting Windows from that prototype.
For comparison, the previous ground-up firmware project took 18 months to get to the same point. Code-first only *feels* faster.
-
@gilesgoat @cloudhop my cube was outside the VP of Engineering's office. For weeks I saw him quietly fuming as he walked past. Often I'd be sketching ideas on a whiteboard, or sitting back staring at it with my feet up on a filing cabinet. Four o'clock each afternoon I disappeared off to the war room to chat with the other three system architects.
Sometimes he saw me actually typing into a code editor. "How's it going?"
"Pretty good - I've got the data structures locked down, most of the function headers in place, just working on the state machine now."
"So no code yet?"
"Not yet."
The code worked the first time it was flashed into the fpga prototype, reading and writing data to a RAM disk. In three months from the start of the project, we were booting Windows from that prototype.
For comparison, the previous ground-up firmware project took 18 months to get to the same point. Code-first only *feels* faster.
@AbramKedge @cloudhop To me coding 'unless I start already with some developed idea in mind' of course always involves quite a bit of thinking/re-watching some code I already done. I tend to 'split a big problem into a set of smaller problems' and work/test them one by one before to attempt "the big merge". Sometime I quickly type things in the editor as 'they are quick ideas I want to test' that then after much rework can turn into real functional code. Erm do I see a brony here
? -
@AbramKedge @cloudhop To me coding 'unless I start already with some developed idea in mind' of course always involves quite a bit of thinking/re-watching some code I already done. I tend to 'split a big problem into a set of smaller problems' and work/test them one by one before to attempt "the big merge". Sometime I quickly type things in the editor as 'they are quick ideas I want to test' that then after much rework can turn into real functional code. Erm do I see a brony here
?@gilesgoat @cloudhop absolutely - especially when adapting or extending existing code. My process is very much the same as yours.
The scary part of that big project was that it was the frontend processor tightly bound to a hugely complex SAS interface hardware block - I tested what I could by simulation, but that was only about 10%!
-
@gilesgoat @cloudhop absolutely - especially when adapting or extending existing code. My process is very much the same as yours.
The scary part of that big project was that it was the frontend processor tightly bound to a hugely complex SAS interface hardware block - I tested what I could by simulation, but that was only about 10%!
@AbramKedge @cloudhop Have you ever found yourself "in the paradox" of having to also write "test programs" to test the code you are writing but then those too would need testing almost leading to 'an infinite recursion of debug' ?
I really almost ALWAYS found the 90% 10% rule working .. 90% of what you wrote will be bug free and doing exactly what you wanted how you wanted .. BUT .. is the 10% that will consume 90% of the time to figure out what is wrong with it, usually few lines of code 
-
@AbramKedge @cloudhop Have you ever found yourself "in the paradox" of having to also write "test programs" to test the code you are writing but then those too would need testing almost leading to 'an infinite recursion of debug' ?
I really almost ALWAYS found the 90% 10% rule working .. 90% of what you wrote will be bug free and doing exactly what you wanted how you wanted .. BUT .. is the 10% that will consume 90% of the time to figure out what is wrong with it, usually few lines of code 
@gilesgoat @cloudhop I was frustrated by test-driven design purists who seemed to want to continually test whether the processor could add two numbers!
I tended not to write test programs - except where running the real program could corrupt real persistent data. Then I separated out all the "doing the work" code from the "writing the results" code, and made a parallel data-safe test version of the program.
Other than that, Debug builds of the code that added sanity checking on function parameters seemed to catch most errors.
-
@gilesgoat @cloudhop I was frustrated by test-driven design purists who seemed to want to continually test whether the processor could add two numbers!
I tended not to write test programs - except where running the real program could corrupt real persistent data. Then I separated out all the "doing the work" code from the "writing the results" code, and made a parallel data-safe test version of the program.
Other than that, Debug builds of the code that added sanity checking on function parameters seemed to catch most errors.
@AbramKedge @gilesgoat I find that relying on spec driven or test driven development too early is useless when dependencies lie about their capabilities or are just broken. I prefer prototyping a design before writing any tests just so I can work with the libraries and get a better sense of what problems I might run into. I only write exhaustive tests after I have an architecture that has a working, functional end-to-end minimal example.
-
@AbramKedge @cloudhop To me coding 'unless I start already with some developed idea in mind' of course always involves quite a bit of thinking/re-watching some code I already done. I tend to 'split a big problem into a set of smaller problems' and work/test them one by one before to attempt "the big merge". Sometime I quickly type things in the editor as 'they are quick ideas I want to test' that then after much rework can turn into real functional code. Erm do I see a brony here
? -
It's either very funny or very depressing to watch executives trip over themselves to prove who has the worst understanding of what software development actually entails.
@cloudhop getting anything resembling a specification is the hardest part of programming, the second hardest is choosing variable names
also, it's a thought activity, not a words per minute
