What's the difference between programming and coding - Leslie Lamport @ HLF 2019

70,828
0
Published 2019-10-14
Leslie Lamport is best known for his seminal work in distributed systems, and as the initial developer of the document preparation system LaTeX and the author of its first manual. Leslie Lamport was the winner of the 2013 Turing Award for imposing clear, well-defined coherence on the seemingly chaotic behavior of distributed computing systems, in which several autonomous computers communicate with each other by passing messages.

In this video Lamport speaks about the key differences between a programmer and a coder. The talk was recorded at the 7th Heidelberg Laureate Forum (HLF).

Credit: ZME Science, 2019.

All Comments (21)
  • @Qdogsman
    Some things change and some things stay the same. I learned to program in two stages. In stage 1, I was handed a thick listing of the program I inherited and on which I learned. The listing was about four inches thick printed on 11x17 inch green-bar paper. (This was in 1962 and the computer was the IBM7080). I successfully solved two "impossible" important problems which caused me to be picked to participate on a team charged with completely re-designing and re-writing an entire application to run on a different computer (the 7080 instead of a 1410). That began stage 2 during which I learned the lessons of Lamport's video. I believe those lessons are just as valuable today but I am dismayed at how they seem to have been forgotten by too many people. When I joined the team of five, the "What" had already been determined: the existing and running 1410 system had to be functionally duplicated and those who were involved in the old system knew that if and when we were successful they would all lose their jobs. It was they who had to certify the results of our tests and believe me, they were tough critics. My job, along with those of the three other programmers, was to start at the "Algorithm Level" and figure out "How" to write the programs for the new system. Our boss Larry, the fifth guy, explained the rules at our first meeting. We were to learn the specs and requirements for our programs and then start drawing flow charts of our algorithms on flip-chart paper. As soon as we thought a portion of it was ready, we were to bring the flowchart sheet to Larry for him to check. We were to do no coding at all until all of our flowcharts were done and ready. Only then would "Coding" begin and keypunch authorization numbers would be assigned to allow our code to become machine readable. The coding would all start on a specific Saturday, some weeks later. We were to work through that weekend, day and night, until all the coding was done. On the next Monday, testing would start. Even though that was sixty years ago, I remember that first time I brought a flowchart to Larry. I felt cocky approaching his desk and handed him my first flip-chart page. I stood up straight and proud as Larry asked me what the program was supposed to do. I told him, and then he said, "Let's see. Suppose you have this input data", and he wrote some numbers on a scratch pad. Then he put his finger on the first block of my flowchart and said "You didn't initialize this variable for the first-time through". I felt like I suddenly got smaller and more humble. He handed me my flowchart and said, "Bring it back to me when you think you have fixed it." With my tail between my legs, I left his office and returned to my desk. I used his method of writing some hypothetical input data on a scratch pad and began tracing through the flowchart seeing what my algorithm would do. I began uncovering bugs about as fast as Larry had. At that moment I made a vow to make sure I desk checked all of my flowcharts thoroughly before I brought them back to Larry. It became a method and a habit I used all through my career. Coding started as scheduled on that Saturday. Coding was intense, mindless, and tedious. We had to write machine-level instructions on coding pads and put them in an out basket. People would periodically gather the sheets and bring them to the large, temporary staff of keypunchers. We coded until our fingers were frozen to our pencils. We stopped and went home in the middle of Sunday night only after all of our coding was done. On Monday morning we picked up our programs (mine was about a box and a half of 5081 cards if I remember right) and prepared them for our first assembly runs. That caught typos and syntax errors. Then when you were ready you put test data together and began the testing runs. Debugging was probably a lot different in those days, but Larry's method obviated nearly all logic bugs. The hardest part was proving to the 1410 reviewers that the difference in results was not because of our bugs, but because of undiscovered bugs in the old system that had been there all along. So for 60 years running, there has always been a difference between programming and coding.
  • 80% of my code is written from the logic on my whiteboard. and 10% is magic that i managed to come up with after trial and error, and the remaining 10% is the idiotic need of my customer. "it must work like this"
  • @adamcarr9442
    My first week in grad school many, many years ago was spent learning to program in Fortran. It was a new requirement that every student had to know how to program. This was pre cp/m, pre windows, pre ios. We moved from electromechanical programming to solid state logic to microcomputers, from a central data processing service using cards to a network of terminals serving the whole university in just a couple of years.
  • @DrMaxPlank
    coding is only a one small part of programming, it is the process of allowing humans to speak to computers
  • @davivify
    A coder, a programmer, and a software engineer walk into a bar. The bartender looks up and says: Hi Dave, what'll it be?
  • so in your terminology programming includes - analyzing, software architecture definition, sources definition, eventual risks of outcome correctness, security area points, user authority levels etc. and coding, coding than is only the writing the program (sequence of commands, parameters etc. in specific IT language (whether it is python, c##, c++ or other its not important). In big corporations part of programmer work does analyst who summarizes company needs, defines the data and information sources, outcome users etc etc. and communicates this with programmers/coders no matter if they are company insider employee or outsourced
  • @Slarti
    As a software developer I spend around 95% of my time reading code to understand it and 5% of my time writing code. The codebase I work on is enormous and people who have worked with it for 10 years still don't fully understand it - we are asked not to add comments as the problem with comments is that if you update the code you need to update the comments too. In some ways it's fun because it is a challenge understanding how to fix bugs or make changes.
  • @othman31415
    Most programmers are bad at math, and the misconception among many people that "you don't need math for programming" stems from not fully grasping what math is, it's not only calculus or algebra. Graph theory for instance is a branch of mathematics most programmers think that it's just bunch of defintions and code snippets.
  • @smanzoli
    Last year a friend (who knows nothing at all about ptogramming or coding) said he bought a Python book to lear programming, and he was told Python was a good first lnguage. I said it was true, but it should be like the second step... that before learning Python, he should first know hot to program... how program works, how computer works, learn logic, what are procedures, loops, what are functions, how memory works, how secreen works, how files works... coding is really easy AFTER you know how to program.
  • @anirbanc88
    1. what should a program do? 2. which algorithm should be used to solve the task? 3. documentation?? 4:36 what is the word? before _???_ a tale so that...
  • This may seem a bit hard but I wish he would have written it all down before making this video as he mentioned in the video. He seemed to wander all over the place and made me nervous just watching.
  • Very interesting all these i would like to learn it all if l had the opportunity
  • @drxyd
    Good naming conventions and clear architecture go a long way, if the code is readable then you typically don't need much in the way of additional documentation. There's really only a need to document novelty and novelty should be avoided wherever possible.
  • @dcdales
    Can someone sum it up for me? Is there a clear distinction?
  • 3:40 „you should be thinking about how it’s done“ - I disagree. One should be thinking about what you want to do and try to write out a definition. If you write in functional style, this usually translates 1:1 to code. In a second step, you may want to refactor your recursions into loops for performance (if that is an issue) or to avoid stack overflows (if you couldn’t write it in a tail recursive style or your language doesn’t guarantee tail call elimination).