> Reading code, mostly. Tracing data through five layers of someone else's design choices. Forming a hypothesis about why the bug is happening, then testing the hypothesis, then narrowing. Recognising that the function in front of you is too big and asking what part of it has its own reason to exist. Recognising that the schema in front of you encodes a decision someone made in 2019 and that the decision is now load-bearing for things they did not anticipate. Knowing which of the five tempting cleanups in the file is going to bite you in production and which is safe.
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
That would just result in projects being flooded with low quality submissions by students who don’t care but are forced to do it. And who get angry when you don’t merge it since they need you to for their course.
We have such a class at our university. We used to have students issue PRs to real projects but then stopped for that very reason. Now we have our own big OSS project that they work on quarter-to-quarter. Seems like a decent compromise.
I see it mentioned often, but it's a completely foreign stance to me. I'll take contributing to existing project over writing one from scratch any day, even if it's shitty enough to require general renovation. It's so much easier to jump onto work when there's an existing skeleton already than do all this boring grunt work to set things up and decide on layout that maybe was exciting when I was still only starting to learn how to program, but hasn't been for decades anymore.
I could see LLMs affecting that though. Their ability to output shitty and yet somewhat functional skeletons to work on manually further is just spot on.
Having taught at a university I'll say that the general reason is because there's already too much to teach, so you do your best. It's extra hard since there's a million people saying "why don't they teach X?" and you have to accommodate them.
There's problems like do you teach Python or C? It sounds silly but the difference is not about languages but how much you teach about systems. Teaching Python you get people going and they can produce faster, which does help students get less discouraged. But teaching C forces learning about the computer system and enables students to dive deeper to teach themselves many different subtopics that no 4 year program can.
What I think is generally missing and would be good to implement is code review and teaching how to understand a large existing codebase (all that grep, find, profilers, traces, tags, and all that jazz). This often gets taught in parallel (e.g. have students review each others code) but it's hit or miss, a lot of extra work, and not everyone does it.
Here's the shitty part: I was often told by peers and people higher up "don't look at student's code, just look at output and run tests." I always ignored, because that advice is why we're failing so many students. But I also understand it because professors are overburdened. There's too much work to do and teaching isn't even half the job. Then every new administrator or "office assistant" they hire, the more work you have (seriously, it'll take days to book a flight because you have to use some code but it takes 2 days for someone to tell you the code and 5 more to tell you that it was the wrong code and it's clearly your fault because you clicked on "book flight" and not trips > booking > flights > schedules > trips > access code > flights > search available flights. Honestly, I think all this llm agent stuff would sound silly if people actually just knew how to design things...)
Grew up with this guy. We both did community college courses on programming before we were out out high school. I'm impressed with the library he's built up and CCs tend to be more pragmatic than UCs in California.
This isn't a plug/whatever - just good content from an old friend.
To be fair, to learn to think, you have to learn the language first.
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
> To be fair, to learn to think, you have to learn the language first.
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
> But when they started, they definitely didn't write only pseudocode that wasn't runnable
I did. I took an enrichment computer science course in high school. I had already toyed around with some "programming" but that course was my first real introduction to computer science.
My teacher surprised us the first day: there would be no use of the personal computers in this class for the first semester. All work would be on paper and the blackboard. If this is not what you expected, drop the course now.
Pointers, linear vs polynomial time, recursive data structures, it all came alight in a few months during that course, on the blackboard. Maybe it was an artifact of how she had learned to program with punched cards in the USSR back when mere mortals were not allowed near the mainframe. I was fortunate to have a teacher like that.
> they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
This, we learn natural language like English first before we can use it to express ideas, argue for or against those ideas with evidence. The problem is not teaching a programming language, the problem is stopping there and not teaching how to use it to solve real problems.
This is itself a skill people need to learn, that I'm not sure is possible with pseudocode and no prior experience. Too easy to gloss over details without actually running it to learn where your blind spots are.
I did this workshop a decade or so ago where I learned my co-workers don't do this, and never did learn how they understand code otherwise. One of them mentioned he didn't even realize this was a thing.
In The Art Of Computer Programming, one of the most influential and comprehensive series of books on the subject, Knuth uses a fictional assembly language called MIX in the examples. The reader does "just run the program in their head."
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
When I took an introductory programming class at Sacramento City College in fall 2004 during my senior year of high school, we spent the first half of the semester designing our programs using flowcharts and pseudocode. We were encouraged to check the logic of our flowcharts and pseudocode. In the second half of the semester, we implemented those programs in C++.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
Not at all. It's called learning computer science. Just like you can do calculus without simulation, you can understand the semantics of a computer program without running it. It might make it harder, but running it is only a didactic tool - as Knuth did, you should be able to prove it correct without ever running it.
Sorry I am pasting my old comment here, but the intention is same
Before learning programming one should know what is computing in general? It sets good mental model, after that you can easily pickup and start writing program yourself .
Data, data, data :))) Some basic notions to know:
Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
It started with all courses teaching algorithms but on the day job you wrote all algorithms by searching it on web or via a library.
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
Yeah, but most* companies hire for just whatever X programming language they use, and do not care if you know how to program and do not care that you could pick up whatever X is in a couple of weeks. (Anecdotally for "most", I am sure there are exceptions)
Aside: so this guy pops up 3 days ago with this vibecoded sort of site and is epically AI-blogging so now we have three posts on the front page at the same time-ish? Great. Take it easy OP.
It's funny, I learned a pile of languages in my undergrad and some UML nonsense. None of it covered properly how to write code that was meant to be read, which IMHO is one of the most basic things.
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
I could see LLMs affecting that though. Their ability to output shitty and yet somewhat functional skeletons to work on manually further is just spot on.
There's problems like do you teach Python or C? It sounds silly but the difference is not about languages but how much you teach about systems. Teaching Python you get people going and they can produce faster, which does help students get less discouraged. But teaching C forces learning about the computer system and enables students to dive deeper to teach themselves many different subtopics that no 4 year program can.
What I think is generally missing and would be good to implement is code review and teaching how to understand a large existing codebase (all that grep, find, profilers, traces, tags, and all that jazz). This often gets taught in parallel (e.g. have students review each others code) but it's hit or miss, a lot of extra work, and not everyone does it.
Here's the shitty part: I was often told by peers and people higher up "don't look at student's code, just look at output and run tests." I always ignored, because that advice is why we're failing so many students. But I also understand it because professors are overburdened. There's too much work to do and teaching isn't even half the job. Then every new administrator or "office assistant" they hire, the more work you have (seriously, it'll take days to book a flight because you have to use some code but it takes 2 days for someone to tell you the code and 5 more to tell you that it was the wrong code and it's clearly your fault because you clicked on "book flight" and not trips > booking > flights > schedules > trips > access code > flights > search available flights. Honestly, I think all this llm agent stuff would sound silly if people actually just knew how to design things...)
This isn't a plug/whatever - just good content from an old friend.
https://www.youtube.com/@ProfessorHankStalica
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
Edit: * a language
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
I did. I took an enrichment computer science course in high school. I had already toyed around with some "programming" but that course was my first real introduction to computer science.
My teacher surprised us the first day: there would be no use of the personal computers in this class for the first semester. All work would be on paper and the blackboard. If this is not what you expected, drop the course now.
Pointers, linear vs polynomial time, recursive data structures, it all came alight in a few months during that course, on the blackboard. Maybe it was an artifact of how she had learned to program with punched cards in the USSR back when mere mortals were not allowed near the mainframe. I was fortunate to have a teacher like that.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
You need to learn to leetcode in psuedocode first.
I never see anyone learning to program using pseudocode (which isn't runnable to get feedback).
If they used pseudocode, were they just run the program in their heads?
This is itself a skill people need to learn, that I'm not sure is possible with pseudocode and no prior experience. Too easy to gloss over details without actually running it to learn where your blind spots are.
I did this workshop a decade or so ago where I learned my co-workers don't do this, and never did learn how they understand code otherwise. One of them mentioned he didn't even realize this was a thing.
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
?
Data, data, data :))) Some basic notions to know: Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
Richard Feynman Computer Science Lecture: https://www.youtube.com/watch?v=EKWGGDXe5MA
Old documentry on programming: https://www.youtube.com/watch?v=dFZecokdHLo
George Hotz video: what is programming? https://www.youtube.com/watch?v=N2bXEUSAiTI
https://denninginstitute.com/pjd/GP/gp_overview.html
https://htdp.org/2003-09-26/Book/curriculum-Z-H-5.html#node_...
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
You can get all these fundamentals for free and probably better from an LLM.