Originally posted by kylew77
View Post
Richard Stallman Announces GNU C Language Reference Manual
Collapse
X
-
-
-
Originally posted by baka0815 View Post
That's a good point and a good reason to not start with C(++) but with Python or similar where you can have a shell which directly interprets your code and gives you direct results. However one should still know about memory allocation and lifetime. You don't have to know the difference between heap and stack when you start programming, but that every allocation reserves memory which needs to be freed afterwards is something that should be know early on - my opinion (and I used Java for many years).
I think that people who worry a lot about memory management have a somewhat myopic view of programming. That's not an insult - it's an observation, and applies to many people whose programming experience is limited to general-purpose imperative languages such as Java and C. Try a pure functional programming language like Haskell - the concept of memory management or leaks doesn't even make sense there, as there are no such things as references, allocations, arrays, etc. Everything is a value. Or try high reliability small systems embedded programming. It is typically done in C, sometimes C++ or Ada. There's no need to think about memory management because dynamic memory is generally banned.
Leave a comment:
-
-
Originally posted by coder View PostI'd say don't confuse students with pointer arithmetic and array operator notation, straight away. Just getting their heads around a variable that addresses another variable, and then how it enables heap allocation should be enough for one or two lessons, at least (depending on how far you want to go with heap). Pointer arithmetic should be a separate lesson.
Leave a comment:
-
-
Originally posted by kylew77 View PostPointers in general they thought it was crazy that we could do array[n] and then do *(array + n),
Leave a comment:
-
-
Originally posted by DavidBrown View PostIf you really want to be good at efficient programming, write some assembly code.
Originally posted by DavidBrown View PostWrite some small programs in assembly. Do some maths, some string processing, and some data structures in the code. And then don't bother programming in assembly ever again, outside extremely niche situations - but remember what it was like when you are coding at a higher level.
Originally posted by DavidBrown View PostAnd keep https://gotbolt.org bookmarked, so that you can see the assembly generated by compilers (of many languages, not just C).
Leave a comment:
-
-
Originally posted by DavidBrown View PostIMHO, the most important aspect of learning your first language, is that you should enjoy it. And that means having results quickly - it means being able to write a program that does something, knowing only a small part of the language.
Leave a comment:
-
-
Originally posted by coder View PostWas it pointers or heap allocation that people really struggled with?
When I was doing my CS PhD / masters I was doing research in CS education and what the best first language is, it was some fascinating stuff, I left before I got anything published though.
Leave a comment:
-
-
Originally posted by coder View PostEh, I'd venture as far as to say that C is good starting point for understanding computers and how they work. If you really just want to learn about algorithms and data structures, then C is probably too low level. I feel like there's been pretty broad consensus on this, for a while now.
With C, you will learn more about how the computer actually works than with other languages, but you will not learn as much as you might think - and you can quickly "learn" things that are not true. People will often say that C is "close to the metal", and that with C you get exactly what you ask for. This is not true - C is specified in terms of an abstract machine, not the target processor, and the generated object code does not have to match the literal source code except at certain specific points (the "observable behaviour"). This leads people to misunderstandings such as thinking that signed integer overflow is two's complement wrapping because that's what the processor does, or that they can mess with pointer casts and get the results they think are "obvious". So yes, you learn a lot about the machine from programming in C, at least compared to most other languages, but don't take that concept too far.
If you really want to be good at efficient programming, write some assembly code. I'd recommend a small processor - get an Arduino kit or something like that - rather than the monstrosity that is the x86 world. Write some small programs in assembly. Do some maths, some string processing, and some data structures in the code. And then don't bother programming in assembly ever again, outside extremely niche situations - but remember what it was like when you are coding at a higher level.
And keep https://gotbolt.org bookmarked, so that you can see the assembly generated by compilers (of many languages, not just C).
Leave a comment:
-
-
Originally posted by coder View PostReally? I wrote a bunch of Postscript, at one point, and I seem to recall that all the stack stuff was automatic. I'm not sure it had any heap memory, but it seemed like Forth was entirely stack-based. At least the version used in Postscript.
Leave a comment:
-
-
Originally posted by amxfonseca View Post
Let me guess, Hurd is also de facto kernel and emacs the de facto text editor (or operative system, depending how you look at it)Last edited by Volta; 08 September 2022, 04:08 AM.
Leave a comment:
-
Leave a comment: