11 results (0.013 seconds)
Assembler teaches you about machine architecture, including memory, caches, registers, pipelines, paging, micro-operations, program status words, privilege levels, virtual hardware, hardware management co-processors, "trusted" boot routines, GPU/CPU/FPGA co-processing.
You learn all the fundamental programming ideas: sequencing, conditionals, branching, subroutines, memory management, bit/byte/word ideas, interrupts, atomic operations, and fencing.
Data structures such as stacks, queues, heaps, trees, strings, character representations, floating point representations, byte packing, and bit-banging.
Issues such as cache timing, instruction scheduling, memory protection, memory-mapped I/O, instruction timing, I/O ports, virtual hardware support, power management, cache coordination.
More things like garbage collection, disk I/O, file system structure, linkers, loaders, libraries.
Operating system ideas like task switching, I/O buffering, networking, time sharing.
Ideas like row-hammering to change protected memory, return-oriented programming to use library code to execute what you can't execute, stack smashing for fun and profit. Micro-timing to break crypto routines.
Best of all: self-modifying code! This idea is either fundamental (if you know what you are doing) or a disaster (if you don't).
Self-modifying code is the basis of learning. Systems learn when they change. The recent AI systems "sort of, kind of, maybe" "learn" by shuffling weights. That's really just glorified pattern recognition (an important idea) but REAL learning requires self-modification.
Everything you ever wanted to know can be learned from assembler.
It's all here. Other languages will teach you new ideas but knowing assembler is fundamental.
One word says it all.
FORTH is a (really) stack-oriented language. It lives and breaths stacks.
FORTH is a small language and is easy to learn. It is centered around "words". There is a set of standard words but you almost immediately learn how to create your own words.
This language is completely dynamic. Every word lives in the dictionary. Your words can overwrite standard words. The "self modifying" aspect of this language is that it enables and encourages you to create "words" that do exactly what you want. A FORTH program is truly unique to the task at hand.
The fundamental "must learn" idea is the "threaded-interpreted language model". Words invoke words which invoke words... it is "kittens all the way down".
"Threaded Interpretive Languages: Their Design and Implementation"
by R. G. Loeliger is a must-read book. These ideas don't show up elsewhere.
FORTH can be implemented in 4K (that's kilobytes).
This is a language that "isn't a language". By that I mean it is one of the "clay-like languages, such as Lisp". Language shapes the way you think. Clay-like languages shape themselves to your thoughts.
Most languages suffer from an "impedance mismatch". They are like linking a firehose to a soda straw. Your "problem" has to move to the language. FORTH (and Lisp) simply shape themselves to the problem. They are "clay for the mind".
FORTH is a way of thinking. Indeed, "Thinking Forth" is another must-read book.
(http://thinking-forth.sourceforge.net/tf-kindle.pdf)
FORTH is a "must know" language.
Types have taken over the programming world. Maybe not your corner. Not yet. But it's coming for you.
Types are a "programming language for data". Usually data was just something that the procedural programs pushed around, shaping, sampling, and mangling at will. Since the compiler and the programming language had no information about the shape of data there was no way to detect data-related errors.
Haskell is one of many different attacks on the data-shape problem.
Haskell also prevents "drive-by" data damage, forcing program fragments to be "pure functions" that don't create side-effects. Every time you call a function with the same input you get the same output.
Haskell programming changes the way you think about data, which is one of the key ideas in "why" and "what", which we are promoting.
In addition, Haskell is one of the "Harper Trinity" languages. Bob Harper (CMU) points out that there are 3 different ways to express things; Proofs, Category Theory, and Programming. Any idea in one has its counterpart in the other two.
https://existentialtype.wordpress.com/2011/03/27/the-holy-trinity/
Haskell features in the MIT "Programming with Categories" course.
https://www.youtube.com/playlist?list=PLhgq-BqyZ7i7MTGhUROZy3BOICnVixETS
Haskell really forces discipline in programming. It reshapes your thinking.
Haskell is a 'must know' language.
Language defines the way you think. Some languages completely change THE WAY you think.
Some languages change WHAT you can think.
Most languages are "more of the same". I don't care if a language is widely used or "a success". Python, for example, is both. But Python teaches you nothing.
Of the many languages I know, I will give highly opinionated recommendations on which languages have profoundly re-shaped the "WAY" and "WHAT" of my thinking.
For each language I will try to hint at the "way of thinking" and the fundamental idea(s) that "bend the mind in a new way".
It is an absolute, unqualified joy to learn a language that changes the way you think and even the things you "can think".
"SNOBOL4 stands apart from most programming languages of its era by having patterns as a first-class data type (i.e. a data type whose values can be manipulated in all ways permitted to any other data type in the programming language) and by providing operators for pattern concatenation and alternation. SNOBOL4 patterns are a type of object and admit various manipulations, much like later object-oriented languages such as JavaScript whose patterns are known as regular expressions." (https://en.wikipedia.org/wiki/SNOBOL)
So the idea of a "pattern as a first-class object" is vital. Indeed, the whole idea of making something "a first class object" is fundamental. This idea will show up in other "must know" languages.
SNOBOL introduces dynamic statement creation and evaluation.
SNOBOL introduces the idea of a pattern language. A pattern statement can succeed or fail. SNOBOL allows you to branch in either case (or not)
"pattern statement" :S(win) :F(lose)
One observation is that a SNOBOL program seems to be a thing of "constant length", at least in my experience. You can write patterns and use them in statements. Or you can write statements with patterns. Or you can mix-and-match. Oddly, though, every SNOBOL program I've written using any combination seems to be the same length.
SNOBOL makes patterns into a language for thinking. When you write your first program you'll struggle. But learning occurs when you figure it out.
Thinking in patterns shows up in compilers (BNF) and regular expressions. Non-deterministic programming shows up. So do state machines. Indeed, AI systems are pattern matching languages (although doing so in SNOBOL makes my mind hurt).
SNOBOL is a "must know" language.
It's assembler for people who know what they are doing.
C has been criticized unfairly. It is possible to do almost everything you can do in assembler.
It introduces the idea of a "high level language". There are tens of thousands of others but C is the one you must know.
C introduces syntax. Now your programs can yell back at you. It's still your fault but now the program can complain. You'll learn about "segfaults", pointers, pointers to pointers, memory layout, and joy of joys... debugging with GDB!
The idea of non-local jumps (setjmp, longjmp) shows up.
The idea of variadic functions is also worth knowing.
C introduces stylized argument handling (argc, argv, envp) and the exquisite joy of buffer overruns.
C introduces ideas (and it is not the first language to do so) such as 'types', in a kind-of, sort-of, hand-waving way. It gives an easier language for ideas of conditions, branching, and looping.
C is lightweight and (can be) portable. You can get "close to the machine", like assembler or "generic" and "machine independent".
C is the "language between languages". When you need two languages to talk, C is the way. C is the "language that other languages speak". A lot of other language compilers compile to C.
The C library is a "must know" component. You need to know how to use the library for things like file I/O, networking, and memory management.
C enables and introduces tools like 'make'.
C lets you pretend that programming is easier than it is. Learning will occur.
Write a chess program in C. It only takes about 200 lines. Yet it involves I/O (for board display and user input), recursion, memory management, alpha-beta tree search, self-modification (changing itself to play better), etc.
The bottom line is that C is a language you must know.
Rule-based languages formed the basis of "expert systems". The idea is to interview an expert in some subject (e.g. car repair) and then write "rules" that will "match" the situation and perform the action. For example,
(rule bad-breaks
when (step on break pedal)
and (pedal offers no resistance)
and (car does not slow down)
then
(fix the breaks))
Given a set of "rule patterns" it is possible to compute a set of actions. Rules can be easily written to "capture knowledge". If a new situation occurs then a new rule can be added.
The system matches every situation against every rule and chooses "the best one" (the match-action cycle).
A whole industry of "Expert Systems" grew around this idea.
Of course, there is a question of efficiency. Given 10,000 rules you really don't want to match every rule every time. Forgy figured out a highly efficient mechanism, called the RETE network, to solve this problem.
Rule based programming is a different way to approach a problem. It provides a solution "near to the problem", is semi-declarative so it can be easily read, and can be easily extended.
It can also easily be "randomly buggy" as rules that "fire" perform an action that changes the situation... which causes another rule to operate... etc.
The "reasoning" can be hard to follow.
But, as we are interested in "why" and "what", rule-based programming will change your mindset about programming.
OPS5 is a "must know" language.
Forget "markup languages" like markdown. Forget "auto-doc" languages like Doxygen.
Forget MSWord.
You're not writing "documentation", you're communicating to another person or, more likely, your future self.
In fact, what you WANT to write is a Literate Program. http://www.literateprogramming.com/
Literate programs have the actual source code embedded in the actual document. You extract the source code from the document and run it. It is always up to date.
Think about a physics textbook. All of the equations are "source code", all of the paragraphs are explanation. Could you learn physics if you just had the equations without the words? Then why would you expect anyone to maintain and modify your program with "only the equations", and no explanations (note: NOT documentation).
Some excellent literate programs are:
Matt Pharr and Greg Humphries. "Physically Based Rendering: From Theory to Implementation", Morgan Kaufmann, 2004
Christian Queinnec "Lisp in Small Pieces" Cambridge University Press (1996) 978-0521545662
Latex is the language of science communication. Learn it. Live it.
Latex is a "must know" language.
Lisp is a shapeless language. It enables you to shape your solution to your problem.
Lisp, unlike every other language I know, enable you to think about the problem and write the solution. Almost every other language has an "impedance problem", like connecting a soda straw to a firehose. Other languages require you to "forcefit your solution" to the language needs.
Lisp is THE language to learn. Thinking in Lisp is thinking about your problem.
Of all the languages I know, Lisp is the ultimate language.
Unlike other languages, Lisp is an "epiphany language". You don't "get it" until you "get it"... and then you wonder why other people don't "get it". From the dictionary we find:
An epiphany (from the ancient Greek ἐπιφάνεια, epiphanea, "manifestation, striking appearance") is an experience of a sudden and striking realization. ... Epiphanies are relatively rare occurrences and generally follow a process of significant thought about a problem.
Clearly the "process of significant thought" is about the problem. Lisp is the language of that breakthrough experience.
There are no words I can say, you have to have your own epiphany. (I could, I suppose, make some parenthetical remarks but that would be a bit meta).
Lisp is a "must know" language. It is, in fact, the most important "must know" language.
Ken Iverson went way-off-the-reservation of programming languages by introducing a symbol for each operation. Each operation is array-based. Unfortunately I can't figure out how to introduce any examples due to the symbol issue. But look at wikipedia: https://en.wikipedia.org/wiki/APL_(programming_language)
Languages that matter will change the way you think. APL really pushes the "need to think" by casting everything as array operations. Unlike FORTH and Lisp, you really need to "carry your problem" to the language.
Normally that would be a problem. But we're striving for the "Why" and "What" level of understanding. This is a language that will really challenge your thinking.
For a mind-expanding experience look for anything by Aaron Hsu. For example,
https://www.youtube.com/watch?v=v7Mt0GYHU9A
Aaron talks about "obesity in programming" https://www.youtube.com/watch?v=UDqx1afGtQc
He discusses the difference between "essentially hard problems" vs "accidentally hard problems".
If you wanted to do something like write a spreadsheet, this is the perfect language.
APL shows you that "thinking" in a language changes WHAT you can think.
APL is a "must know" language.
miniKanren by Friedman is another http://minikanren.org/
The fundamental idea is declarative programming. You say what you want and the system figures out how to give you what you want.
This idea is so powerful that Japan announced "The Fifth Generation Project".
https://en.wikipedia.org/wiki/Fifth_generation_computer
Prolog never seemed to generate much interest once the Fifth Generation died. But that's likely to change. Logic programming, especially in the area of program verification and proof systems
https://en.wikipedia.org/wiki/Lean_(proof_assistant) are a growing area of interest. Computers are vital to all parts of life these days and logic is becoming more mainstream.
But as a programming language Prolog completely changes the "Why" and "What" mindset. If you "say what you want" rather than "how to compute what you want" then the machine can "answer questions" in unexpected ways. Trivially, if you wanted to know C given A and B you could say:
C = A + B
but from a Prolog mindset, this is also a computation of A given C and B or B given A and C. You said you wanted a relation, not how to compute it so Prolog can be used to figure out the relation in any way you want, given what you already know.
Declarative programming in Prolog (or miniKanren https://www.youtube.com/watch?v=fHK-uS-Iedc) deeply changes the way you think and write programs. Computers that do what THEY understand can be used in both forward and backward ways of thinking.
Prolog is a "must know" language.