It's easy to think of notation like shell expansions, that all you're doing is replacing expressions with other expressions.
But it goes much deeper than that. Once my professor explained how many great discoveries are often paired with new notation. That new notation signifies "here's a new way to think about this problem". And that many unsolved problems today will give way to powerful notation.
show comments
FilosofumRex
Historically, speaking what killed off APL (besides the wonky keyboard), was Lotus 123 by IBM and shortly thereafter MS Excel. Engineers, academicians, accountants, and MBAs needed something better than their TI-59 & HP-12C. But the CS community was obsessing about symbolics, AI and LISP, so the industry stepped in...
This was a very unfortunate coincidence, because APL could have had much bigger impact and solve far more problems than spreadsheets ever will.
show comments
talkingtab
The base concept is related to other useful ones.
The Sapir-Whorf hypothesis is similar. I find it most interesting when you turn it upside down - in any less than perfect language there are things that you either cannot think about or are difficult to think about. Are there things that we cannot express and cannot think about in our language?
And the terms "language" and "thought" can be broader than our usual usage. For example do the rules of social interaction determine how we interact? Zeynep Tufekci in "Twitter and Teargas" talks about how twitter affords flash mobs, but not lasting social change.
Do social mechanism like "following" someone or "commenting" or "liking" determine/afford us ways of interacting with each other? Would other mechanisms afford of better collective thinking. Comments below. And be sure to like and follow. :-)
And then there is music. Not the notation, but does music express something that cannot be well expressed in other ways?
show comments
jweir
After years of looking at APL as some sort of magic I spent sometime earlier this year to learn it. It is amazing how much code you can fit into a tweet using APL. Fun but hard for me to write.
show comments
colbyn
I really wish I finished my old Freeform note taking app that complies down to self contained webpages (via SVG).
IMO it was a super cool idea for more technical content that’s common in STEM fields.
> Nevertheless, mathematical notation has serious deficiencies. In particular, it lacks universality, and must be interpreted differently according to the topic, according to the author, and even according to the immediate context.
I personally disagree to the premise of this paper.
I think notation that is separated from visualization and ergonomics of the problem has a high cost. Some academics prefer a notation that hides away a lot of the complexity which can potentially result in "Eureka" realizations, wild equivalences and the like. In some cases, however, it can be obfuscating and be prone to introducing errors. Yet, it's a important tool in communicating a train of thought.
In my opinion, having one standard notation for any domain/ closely related domains is quite stifling of creative, artistic or explorative side of reasoning and problem solving.
The paper doesn't really explore this concept well, IMHO. However, after a lot of time reading and writing APL applications, I have found that it points at a way of managing complexity radically different from abstraction.
We're inundated with abstraction barriers: APIs, libraries, modules, packages, interfaces, you name it. Consequences of this approach are almost cliché at this point—dizzyingly high abstraction towers, developers as just API-gluers, disconnect from underlying hardware, challenging to reason about performance, _etc._
APL makes it really convenient to take a different tack. Instead of designing abstractions, we can carefully design our data to be easily operated on with simple expressions. Where you would normally see a library function or DSL term, this approach just uses primitives directly:
For example, we can create a hash map of vector values and interred keys with something like
Standard operations are then immediately accessible:
k v⍪←↓⍉↑(2 0.33)(2 0.01)(3 0.92) ⍝ insert values
k{str[⍺] ⍵}⌸v ⍝ pretty print
k v⌿⍨←⊂k≠str⍳⊂'buggy' ⍝ deletion
What I find really nice about this approach is that each expression is no longer a black box, making it really natural to customize expressions for specific needs. For example, insertion in a hashmap would normally need to have code for potentially adding a new key, but above we're making use of a common invariant that we only need to append values to existing keys.
If this were a library API, there would either be an unused code path here, lots of variants on the insertion function, or some sophisticated type inference to do dead code elimination. Those approaches end up leaking non-domain concerns into our codebase. But, by subordinating detail instead of hiding it, we give ourselves access to as much domain-specific detail as necessary, while letting the non-relevant detail sit silently in the background until needed.
Of course, doing things like this in APL ends up demanding a lot of familiarity with the APL expressions, but honestly, I don't think that ends up being much more work than deeply learning the Python ecosystem or anything equivalent. In practice, the individual APL symbols really do fade into the background and you start seeing semantically meaningful phrases instead, similar to how we read English words and phrases atomically and not one letter at a time.
show comments
cess11
Last year The Array Cast republished an interview with Iverson from 1982.
It's quite interesting, and arguably more approachable than the Turing lecture.
In 1979 APL wasn't as weird and fringe as it is today, because programming languages weren't global mass phenomena in the way that they are today, pretty much all of them were weird and fringe. C was rather fresh at the time, and if one squints a bit APL can kind of look like an abstraction that isn't very far from dense C and allows you to program a computer without having to implement pointer juggling over arrays yourself.
show comments
gitroom
man i always try squishing code into tiny spaces too and then wonder why i'm tired after, but i kinda love those moments when it all just clicks
It's easy to think of notation like shell expansions, that all you're doing is replacing expressions with other expressions.
But it goes much deeper than that. Once my professor explained how many great discoveries are often paired with new notation. That new notation signifies "here's a new way to think about this problem". And that many unsolved problems today will give way to powerful notation.
Historically, speaking what killed off APL (besides the wonky keyboard), was Lotus 123 by IBM and shortly thereafter MS Excel. Engineers, academicians, accountants, and MBAs needed something better than their TI-59 & HP-12C. But the CS community was obsessing about symbolics, AI and LISP, so the industry stepped in...
This was a very unfortunate coincidence, because APL could have had much bigger impact and solve far more problems than spreadsheets ever will.
The base concept is related to other useful ones.
The Sapir-Whorf hypothesis is similar. I find it most interesting when you turn it upside down - in any less than perfect language there are things that you either cannot think about or are difficult to think about. Are there things that we cannot express and cannot think about in our language?
And the terms "language" and "thought" can be broader than our usual usage. For example do the rules of social interaction determine how we interact? Zeynep Tufekci in "Twitter and Teargas" talks about how twitter affords flash mobs, but not lasting social change.
Do social mechanism like "following" someone or "commenting" or "liking" determine/afford us ways of interacting with each other? Would other mechanisms afford of better collective thinking. Comments below. And be sure to like and follow. :-)
And then there is music. Not the notation, but does music express something that cannot be well expressed in other ways?
After years of looking at APL as some sort of magic I spent sometime earlier this year to learn it. It is amazing how much code you can fit into a tweet using APL. Fun but hard for me to write.
I really wish I finished my old Freeform note taking app that complies down to self contained webpages (via SVG).
IMO it was a super cool idea for more technical content that’s common in STEM fields.
Here’s an example from my old chemistry notes:
https://colbyn.github.io/old-school-chem-notes/dev/chemistry...
> Nevertheless, mathematical notation has serious deficiencies. In particular, it lacks universality, and must be interpreted differently according to the topic, according to the author, and even according to the immediate context.
I personally disagree to the premise of this paper.
I think notation that is separated from visualization and ergonomics of the problem has a high cost. Some academics prefer a notation that hides away a lot of the complexity which can potentially result in "Eureka" realizations, wild equivalences and the like. In some cases, however, it can be obfuscating and be prone to introducing errors. Yet, it's a important tool in communicating a train of thought.
In my opinion, having one standard notation for any domain/ closely related domains is quite stifling of creative, artistic or explorative side of reasoning and problem solving.
Also, here's an excellent exposition about notation by none other than Terry Tao https://news.ycombinator.com/item?id=23911903
> Subordination of detail
The paper doesn't really explore this concept well, IMHO. However, after a lot of time reading and writing APL applications, I have found that it points at a way of managing complexity radically different from abstraction.
We're inundated with abstraction barriers: APIs, libraries, modules, packages, interfaces, you name it. Consequences of this approach are almost cliché at this point—dizzyingly high abstraction towers, developers as just API-gluers, disconnect from underlying hardware, challenging to reason about performance, _etc._
APL makes it really convenient to take a different tack. Instead of designing abstractions, we can carefully design our data to be easily operated on with simple expressions. Where you would normally see a library function or DSL term, this approach just uses primitives directly:
For example, we can create a hash map of vector values and interred keys with something like
Standard operations are then immediately accessible: What I find really nice about this approach is that each expression is no longer a black box, making it really natural to customize expressions for specific needs. For example, insertion in a hashmap would normally need to have code for potentially adding a new key, but above we're making use of a common invariant that we only need to append values to existing keys.If this were a library API, there would either be an unused code path here, lots of variants on the insertion function, or some sophisticated type inference to do dead code elimination. Those approaches end up leaking non-domain concerns into our codebase. But, by subordinating detail instead of hiding it, we give ourselves access to as much domain-specific detail as necessary, while letting the non-relevant detail sit silently in the background until needed.
Of course, doing things like this in APL ends up demanding a lot of familiarity with the APL expressions, but honestly, I don't think that ends up being much more work than deeply learning the Python ecosystem or anything equivalent. In practice, the individual APL symbols really do fade into the background and you start seeing semantically meaningful phrases instead, similar to how we read English words and phrases atomically and not one letter at a time.
Last year The Array Cast republished an interview with Iverson from 1982.
https://www.arraycast.com/episodes/episode92-iverson
It's quite interesting, and arguably more approachable than the Turing lecture.
In 1979 APL wasn't as weird and fringe as it is today, because programming languages weren't global mass phenomena in the way that they are today, pretty much all of them were weird and fringe. C was rather fresh at the time, and if one squints a bit APL can kind of look like an abstraction that isn't very far from dense C and allows you to program a computer without having to implement pointer juggling over arrays yourself.
man i always try squishing code into tiny spaces too and then wonder why i'm tired after, but i kinda love those moments when it all just clicks