A few of my own experiments in this time with unification over the binders as variables themselves shows there’s almost always a post HM inference sitting there but likely not one that works in total generality.
To me that spot of trying to binding unification in higher order logic constraint equations is the most challenging and interesting problem since it’s almost always decidable or decidably undecidable in specific instances, but provably undecidable in general.
So what gives? Where is this boundary and does it give a clue to bigger gains in higher order unification? Is a more topological approach sitting just behind the veil for a much wider class of higher order inference?
And what of optimal sharing in the presence of backtracking? Lampings algorithm when the unification variables is in the binder has to have purely binding attached path contexts like closures. How does that get shared?
Fun to poke at, maybe just enough modern interest in logic programming to get there too…
abathologist
I did a few days of AoC in 2020 in λProlog (as a non-expert in the language), using the Elpi implementation. It provides a decent source of relatively digestable toy examples: https://github.com/shonfeder/aoc-2020
(Caveat that I don't claim to be a λProlog or expert.)
All examples showcase the typing discipline that is novel relative to Prolog, and towards day 10, use of the lambda binders, hereditary harrop formulas, and higher order niceness shows up.
upghost
I'm surprised how hard I had to dig for an actual example of syntax[1], so here you go.
Did some modest development on Lambda Prolog back in 1999. I still have a vivid memory of feeling my brain expanding :) like rewiring how I approach programming and opening up new territory in my brain.
It might sound weird and crazy, but it quite literally blew my mind at the time !
polairscience
I think that might be my favorite department/lab website I've ever come across. Really fun. Doesn't at all align with the contemporary design status quo and it shows just how good a rich website can be on a large screen. Big fan.
Anyone know why there seems to be a Prolog renaissance?
I personally found it by asking for a specific language recommendation from ChatGPT, and one of the suggestions was Prolog.
poppingtonic
Learning how to implement Prolog in pg's On Lisp was a fun way to spend multiple weeks programming. Doing this again this year should be a lot of fun.
TheRoque
I remember learning it in univerisity. It's a really weird language to reason with IMO. But really fun. However I've heard the performances are not that good if you wanna make e.g. game AIs with it.
show comments
big-chungus4
when I downloaded the example programs, they open up in my music player but don't play anything
show comments
big-chungus4
(1987)
yodsanklai
I'm curious to see how AI is going to reshape research in programming languages. Statically typed languages with expressive type systems should be even more relevant for instance.
I am a huge fan of the work towards putting this in kanren as λKanren:
https://www.proquest.com/openview/2a5f2e00e8df7ea3f1fd3e8619...
A few of my own experiments in this time with unification over the binders as variables themselves shows there’s almost always a post HM inference sitting there but likely not one that works in total generality.
To me that spot of trying to binding unification in higher order logic constraint equations is the most challenging and interesting problem since it’s almost always decidable or decidably undecidable in specific instances, but provably undecidable in general.
So what gives? Where is this boundary and does it give a clue to bigger gains in higher order unification? Is a more topological approach sitting just behind the veil for a much wider class of higher order inference?
And what of optimal sharing in the presence of backtracking? Lampings algorithm when the unification variables is in the binder has to have purely binding attached path contexts like closures. How does that get shared?
Fun to poke at, maybe just enough modern interest in logic programming to get there too…
I did a few days of AoC in 2020 in λProlog (as a non-expert in the language), using the Elpi implementation. It provides a decent source of relatively digestable toy examples: https://github.com/shonfeder/aoc-2020
(Caveat that I don't claim to be a λProlog or expert.)
All examples showcase the typing discipline that is novel relative to Prolog, and towards day 10, use of the lambda binders, hereditary harrop formulas, and higher order niceness shows up.
I'm surprised how hard I had to dig for an actual example of syntax[1], so here you go.
[1]: https://www.lix.polytechnique.fr/~dale/lProlog/proghol/extra...
There is a great overview of λProlog from 1988: https://repository.upenn.edu/bitstreams/e91f803b-8e75-4f3c-9...
Did some modest development on Lambda Prolog back in 1999. I still have a vivid memory of feeling my brain expanding :) like rewiring how I approach programming and opening up new territory in my brain.
It might sound weird and crazy, but it quite literally blew my mind at the time !
I think that might be my favorite department/lab website I've ever come across. Really fun. Doesn't at all align with the contemporary design status quo and it shows just how good a rich website can be on a large screen. Big fan.
https://www.lix.polytechnique.fr/
Anyone know why there seems to be a Prolog renaissance?
I personally found it by asking for a specific language recommendation from ChatGPT, and one of the suggestions was Prolog.
Learning how to implement Prolog in pg's On Lisp was a fun way to spend multiple weeks programming. Doing this again this year should be a lot of fun.
I remember learning it in univerisity. It's a really weird language to reason with IMO. But really fun. However I've heard the performances are not that good if you wanna make e.g. game AIs with it.
when I downloaded the example programs, they open up in my music player but don't play anything
(1987)
I'm curious to see how AI is going to reshape research in programming languages. Statically typed languages with expressive type systems should be even more relevant for instance.