vertnerd 4 hours ago

I was captivated by the August 1980 issue of Byte magazine, which had a cover dedicated to Forth. It was supposed to be easy to implement, and I imagined I might do that with my new KIM-1 6502 board. Alas, the KIM-1 was lost when I went to college, and life forced me down different pathways for the next 45 years.

About a year ago I finally began to work on my dream of a Forth implementation by building a Forth-based flight management computer into a spaceflight simulation game that I am working on. Now, instead of writing mostly C# or GDscript code in Godot, I am trying to figure out ways to create a useful device using this awkwardly elegant language. I'm having fun with it.

One of the interesting bits is that I have been able to make the Forth code an entirely separate project on Github (https://github.com/Eccentric-Anomalies/Sky-Dart-FMS), with a permissive open-source license. If anyone actually built a real spacecraft like the one in my game, they could use the FMS code in a real computer to run it.

There is one part of the linked article that really speaks to me: "Implement a Forth to understand how it works" and "But be aware of what this will not teach you". Figuring out the implementation just from reading books was a fascinating puzzle. Once I got it running, I realized I had zero experience actually writing Forth code. I am enjoying it, but it is a lot like writing in some weird, abstract assembly language.

  • PaulHoule 2 hours ago

    Circa 1980 BASIC was the dominant language for micros because you could fit BASIC in a machine with 4k of RAM. Although you got 64k to play with pretty quickly (1983 or so) it still was a pain in the ass to implement compilers on many chips, especially the 6502, which had so few registers and addressing modes that you're likely to use virtual machine techniques, like Wozniak's SWEET 16 or the atrocious p-code machine that turned a generation of programmers away from PASCAL.

    FORTH was an alternative language for small systems. From the viewpoint of a BASIC programmer in 1981 the obvious difference between BASIC and all the other languages which that you could write your own functions to add "words" to the language. FORTH, like Lisp, lets you not only write functions but create new control structures based on "words" having both a compile-time and run-time meaning.

    FORTH's answer to line numbers in BASIC was that it provided direct access to blocks (usually 1024 bytes) on the disk with a screen editor (just about a screenful on a 40x25) You could type your code into blocks and later load them into the interpreter. Circa 1986 I wrote a FORTH for the TRS-80 Color Computer running the OS-9 operating system and instead of using blocks it had POSIX-style I/O functions.

    FORTH was faster than BASIC and better for systems work, but BASIC was dominant. Probably the best way to use FORTH was to take advantage of it's flexibility to create a DSL that you write your applications in.

  • area51org 3 hours ago

    I had that issue, and I think I still might have it in my closet. (Weren't those Robert Tinney covers amazing?)

    I always wanted to try out Forth but had no real opportunity. Maybe I should now?

js8 11 hours ago

I used to be a fan of these languages like Lisp and Forth and Joy (and Factor and Haskell), but then I found that what I a really long for is just (untyped) lambda calculus (as a universal language). (Combinatory logic is just a similar representation of lambda calculus, but the differences go away quickly once you start abstracting stuff.)

I think expressing semantics of all (common) programming languages in lambda calculus would give us a solid foundation for automated program translation. And we should aim for that, the babel tower of languages doesn't really help anyone.

The current issue I have is with type theory. So I am trying to embed notion of types directly into the lambda terms, so they would sort of "automatically typecheck" when composed. The crucial in this, in my opinion, are lambda terms that do not use lambda abstraction in their body, because you can think of these terms as a kind of mini-DSLs that have yet to be interpreted.

Anyway, once we can translate calculus of constructions (and other common formal logics) into untyped lambda calculus, it will also help us doing automated theorem proving. It must be theoretically possible, but to my knowledge nobody has really done this sort of hardcore formalization.

  • tromp 9 hours ago

    I implemented the Calculus of Constructions in untyped lambda calculus in order to shorten the 643 byte C program computing Loader's Number to 1850 bits (under 232 bytes) [1], as one of the milestones reached by my functional busy beaver function [2].

    [1] https://codegolf.stackexchange.com/questions/176966/golf-a-n...

    [2] https://oeis.org/A333479

    • js8 5 hours ago

      I think this is great. I think you should write a paper on it.

      I suspect it might need some kind of commutative diagram proof, i.e. if you express things in CoC formalized within BLC you will get the same result as when you express in BLC formalized within CoC, I am not sure from the top of my head.

      (Kind of similar to showing that self-interpreting quoted interpreter on quoted program is the same as quoting the result of running the interpreter on the program.)

      And of course, this proof of equivalence should have formalization both in CoC (Lean?) and BLC.

      My hope is eventually someone writing a book on logic where the metalogic will be just untyped lambda calculus. Proofs will be just beta-reduction and judgemental equality. And everything will be in the form, let's study properties of these lambda terms that I came up with (the terms will of course represent some other logic such as CoC, simply typed LC, or even LC itself etc.).

  • lambdaone 3 hours ago

    Grounding programming languages in mathematics like this is essentially the goal of Strachey and Scott's denotational semantics, which has been very influential in programming language theory:

    https://en.wikipedia.org/wiki/Denotational_semantics

    • bananaflag 2 hours ago

      All approaches to semantics of programming languages are mathematical, the denotational one is not "more mathematical" than the rest.

  • entaloneralie 4 hours ago

    A lot of languages(including forth) maps really poorly to LC. Read some of the Forth writing on portability at all costs:

    http://www.ultratechnology.com/antiansi.htm

    • skybrian an hour ago

      It's difficult to understand what they were actually doing, but reading between the lines, it sounds like an advantage of 'machine Forth' is writing some Forth words in assembly? I can see why that would run much faster for a jpeg decoder.

    • tromp 4 hours ago

      Simple pure concatenative languages map quite well though [1].

      [1] https://github.com/tromp/AIT/blob/master/ait/mlatu.lam

      • entaloneralie 3 hours ago

        Joy, right? Not Forth.

        Conversion between spagetti stacks and pure stack programming(in which the stack contains numbers and no GC) has a massive translation cost if you go from LC to Forth and back.

  • CaptainOfCoit 6 hours ago

    > And we should aim for that, the babel tower of languages doesn't really help anyone.

    What exactly do you mean with this? That the amount of programming languages available isn't actually helpful, it's detrimental?

    • js8 5 hours ago

      Well, I feel that lot of code is written again and again, just in different languages. If we could automatically compare and translate different implementations, I think it would be beneficial for finding bugs.

      Everytime somebody comes up with a new programming language, I am like, yeah, so you added these abstraction, just in a different syntax. I think people who come up with new languages should implement the primitives on top of lambda calculus (which really is the simplest logical system we know), then we could potentially have automated translators between languages, and that way we could cater to everyone's preferences, and expand a standard library across different languages.

      So in short, yes, I think proliferation of programming languages without formal understanding of their differences is detrimental to interoperability of our computer systems.

      It would also allow wider notion of metaprogramming - automated manipulation of programs. For example, let's say I need to upgrade my source code from one interpreter version to another. If both interpreters are represented as a set of terms in lambda calculus, I can see how express one in the other, and formalize it as some kind of transformation. No more manual updates when something changes.

      It would also allow to build a library of universal optimizations, etc. So I think programmers would benefit from having a single universal language.

      • CaptainOfCoit 5 hours ago

        > Everytime somebody comes up with a new programming language, I am like, yeah, so you added these abstraction, just in a different syntax.

        I got this feeling too, until I started to explore languages outside of the C/Algol-like syntaxes. There is a wide range of languages out there, from array languages to lisps, and they don't give me the feeling of "just a different syntax" but actually changed the way I think.

        So yeah, I love lisp now and spend most of my days writing it, but it also comes with the downside that now Java, C# and Golang look more similar than different to each other.

        > It would also allow to build a library of universal optimizations, etc. So I think programmers would benefit from having a single universal language.

        I think assuming everyone would use the same hardware, same environment and same workflows to solve the same problems, this would make a lot of sense and would be hugely beneficial!

        But in reality lots of problems need different solutions, which has to be made in different ways, by different people who think differently. So because the world is plural, we need many programming languages too. Overall I feel like that's a benefit.

    • Mountain_Skies 5 hours ago

      Everyone should be free to unite behind my choices. It's obviously what is best for everyone.

      • js8 5 hours ago

        That's a rather cheap retort. I am not saying everybody should use raw untyped lambda calculus for their programming, just that we would all benefit if we could translate languages we use to and from it, because then we could interoperate with any other code, refactor it, etc.

        • justin66 3 hours ago

          Has even a single programming language made the complete documentation and implementation effort you're describing? It'd be interesting to read about.

        • actionfromafar 5 hours ago

          Isn’t structure lost in the compilation process?

          I mean, we already have bits and pieces of what you want, like an assembler to C decompiler, but the output isn’t very nice without the types.

          And how many languages can run on the CIL in dotnet. Say we create a CIL to lambda calculus compiler. Where do we go from there?

    • lproven 5 hours ago

      Certainly that is how I read it.

      It divides effort, spreads it too thinly among too many disparate projects with essentially the same goals, and as a result, they all advance much more slowly.

      Examples: how many successors to C are there now? Hare, Odin, Joy, Zig, Nim, Crystal, Jai, Rust, D... And probably as many again that are lower-profile or one-person efforts.

      For a parallel example, consider desktop environments on FOSS xNix OSes.

      I have tried to count and I found about 20.

      A "desktop" here meaning that it provides a homogenous environment, including things like a file manager and tools to switching between apps, plus accessories such as text editors, media viewers, and maybe even an email client, calendar, and/or address book. I am trying to explicitly exclude simple window managers here.

      The vast majority are simply re-implementations of the Windows 9x desktop. Taskbar along 1 edge of the screen, with buttons for open apps, start menu, system tray, hierarchical file explorer, a Control Panel app with icons for individual pages, etc.

      This includes:

      * KDE Plasma (and Trinity)

      * GNOME Flashback (AKA GNOME Classic, including the Consort fork)

      * Cinnamon

      * Xfce

      * Budgie

      * MATE

      * LXDE (including Raspberry Pi PIXEL)

      * LXQt

      * UKUI (from Ubuntu Kylin, openKylin, etc.)

      * DDE (from Deepin but also UOS, Ubuntu DDE and others)

      * Enlightenment (and Moksha etc.)

      * ChromeOS Aura

      And more that are now obsolete:

      * EDE

      * XPde

      * Lumina

      That's about 15, more if you count variants and forks. There are more.

      The main differences are whether they use Gtk 2, 3 or 4, or Qt. That's it.

      It's easier to count the ones that aren't visibly inspired by Windows >= 95:

      * GNOME Shell, ElementaryOS's Pantheon, Ubuntu's Unity.

      Arguably: GNUstep (whose project lead angrily maintains is not a desktop after all), and the long-dormant ROX Desktop...

      So, arguably, 3 you can run on a modern distro today.

      CDE is older than Linux or Free/NetBSD so doesn't count. I only know 1 distro that offers it, anyway: Sparky Linux.

      MAXX Interactive Desktop looks interesting but it's not (yet?) FOSS.

      All that effort that's gone into creating and maintaining 8-10 different Win9x desktops in C using Gtk. It's tragic.

      And yet there is still no modern FOSS classic-MacOS desktop, or Mac OS X desktop, or GEM desktop, or Amiga desktop, or OS/2 Workplace Shell... it's not like inspiration is lacking. There are at least 3 rewrites of AmigaOS (AROS, MorphOS, AmigaOS 4.x) but despite so much passion nobody bothered to bring the desktop to Linux?

      Defenders of each will vigorously argue that theirs is the best and there are good reasons why it's the best, I'm sure, but at the end of the day, a superset of all of the features of all of them would not be visibly different from any single one.

      That's rather sad, IMHO.

      • amiga386 4 hours ago

        > There are at least 3 rewrites of AmigaOS (AROS, MorphOS, AmigaOS 4.x) but despite so much passion nobody bothered to bring the desktop to Linux?

        The passion is there for the whole AmigaOS, of which the desktop metaphor, Workbench, is just a part. What fun is AmigaOS without Exec, Intuition and AmigaDOS? The passion is to see AmigaOS run, not to see Linux wearing its skin.

        GUIs for manipulating files a-la Workbench are readily available, nobody seems to have built an Amiga-skinned one when a Win95 one will do. DOpus is already a clone of Midnight Commander, and there are clones of that aplenty, the most DOPus-like one I've seen is Worker (http://www.boomerangsworld.de/cms/worker/)

        The rest of the Workbench metaphor is available via AmiWM (https://www.lysator.liu.se/~marcus/amiwm.html), or requires apps to play along (e.g. Gadtools, MUI, Commodities, ARexx)

        • lproven 4 hours ago

          Well, you do you, and indeed, the entire community is free to do as it wishes.

          What I find surprising is that there are multiple entire Amiga-themed Linux distros – for example:

          https://www.commodoreos.net/CommodoreOS.aspx

          I reviewed it. I was not very impressed.

          https://www.theregister.com/2025/05/06/commodore_os_3/

          And ones which put an Amiga emulator front and centre:

          https://cubiclenate.com/pimiga/

          https://wilkiecat.wordpress.com/2025/05/31/pimiga4-by-chris-...

          (Which I looked at, but decided that there wasn't enough here to review.)

          And new hardware like the A1200NG:

          https://www.a1200.com/index.php/the-a1200-ng/

          Which is an Arm board running Linux running a full-screen Amiga emulator.

          And AROS Portable:

          https://arosnews.github.io/aros-portable/

          Which I also reviewed:

          https://www.theregister.com/2025/05/22/aros_live/

          Given this visible interest in running Amiga stuff on Linux and integrating AmigaOS (and AROS) I am very surprised that in ~30 years, nothing has progressed beyond a simple window manager.

          Intuition isn't that big or complicated. It's already been recreated several times over, in MorphOS and in AROS.

          I am so tired of seeing Linux desktops that are just another inferior recreation of Win95.

          I want to see something different and this seems such an obvious candidate to me.

          • amiga386 2 hours ago

            I think you can categorise Amiga enthusiasts in various ways, this is my taxonomy:

            1. Hardware enthusiasts who specifically love the Amiga's original hardware, its peripherals, and the early post-Commodore direction (PowerPC accelerators), and/or modding all of the above. These sort of people used WarpOS back in the day and probably use MorphOS or AmigaOS 4 today. The question is whether, for these people, modern single-board computers "count" as Amigas or not.

            2. Nostalgic enthusiasts of the system that the Amiga was, who are happy with a real Amiga, or with an emulated one, or an emulated one running on some board in a box shaped like an Amiga. Possibly with a non-Amiga UI to boot some classic games. These enthusiasts may enjoy fake floppy drive sounds that remind them of booting disks in their youth.

            3. Software enthusiasts of the Amiga's OS, and the directions it took that were different from its contemporaries, and the software ecosystem that came from it. These people have a longer user-startup than startup-sequence. They probably have most of Aminet downloaded. These people might be interested in other alternative OSes, e.g. QNX or BeOS. If they're still using Amiga hardware, or emulators, they'd be interested in AmigaOS 3.5/3.9 and 3.1.4/3.2. This can also include AROS and the work to get it running on native hardware, not just m68k but also x86 and arm... but it's unlikely that it will ever support as broad a range of hardware that Linux does, which limits how many people would want to use it, because it's unlikely to be able to drive a random modern laptop.

            4. The reverse of 3, Amiga users that were big UNIX fans, e.g. Fred Fish, the sort of people who ran GeekGadgets and replaced their AmigaShell with pdksh. They probably just moved wholesale to Linux and didn't look back.

            There are probably other categories, but I think the one you're looking for is 5: enthusiasts of the Amiga's look and feel, but not its broader OS or its software. If they did care about that, they'd be in groups 2 and 3, and emulators or alternative AmigaOSes would satisfy them most.

            I can't say why there aren't many alternative desktops for Linux. Probably because it takes a lot of resources to build a full desktop environment for linux - a Window Manager, or even just an existing Window Manager theme is not enough. A file browser is not enough. Ultimately it takes the applications themselves to play along, which only works when have the clout to make people write software in your style (e.g. KDE, GNOME, Windows, macOS, Android, etc.).

            The only alternative UI taken from retro machines to Linux, that I can think of, is ROX Desktop (https://en.wikipedia.org/wiki/ROX_Desktop) with its ROX-Filer... and even that doesn't look entirely like RISC OS, which you could be running instead of Linux.

      • js8 5 hours ago

        I feel the same, too many people doing similar stuff in slightly different syntax, too few people looking at how things are similar and could be unified.

        I think it's time to look beyond syntax in programming and untyped lambda calculus is the simplest choice (that is universal and can easily express new abstractions).

        Mathematics suffers to some extent from a similar problem, but recent formalization efforts are really tackling it.

  • procaryote 11 hours ago

    Why specifically untyped?

    • js8 10 hours ago

      Because it is the simplest thing we have, and has a pretty straightforward self-interpreter.

      It feels like you need a lot more metamathematics to deal with typed lambda calculus than with untyped one, and types are something that comes without a justification.

      Anyway, the idea is, if you have a language, you can think of source code written in the language as a giant lambda term, where you have all lambdas upfront and only composition in the body. A tree of symbols to be composed, essentially. And then to interpret this source code in a language, you supply definitions of the language's primitives as arguments to the source code term.

      Now if your language is typed, the primitives need to be chosen in such a way, so that the interpreted term (the source code applied to language primitives) fails to normalize if the program is not typed correctly.

      You can then have a correspondence between the primitives of the typed language that typecheck and simpler set of primitives of the same language used purely for computation, used under the assumption that the program typechecks. This correspondence "defines" the typing mechanism of your language (in untyped lambda terms).

      • tromp 2 hours ago

        > have all lambdas upfront and only composition in the body

        That is only possible for a very limited subset of lambda terms. For example, it's not possible for the one-point basis

            A = λx λy λz. x z (y (λw. z))
        
        from which any closed lambda term can be constructed by composition.
    • jonathanstrange 9 hours ago

      The typed lambda calculus is not Turing complete.

      • js8 9 hours ago

        Not quite true, depends on the flavor of the type system you use. Church's simply typed is not, but other type systems are.

        I think one way to make it Turing complete is to add typing rule for Y combinator as an axiom to your simple type system, but I am not sure.

        • jonathanstrange 7 hours ago

          Yes, I had Church's simply typed lambda calculus in mind. You need to add a fix point combinator or general recursion.

stsquad 5 hours ago

In my first proper job as a software engineer I wrote a bunch of Forth for "fruit machines". I don't know what the US equivalent would be but they are low stakes gambling machines which are quite common in UK pubs. The core processor was a 6809 and Forth was chosen because the interpreter was super small and easy to implement. I really appreciated the quick interactive way you could update and tweak code as you tested it. I did get slightly weary of having to keep the state of the stack in your head as you DUP and SWAP stuff around but that was probably due to my inexperience and not decomposing things enough.

They continued to use Forth as the basis for their 68000 based video gaming machines although when it came to the hand classifier for video poker we ended up using C - mostly because we wanted to run a lot of simulations on one of these new fangled "Pentium" processors to make sure we got the prize distribution right to meet the target repayment rate of ~98%.

  • ikamm 5 hours ago

    We just refer to them as “slot machines” in the US

rpcope1 13 hours ago

If you like Forth, but find it challenging to build real stuff with, Factor (https://factorcode.org/) is most or all of the good stuff about Forth designed in a way that's much easier to do things with. It was designed by Slava Pestov (who I think had a big hand in Swift), and honestly it's a lot of fun to build webapps and other programs with, and much less brutal to read than Forth can be.

  • a96 3 hours ago

    I had to have a peek if it's all just web. Apparently, no.

    https://concatenative.org/wiki/view/Factor/UI

    > The Factor UI is a GUI toolkit together with a set of developer tools, written entirely in Factor, implemented on top of a combination of OpenGL and native platform APIs: X11, Win32 and Cocoa.

    > UI gadgets are rendered using the cross-platform OpenGL API, while native platform APIs are used to create windows and receive events. The platform bindings can be also used independently; X11 binding has also been used in a Factor window manager, Factory, which is no longer maintained. The Cocoa binding is used directly by the webkit-demo vocabulary in Factor.

    Fascinating. Probably dead and no mention of Wayland, but fascinating.

    • mrjbq7 an hour ago

      Factor is not dead, but continues to make development progress. If you're curious you can find more information on the main page:

      https://factorcode.org

      The latest release of 0.100 was September 2024, and we are getting close to a new release which we hope to do end of the year or so.

      https://github.com/factor/factor

      The cross-platform UI that Factor has works on macOS, Windows, and Linux. On Linux, it unfortunately still uses a GTK2-GLext project for the OpenGL widget that we render into, but modern GTK3/4 has a Gtk.GlArea that we need to switch to using which will improve the compatibility on Wayland. However, it works fine with even the latest Ubuntu 25.10 release.

      And of course, you could use other libraries easily, such as Raylib:

      https://re.factorcode.org/2025/05/raylib.html

  • arethuza 9 hours ago

    I have very fond memories of programming in PostScript within NeWS/HyperNeWS - it did quite a few things that I've never seen in any other environment.

    Edit: To be fair relying on PostScript probably did limit the appeal, but I actually really liked it.

  • ulbu 4 hours ago

    note to those interested: no apple silicon support.

    • mrjbq7 an hour ago

      No, but works fine in Rosetta emulation. And can use native libraries installed via for example the Intel Homebrew.

      We do hope to get native aarch64 support in the near future. Let's see.

dcreager an hour ago

Stepping away from Forth in particular, one of the benefits of a stack-based / concatenative language is that it's easy to implement on constrained hardware. uxn [1] is a great example of that.

And shameless self-promotion, if you're interested in how these kinds of languages compare with more traditional named-based languages, with more theoretical constructs like the lambda calculus and combinatory logic, and with gadgets like a PyBadge — well you're in luck! I gave a talk about exactly that at the final Strange Loop [2].

[1] https://100r.co/site/uxn.html

[2] https://dcreager.net/talks/concatenative-languages/

schwartzworld 4 hours ago

I spent a few months playing with forth after seeing a talk on it at Boston Code Camp. I struggled to find a practical application (I do web dev), but it had a lasting effect on my style of programming. Something about the way you factor a forth program changed me. Now I mainly do functional-flavored typescript, and while forth is NOT an FP language, there is a lot that carries over.

In Forth, the language rewards you for keeping your words focused and applying the single responsibility principal. It’s very easy to write a lot of small words that do one thing and then compose your program out of them. It’s painful to not do this.

There is no state outside the stack. If you call a word it pulls values off the stack and deposits values back on the stack. Having no other mechanism for transferring data requires you to basically create data pipelines that start to look like spoken language.

nikolay 14 hours ago

Many people glorify the simplicity of Lisp as an interpreter, but Forth is similar and underappreciated. Sadly, the only code I've written in Forth is... PostScript. Yeah, PostScript is a dialect of Forth. As a child, I really was amused by the demo of GraFORTH on Apple ][, which included 3D wireframe animations, which at the time were magical.

  • flyinghamster 4 hours ago

    I had a copy of that as well - I forget whether it was a Christmas gift or if I bought it. The demos were neat, but I was lacking in ideas when I had time to play with it, and the Apple didn't go to college with me.

    But if I were going to do some "from the ground up, using first principles, with nobody else's libraries" embedded work, Forth would certainly be something I'd consider.

  • lutusp 11 hours ago

    > As a child, I really was amused by the demo of GraFORTH on Apple ][, which included 3D wireframe animations, which at the time were magical.

    I originally wrote GraFORTH (https://archive.org/details/a2_GraFORTH_1981_Lutus_Paul) to escape the slow world of integer BASIC on my first computer (an Apple II). Because it relied on large blocks of assembly code to produce nice graphics, it perhaps misled people about what Forth could do on its own.

    Later I wrote a variation I called TransFORTH (https://mirrors.apple2.org.za/ftp.apple.asimov.net/documenta...) that supported floating-point. I intended to combine GraFORTH and TransFORTH, but my computer didn't have enough RAM.

    Innocent times, different world, before the personal computing tail began wagging the dog.

    • vidarh 10 hours ago

      Someone mentioning childhood tech and the creator showing up is peak HN, in the best possible way. I love little threads like this... I never used a Forth as a child, but I recall reading about it and marvelling over it at a time when getting hold of huge amounts of pirated games was easy, but finding anywhere to even buy more serious tools could be challenge... I think it was probably 20+ years before I actually ended up trying a Forth.

    • ErroneousBosh 7 hours ago

      > I originally wrote GraFORTH

      Oh really?

      Given you were around at about the correct time period, could you hazard a guess at what dialect this very old Forth game from Byte magazine was written in?

      https://github.com/RickCarlino/Cosmic-Conquest-1982

      It has some graphics commands in that I couldn't find in any other version of Forth on the Apple II. I'm a little outside the Apple II demographic, since they didn't really take off in the UK - although the very first home computer I ever used was an Apple II owned by the father of the guy that founded Rockstar Games :-)

  • tonyedgecombe 3 hours ago

    >Yeah, PostScript is a dialect of Forth.

    My understanding is they were developed independently.

  • James_K 8 hours ago

    The difference between Forth and Lisp could not be more pronounced. Forth source code has entirely implicit structure, you can't even tell which function is called on which arguments. Lisp has entirely explicit structure which makes it much easier to read and edit. Lisp needs only a single primitive (lambda) to create the entire programming language, whereas Forth needs many primitives which break the core idea of the language in order to be usable. All of what is elegant about Lisp is ultimately lacking in Forth.

    • 7thaccount 5 hours ago

      I think I'd agree from a mathematical perspective that lisp is more elegant, but implementation-wise, I really do like Forth's simplicity. They're both really cool.

sriku 12 hours ago

I've had a soft spot for Forth and am toying with a silly Forth-like interpreter for web programming ... if not for actual use, at least for some fun time. One concept it adds is the notion of a "current selection" which can define the available vocabulary to use and is used to select and work with DOM elements. Just experimenting.

https://github.com/srikumarks/pjs

Edit: As a kid, I disliked BASIC as a language though it let me do fun stuff. So I made an interpreter in BASIC for a language I'd like and it turned out Forth-like (and I didn't know about Forth at that time). I guess I'm still that kid some 35 years later.

whartung 2 hours ago

The Forth super power is that you have full control over how a symbol is evaluated, both at compile and runtime. I don't know of anything else that offers that. Lisp doesn't.

That gives the developer pretty much free rein to do whatever they want, which can be both good and bad.

I've always loved the elegance of Frank Sergeant's 3 Instruction Forth paper [1], it's very cool once you wrap your head around it.

Also, studying the F83 Metacompiler is valuable as well. F83 is a very capable 8/16-bit Forth system.

I honestly marvel at how much work must have gone into F83, given the tools of the time. I wish I knew more about its development journey. How it got bootstrapped.

[1] https://pygmy.utoh.org/3ins4th.html

behnamoh 15 hours ago

Why is it that languages like this don't scale? It's not the first time I see a powerful language that got forgotten. Other examples include SmallTalk and Common Lisp (tiny community).

It is because some languages are "too powerful"? What does that say about our industry? That we're still not that advanced of a specie to be able to handle the full power of such languages?

I say that because it seems languages that are "dumbed down" seem to absolutely dominate our world (Python, Ruby, JS, etc.)

  • ErroneousBosh 7 hours ago

    It's a different solution for a different time.

    Forth was an excellent way to write a powerful and expressive programming language that could self-host with a bare minimum of assembly language "bare metal" programming.

    The fridge-sized computer that Forth was originally developed on had double-digit kilobytes of memory (maybe 8192kwords, with 16-bit words) and clocked instructions through at a whopping 300kHz or so. The microcontroller that drives the Caps Lock LED on your keyboard is a hundred times faster with a hundred times the memory.

    These days we do not need to squeeze editor, compiler, and target binary into such a tiny machine. If you're developing for a microcontroller you just use C on your "big" computer, which is unimaginably more powerful.

    In the olden days of the 1990s I used a development system for embedded stuff that was written in and targetted Forth on a Z80 with a whopping 64kB of RAM and 5.25" floppies, but that was at least ten years old and five years out of date at the time.

    You're probably reading my words on a slice of glass the size of half a sandwich that contains more computing power than existed in the whole world when Forth was first written.

    It's a shame because writing something like Forth from the ground up (and I mean, assembly code to load the registers to start the ACIA to begin transmitting text to the terminal) perhaps in an emulated early 80s home computer is a great way to get a sense of what the chip behind it all is doing, and I feel that makes you a better programmer in "real" languages like Go or Python or C.

  • tarkin2 14 hours ago

    One simpler explanation: in forth you are forced to keep the stack, and modifications to the stack, in your short term memory, albeit only really three numbers in most cases. Whereas with C et al you simply look down the page at the variables, far less taxing on your short term memory.

    Well-written and designed high-level forth words often transcend that and tend to be, quite literally, readable however, in a way that is incredibly rare to see in C et al. Of course the argument is that other programmers shouldn't be expected to see the problem in the way the original problem solver did.

    • rpcope1 13 hours ago

      This is probably why you see things like locals get used a lot as modern Forth programs grow. It doesn't have to be brutal early days Chuck Moore genius programs, but I guess you start getting away from the original ethos.

      • tarkin2 10 hours ago

        I think even with locals you're still mentally dealing with a few items on the stack in each word usually. But, yes, locals do help you from passing around items from word to word: you see the usage of the local far easier than you see the location of the stack elements.

  • zovirl 11 hours ago

    I was lucky, early in my career, to work at a place which used a lot of Perl and to read Damian Conway’s book, Object Oriented Perl. It was an amazing, mind-expanding book for me. It was filled with examples of different approaches to object-oriented programming, more than I ever dreamt existed, and it showed how to implement them all in Perl.

    So much power! And right in line with Perl’s mantra, “there’s more than one way to do it.”

    Unfortunately, our codebase contained more than one way of doing it. Different parts of the code used different, incompatible object systems. It was a lot of extra work to learn them all and make them work with each other.

    It was a relief to later move to a language which only supported a single flavor of object-oriented programming.

  • lukan 14 hours ago

    What I heard is with Forth, basically no 2 environments are alike, but highly customized, meaning every forth programmer creates his own language in the end for his custom needs.

    So collaborating is a bit hard like this. The only serious forth programmer that I know, lives alone in the woods doing his things.

    So from a aesthetic point of view, I really like the language, but for getting things done, especially in a collaborative way?

    But who knows, maybe someone will write the right tools for that to change?

    • coliveira 12 hours ago

      This is not a real issue, because the same thing can be said about C. No two C projects are the same, each has its own set of libraries, macros, types, etc.

      I think the main problem is that Forth systems don't have a standard way of creating interfaces like C and other languages have. So the diversity of environments becomes a big issue because it's difficult to combine libraries from different sources.

      • jstimpfle 9 hours ago

        That's not right. C coders have no problem at all diving right into most other C codebases.

      • andoando 10 hours ago

        I think it goes beyond that no, because you can do meta programming with Forth

    • veltas 9 hours ago

      Have you tried collaborating with Forth? There's a lot of documented history of people doing so in industry when it was actually used, and more recently I've usually found Forth codebases approachable and easy to follow.

      Personally I think this is the pay-off for writing the code in the first place because Forth is very difficult to write in a clear way, if you actually manage to do it you've probably made it very clear to follow because otherwise it's hard to finish your project and make it work at all.

      • lukan 8 hours ago

        "Have you tried collaborating with Forth?"

        No, I never did more than very simple experiments with Forth (which is why I started my comment with "what I heard").

        "because Forth is very difficult to write in a clear way"

        But that pretty much means, that the average programmer will have problems with collaborating.

    • anta40 9 hours ago

      I think like C, there's standard/ANS Forth.

      But most likely lots of those Forth coders implement their own which don't necessarily conform to the standard.

  • saghm 14 hours ago

    I don't think "power" is really that helpful a metric in determining how useful a programming language is. If you think of programming from the standpoint of trying to specify the program you want out of all of the possibly programs you could write, one of the most helpful things a programming language can do is eliminate programs that you don't want by making them impossible to write. From that standpoint, constraints are a feature, not a drawback.

    • zovirl 12 hours ago

      And at the extremes, too much power makes a tool less useful. I don’t drive an F1 car to work, I don’t plant tulips with an excavator, I don’t use a sledgehammer when hanging a picture. Those tools are all too powerful for the job.

      • tsegratis 7 hours ago

        planting flowers? trowel

        planting foundations? excavator

        once you specify "the job", the best tool is "the solution" to that job only. anything else is excess complexity

        however if "the job" is unspecified, power is inverse to the length of "the solution"

        so is constraint of power bad?

        --

        a fascinating question

        just like music can be created by both additive and subtractive synthesis; every line of code creates both a feature and a constraint on the final program

        in which case power can be thought of as the ability to constrain...

        • tsegratis 6 hours ago

          that is quite wild...

          it implies expressivity is the ability to constrain

          it implies drawing on a page, or more broadly, every choice we make, is in equal parts a creative and destructive act

          so maybe life, or human flourishing is choosing the restrictions that increase freedom of choice? it's so meta it's almost oxymoronic; concretely: we imprison people to maximize freedom; or, we punish children with the aim of setting them free from punishment

          this is the same as the walk from law into grace found in Christian ethics

          maybe the ultimate programming language then, provides the maximal step down that path, and this is also the most useful definition of "power"

          i.e. place on people those restrictions that increase their ability to choose

  • rpcope1 13 hours ago

    I worked at a place that had a big Forth codebase that was doing something mission critical. It was really neat and cool once you finally got it, and probably hundreds or maybe thousands of people had touched it, worked on it and learned it, but the ramp was pretty brutal for your average developer and thus someone decided it would be better to build the same thing over with a shitty almost-C-but-not-quite interpreted language. It certainly made it easier for more people to understand and build, even if the solution was less elegant.

    • kragen 13 hours ago

      That sounds interesting! Do you have any tips for us on how to use Forth effectively? What was the codebase?

      • rpcope1 3 hours ago

        Honestly, when I write forth now, which is usually for embedded targets, I've got a customized version of zforth that I've grafted some stuff like locals into. If it's a small program, it's better to not be afraid of things like globals, and just spend at least twice as much time factoring, writing comments and thinking than writing. It's important to read other people's Forth code and try to understand, as there's a zen and style that looks very different than how you'd write something like Java. It's freeing and enlightening once it clicks, but you have to fight a ton of the way you think about "normal" code.

        As far as the codebase, I probably shouldn't say too much (may it's been long enough now, but Idk), but all I'll say is that was a important part of things at a certain disk drive manufacturer.

        • kragen 2 hours ago

          That makes a lot of sense! Thanks!

          What were the most common mistakes you saw people new to Forth making? Being afraid of global variables is one of them, I infer.

  • procaryote 11 hours ago

    Powerful languages invites people to do needlessly complex things. Needlessly complex things are harder to understand. Harder to understand is worse.

    Code that matters is usually read and extended many more times than it is written, over time by different people, so being straightforward beats most other things in practice

  • 7thaccount 5 hours ago

    I think it's a simple abstraction situation and the move for programming environments that include everything.

    Geordi Laforge doesn't code much on the Enterprise. He simply asks the computer to build him a model of the anomaly so he can test out ideas. In a way, modern languages like Python (even before LLMs) let you get a lot closer to that reality. Sure you had to know some language basics, but this was pretty minimal and you'd use those basic building blocks to glue together libraries to make an application. Python has a good library for practically anything I do and since this is standard, it's expected that a task doesn't take too long. I can't tell my boss I'll need 3 years to code my own solution that uses my own libraries for numpy and scipy. You're expected to glue libraries together. This is why MIT moved SICP from scheme to Python. It's a different world.

    With Forth, every program is a work of art that encapsulates an entire solution to a problem from scratch. It's creator chuck moore takes this to such a level that he also fabs his own chips to work with his forth software optimally. These languages had libraries, but they weren't easy to share and didn't have any kind of repository before Perl's CPAN. Perl really took off for awhile, but Python won out by having a simpler language with builtin OO (Perl's approach was a really hacky builtin OO or you download a library...).

    To be honest though, I spent a decade trying many languages (dozens including common lisp, Prolog, APL, C, Ada, Smalltalk, Perl, C#, C++, Tcl, Lua, Rust...etc) looking for the best and although I never became experts in those languages, I kept coming to the conclusion that for my particular set of needs, Python was the best I could find. I wasted a lot of time reading common lisp books and just found it much easier to get the same thing done in Python. Your mileage will vary if you're doing something like building a game engine. A lot of people are just doing process automation and stuff like that and languages like Python are just better than common lisp due to the environment and tooling benefits. Also, although Python isn't as conceptually beautiful as lisp, I found it much easier to learn. The syntax just really clicked for me and some people do prefer it.

  • lproven 5 hours ago

    > Why is it that languages like this don't scale?

    Stanislav Datskovskiy addressed this rather well:

    https://www.loper-os.org/?p=69

    • randallsquared 4 hours ago

      I've read that a number of times, but this is the first time since the rise of vibe engineering.

      > I predict that no tool of any kind which too greatly amplifies the productivity of an individual will ever be permitted to most developers.

      There's a new essay in here, somewhere, about why copilot and AI coding is succeeding at bridging this gap.

  • mcdonje 14 hours ago

    It kinda happened with markup languages. HTML, SVG, and some other domain specific markup languages are all XML, which is a subset of SGML.

    The thing there is those DSLs have their own specs.

    Coding is a social activity. Reading code is hard. When there are multiple ways of doing things, it's extra hard. People want to have relatively standardized ways of doing things so they can share code and reason about it easier.

    If there's a lisp or racket or a forth that's defined as a DSL, it might take off if it's standardized and it's the best solution for the domain.

    • gldrk 14 hours ago

      HTML uses a ton of SGML features not part of XML (sometimes erroneously though to be non-standard ‘tag soup’, not to mention self-closing tags). You need either a specialized parser or an SGML processor + DTD.

      • shawn_w 12 hours ago

        Wasn't HTML4 the last one defined as a SGML DTD? 5 and on is its own beast.

        (rip XHTML)

        • gldrk 10 hours ago

          You are right. There is a third-party DTD that should be mostly compatible (https://sgmljs.sgml.net/docs/html5.html).

          In reality, HTML4 was never implemented to the letter by user agents, because people do things like putting -- inside comments.

  • pjmlp 11 hours ago

    Sadly our industry carries mostly about brick layers and usually tries to go into technologies that make it easier to deal with employees like replaceable servants at low wage prices.

    The large scale salaries SV style isn't something that you will find all over the globe, in many countries the pay is similar across all office workers, regardless if they are working with Git, or Office.

    • vlovich123 11 hours ago

      That argument implies that you would actually see these languages in communities with large SV style salaries which isn’t the case.

      It turns out that “brick layer” languages are also easier to understand not just for the next person taking over but yourself after a few months. That’s valuable even to yourself unless you value your time at 0.

      • pjmlp 11 hours ago

        Why? The less the VCs have to spend with employees the better.

        See the famous quote about Go's target audience, or 2000's Java being a blue colour job language.

        Not only do languages like Lisp, Forth, Smalltalk require a people to actually get them, a bit like the meme with burritos in Haskell, they suffered from bad decisions from companies pushing them.

        Lisp suffered with Xerox PARC, Symbolics and TI losing against UNIX workstations, followed by the first AI Winter, which also took Japan's 5th project with Prolog alongside with it.

        Smalltalk was getting alright outside Xerox PARC, with big name backers like IBM, where it had a major role on OS/2, similar to .NET on Windows, until Java came out, and IBM decided to pivot all their Smalltalk efforts into Java, Eclipse has roots on Visual Age for Smalltalk.

        • vlovich123 4 hours ago

          Your entire post makes the claim that it’s because the vast majority of programmers get paid the same as other roles and that’s why there’s the language selection pressure there is.

          High salary jobs would be the exception yet they also make pragmatic choices about languages. It’s a two sided market problem - employers want popular languages to be used so they have a talent pool to hire from and don’t end up having a hard time finding talent (which then also implies something about the salary of course but it’s a secondary effect). Employees look to learn languages that are popular and are easy to find employment in.

          Not sure if you’ve spent any time with them but VCs and investors more broadly generally could give two fucks about the language a business is built in. There are exceptions but generally they just want to see the business opportunity and that you’re the team to go do it.

          There’s a reason it’s difficult to find employment with Haskell or Lisp or other niche languages and it’s because they’re niche languages that “you have to get” - not easy to learn and generally not as easy to work with as “popular” languages that see significantly more man hours dedicated to building out tooling and libraries. There’s also secondary things like runtime performance which is quite poor for Haskell or Lisp if you’re a beginner and even people familiar with the language can struggle to right equivalent programs that don’t use significantly more memory or CPU. And finally the languages can just be inherently more difficult and alien (Haskell) which attracts a niche and guarantees it remains a niche language that attracts a particular kind of person.

        • gjvc 6 hours ago

          psst blue "collar" :-)

  • socalgal2 11 hours ago

    I'm not entirely sure this is different from other languages but I believe a common complaint about lisp is every solution ends up writing a DSL for that solution, making it hard to understand for anyone else. So it's a super power if you're a small team and especially if you're a team of 1. But if you're a large team it doesn't scale.

  • kragen 13 hours ago

    I think those other languages have real advantages you aren't seeing.

    —·—

    The other day akkartik wrote an implementation of the program Knuth used to introduce literate programming to the CACM readers: https://basiclang.solarpunk.au/d/7-don-knuths-original-liter...

    It just tells you the top N words by frequency in its input (default N=100) with words of the same frequency ordered alphabetically and all words converted to lowercase. Knuth's version was about 7 pages of Pascal, maybe 3 pages without comments. It took akkartik 50 lines of idiomatic, simple Lua. I tried doing it in Perl; it was 6 lines, or 13 without relying on any of the questionable Perl shorthands. Idiomatic and readable Perl would be somewhere in between.

        #!/usr/bin/perl -w
        use strict;
    
        my $n = @ARGV > 1 ? pop @ARGV : 100;
        my %freq;
    
        while (my $line = <>) {
          for my $w ($line =~ /(\w+)/g) {
            $freq{(lc $w)}++;
          }
        }
    
        for my $w (sort { $freq{$b} <=> $freq{$a} || $a cmp $b } keys %freq) {
          print "$w\t$freq{$w}\n";
          last unless --$n;
        }
    
    I think Python, Ruby, or JS would be about the same.

    Then I tried writing a Common Lisp version. Opening a file, iterating over lines, hashing words and getting 0 as default, and sorting are all reasonably easy in CL, but splitting a line into words is a whole project on its own. And getting a command-line argument requires implementation-specific facilities that aren't standardized by CL! At least string-downcase exists. It was a lark, so I didn't finish.

    (In Forth you'd almost have to write something equivalent to Knuth's Pascal, because it doesn't come with even hash tables and case conversion.)

    My experience with Smalltalk is more limited but similar. You can do anything you want in it, it's super flexible, the tooling is great, but almost everything requires you to just write quite a bit more code than you would in Perl, Python, Ruby, JS, etc. And that means you have more bugs, so it takes you longer. And it doesn't really want to talk to the rest of the world—you can forget about calling a Squeak method from the Unix command line.

    Smalltalk and CL have native code compilers available, which ought to be a performance advantage over things like Perl. Often enough, though, it's not. Part of the problem is that their compilers don't produce highly performant code, but they certainly ought to beat a dumb bytecode interpreter, right? Well, maybe not if the program's hot loop is inside a regular expression match or Numpy array operation.

    And a decent native code compiler (GCC, HotSpot, LuaJIT, the Golang compilers, even ocamlopt) will beat any CL or Smalltalk compiler I have tried by a large margin. This is a shame because a lot of the extra hassle in Smalltalk and CL seems to be aimed at efficiency.

    (Scheme might actually deliver the hoped-for efficiency in the form of Chez, but not Chicken. But Chicken can build executables and easily call C. Still, you'd need more code to solve this problem in Scheme than in Lua, much less Ruby.)

    —·—

    One of the key design principles of the WWW was the "principle of least power", which says that you should do each job with the least expressive language that you can. So the URL is a very stupid language, just some literal character strings glued together with delimiters. HTML is slightly less stupid, but you still can't program in it; you can only mark up documents. HTTP messages are similarly unexpressive. As much as possible of the Web is built out of these very limited languages, with only small parts being written in programming languages, where these limited DSLs can't do the job.

    Lisp, Smalltalk, and Forth people tend to think this is a bad thing, because it makes some things—important things—unnecessarily hard to write. Alan Kay has frequently deplored the WWW being built this way. He would have made it out of mobile code, not dead text files with markup.

    But the limited expressivity of these formats makes them easier to read and to edit.

    I have two speech synthesis programs, eSpeak and Festival. Festival is written in Scheme, a wonderful, liberating, highly expressive language. eSpeak is in C++, which is a terrible language, so as much as possible of its functionality is in dumb data files that list pronunciations for particular letter sequences or entire words and whatnot. Festival does all of this configuration in Scheme files, and consequently I have no idea where to start. Fixing problems in eSpeak is easy, as long as they aren't in the C++ core; fixing problems in Festival is, so far, beyond my abilities.

    (I'm not an expert in Scheme, but I don't think that's the problem—I mean, my Scheme is good enough that I wrote a compiler in it that implements enough of Scheme to compile itself.)

    —·—

    SQL is, or until recently was, non-Turing-complete, but expressive enough that 6 lines of SQL can often replace a page or three of straightforward procedural code—much like Perl in the example above, but more readable rather than less.

    Similarly, HTML (or JSX) is often many times smaller than the code to produce the same layout with, say, GTK. And when it goes wrong, you can inspect the CSS rules applying to your DOM elements in a way that relies on them being sort of dumb, passive data. It makes them much more tractable in practice than Turing-complete layout systems like LaTeX and Qt3.

    —·—

    Perl and Forth both have some readability problems, but I think their main difficulty is that they are too error-prone. Forth, aside from being as typeless as conventional assembly, is one of the few languages where you can accidentally pass a parameter to the wrong call.

    This sort of rhymes with what I was saying in 02001 in https://paulgraham.com/redund.html, that often we intentionally include redundancy in our expressions of programs to make them less error-prone, or to make the errors easily detectable.

    • igouy 6 minutes ago

      > And it doesn't really want to talk to the rest of the world—you can forget about calling a Squeak method from the Unix command line.

      You seem absolutely certain!

      Here's an example of a Pharo Smalltalk program call on the Ubuntu command line, with the calculation result written to stdout --

          /opt/src/pharo-vm-Linux-x86_64-stable/pharo --headless nbody.pharo_run.image Include/pharo/main.st 50000000
      
          -0.169075164
          -0.169059907
      
      
      https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

          ~
      
      Here's a corresponding Perl program --

      https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

    • codys 13 hours ago

      The article in CACM that presents Knuth's solution [1] also includes some criticism of Knuth's approach, and provides an alternate that uses a shell pipeline:

          tr -cs A-Za-z $'\n' |
          tr A-Z a-z |
          sort |
          uniq -c |
          sort -rn |
          sed ${1}q
      
      (I converted a newline to `$'\n'` for readability, but the original pipeline from the article works fine on a current MacOS system)

      1: https://dl.acm.org/doi/pdf/10.1145/5948.315654

      • mek6800d2 11 hours ago

        With great respect to Doug McIlroy (in the CACM article), the shell pipeline has a serious problem that Knuth's Pascal program doesn't have. (I'm assuming Knuth's program is written in standard Pascal.) You could have compiled and run Knuth's program on an IBM PC XT running MS-DOS; indeed on any computer having a standard Pascal compiler. Not so the shell pipeline, where you must be running under an operating system with pipes and 4 additional programs: tr, sort, uniq, and sed.

        McIlroy also discusses how a program "built for the ages" should have "a large factor of safety". McIlroy was worried about how Knuth's program would scale up to larger bodies of text. Also, Bentley's/McIlroy's critique was published in 1986, which I think was well before there was a major look into Unix tools and their susceptibility to buffer overruns, etc. In 1986, could people have determined the limits of tr, sort, uniq, sed, and pipes--both individually and collectively--when handling large bodies of text? With a lot of effort, yes, but if there was a problem, Knuth at least only had one program to look at. With the shell pipeline, one would have to examine the 4 programs plus the shell's implementation of pipes.

        (I'm not defending Pascal and Knuth, Bentley, and McIlroy are always worth reading on any topic -- thanks for posting the link!)

        Bringing this back to Forth, Bernd Paysan, who needs no introduction to the people in the Forth community, wrote "A Web-Server in Forth", https://bernd-paysan.de/httpd-en.html . It only took him a few hours, but in fairness to us mortals, it's an HTTP request processor that reads a single HTTP request from stdin, processes it, and writes it output to stdout. In other words, it's not really a full web server because it depends on an operating system with an inetd daemon for all the networking. As with McIlroy's shell pipeline, there is a lot of heavy lifting done by operating system tools. (Paysan's article is highly recommended for people learning Forth, like me when I read it back in the 2000s.)

  • cjfd 9 hours ago

    One thing I note is that all of the languages you name are very far from the machine. Also Forth is not close to the modern machine. Note that it only has two integer types and the larger one can be aligned either way you make sure it is not.

    • lycopodiopsida 9 hours ago

      > One thing I note is that all of the languages you name are very far from the machine

      Common lisp is one step away from assembly - you disassemble any function and it is, in fact, a valid strategy of one wants to check the compiler optimizations.

      • cjfd 7 hours ago

        I googled a bit on how common lisp is compiled. Apparently it is possible to add some sort of type hints and ensure that parameters/variables have a certain type. If one uses that for most code, it would potentially be enough to qualify as being close to the machine.

        • vkazanov 6 hours ago

          Yes, in a way common lisp code can be locally lowered to a well-typed language.

          What people do is just write code they way they usually write dynamic lisp and then add types to functions where necessary for performance.

          SBCL generates good assembly, btw.

        • Zambyte 2 hours ago

          What does "close to the machine" mean to you?

          • cjfd 2 hours ago

            To me it means that one attempts to use the machine well. I.e., avoid introducing overheads that have nothing to do with the problem one is trying to solve. As an example of something that is very far from the machine imagine wanting to add some integers together. One can do this in untyped lambda calculus by employing Church Numberals. If one looks at the memory representation now your numerals are a linked list of a size equal, or proportional, to the number. However, the machine actually has machine language instructions to add numbers in a much more efficient way. For this discussion maybe the most relevant example is that using dynamic typing for algorithms that don't need it is distant from the machine because every value now has a runtime type label that is actually not needed because if your program could actually be statically typed, one would know in advance what the type labels are so they are redundant.

    • fwip 3 hours ago

      There are many Forths, and an implementer can and should define words that map well to the target hardware.

  • JonChesterfield 14 hours ago

    They scale extremely effectively to large problems solved by a team size of one, maybe two.

    The story goes that changing the language to fit how you're thinking about the problem is obstructive the rest of the people thinking about the same problem.

    I'm pretty sure this story is nonsense. Popular though.

  • username223 3 hours ago

    I don't think there's a unifying reason why programming languages languish in obscurity; it's certainly not because they're "too powerful." What does "powerful" even mean? I used to care more about comparing programming languages, but I mostly don't these days. Actually used/useful languages mostly just got lucky: C was how you wrote code for Unix; Python was Perl but less funny-looking; Ruby was Rails; JavaScript is your only choice in a web browser; Lisp had its heyday in the age of symbolic AI.

    Forth and (R4RS) Scheme are simple to implement, so they're fun toys. Some other languages like Haskell have interesting ideas but don't excel at solving any particular problems. Both toy and general-purpose programming languages are plentiful.

    • lycopodiopsida 2 hours ago

      Alike to big fortunes, no one wants to hear the truth about lot of them existing due to simple luck. There is a significant amount of post-hoc rationalization to explain the success by some almost magic virtues. Or even explain the success by lack of such virtues - "worse is better" and so on.

  • elitepleb 13 hours ago

    frankly it's a miracle any of them scaled at all, such popularity mostly comes down to an arbitrary choice made decades ago by a lucky vendor instead of some grand overarching design

tombert 13 hours ago

Forth has been a peripheral fascination of mine for about a decade, just because it seems to do well at nearly every level of the software stack. Like a part of me wants to build a kernel with it, or make a web server, or anything in between.

I've never actually done any Forth, though, just because it's a bit arcane compared to the C-inspired stuff that took over.

jll29 8 hours ago

FORTH has some elegance and it's so simple that it is tempting to implement it.

However, no language should permit defining the value of 4 by 12, as there is no situation in which this can bring more good than harm in the long term.

Another issue that affects FORTH but also Perl and other languages is that they deal with a lot of things implicitly (e.g. the stack, or arguments to functions). Most people agree that explicity is more easy to read than implicit.

  • ErroneousBosh 7 hours ago

    > However, no language should permit defining the value of 4 by 12, as there is no situation in which this can bring more good than harm in the long term.

    A Skil saw should not permit you sticking your fingers in the spinning blade, yet most people know that this is a stupid and dangerous thing to do.

    • schwartzworld 4 hours ago

      Lots of saws have safety features to keep fingers from being removed. It happens all the time.

snitty 3 hours ago

I remember programming in Forth on my Palm Pilot, as there was a Forth interpreter for it.

praptak 6 hours ago

"Working without names (also known as implicit or tacit or point-free programming) is sometimes a more natural and less irritating way to compute. Getting rid of names can also lead to much more concise code. And less code is good code."

Does Forth really reduce the burden of naming things? You don't name results but don't you have to pay for it with the burden of naming words? (My impression is that there's more words in a Forth program than functions in an equivalent program in a language that has named variables).

  • 7thaccount 5 hours ago

    The quote makes more sense IMO for array languages like J that support a tacit style. J's "trains" just make things flow without a lot of variables. Aaron Hsu's Co Dfns compiler (spoken about on here and YouTube) also uses this style with Dyalog APL.

    Forth is concatenative, so you can build the words on top of each other without worrying about a ton of variables. So I think it's partially true for Forth.

  • vdupras 5 hours ago

    Yeah, I wouldn't have phrased it like in the article either. What I'd say is that Forth is more about naming processes than variables.

peter303 7 hours ago

RPN interpreters require very little core memory. So they were popular with computers where core memory was under ten kilobytes.

But its horrible for software engineering with multiple programmers and large codebases. Lacks structures, interfaces, modules, data abstraction that you expect in a modern language. We called it the "Chinese food" of coding- ten minutes later you had nomidea what you just coded.

kragen 14 hours ago

This is, I think, the best overview of Forth, and computing as a whole, that I've ever seen.

shevy-java 12 hours ago

Back in 2004 or so - ancient days now - I remember an elderly programmer on #gobolinux (freenode IRC back in the days) who kept on praising Forth. I never understood why, but he liked Forth a lot.

Now - that in itself doesn't mean a whole lot, as it is just anecdotal, but people who are very passionate about programming languages are quite rare. I've not seen something like that happen with any other language (excluding also another guy on #gobolinux who liked Haskell). I did not see anyone praise, say, PHP, perl, JavaScript etc....

Some languages people don't like to talk about much. Forth though was different in that regard. I never got into it; I feel it has outlived the modern era like many other languages, but that guy who kept on talking about it I still remember. His website also was built in Forth and it was oddly enough kind of an "interactive" website (perhaps he also used JavaScript, I forgot, but I seem to remember he said most or all of it was implemented in Forth - turtles all down the way).

mkovach 4 hours ago

I first encountered Forth on a TI-99/4A, complete with that magnificent expansion box that looked like industrial HVAC equipment. Hearing me complain about TI Extended BASIC's glacial pace, my parents saw in one of my magazines that Forth was faster and bought it hoping I would find it helpful.

It was mind-bending but fascinating. I managed a few text adventures, some vaguely Pac-Man-esque clones, and a lingering sense that I was speaking a language from another dimension.

I've since forgiven my parents. Forth resurfaces now and then, usually when I reread Leo Brodie's thought-provoking Forth books, and I feel like I'm decoding the sacred texts of a minimalist cult. I came away thinking better, even if I've never completely caught up with the language.

fennec-posix 13 hours ago

A long read but one that's quite incredible. Has definitely helped my understanding of computing get closer to the metal so to speak.

rtpg 11 hours ago

it mentions sometimes not naming things as great, but... what does naming intermediate values in forth look like? Is there even a naming scope that would allow for me to give values names in case I don't want to get entirely lost in the sauce?

nakamoto_damacy 12 hours ago

I wish "Simple Made Easy," by Rich Hickey, could be applied here. Forth is simple but not easy. If there is something as simple as Forth but also accessible to mere mortals (aka easy) then I'd like to know what it is (I don't consider Clojure itself as a language to be simple in this sense).

dlcarrier 14 hours ago

That is assuming that you, with German grammar, write.

  • DavidSJ 14 hours ago

    I believe, that you that sumes as mean.

alganet 4 hours ago

There's a certain mesmerizing effect that creeps in once you start digging into programming language fundamentals.

Any kind of notation, really, can do that to a person. It's kind of hypnotic.

I avoid it like the plague (getting too much into it). Not because I dislike it, but because I like it so much.

I believe the ideal programming language must be full of problems, and then obvious ways to get around those problems. It's better than a near-perfect language with one or two problems that are very hard to get around.

The "Stop Writing Dead Programs" video mentioned is quite nice. It's surprising how the web is a platform for many of the languages the presenter offer as inspiration.