interroboink 3 days ago

Somewhat tangential: this reminded me of Neal Stephenson's "In the Beginning was the Command Line" [1] which speaks highly of BeOS.

I liked the bug report "R4: BeOS missing megalomaniacal figurehead to harness and focus developer rage" (:

[1] https://web.stanford.edu/class/cs81n/command.txt

  • troyvit 2 days ago

    Man I read that during my formative years as a computer guy and it helped me big-time make the switch from MacOS to Linux. That and "The Cathedral and the Bazaar."[1] I don't know how well that one stands up to the test of time though. I'm going to re-read your link now.

    http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral...

    • ameliaquining 2 days ago

      The most immediately obvious answer to that question, shared by Stephenson himself, is that the essay quickly failed the test of time because it did not anticipate Mac OS X, which breaks the dichotomy of user-friendly-but-anemic Classic Mac OS vs. powerful-but-forbidding Unix.

      (I personally still don't really like Apple systems and don't use them, but I think it's clear that by the standards of the time OS X squared this circle.)

      • troyvit a day ago

        I was speaking to ESR's essay holding up to the test of time but you have a good point. I'd argue however that the lock-down Apple imposes on OS X, such as signed system volumes, keeps them firmly planted in a cathedral. That said I don't know why I'd argue with Neal Stephenson about this stuff to begin with so maybe I should stop typing :)

        • ameliaquining a day ago

          IIUC cathedral-style development of open source projects (i.e., intermediate development versions aren't publicly available, you can only get official releases) is now very rare; Android is the only project I can think of off the top of my head that works that way. So in that respect ESR's perspective clearly won out. Of course an essay of that length makes any number of claims and some will have aged better than others.

          • interroboink a day ago

            I think one could argue that FreeBSD is more cathedral-style than Linux, and there are other examples, especially for programming languages, where the inventor(s) want to avoid it being pulled in too many directions at once (extreme example: Jai).

            I don't know that it's really such a binary either/or decision, more of a spectrum of "more cathedral-y" and "more bazaar-y."

DeathArrow 3 days ago

The tutorial is from 2010. At the time people were much more enthusiastic about Haiku and other alternative operating systems.

Now we only have Windows and Unix like.

I still wonder what would have happened had Jean-Louis Gassée been less greedy and Apple acquired BeOs instead of Next.

I discovered BeOs in 2000 and it seemed to me much more interesting at that moment than either Windows or Linux. Not only it looked and felt better but it introduced other ideas and concepts.

I had hopes it's adoption will increase but it soon withered and died.

I still wonder why we can't do better than Unix and Windows. Unix is 50 years old and Windows is old, too. There should be better concepts out there waiting to be discovered and implemented.

At some point there were many companies, universities, groups and individuals involved in researching and implementing operating systems.

At that point I was following OsNews website daily and each day there were news about some new and exciting developments.

Not anymore.

I miss the days when I read about BeOS, Syllable, AROS, MorphOS, AtheOS, SkyOS, Plan 9, Inferno, Singularity. And there were a ton of interesting kernels, too.

  • WillAdams 2 days ago

    I'm faintly surprised that the Raspberry Pi hasn't made OS-experimentation more of a thing.

    There are:

    https://www.raspberrypi.com/news/risc-os-for-raspberry-pi/

    https://wiki.sugarlabs.org/go/Installation

    https://9p.io/wiki/plan9/Raspberry_Pi/index.html

    as well as a couple of RTOS options: https://all3dp.com/2/rtos-raspberry-pi-real-time-os/ and QNX: https://pidora.ca/qnx-on-raspberry-pi-the-professional-grade...

    and TRON: https://www.tron.org/blog/2023/09/post-1767/ (but that's only on the Pico? I'd love to see it as a desktop OS on an rPi5)

  • MisterTea 2 days ago

    > I still wonder why we can't do better than Unix and Windows. Unix is 50 years old and Windows is old, too. There should be better concepts out there waiting to be discovered and implemented.

    Plan 9. It's a Unix built on top of a network. Each process can be thought of as a container as each gets a namespace which consists of a table of mounts and binds of resources. Those resources are served by an architecture neutral RPC protocol which serves a tree of objects we call files: 9P. The server of these resources is called a file server and is akin to a micro-service that runs bare metal on the CPU, the kernel is the container host. These resources are protected by Unix permissions verified by the kernel against a security service called factotum, itself a 9P server. Cloud ready, bare metal micro-service computing started in the late 80's and nobody noticed.

    > I miss the days ... Plan 9, Inferno, ...

    Still alive and kicking. Head over to 9front.org for a maintained fork of each. Patches arrive nearly daily to 9front and the community is encouraged to contribute. 9front also runs my home network because setting up DHCP, DNS and TFTP for PXE is dumb simple.

  • terhechte 3 days ago

    Just wanted to say that I was only someone who read OsNews daily back somewhere from 2002 to 2007 or 2008. So many interesting things going on.

    With the proliferation of web apps and containers, alternative operating systems are actually more feasible today. At the same time, I'm dependent on all the niceties that the macOS/iOS ecosystem offers (the integration, the sync). Something I wanted to look into is just running macOS in a very default way and then using a fast OS (such as Haiku, though their Arm64 support is not very good yet) in a fullscreen VM. With modern Apple Silicon, there's almost no performance penalty.

  • ForHackernews 2 days ago
    • lproven 2 days ago

      > https://serenityos.org/

      Trad Unix, redone in C++; not self-hosting; project lead has quit to work on a browser.

      > https://collapseos.org/

      Interesting but mainly for extreme smallness. Forth will alienate a lot of people. For me, Oberon would have been a more interesting basis.

      > http://kolibrios.org/en/

      Hostile fork of Menuet OS; based on 25YO code.

      I mean, yes, it is good there's interesting stuff, but these are not inspiring examples IMHO.

  • anthk 3 days ago

    Risc OS it's still alive, it works great under an RPI B+ and up. Just get some industrial graded SD.

    I suggest you to get the Risc OS Direct 'distro' and then the update which has wireless support. Read the documentation first, of course.

    Risc OS it's very similar to Windows in usage (more like an Amiga), once you read the docs you will get it ready in no time. You have the old Netsurf browser on its native plataform, it's fun.

    On plan9, 9front superseded it.

  • flohofwoe 3 days ago

    > I still wonder what would have happened had Jean-Louis Gassée been less greedy and Apple acquired BeOs instead of Next.

    Nothing different would have happened. You would have used C++ instead of Objective-C to write macOS UI programs, but other then that macOS would be in the same shitty state it as is today (assuming a wildly popular iPhone would also have happened in that alternative timeline and take the focus away from desktop UI development).

    One thing might probably be different: macOS might be less popular amongst developers because BeOS wasn't a UNIX clone (but 'Linux on the desktop' might actually be in a better state).

    • os2warpman 2 days ago

      >I still wonder what would have happened had Jean-Louis Gassée been less greedy and Apple acquired BeOs instead of Next

      Internal efforts to get BeOS's security levels up to even Mac OS 9 levels would have bankrupted 1997 Apple.

    • z3phyr 2 days ago

      Most of the developers of the microcomputer revolution era programmed in Non-Unix environments.

      Even in workstation space, the cool lisp/smalltalk/... developers hated unix, but the small market was filled with unixes.

      Even today, most of the native developers use windows!

    • b800h 3 days ago

      Yes, this - OS X wouldn't have become a near-ubiquitous developer desktop. That would have been very bad for Apple.

      • anthk 3 days ago

        The rest of the world doesn't use OSX for all.

        • lproven 2 days ago

          I beg to differ.

          I am a techie, I travel a lot, and I've worked for both the 2 biggest enterprise Linux vendors.

          Windows boxes are in use in both of them, and Windows laptops are everywhere -- but the second most popular platform after Windows, and the 2nd most popular after Linux inside Linux vendors, is macOS.

          It's everywhere.

          The partial exception was when I moved to Czechia. A decade ago it was still relatively poor. Few iDevices, few Android phones, lots of Windows Mobile then. Not many Macs except inside big companies.

          But that's changed. Now they're everywhere there, too.

          MacOS is huge. I see Macs everywhere, far far more than I ever see ChromeBooks in the real world. I think most ChromeBooks are probably in schools.

          • anthk 2 days ago

            I'm from Spain. Europe it's Androidlandia, that's it. When a Xiaomi device it's as good as most iPhones at a far lower price (for basic needs), then Android curbstomps Apple there.

            At work, Windows. Forget Apple, or Chrome. Companies will set AD and Windows.

            On backends and high tier servers, developing machines, and so on: GNU/Linux, hands down. The IT companies virtualize Windows machines with KVM, set up an AD domain under a VM and call it a day. No Apple there.

            OTOH, I have to say that iPads have a much better touchscreen for handwritten input with lightpens. Hope Android solves that soon.

            • lproven a day ago

              > Europe it's Androidlandia, that's it.

              I reject all your generalisations.

              I'm an Irish citizen who until 2023 lived in Brno then Prague. Since I moved to the Isle of Man I've travelled and worked in Latvia, the Netherlands, Spain, Germany, Belgium, Czechia, Austria, England, Ireland, and Scotland, from memory. While I lived in Czechia I routinely travelled in all the countries it borders.

              I have seen more iPhones in use than any single make of Android, and more iPads than all other forms of tablet put together. MS Surface tablets are #2 and outnumber all Android tablets put together. I am typing on a work-provided MacBook Air and my boss also uses MacBooks, and I think the majority of the company works on them, but we stopped having a central office years ago so my sampling is very ad hoc.

              At both the enterprise Linux vendors where I've been a paid full-time member of staff, most managers and marketing people use MacBooks. (IMHO this is a damning indictment of desktop Linux but that's incidental.)

              My own direct observations in the last decade refute your claims. I can't give you numbers but let's put it this way: the majority of the guests I invited to my wedding in 2023 I had to use Apple Messages from my iMac to contact, because they're not on any of the systems I use: Whatsapp, Signal, or Telegram.

              Your experience is NOT representative.

              • anthk 17 hours ago

                >managers and marketing

                YOu said it all.

              • anthk 17 hours ago

                >Managers and marketing

                Ah, well, the most irrelevant place from a company compared to the actual product development.

                Thus, no wonder everyone sees Macs as fancy toys just to show off instead of doing actual work. The times from OSX under G4 being a really good system for A/V and press/journalism production are long gone.

                It can be, but any Windows or even some Linux machine with Krita and some medium A/V tools with some -rt kernel with Pipewire can destroy OSX on performance. Ardour is no joke and people has tools like DavinCi. There's no need to spend $4-6k on a Mac Pro any more. Pick any high end Nvidia card with hardware encoding/decoding and A/V producing can be trivial.

                If OSX it's just a tool do bullshit presentations, OSX it's doomed in the desktop.

                The iPad it's everything else. It's really good at handwriting, and it's really good for students at uni doing tons of writtings and notes, and OFC for painting and photo manipulation.

      • graemep 3 days ago

        Really? Most users I have come across are not developers. A high proportion of developers use it, but the proportion of users who are developrrs is quire low.

        • bigstrat2003 2 days ago

          I don't even think it's true that a high proportion of developers use Macs. My guess would be that most developers use Windows, because most developers work for corporations and most corporations issue Windows machines to all employees. Companies which offer a choice are in the minority.

        • actionfromafar 3 days ago

          Developers drive adoption (not least by creating software) and high specs.

          • graemep 3 days ago

            Yes, but those developing for Apple OSes would have adopted whatever OS Apple used.

            The developer who pick it because it is a Unix are people who developer for Unix like OSes, mostly for servers, so they mostly do not produce software that drives adoption by people who are not developers.

            • actionfromafar 2 days ago

              This is true. Only upside I can see is that the developer tools available for "unix-ish" system were really strong (if clunky at times).

        • cbzbc 3 days ago

          I think you'd have got a much smaller eco-system of third party apps for MacOS (think omnigraffle etc). Whether that would have made a substantive difference I can't say.

  • kitd 3 days ago

    OS/2 was created by the biggest computer company in the world at the time, yet even it couldn't get enough traction.

    In the world of OSes, "the same as what everyone else is using" is much more important than "new takes on old concepts".

    • pjmlp 3 days ago

      Of course, when it was more expensive than MS-DOS 5/Windows 3.1 combo hardware requirements, what to expect.

      In today's money, going with OS/2 instead would have cost me 1000 euros more when I bought my 386 PC.

    • piokoch 3 days ago

      It came a few years to early. I remember my shock when I get [1] my OS/2 copy and it was on 10 (yes, ten!) 3.5 inch Floppy Disk, while Windows needed two (one for main OS, one for some utilities).

      [1] This was 1987 still communistic Poland, so I bought a pirated copy on a famous Warsaw computer bazaar on Grzybowska street.

      • sixothree 2 days ago

        Agreed. Their timing was not right. But once Windows 95 hit, it was over. Also, the RAM requirements made it pretty much impossible for people to recommend to their friends and family.

        I really had a special fondness for OS/2. But using it today, it really is a quirky thing. Maybe if it had won I wouldn't be thinking that way.

        • lproven 2 days ago

          > Their timing was not right.

          I agree.

          https://www.theregister.com/2025/01/05/microsoft_os2_flop_fu...

          I think OS/2 1.x should have targeted the 386, in the 1980s.

          > Also, the RAM requirements made it pretty much impossible for people to recommend to their friends and family.

          I bought OS/2 2.0 with my own cash, and ran it on several different 386SX machines in 4MB of RAM. It was usable on that spec.

          Any good spec of machine for Windows 3.0 could run OS/2 2.x usefully without being unpleasant.

      • lproven 2 days ago

        I don't think you are remembering correctly.

        > It came a few years to early. I remember my shock when I get [1] my OS/2 copy and it was on 10 (yes, ten!) 3.5 inch Floppy Disk

        Not so bad, really.

        > while Windows needed two (one for main OS, one for some utilities).

        No version of Windows came on 1 floppy.

        Windows 1.01 took 4 360k disks -- here's a picture:

        https://www.firstversions.com/2015/05/microsoft-windows.html

        Windows 2 took 8 360k disks:

        https://archive.org/details/microsoft-windows-v2.0

        Windows 3 took 8 even on DS DD 720kB disks:

        https://archive.org/details/windows_3.00_english_with_ms-dos...

        By Windows 95 it was up to 27 high density 1.4MB disks:

        https://www.reddit.com/r/interestingasfuck/comments/uopb1n/t...

        10 is not bad at all for a full preemptive multitasking x86-32 OS with a GUI, IMHO.

  • 7thaccount 2 days ago

    I too long for different ways of interacting with a computer.

    Unfortunately, I think the ship may have sailed as it's getting too hard to both start from scratch and also provide support for everything from a web browser to drivers and so on. It was a lot easier when the to do list was 1/100th the size. The workaround is to utilize what has already been done, but then that kind of defeats the entire purpose and you just get a slightly different flavor of the same thing.

    The only other real option would be a radical revision of what a computer is. Something really simple that maybe a bit closer to what Carl Sasserath was doing with iOS (no, not Apple iOS, but the internet OS via Rebol thing) iirc, or what the Forth folks do with hardware. I think Alan Kay may have talked about this as well from a smalltalk perspective. The question is if you can do anything interesting with it. I'm sure there are dozens of us that would give up YouTube and social media to have a fully understandable computing system :)

    • jodrellblank a day ago

      > "I too long for different ways of interacting with a computer."

      Like what?

      People built chording/pen/handwriting/gesture/Kinect/Wiimotes/multitouch/mouse gesture/swipe gesture/joysticks/Griffin Powermate/3Dconnextion Spacemouse/motion tracking/eye tracking/voice recognition/etc. on current operating systems, what do you want in the way of interaction that couldn't be built with a current OS?

      • 7thaccount a day ago

        I'm not talking about hardware devices. I'm talking about a very different combination of OS, hardware, and how applications are made and shared. Something closer to a Lisp/Smalltalk OS I guess, but that isn't really right either. Windows is a turd and Linux/Mac aren't perfect either. Sometimes I feel like it's turds all the way down. A lot of this is just my ignorance and hubris (I admit that), but I also feel like we somehow got off on the wrong foot and now we've built this terrifying and brittle tower of complexity.

    • DeathArrow 2 days ago

      >It was a lot easier when the to do list was 1/100th the size

      Things in the past look simple only if we look at them through today's lens.

      The apparent complexity was the same.

      Today I work with microservices, on top of Kubernetes on top of cloud services and I have to know a gazillion things. But I don't have the feeling that I had an easier time when I was a kid playing with C/C++ under DOS, learning assembly, writing terminate and stay resident programs, trying to write simple device drivers or trying to program the graphics hardware using whatever limited info I had access to. When I started doing desktop application using Win32 and Qt, it didn't seemed more simple than now. Learning how to use Linux syscalls, how to program for X11 in 2000 didn't seem simpler.

      Of course, software was much more simpler but now we have better tools, a lot of more easy accessible information, we have developed practices, standards and methodologies to help. And since we have huge resources, we don't have to extract every last drop of performance from the hardware.

      So, I don't think the life of average programmer in 70s,80s,90s,2000 was easier than now.

      It only seems easy if we have to resolve problems from 40 years ago using the knowledge, tools and the hardware we have now.

      • 7thaccount a day ago

        We may be talking past each other.

        I don't think it is harder on programmers now (or easier for programmers in the past). I can do things in Python that would make me a 10x programmer 40 years ago if the hardware could have supported it. They also didn't have stack overflow back then...just a dog-eared copy of some old C book. They didn't have hardware like we do that would've made a supercomputer back then look like something you'd put in a toaster today (that was a poorly worded sentence...sorry). The challenges they faced were numerous.

        My point is that the actual complexity layers are much worse now. Back in the Commodore 64 days, many users knew the machine inside and out. They could program in assembly, do graphics on the display...etc, all while understanding exactly what is going on. None of that was easy or as efficient as what I can do in Excel today or some 3D graphics program, but it was something you could wrap your head around. Today, we have huge monolithic amounts of code or hardware for everything. I don't understand anything about my hardware, I don't understand the millions of lines of windows, I don't understand the millions of lines in Microsoft Office, or how my web browser works or how Unreal engine was built...etc. It's the product of millions of people working together to create something beyond the limits of a single human.

        If we wanted to truly start from scratch, there's no way (that I can see) where we can reinvent all of that and actually get a large amount of users. It's not impossible, but Herculean. You could do something if the to-do list was MUCH smaller, like what was done with Temple OS or Collapse OS.

  • lproven 2 days ago

    > I still wonder what would have happened had Jean-Louis Gassée been less greedy and Apple acquired BeOs instead of Next.

    I'm paraphrasing an old blog post of mine, but...

    In today's world, spoiled with excellent development tools, everyone has forgotten that late-1980s and early-to-mid 1990s dev tools were awful: 1970s text-mode tools for writing graphical apps.

    Apple acquired NeXT because it needed an OS, but what clinched the deal was the development tools (and the return of Jobs, of course.) NeXT had industry-leading dev tools. Doom was written on NeXTs. The WWW was written on NeXTs.

    Apple had OS choices – modernise A/UX, or buy BeOS, or buy NeXT, or get bought and move to Solaris or something – but nobody except NeXT had Objective-C and Interface Builder, or the NeXT/Sun foundation classes, or anything like them.

    The meta-irony being that if Apple had adapted A/UX, or failing that, had acquired Be for BeOS, it would be long dead by now, just a fading memory for middle-aged graphical designers. Without the dev tools, they'd never have got all the existing Mac developers on board, and never got all the cool new apps – no matter how snazzy the OS.

  • Kon5ole 2 days ago

    >I still wonder what would have happened had Jean-Louis Gassée been less greedy and Apple acquired BeOs instead of Next.

    I'm pretty sure Apple would have been bankrupt a few months later if they didn't buy back Steve.

  • erremerre 3 days ago

    I think the main issue is... hardware.

    There is a reason why I have 0 problems using linux on a raspberry PI, yet everytime I try to install in a real computer I got lying around I got a myriad of nosense problems which are particularly hard to solve.

    If we want a new OS we need to make it for certain platform which is always identical in terms of hardware. I would say a PS3, Steam Deck, or Nintendo Switch would be good candidates.

    They have plenty of identical hardware units in the market, and you could focus on the OS rather than supporting strange hardware issues.

    You missed my favourite from the list: ReactOS

    • desdenova 3 days ago

      Drivers are precisely the reason we only have Windows, Macos and Linux as viable options.

      Hardware manufacturers write Windows drivers, Linux community write drivers for basically all consumer hardware, and apple develops both the hardware and the OS with their own drivers.

      • DeathArrow 3 days ago

        >Drivers are precisely the reason we only have Windows, Macos and Linux as viable options

        That is one big issue and another one is software.

        Writing drivers and porting software means both time and money.

        How ever if a new OS would bring lots of benefits to both users and companies, it might tip the scale and make the time and money investment worthwhile.

        Of course, by a new OS I don't mean just another platform that ebables us to run software and use hardware, as existing OSes do that just fine.

        By a new operating system, I mean one that enables us to use new computing paradigm, enable new types of software to software and software to hardware interactions and would make a big disruption to the market. Something with the same kind of impact as AI or the introduction of smartphones.

        • actionfromafar 3 days ago

          Everything is so fluid nowadays. Even if use-case was found, someone would find a way to run Linux on it.

          • whywhywhywhy 2 days ago

            Yeah but finding a way to run it is having a hobby of debugging software and work arounds on top of your job.

  • pjmlp 3 days ago

    iDevices and Android are not Unix like, there is hardly any way to do an usable application with POSIX APIs.

    Same applies to cloud computing with language runtimes and serverless.

    While it looks grim, there are some hopes for OS lovers.

tialaramex 3 days ago

Because Haiku has taken such an extraordinarily long time, as with a city centre that was constructed over multiple centuries we can see the clear lines of new work with new techniques and materials on top of old work done in a way that was usual at the time.

Haiku has a lot of C++ 98 code or even pre-standard C++, not least all the stuff re-used with permission from BeOS. As was usual for projects at that time many fundamental building blocks are provided rather than using a language standard. For example there's BString and BList.

Haiku also has seams of BSD code where there'd be a project to do Whatever (WiFi, TLS, drivers, etc.) "properly" in a way unique to Haiku but as a stop gap here's some BSD code until we finish our own proper solution, which of course never happens.

  • waddlesplash 2 days ago

    > with new techniques and materials on top of old work done in a way that was usual at the time.

    But is there any long-lived project for which this isn't true? Linux and the BSDs surely have many components that fall into this category.

    > For example there's BString and BList.

    BString is a much nicer string class to work with (IMO) than std::string. It lacks some modern conveniences, and it has some unfortunate footguns where some APIs return bytes and some return UTF-8 characters (the former should probably all be considered deprecated, indeed that's a BeOS holdover), but I don't think there's any intent to drop it.

    BList could be better as well, but it's still a nicer API in many ways than std::vector. Our other homegrown template classes also are nicer or have particular semantics we want that the STL classes don't, so I don't think we'd ever drop them.

    > Haiku also has seams of BSD code where there'd be a project to do Whatever (WiFi, TLS, drivers, etc.) "properly" in a way unique to Haiku

    What would be the point of implementing WiFi drivers from scratch "uniquely" for Haiku? Even FreeBSD has started just copying drivers from Linux, so that may be in our future as well. I don't know that anyone ever really considered writing a whole 802.11 stack for Haiku; there was some work on a "native" driver or two at one point, but it was for hardware that we didn't have support for from the BSDs, and it still used the BSD 802.11 stack. Writing our own drivers there just seems like a waste of time; we might as well contribute to the BSD ones instead.

    • tialaramex 2 days ago

      > But is there any long-lived project for which this isn't true?

      I don't think any other project like this exists. You're coming up on your 25th anniversary without shipping the release software !

      I see that BString itself also uses this weird phrase "UTF-8 character". That's not a thing, and rather than just being technically wrong it's so weird I can't tell what the people who made it thought they meant or what the practical consequences might be.

      I mean, it can't be worse than std::string in one sense because hey at least it picked... something. But if I can't figure out what that is maybe it's not better.

      UTF-8 has code units, but they're one byte, so distinguishing them from bytes means either you're being weird about what a "byte" is or more likely you don't mean code units.

      Unicode has characters, but well lets quote their glossary: "(1) The smallest component of written language that has semantic value; refers to the abstract meaning and/or shape, rather than a specific shape (see also glyph), though in code tables some form of visual representation is essential for the reader’s understanding. (2) Synonym for abstract character. (3) The basic unit of encoding for the Unicode character encoding. (4) The English name for the ideographic written elements of Chinese origin. [See ideograph (2).]"

      So given BString is software it's probably working in terms of something concrete. My best guesses (plural, like I said, I'm not sure and I'm not even sure the author realised they needed to decide):

      1. UTF16 code units. This is the natural evolution of software intended for UCS-2 in a world where that's not a thing, our world.

      2. Unicode code points. If you were stubbornly determined to keep doing the same thing despite the fact UCS2 didn't happen, you might get here, which is tragic

      3. Unicode scalar values. Arguably useful, although in an intensely abstract way, the closest thing a bare metal language might attempt as a "character"

      4. Graphemes. Humans think these are a reasonable way to cut up written language, which is a shame because machines can't necessarily figure out what is or is not a grapheme. But maybe the software tries to do this? There have been better and worse attempts.

      I don't love std::vector but I can't see anything to recommend BList at all, it's all using type erased pointers, it doesn't have the correct reservation API, it provides its own weird sorting - which doesn't even say whether it's a stable sort,

      • waddlesplash 2 days ago

        It's Unicode code points. I don't know why you say this is "tragic", it's a logical unit to work in here.

      • GoblinSlayer 2 days ago

        I suppose it means text encoding is known to be UTF-8.

        • GoblinSlayer 2 days ago

          Edit: oww, the *Chars* method family? Well, that one is bad. STL is sort of lucky here as it tried to figure out unicode when it was already well known.

Hj8Rd2Qw 2 days ago

Haiku's approach to teaching programming is refreshing - starting with real-world GUI apps right away rather than console exercises. Their simple API design makes this possible while still conveying fundamental concepts through tangible, visual results. A great example of how OS architecture can influence learning outcomes.

ktallett 3 days ago

This was how I relearned to program in C++ and ported a few apps to the OS. It was a great experience and the OS has come on leaps and bounds. If anything for many it could be a daily driver nowadays.

Aldipower 3 days ago

This a a great tutorial! Very easy to read and understand. If someone would like to dive into C programming I would suggest this.

  • 10729287 3 days ago

    That's exactly what I wanted to read and was about to ask. I always wanted to both contribute to FOSS beyond helping with translation and to learn coding. That seems to be the perfect opportunity to check those two boxes as I've got a lot of sympathy for the Haiku os.

  • sverhagen 3 days ago

    Kindly disagree. You are either an experienced programmer, in which case this starts off way too basic, or you are coming into programming for the first time, in which case: why would an average person start with C++ or whatever Haiku is?!

    • Aldipower 3 days ago

      Nope, if you read carefully, I've wrote "dive into C programming". This does not target first time programmers. But this doesn't mean you are an experienced programmer already. Fair enough, I've meant C++, not C. :-) I think this tutorial is perfect for kind of beginners, but not complete novices of course.

codr7 2 days ago

I was really into BeOS for a few years, right around the Intel pivot.

Very nice OS, but I remember the programming API to be tricky since everything was multi-threaded.

revskill 3 days ago

But i already know how to program, then why do i need Haiku ?

dddw 3 days ago

There is so much love and attention to Haikus