I own 4 Mega STe and I am using them almost daily. One of them, the rest is spare parts. Producing music with it. The Atari is my MIDI master clock and central piece of MIDI sequencing together with Cubase 3.1 for the Atari. Seriously the MIDI timing is unbeaten until today! The MIDI ports are directly attached to the CIA chip which is again directly connected to the Motorola 68k CPU. Runs absolutely stable even 35 years later. No crashes what so ever and also no distractions by updates or "phone home applications". It just works, distractless! Shame on the "present future".
Yes, back in the days, I/O was often really low latency because memory and therefore buffers were expensive and gate count was limited, it meant more direct connections, which meant low latency.
The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames! It was necessary because the console didn't have enough memory for a framebuffer.
The Atari ST is special for the inclusion built-in of MIDI ports, and it was made cheaply, which at the time meant direct connections and it resulted it low latency.
You can have low latency and low jitter today, but you will need to use a microcontroller not a general-purpose CPU. The old 16/32 bit retro machines are essentially microcontroller architecture devices with general-purpose computer peripherals, for pretty much the reasons you mention. But there are many cheap microcontrollers available today, such as the Raspberry Pico series.
And when you factor in FPGAs, you can get down to the microsecond or less. Low latency is possible, it is just that priorities are often elsewhere.
We like being able to plug everything anywhere. And I admit it is damn cool being able to connect a display, most kinds of storage devices, keyboard and mouse, all while charging my laptop on a single port, at any time. I may even be able to disconnect my laptop and put my phone instead and it will do something sensible. If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
> If you did that back in the day, there was a good chance for one of the devices to turn into a smoke machine.
Back in the day, you would not have been able to do any of this with one port. Each type of device had it's own uniquely shaped connector/pins combo. You were not going to connect your SCSI devices into the VGA monitor port accidentally. Closest I ever saw was someone attempting plug in a Mac ADB cable to the S-Video port, but that just resulted in bent pins. It just so happened those pins were on an Avid Film Composer dongle instead of a replaceable cable.
I think modern general purpose CPUs are perfectly capable of low latency and jitter. The problem isn't the CPU, the problem is stuff around the CPU (mostly the operating system). The less deterministic aspects of modern CPUs (branch prediction, speculative execution, caches etc). happen at timescales much smaller than what you usually care about (and possibly smaller than the jitter specs on microcontrollers).
A rp2350 with psram and microsd could probably do a commendable job at pretending to be an entire Atari ST while providing a bootload of extra low latency goodies at the same time
I refer to the RP2xxx chips as "headless Amigas" because their PIO modules essentially function like Coppers: they are simple state machines that offload I/O functionality off the CPU.
I think there's a very strong future in emulation of achieving FPGA-like latency by using a Raspberry Pi Pico/Pico2 to emulate each of the target machine's subsystems/chips.
Have you seen https://github.com/floooh/chips A bunch of PIO linked chips using these interfaces would feel like a weird blend of software and hardware that stands separately to the FPGA world. I have wondered if it would actually work as a larger scale paradigm. Imagine a single piece of silicon with a bunch of RP2xxxx level processor/ram blocks with PIO links between them all. I'm not sure how it would come out compared to FPGAs for balance of flexibility/cost/power consumption/etc. but I suspect it could find a niche.
antirez mentioned running some of these on RP2040's
The 68000 is "essentially microcontroller architecture"? I don't think there are many people who understand architecture that would agree with that statement.
> The Atari 2600 for instance was known for "racing the beam", updating the image while it was being drawn on the CRT monitor. A latency measured in pixels rather than frames!
Oh wow! I remember hearing that oculus were doing this on their devices and thinking it was new.
FWIW you can get excellent latency on a modern device but only if you run everything in real mode and forgo complicated buses like USB that are effectively network link layers.
This is true, but in my opinion also misleading. Speed and latency are fundamentally different. Speed would be a Performance Feature in the Kano model, meaning there is usually a linear relationship between speed and user satisfaction.
Latency would be a Basic Feature.
Once you cross 7 ms (or 5 ms, or even 3 ms if you absolutely insist) you're happy, above that everything is absolutely unusable.
You are missing out the jitter. This is often the worst part of modern implementations. If there is a jitter of 4ms and peaking sometimes with 20ms, then a 5ms latency is still bad. This implementation is basically unusable. Like many modern USB ones..
The Atari has an absolute stable and extremely low jitter. Some guy measured it to 1µs. Cannot find the link though, sorry.
So the Atari has low latency around 2-4ms with an extremely low jitter. This is execatly what you want from a MIDI clock and sequencer driving multiple MIDI devices.
How do you think any professional works nowadays with MIDI? A good, modern USB interface (from Focusrite or similar) has a jitter well below 1ms, usually in the range of 200µs. If that is too much, simply sync your DAW with an external, dedicated clock, which will usually give you a jitter in the single µs range.
I have a Focusrite and the MIDI timing is terrible. Sure, there is more to it then just the interface. With USB you just cannot guarantee a stable midi timing, because there is no good midi buffer implementation for it. Technically it would be possible, but no one cares.. Professionals using something like MIDI to audio converters via an VSTi plugin that takes midi signals, modulates them onto a audio signal (which can be easily buffered) and some dedicated outboard equipment converts this back to MIDI. If you are working with hardware synths, etc. this is the only option you have nowadays with non-vintage hardware. A lot of producers do not work with midi anyways, they use plugins, that's why it is some kind of a niche problem and there's not much talking about it.
First off, I'm assuming of course we are talking Mac here, because Windows is unusable for MIDI. If you have terrible MIDI timing with a Mac, then yes indeed, you'll need to sync via audio, but there are nice and inexpensive solutions for this, for instance the Midronome.
Look, I'm not trying to convince you to get rid of your Ataris, quite the contrary. I'm just disagreeing that it's impossible to have low jitter nowadays, but I fully agree that things used to be simpler before everything was done via USB.
Agreed. It is of-course not impossible, but it is almost impossible out-of-the-box (literally ;-)) I have a USAMO (Universal Sample-Accurate MIDI Output) device, but do not use it, because as I said, Atari is king here. :-) Not sure how the Midronome can solve the problem of midi notes coming inaccurate from a modern DAW? But maybe I do not understand it completly. Need to have a deeper look. Since some years I am using Linux with a Focusrite for mastering and audio tracking. Midi was bad with Linux and Windows since I got my first USB interface and went away from PCI interfaces. But this shouldn't matter too much. :-)
Note that this is an old version, I just saw the there's now the "Nome II", and at least for Mac, he has actually developed a USB protocol to provide a stable clock (which as you've already written is totally possible via USB, it's just nobody cared enough):
Thanks a lot!
The scotsman is cool and his t-shirt too. :-D T-Shirt says in German "Little pig".
Regarding "midi notes" Sim'n Tonic himself is saying this to the Midronome:
"Note that only these MIDI messages are simply forwarded when they are received, their timing is not changed. So if your DAW sends them with a lot of latency and/or jitter, the Midronome will forward them with the same latency/jitter. Actually this is a problem I plan on tackling as well [...]"
So the Midronome does not solve the problem of inaccurate midi notes coming from a modern DAW. The USAMO does by the way.. But only with one midi channel at once. And of course, coming back to the actual topic, the Atari hasn't a problem at all with accurate midi notes, it is absolutely tight at all 16 channels. So it seems there is indeed nothing comparable to the Atari nowadays. Maybe it will in the future.
Not sure if that is still accurate. This might only be available for Mac, but on the FAQ for Nome II it says this:
Can Nome II send MIDI Notes?
Nome II is like a MIDI hub, you can ask it to forward any MIDI sent over USB to one of its MIDI outputs. It will not only forward these instantly but merge them smartly with the MIDI Clock, without affecting it.
The Windows MIDI/USB stack adds considerable amount of jitter to the MIDI clock, compared to the much superior ones in MacOS. I will fully admit that "unusable" is a personal opinion based on my experience. Of course performers also use Windows, but I heavily doubt you are able to see which device in their rack acts as a master clock, and how they sync their devices, apart from the fact that most performers nowadays don't use MIDI at all.
Midi is used heavily for guitar patch and lighting automation as well as triggering backing tracks in a DAW running on stage. The use of MIDI (over USB) has only increased on stages.
This is getting ridiculous, we are talking about making music, so triggering notes from different devices in sync. You know, what MIDI was originally designed for, not triggering some lights, guitar patches or a background track. You are exactly proving my point: MIDI nowadays is pretty much reduced to SysEx for doing simple automations. None of that is seriously affected by jitter in the ms range. You sound like you have no idea how electronic music was done before VSTs were a thing.
At any given time only one message can be sent down the wire. [1]
So on the beat, an implementation can send either the clock pulse or note on or something else. [2]
If you send the clock everything else has to wait. If you send something else, the clock has to wait.
Now with modern computers, you are also dealing with USB which is a low priority parallel protocol and has to coordinate with everything else a modern kernel does.
Music is hard.
[1] premium hardware sequencers sometimes have two or more Midi Out to reduce contention.
[2] Midi Time Code solves this by encoding monotonic time into Midi and is how serious sync is done over Midi, e.g. in Hollywoood
I really wish you could easily get somwthing like the Myster ST clones...seems like supply is spotty, and price seems pretty high. Id love an original if they were less marked up too...
I have the equivalent of a $500,000+ studio from my childhood, all in my laptop.
You are concerned about a 9600 baud protocol.
There is zero 'shame' on the 'present future' when it comes to music production tools. It is like one of the hugest bright spots/biggest equalizers. Best thing I did was go ITB. No headaches. No hardware maintenance on obscure hardware. No MIDI limitations or even considering of my MIDI chains. Just music making.
There's not one word in his post where he looks down on VSTs or anything. It's just how he likes to make music, and he is unhappy with the state of modern MIDI implementations. In fact, it's the exact opposite: you are shaming him for still using MIDI.
Like C++, this cannot be parsed with a context-free grammar. The "present future" refers to
No crashes what so ever and also no distractions by updates or "phone home applications"
which is something I would guess most people would indeed see as shameful regarding our present future in software, but OTOH, this is HN, so who knows.
My old synths crashed, and required physical maintenance. In addition I lost songs that failed to read off of the crude tape backup that my Quantitizer/sequencer used.
I am much happier with my setup that I could have never have afforded outside the current future than my much lessor previous setup. This being HN I'm sure that are people that can afford to spend much more than me for gear so they might prioritize the 'minor differences' to them you list over no access at all, but I much prefer having access and price points I can actually afford.
Some musicians still like to play instruments -- for them and their listeners, ITB production is seen as a cheat and not real musicianship -- and for them the lack of a stable MIDI clock on today's hardware absolutely does matter. A trained musician can feel time difference as small as 1 ms. Any latency or jitter greater than that and a perfect track could be ruined.
As an aside, all-digital workflows take the joy out of music being made in the moment, by ear and by feel. There is no replacement, for example, for a professional sound engineer adjusting a mix strictly by the sound in their headphones and the feel of the sliders under their fingers.
I have a Novation SL MkII as my controller keyboard. It is a much more tactile experience than say a DX7 or other menu diving synth. It has faders built in for mixing. As someone who has done both I have so much more joy being all digital. I have access to so much that I never did before.
You always want latency or jitter as low as possible. Latency adds up in the chain. Instrument/Sequencer (2ms) -> Some digital effect (3ms) -> Digital Mixer (3ms) -> In-Ear monitor bridges an air gap with processing (6ms)
Suddenly you have 14ms latency. Bad, but reality. So, every ms less is better.
Regarding jitter, this is the worst, because the brain cannot adapt to the changes, whereas constant latency can be regulated somehow by the brain.
3 button joypads from the start, using U+D and L+R combinations for 2 more buttons
Double-sided drive from the start.
Finally, they should have included the blitter socket and a full 2x32pin expansion instead of the cartridge port.
The blitter socket especially would be handy to drive a T212 transputer in 87, when the blitter was available, instead of producing the ATW.
The tiny ST with the external disk drive had the joystick ports at the side - a far superior design.
I quite liked the STe. The mono monitor was great, RAM upgrades were easy, and they'd improved some of the display hardware's worst limitations. Even though TOS was never especially good, they'd fixed all the worst bits by that point.
Still could have benefited from some other extra hardware and OS tweaks though I think.
- 800 KB disk format supported directly by the OS
- blitter is not as useful as it could be, due to sharing bus time with the CPU. It should be able to use ACSI bandwidth if not in use/Shifter bandwidth during non-display periods, so it can run in parallel with the CPU
- 256 px 5 bitplane mode (so still 40 words per line), probably an EHB kind of affair if 32 palette entries would be too much
- something to improve endless scrolling? No carry out of bit 15 when computing Shifter address? You'd end up wrapping before the display period was finished if increasing the display stride, but you can work around that in software...
- put the 9 pin joystick ports on the side
- write signal for that stupid cartridge port that is almost (but not quite) useful for general purpose expansion
The Atari ST had a MIDI port: that notoriously lacked on the Commodore Amiga (I think an Amiga with a stock MIDI port would have been a homerun).
I saw Atari ST in music studios well into the late 90s/early 2000s because back then quiet beige PCs weren't a thing yet: PCs virtually all came with super noisy fans, which was a big no-no for music studios.
A buddy would bring his Korg synth to my neighbour's house and hook it to their Atari ST. Another dude I remember would play drums from Dire Straits song from his Atari ST hooked to some MIDI gear and then he'd take his guitar and play Dire Straits songs.
These were the days.
I'm not surprised some musicians still use them. If I'm not mistaken Kavinsky (who became famous after the movie Drive came out but recently had renewed interest for he performed at the Olympics games' ceremony) started doing music at a late age, on an Atari ST a friend of his game him.
As an anecdote PCs were so noisy that I asked my neighbour (an electrical engineer) if it was possible to come up with a system where the fan would slow down when the CPU wasn't too hot: and sure enough we were then modding our PSUs with "thermistors" and we'd be calibrating our tiny hack, no shit, with boiling water in the kitchen (ah clueless teenagers). Funnily enough about 10 years later every single PSU had variable fan speed.
That's the thing: we were used to quiet 8-bit and then 16-bit computers and when we had to move to these piece-of-shit PCs (but with fast CPUs / FPUs and that were upgradeable), we had to endure these painful ultra noisy CPU/PSU fans (and HDDs).
So the Atari ST just made sense. You could have these super fast (compared to Atari ST) PCs but they were noisy, fugly, piece of unbearable shits that the cool guys in music studios simply wouldn't tolerate back then.
Now of course at some point PCs became just too good and several brands started focusing on quietness and it was then possible to have a totally silent PC, both looking cool and being literally cool (big heatsink, quiet fans, etc.).
But yeah the Atari ST was and still is certainly for some a thing for creating music.
Lots of respect to the Atari ST for his MIDI port (and that comes from a Commodore Amiga owner and fan).
And, just to add a third point, the Atari runs stable! I just tried to sequence with a SoundBlaster AWE32 and Voyetra MIDI Orchestra MIDI Sequencer under Windows 95b. For fun. I already recorded some MIDI tracks, then suddenly after 60 minutes Windows presented me with the famous bluescreen. Everything I've just recorded and didn't autosave lost. Haha.
Absolutely, when I first tried Windows Cubase in the 90s, it wasn't long before it ate all my data. Even today's DAWs still haven't caught up with the original ST Cubase in terms of stability.
> The Atari ST had a MIDI port: that notoriously lacked on the Commodore Amiga
I never really understood why people thought this was a big deal. I had my Amiga hooked to a DX7 synth with a serial to MIDI cable that had a couple active parts in it. MIDI is a serial protocol and the Amigas had full RS232 ports with hardware interrupts, +12v, -12v, as well as audio in and out on unused pins. The serial to MIDI In/Out cable cost around $15 more than two MIDI cables. You can still buy them today: https://retroready.one/products/ka12-serial-port-midi-interf....
The big deal was latency and jitter the Amiga could not provide that good as the Atari this way with the serial interface. I remember such discussion back in the days. Also the multitasking capabilities of the AmigaOS were to good! Yes, this hindered precise jitter. That is why Steinberg and other sequencer software producers bet on the built in and rock solid Atari Midi implementation. Which meant also a lack of good software for the Amiga then on top of it.
While it's true the Amiga never got the breadth of serious music sequencing software that was on the ST, I never experienced any issues with MIDI jitter or timing.
While the Amiga could do multi-tasking, applications could also optionally take over the entire machine, which many games obviously did but so did real-time video applications like the Video Toaster, animation, audio and music apps. Lots of Amiga games and video apps "raced the beam" in real-time without ever dropping a field of 60 fps video. So, at least in principle, the Amiga hardware was as capable as the Atari ST in terms of ability to respond to interrupts at MIDI rates. The Amiga also had video editing software, which back then, involved controlling three professional VTRs in real-time over that single serial port to do A/B roll editing on precise timecodes 29.97 times a second.
So, yeah, I agree that the Atari totally won the MIDI production music market because it had much better software applications. If you were primarily a music producer or studio, there was certainly no reason to pay more for an Amiga's custom graphics capabilities - and if you were serious, the Amiga's more limited music app selection made the choice for you. My only quibble is that, IMHO, the claims of 'MIDI jitter' somehow being endemic to the Amiga were more Atari marketing FUD than reality. I don't doubt that some user at some point did something on an Amiga that caused MIDI timing issues, but it wasn't because of some fundamental hardware limit. It was due to configuration, some app incompatibility, bug or some other solvable issue - because similar timing issues could occasionally happen on the ST too - and then would be solved.
The other story with the ST and music studios and productivity software generally was the excellent monochrome 640x400 monitor, which was far better than dealing with interlace on the Amiga. And was cheaper than an Amiga colour monitor.
There were ways to do non-interlaced video on the Amiga, but just like having to buy external MIDI adapter ... more $$, more hassle.
That and floppy format compatibility with MS-DOS made it easier to move data around.
While that was an advantage for the Atari ST, the Amiga standard desktop was in a non-interlaced 640 x 200 4-color mode that could do 80 x 24 text for productivity apps (word processing, spreadsheets) as well as decent graphical interfaces for creative tools like video editing, 3D rendering and music sequencing. Of course, having 200 more vertical lines would enable some increased detail assuming the app GUI supported it but it wasn't a huge difference (and on the Amiga the colors could also be useful for GUIs).
All the Atari owners I knew opted for the color monitor (which had the same limit of 640 x 200 res and 4 colors (from a smaller palette than the Amiga)) but I didn't know anyone solely dedicated to music production. Given the vast majority of ST systems had color monitors, I assume the Atari's music software interfaces were designed to work with 640 x 200 as well as 640 x 400. Not having used those tools on the Atari I don't know if the publishers went to the trouble of creating separate GUI screens that significantly utilized the extra 200 vertical lines for 640 x 400 monochrome monitor owners to fit more information into fewer different screens or if the increased vertical res just made the text edges look nicer.
On the Amiga virtually no software ever ran GUIs in interlaced mode because high-contrast single pixel horizontal lines would flicker. The Amiga interlaced modes were used for displaying organic content like photos and video, which didn't have any elements that flickered (or, at least, not any more than organic imagery on a high-quality TV). Some software used the Amiga's hardware support for switching resolution modes mid-screen to show interlaced 640 x 400 content on the top with 640 x 200 GUI on the bottom. With the Amiga 3000 and 4000 Commodore added support for 31.5 Khz non-interlaced 400 line monitors. Commodore did release the A2024, a non-interlaced super high-res monochrome monitor for all Amigas in 1988 in 15-inch and 19-inch sizes. It displayed 1024 x 800 in 8 grayscale tones, however it wasn't very popular and little software other than the Amiga OS, desktop publishing and word processing ever redesigned GUIs specifically to exploit the A2024's increased resolution.
My two points being: 1. I agree having the option of the monochrome non-interlaced monitor was a definite advantage for the Atari ST vs the Amiga but it wasn't an advantage for most Atari ST users because most chose the Atari color monitor (except for some dedicated music or DTP installations). 2. Interlaced mode on the Amiga wasn't a disadvantage vs the Atari, it was a feature imaging, animation and video software could optionally use to display organic image content. I bring this up because I once saw an Atari ST fan put the Amiga into interlace mode and show the desktop windowing GUI (which flickered) as an "Aha, gotcha!" example, which was silly because interlace mode was never meant to display interfaces nor did Amiga users use it that way (in fact, the Amiga desktop GUI looked vertically squished in interlace modes until the OS was redesigned to support higher vertical resolutions with the release of the A2024 monitor, A3000 & A4000).
Anyway, my goal isn't to rehash an ancient 'platform war' no one cares about anymore because I don't even care about it anymore. I currently still own three different models of Amiga AND three different models of Atari ST and I love them all. There were a few scenarios where the ST had real advantages over the Amiga but they were limited to a smaller subset of users primarily in music and DTP. The overwhelming advantage of the ST over the Amiga was price. And dollar for dollar, the ST was a remarkably capable system at a tremendous value. However, the Amiga simply had better hardware (at a much higher price until the A500). The Atari was designed over a period of 5 months out of off-the-shelf components to be a low-cost leader, while the Amiga was designed over three years with highly sophisticated custom chips and the A1000 sold (initially) at almost twice the price. Of course the Amiga was better at more of the things most of the target users cared about. Arguably, the most favorable Atari ST + monochrome monitor comparison in 1985 is with the Mac for dedicated desktop publishing use cases. It was 1/3 the price ($800 vs $2500), had more memory, higher resolution and a large monitor. There was no DTP-relevant spec the Mac was better at although, arguably, the ST's keyboard was a notable weakness in all use cases.
That reminds me of using Twister, that optimized the position of sectors on the floppy disk to minimize seek times and speed loading dramatically (and I think they squeezed a few more sectors onto the disk so that it could hold more -- maybe more sectors on the outer rings of the disk?).
normal 720 K floppies were very generous with the sectors on a track. It was easy to format a floppy with 10 sectors per track even without reducing the gap between sectors. On Atari it was almost standard praxis which made that floppies had generally 800K capacity. It was even possible to squeeze 11 sectors per track by reducing the inter-sector gapto a minimum. Furthermore, most floppies allowed to write on track 81 and 82 (sometime even 83). So it was possible to have floppies with up to 902K capacity (not a good idea in the long run, I recently tested such a floppy I had made 30 years ago and it had a lot of read errors athing that 720K and 800K do not).
Yeah, thanks for the good write-up, maybe you're right and there's also some Atari marketing FUD. The Amiga was/is definitely an impressive machine too. That is given. :-)
To be fair, everything about PCs back then sucked.
DOS was crap, when you had tGEM and Amiga OS.
Windows 1 and 2 was beyond terrible.
They were shit for games
They were bulky
They were slow
They crashed all the time.
They were ugly
They were noisy.
They were hard to manage (autoexec.bat, no OS in ROM, stupidly tricky partitioning tools, incompatible drivers for the same hardware but in different applications, etc)
But IBM lost control of the hardware market so they became cheap, ubiquitous crap.
And that’s literally the only reason we got stuck with PCs.
I hope someone will write an entertaining satire about the sometimes almost PTSD-like bitterness and bizarre selective perceptions of the "anti-PC" crowd, especially in the Amiga space. :D
Please don't take it too harshly, but your list of grievances is almost radically different to my experiences of personal computing in the late Eighties to mid-Nineties... to me somewhat of a faszinosum all on its own. In my litle corner of the world the Amiga was virtually nonexistant [1], largely undesireable, and prophesized to be a corpse already as early as 1991.
I'll give you one thing, tho: A mostly non-plastic, compact keyboard computer case (Amiga 600-like) for a suitably powerful IBM compatible would've been awfully nice. Still would be, for a vintage bird that is. We only got "Schneiderplastik", the Euro-PC to be more precise [2], and that one wasn't updated in a satisfying fashion.
1. The only people I knew that had Commodores were two of my best friends, one with a Commodore 64, the other with a 128. The demosceners I grew up with were Atari ST guys, all of them (becoming) musicians.
Sure, there is clearly some "rose-colored memory" effect going on - how could there not be? As someone who used Amigas, Atari STs, Macs and PCs back in the day - and who still owns over a hundred unique models of vintage 80s and 90s computers, they ALL suck in many annoying ways AND most had some unique strengths. We all learned to live with what we had and how to make it do what we needed done.
People got accustomed to whatever personal computer they used every day and many grew fond of it. After all, the power and capability of a desktop computer in the 80s was unprecedented and, for many, revelatory. That said, in the mid-to-late 80s, the PC platform was generally under-powered dollar for dollar compared to most of its leading competitors, most of which were based on Motorola 680x0 CPUs. The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards (something which Apple had with the Apple II but abandoned for a while with the Mac, the Atari ST never really had and only the "professional" series Amiga's had (A2000, A3000, A4000).
Being underpowered per dollar doesn't mean the PC couldn't be extremely useful or the best platform for a given scenario and it certainly doesn't mean there weren't hobbyists who used and loved their late 80s PCs as dearly as any other 80s personal computer owner. Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac.
I don't think we negotiate the same phenomenon. You seem to describe the harmless, almost romantic indulgences of nostalgics. I talk about the bizarre, often enough toxic distorsions of a certain breed of user, who still fights, after all these years, their Platform War. The sort of fan who blames "the PC" for the "ills of the industry".
Anyway, off to some specifics:
> "The strength of the PC during this time was it's rapidly expanding library of business software applications and hardware expansion options in the form of add-in cards [...]".
A standardized general-purpose computing platform "for the future". Exactly what spoke to me, as disseminated in the publications I read as a kid in 1991.
> "Of course, this power balance was largely addressed by the mid-90s - which is why the PC juggernaut then extinguished the Amiga, Atari and (very nearly) the Mac."
"Power balance"? I didn't think in such abstracts when I made my choice, and conducted the long and merciless attrition-lobbying campaign for financial support, to buy a PC. The Amigas and the Ataris were simply not a factor for a variety of different, but very tangible and practical reasons:
Atari ST? I was not (on my way to become) a musician with the need for a precise and affordable backpack-portable computer instrument.
Amigas? The big birds were seen, outside of some specialist-niches, as uneconomical compared to their IBM-compatible brethren.
The vanilla home computers were seen as affordable, but extremely limited, racking-up (hidden) secondary costs to make them more usable. Often enough they carried a certain cultural stigma as well, being perceived by our financiers as gaming toys and therefore time wasters. And most importantly? No one I personally knew had an Amiga. Who to swap software with, where to find a mentor? Yeah...
The Atari guys I befriended used their machines almost exclusively for dabbling in electronic music, later as part of the emerging East German EBM and hard techno scene.
Games? The titles I was interested in either didn't exist on M68k platforms (flight simulations à la Aces of the Pacific, wargames such as the Harpoon series, or adventures like The Lost Files of Sherlock Holmes), were practically unplayable (e. g. Red Baron), considered inferior (e. g. Wing Commander)... or just came out too late.
By listening to stories of some Britons of my age, it only recently came to my attention how privileged I have actually been. Some of these people told me stories of buying their first A600s or A1200s only in 1993! At that time it was hard to separate me from my trusty, second-hand PC... a machine with a CPU-type nearing its eighth (!) birthday (386DX-25).
You’re talking about the 90s though. Thats actually several generations of PC later.
PCs in the 80s were so bad that most homes still ran 8-bit micros with a BASIC ROM.
Windows 3.0 wasn’t even released until 1990.
And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout. It was only really the uptake of DirectX that fixed that (and to an extent, Rage / OpenGL, but you could technically get DOS drivers for them too).
But that was a whole generation of computers away from the 3.x era of PCs, and another generation again from the 80s.
But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself. It was so bad that best practice advice was to reformat and reinstalling Windows every 6 months (I did it much less regularly than that though). And this was a common idiom throughout the entire life of the 9x era of Windows too. But to be fair, that was also a common idiom with pre-OSX Macs too. Apple had become painfully shit at the point too.
If The ST and Amiga were still evolving like PCs were, then by the late 90s I’m sure Amigas might have suffered from the same longevity problems too. But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
> You’re talking about the 90s though. Thats actually several generations of PC later.
I'm East German; you people got a headstart. My relevant hands-on experience is centered around a 386DX system, technology introduced in the mid-Eighties. 1987 brought VGA, AdLib, and MT-32s to the table, with games support gearing up in '89, the year the Sound Blaster was released. Fall 1990 saw the release of Wing Commander. Of course that's just technology, economical realities tell a different story.
> Windows 3.0 wasn’t even released until 1990.
Windows was as relevant to me as an Amiga. GUIs didn't do much for me until much later. Still prefer CLIs, TUIs (and minimal GUIs that come as close to the latter as possible).
> And let’s be honest, the problems with autoexec.bat, config.sys and different incompatible DOS drivers for games existed throughout.
I never experienced serious troubles in DOS. The first two, and only, games I could not get to work were two infamously bug-ridden Windows titles of the late 90s: Falcon 4.0 and Privateer: The Darkening. By the time they fixed 'em with a litany of patches I was busy with other things.
> But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
News to me. How bizarre!
> But in the 80s, they (and STs, Macintoshes, and Acorn Archimedes) were stood head and shoulders above what PCs could do at that point in almost every regard.
Hardware? Until '87. Games? Until late '90 I'd say, at the earliest, accounting for a strong genre bias. [1] Then, outside of niches (Video Toaster, cheap DTP, music production) and certain "creature comforts", it was over; the eco system began to atrophy.
1. The first two DOS-platformers that wowed me visually were Prince of Persia 2 ('93) and Aladdin ('93/'94); all my other genre preferences were, to put it diplomatically, underserved on 16-bit home computers.
That doesn’t mean PCs were somehow more capable in the 80s though ;)
> Windows was as relevant to me as an Amiga.
Same point as above.
> I never experienced serious troubles in DOS.
I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
To this day, I’ve never heard the music in Transport Tycoon because that game refused to work with whatever midi drivers I threw at it.
> > But by the mid-90s, we had a whole new set of problems with PCs: the OS silently corrupting itself.
> News to me. How bizarre!
I’d be amazed if you’ve never once heard about the old problem of computers getting slower or buggier over time and a reinstall fixing things.
> Hardware? Until '87.
You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts. And then things really accelerated (no pun intended) with 3D hardware graphics acceleration. Which, to be fair, was available for Amigas too, but the only software that targeted them were 3D rendering farms.
> That doesn’t mean PCs were somehow more capable in the 80s though ;)
It clarifies specifics relating to my personal experiences with the discussion matter, addressing (perceived) realities of a local market. How people use computers is of the utmost relevance; a fact which you, given your lamentations here, certainly must have internalized.
> I think that’s probably rose tinted glasses on your part then. The pain was real. Games seldom “just worked” and you often had to be really knowledgeable in your system to get stuff working.
No rose-tinted glasses here. And I believe you that your and others' pain was real. Many people could not work their heads around a PC; many of 'em fell for cheap SX clunkers with other substandard components, ffs. That's obviously an inherent problem of such an open platform: PCs are highly individual in many subtle ways; a trade-off one had, and still has, to negotiate in one fashion or another.
> You’re comparing theoretical top of the line PC hardware (which nobody actually owned and no one had written software for yet) with commodity Amigas and STs.
I'm comparing hardware available on the market (with key system components coming together in 1987/88, and games supporting such top-of-the-line hardware showing up in numbers from '88 onwards). I also spoke to economical realities in nearly every post in this disc; I am well aware that 16-bit home birds had a technical lead for a short while, and were an even better value proposition for many people a while longer. For some, just as valid, this still holds true.
> And even those top of the line PCs still missed a few tricks that made some genres of games better on Amigas and Atari STs, like fast blitters.
Yes, already addressed by referring to Prince of Persia 2 and Aladdin (1993/94!).
> It wasn’t until the era of ray casting 2.5D 1st person shooter that PCs started looking better than their counterparts.
So, your stylistic (genre) preference maps it into the time between 1991 (with Hovertank 3D in April as well as Catacomb 3-D in November) and Wolfenstein 3D (May 1992). Okay.
With mine it begins earlier, largely because of proper 3D-titles: Deathtrack (1989, PC-exclusive), LHX: Attack Chopper (1990, no Amiga/Atari port), and Red Baron (1990, got the Amiga slideshow in 1992), as well as the odd non-3D action title here and there, e. g. Silpheed (1989, no Amiga/Atari port).
One can probably go even back to 1988, for at least parity in certain markets and their segments, if one compares the technological edge in an intellectually honest fashion, i. e. what the platform, hardware and software, was really technically capable of.
And productivity software, part of the deal, is of course its very own world.
I’m not talking about personal preference. I’m talking about the wider industry.
As I said before, I had a PC back then. I used to write software for them. I know how the hardware and software compared with other systems out there at the same time.
If you were in East Germany at the time, then you wouldn’t have had an accurate view of what was happening in the industry. You would have had your own brands of things because it wasn’t as easy (or even possible) to import western products. And by the time the wall fell and the borders had opened up, PCs had reached parity with their contemporaries. So of course Atari STs and Amigas weren’t common items and PCs seemed like better devices from your perspective. But surely you have to also understand that your experiences aren’t a typical snapshot of the computer industry in the 80s. In fact they’re about as atypical as it gets.
You’d have been better off saying something like “things were a lot different in Soviet Germany” and we could have had a more interesting and productive insight into what life was like for yourself. Instead you’ve been talking about your own experiences like it was a fact of how those products compared to each other (which is where you’re wrong) rather than what devices made it to your borders (which you were correct on). You do understand how they’re different arguments?
But in the late 1980s, oh my. An Amiga 500 in 1987 was really a lot better than a PC of the time for many things. It was also a lot cheaper. Maybe half the price. The Amiga and the Atari ST didn't improve enough by 1991. By then a PC was better.
But by 1988 the PC was so far outselling everything else that the writing was on the wall.
People who had Amigas and Atari STs couldn't quite understand how their machines, that they perceived as so much better, were being outclassed by PCs running MS-DOS. On an Amiga 500 in 1987 you had a decent GUI. Until Windows 3 PCs didn't.
For example, Pro-Write on the Amiga having real time spell checking and being WYSIWYG in the late 1980s. It wasn't until Word 6 in 1993 that Word was really much better.
The big advantage we had on Atari and Amiga was that the 68000 could address more than 640K without a sweat. PC's had this annoying limit up until the 90s and the complexity that it introduced was mind blowing (EMM, EMS, XMS etc.).
In 87 when I was student at University, I managed to write all my sofware on the Mega ST2 and print my papers with Signum! on my 9 pin matrix printer in a quality that my PC colleagues were absolutely jealous of. As said, the advantage was then quickly lost even if I still could use my 1991 acquiered TT up until the mid 90. But by then, the PC was indeed already in another category (CD-ROM, SVGA, Soundcards, Win95 and/or NT or OS/2, beginning of Linux etc.). Our poor niches computer couldn't follow against the sheer mass of the market.
> PC's had this annoying limit up until the 90s and the complexity that it introduced was mind blowing (EMM, EMS, XMS etc.).
Competent enough people on both ends, end-users and programmers alike, simply worked around that. In the end, it still allowed for a platform of industry-leading applications and games, many of them not available on Amigas or Ataris.
If youve only used a PC in the 90s then it’s easy to see the Atari and Amiga crowd as rose-tinted fanboys. But they’re comparing 90s IBM PCs with 80s competitors.
Really, that says more about how IBM PCs were 10 years behind the competition than about how great IBM-compatibles were.
There were others too. At least the Olivetti Prodest PC1 [1], which I had, and the Sinclair PC200 [2], which a close friend had. Other friends had the EuroPC, some had the Amstrad 1512 and other different PC compatible boxes.
I remember my PC1 fondly. Well, I still have it. I learned to code in GW-BASIC, Turbo Pascal and C (in that order) with it. I was using it for a long time, until 1997, for serious work (coding and university assignments), when I finally had the money to upgrade to a Pentium PC.
As much as my world was PC-centric, the first time I saw an Atari ST and what it could do, my jaw dropped. I knew of the Amiga from magazines, but the first time I actually saw one was several years later, after I acquired my Pentium PC and I admit it wasn't that impressive then. But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
Thank you; interesting machines. A modern, compact industrial-grade metal keyboard case PC with an external PSU, and some quality of life goodies, built around a Pentium MMX or Pentium II CPU, is really something I would fork over money for. Essentially something along the lines of C64EVOs planned MODLR case. [1]
> But still I couldn't help but think: "wow, you had that in 1989?". There was no comparison with the PCs of its time.
This speaks more of local market realities, e. g. "demo software" on hardware in an actual computer shop or, as in your example, at a friend's home, especially in the form of 2D arcade action games, then at its peak of popularity on 8- and 16-bit home computers... and yet to shine on PCs (and as opposed to glossies and the like, where the first 486 machines stepped onto the stage turn of the year 89/90).
But at that time I wasn't thinking about computers that much, still digesting the beginning of the end of the GDR.
OP's grievances are spot on for the period 1985 to 1990. After that, PC's did indeed gain enough power (386 were mass and 486 just came out), also VGA started to become. This means that your perception built especially after 1990 is right but doesn't contradict OP's list as Atari's and Amiga's were indeed much much more advanced and useful than XP's and 286 under MS-DOS/CGA.
I was predominantly a PC user. In fact I had a side hustle repairing PCs in the 80s and 90s. It wasn’t fun but it was tax free pocket money.
Every time I got to play on non-PC home computers I’d be blown away by how much better those machines were.
These days I collect retro hardware and the Atari STs and Amigas are still easier to maintain.
So my opinions aren’t that of a Amiga fanboy. PCs in the 80s were really that shit.
I do think a lot of the problem was Microsoft though. I never liked any of Microsoft’s software even back then. And that was long before they gained the reputation that so many older timers like myself still remind people about. I actually wrote my own windowing system in Pascal because I was fed up with early Windows. It wasn’t a patch on GEM but back then I didn’t know you could run GEM on a PC.
Fair enough. I never had serious problems with my PCs... or Microsoft's OS offerings. And all that maintenance song and dance around these machines, before Plug & Play became reliable, came almost naturally to us. No surprise here, we were insatiably curious, able to read the manuals, and not afraid to ask.
We had affordable windowing and GUIs on the Atari and the Amiga, with instant boot from ROM and tons of RAM. The Amiga had the beginnings of multitasking and hardware acceleration for graphics and sound.
Then suddenly the industry decided to go back to a cut-down version of late 70s S-100 computing, with insanely unaffordable prices, crippled specs, bizarre semi-manual memory management, ugly hardware, and a command line interface that was basically CP/M but not as good.
This was a mistake. It slowed upgrades. The several second's worth of speed increase was irrelevant when to upgrade an Amiga you had to ship out 2x 40-pin DIPs and floppy disks as opposed to just floppies.
And due to incompatibilities it was common to install a ROM switcher in your system, especially in Amigaland when kickstart 2.0 came out and you wanted to keep 1.3 so your games would still run. So you had to buy and install a switcher like a MultiStart AND buy and install new ROMs and manage two sets of floppies. This led to a schism in the market where normal people were stuck with an A500 running 1.2/1.3 and its 1980s featureset and power users who wanted space-age luxuries like IDE hard drives were running 2.0.
Practically every word uttered or printed by Commodore about compatibility between the two was an outright lie.
Microsoft had an obsessive focus on backwards compatibility so MS-DOS 5.0 was adopted by practically everybody: just insert the disk, switch to a:, type "install". Done. There were compatibility issues, but they were on a scale that was irrelevant compared to the Kickstart disaster.
Could you imagine in 1992 having a PC and having to install a PCB in the BIOS sockets so you could have both versions 3.02.111 and 4.84.1932 of your BIOS and keep DOS 3.3 and 5.0 boot disks on hand so that you could run Commander Keen and use the newest version of WordPerfect?
I did all of that on the non-PC side of the house, and many others did too. I had (and still have) an A2000 with hard card, flicker fixer, accelerator, ram expansion, rom switcher, and other upgrades. I spent thousands of hours tinkering and having fun with the system.
My cousin had an Amiga in the late 80s/early 90s. As a kid, I was so incredibly jelous. Everything was better, sound, graphics, the OS, the hardware itself. Even well into halfway the nineties we were messing with tire fires like VESA Local Bus on the PC side.
> a command line interface that was basically CP/M but not as good.
I hate to go to bat for MS-DOS, but it had at least one real advantage over CP/M: a single disk format. As doomed to failure as the various non-PC DOS machines (e.g. Tandy 2000 and DEC Rainbow) were, they could at least share disks.
I just remembered that the Rainbow only supported single sided disks. While a PC could format and use those, people normally used double sided disks. So, kind of technically compatible but not completely.
You could have a GUI on a PC as well. I developed GEM applications on an Olivetti M24 before I got my first Atari ST.
The Olivetti had a B&W 640x400 monitor and a Logitech mouse that plugged into the back of the keyboard. You could replace the 8086 CPU with an NEC V30 for a bit more speed.
Cheap and ubiquitous is what people want, and if PCs became successful because people other than IBM started producing the same hardware, then perhaps Commodore should have done that too. Software seems to have proven itself the better moat, and so maybe AmigaOS could have been the thing that would tie this hypothetical Amiga-compatible market together, keeping Commodore alive.
They'd have to have been a bit more careful about it than IBM were.
I am confident it would still feel like everything is terrible.
The thing that really drove the PC era was that the commodity desktop spec was rapidly gaining capability, compilers for them were getting good enough to depend on high-level languages, and the most affordable way to meet the market where it was, was not to build custom(as had been the case in the 80's when looking at stuff like video editing systems, CGI, digital audio, etc.) but to use Microsoft as a go-between. All the heralded architectures of the 80's were doing things more custom, but it amounted to a few years of advantage before that convergence machinery came along and stripped away both customers and developers.
Apple did survive in that era, though not unassisted, and the differentiation they landed on(particularly after Jobs came back) was to market a premium experience as an entry point. I think that is probably going to be the exit from today's slop.
In this era, spec is not a barrier - and you can make <$100 integrated boards that are competent small computers, albeit light on I/O - and that means that there's a lot more leeway to return to the kinds of specialty, task-specific boxes that the PC had converged. There's demand for them, at least at a hobbyist level.
For example, instead of an ST and outboard synths for music, you could now get an open-source device like the Shorepine Tulip - an ESP32 touchscreen board set up with Micropython and some polished DSP code for synths and effects. It's not powerful enough to compete with a DAW for recording, but as an instrument for live use, it smashes the PC and its nefarious complexities.
You can't go up and down simultaneously on a joystick anyway! The stick can't go both ways at once. So this gives you the option of reusing these signal combinations for other kinds of input.
Just that you can't go up and down at the same time, so it's a valid electrical signal for another button, and importantly standard so games would take advantage of it.
Sure, but if U+D is button 2, you can't register up or down and button 2 together. You need another wire, a serial protocol, or whatever sega did for the 6-button genesis/mega drive controller (i think a toggle?)
This would be my question as well. If i go up, and pressing this button 2, then the output signal does not know if i'm going up or down or none of those.
Obviously it'd be better to have a protocol like the megadrive, but given the setup in the ST, this is a hack without having to change the ST hardware.
That works for Metal Slug for the grenade, because a) you don't hold down the grenade button, b) you don't have a lot of grenades so you don't use them that often so you won't really notice any weirdness like if you duck and grenade simultaneously the duck doesn't happen until you let go of the grenade button.
If the up+down button was used for firing the main weapon, this wouldn't work, especially when you've got the machine gun and you can just hold the button down. You wouldn't be able to duck or aim up while shooting, or it would be unreliable.
If you were trying to use it for a three button fighting game, I think you'd have problems too. Especially if you are doing 'negative edge' techniques where you would hold a button while inputting the directional sequence and release the button to trigger the special.
I mean the whole point of the ST was "rock bottom price" and a lot of the things you're talking about would have raised the BOM significantly, or delayed its introduction by precious few weeks or months.
Beating the Amiga to market, and beating it on price were super important.
But I do think there was a serious problem with follow through. The Blitter and GDOS and then the STe came too long to come after. The Blitter never became standard, and the games and software suffered for it. And updates on the operating system were slow and thin until it was way too late.
I do agree that the cartridge port thing -- it being limited to 128kB expansion -- was needless. One more pin, even, would at least allow for a proper OS upgrade via cartridge port! Definitely one of the stupidest design decisions on the machine.
You have a point, but the ST bottleneck during development appeared to be the software, so there was possibly space for hardware tweaks. The BOM would go up slightly but remember they would have saved on developing and shipping a separate disk drive which would cover a lot of these changes.
Realistically it's amazing the ST was as good as it was, given the 6 month development time and the kings of penny pinching at the helm :)
I think they’d be better on the back unless you are supposed to plug them out all the time.
> Finally, they should have included the blitter socket
That would be hard without having a functioning one first. The blitter would be also handy for a number of things, from PCM sound to network and disk transfers.
On the right (ie. Mirror of the STE 15pin ports) would mean that they could keep the electrical layout ie. Connected to the keyboard, and hence a separate keyboard unit. For 2 player games you would be swapping it a lot- there were adapters just to help with this.
I agree it would be difficult to design a correct socket, but from interviews it was always the plan to have a blitter, and a socket as standard would have helped adoption.
The main thing is that the T212 is a great coprocessor, faster than the 68881 fpu and with a 2k cache. Introducing the transputer as a coprocessor would potentially have changed the computing landscape
I was astonished to find about 22 distinct C compilers, including their own libraries, assemblers, linkers etc. for the Atari ST and its successors. That's not counting separate versions, just distinct products from different vendors.
From what I can see now looking at archive sites, there was a huge amount of activity in developer tools on the ST back in the day. Much more than I thought at the time. It might have been a serious contender for the dominant architecture (along with the m68k CPU), if IBM PC-compatibles and x86 hadn't won.
Recently I looked for Atari ST C compilers, out of curiosity to test portability of a C program I'm working on.
I've been testing C code for diverse Unix systems.
As I used to own an Atari 520ST (with 1MB RAM beautifully piggy-backed and hand-soldered on the existing RAM chips :-), it seemed like a good idea to peek at C on an ST emulator. I didn't use C when I had a real Atari ST (no C books in my local library), so I expected to find one or two C compilers, not 22!
I think that I used the Megamax C compiler back in 1987-8. I was just messing around and experimenting, not programming professionally, but it worked well for me.
I had that. Later renamed Laser C. Being a poor college student I used a Radio Shack floppy disk drive for a second 3.5", hand-soldering the DIN connector the ST used. Afterwards I could have my compiler tools on one and source/object code on the other - a huge time saver.
Try and get a compiler and linker to fit in 360k these days!
Atari vs Amiga was such an interesting time in computing history.
When I see generations that grew up with game consoles, and talk about the current uptake on desktop games, they really have no idea what they missed out in home computing and the first wave of indie game development,
from bedroom coders .
Tangent: the older I get, the more it annoys me that this expression kind of implies a failure of young people to study history, when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from. Especially because it's usually people in power in some form who are trying to keep the newer generations naive here so they can be fooled again.
Not saying that this interpretation was your intent (in fact I suspect it's the opposite), just directly expressing my annoyance at the expression itself.
> when I feel like it's more the responsibility of previous generations to preserve and pass down history for them to learn from
But everything has been preserved and passed down. The entire home computing phenomenon has been archived and is available on the internet thanks to the rampant 'software piracy' which was common at the time, and detailed schematics and manuals coming with the computers (which have all been digitized and are available on the internet). Even my obscure KC85 games I wrote as a teenager and 'distributed' on cassette tapes by snail mail are available as download because some kind person(s) digitized all that stuff during the early 90s and put it on some 'underground' game download portals.
The 80s and early 90s home computer era will be better preserved than anything that came after it.
> The 80s and early 90s home computer era will be better preserved than anything that came after it.
Indeed. Sadly, many more recent games will probably be lost to time forever due to DRM, online service components going offline or never being distributed on physical media in the first place. As someone into vintage computers and preservation, I worry that future generations may look back and see the late 2010s and certainly the 2020s as a 'dark age' when surveying the history and evolution of digital gaming. All we'll have are YouTube videos (assuming those survive future business model tectonic shifts) but no ability to actually experience the game play first-hand.
Recently I've been exploring the back catalog of more obscure PS3 and X360 games via emulation and have found some absolutely terrific titles I never even knew existed. Some of them were only ever sold through the console's online store and never available on physical media. With the XBox 360 and Nintendo Wii stores now long offline, only the PS3 store remains available - and who knows for how much longer, since Sony already announced its closure once and then changed their mind. There's now a race to preserve many of these titles
The good news is that not only was almost all of it preserved, teenagers today are really interested in retro gaming. My 15 year-old daughter, who's not into computers more than any other 15 year-old girl, just asked if she could go with me to the vintage computer festival this Summer. She tells me her friends at school are all interested in running emulators to play classic games from arcade to SNES to PS2 and N64.
I guess the 'dark lining' to that silver cloud is that this interest from teens in retro gaming is partly thanks to the increasing downsides of modern gaming (cost, DLC, ads, hour-long download/installs, etc). While game graphics continue to get more and more impressive, stuff like real-time path tracing doesn't seem to excite teens as much as does me. Ultimately, it's about game play more than visuals. Lately I've been exploring the immense back catalog of N64, PS2, PS3 and x360 games via emulation and there are some incredible gems I never even heard about back in the day. It's especially great now thanks to the huge variety of mods, enhancements, texture packs, decompilations/recompilations and fan translations. And current emulators can upscale and anti-alias those games even on a potato desktop or laptop with a low-end discrete GPU.
Understandable, hence why many of my comments kind of look like mini-history lessons, and I tend to be pedantic.
However curiosity also plays a big role.
If I know so much about computing history since the 1950's, is because I do my research, and take advantage of all archives that have been placed online, certainly I wasn't around to live all of it.
I never thought that statement was actually about having an “idea”, but more about not actually having lived through the experience. Quite the opposite from your belief, no amount of study would allow them to understand what is was like.
The Atari ST was foundational for me. I loved it so much, learned a great deal, discovered and played many great video games, designed page layouts, made my own software and games. In December 2023 one of my recent games was listed in Ars Technica's "Best Games of 2023" alongside Super Mario Wonder and I can draw the line right the way back to my time on the Atari ST. https://news.ycombinator.com/item?id=38372936
I had a 520ST back in the mid 80s. I would have killed for a Mega ST, but I couldn't afford one and realistically needed an IBM-compatible PC, which I eventually got.
Things I remember about about the 520ST:
- Those horrible diagonal function keys. There was no reason for them to be diagonal, rather than normal keys as they were on the IBM. But I've always hated function keys.
- Games like Dungeon Master (really still quite a good game today).
- Not a bad C compiler, but I can't remember who by - LightSomething?
- The GEM GUI was not so bad, but using it with a floppy disk was.
But all-in-all I was quite happy to get my PC-compatible to do serious work with.
GEM was in ROM on the Atari ST and it was fantastic. It was light years ahead of where Windows was at the time. It was Amiga Workbench that was somewhat limited by being on a floppy disk.
Ironically, speaking as an Amiga guy, those diagonal function keys were an aspect of the ST I really liked!
I don't know if they were consistent with the other keys in terms of feel, but they were a striking, unique design feature that instantly identified the machine as being Atari without compromising practicality.
Yeah, I forgot about that. But I suppose you didn't need to replug them very often, and it wasn't much worse than plugging into an IBM PC before USB came along. And at least the Atari had lots of useful ports.
I did prefer the Amiga, but I still got an Atari ST for an very obvious reason: It had MIDI DIN port and was way cheaper than most digital sequencers at the time.
It's funny how some young producers today wonder "how did people do it without a computer before the 2000?"...well guess what, we did used computers! I cannot however remember what software sequencer I was using, I know it had MIDI effects (like MIDI echo), that's all I remember.
And by 1998, Logic was fairly advanced anyway and even had plenty of plugins.
The Mega ST (not the Mega STe) is the best quality machine of the series. Mechanical cherry keyswitches on the keyboard. Easy access in the case to expand (though the Mega STe/TT had standardized VMEbus and the Mega ST was its own custom thing).
The Mega STe had a funkier case, VMEbus, and upgraded specs, but mushy rubber dome keyboard, more brittle plastics.
I like to collect the Mega as the best of the bunch, personally.
The Mega ST keyboard is indeed the best keyboard of the whole Atari family. My TT keyboard has its rubber dome getting stiff with age. This said, the Mega ST keyboard has one big flaw, its plastic is getting extremely brittle and fragile with age. I had one keyboard drop 1m (3ft) to the floor and it exploded like a glas vase. So if you have a Mega ST keyboard, be very careful to handle it gently.
The expansion slot of Mega-ST is just 2 rows of 32 pins that are 1:1 connected to the CPU pins. Any extension that was supposed to be solderedon the CPU could be put in the slot with a simple adapter (see f.ex. the Volksfarben ISA slot adapter for ET4000 VGA cards).
The main problem with the keyboard was the non-standard size of the kepcaps. The standard distance between keycaps is 0.75 inches, and the standard top width is 0.5 inches. The Atari ST keycap distance is standard, but the top width is 0.625 inches. Because of this, if your finger isn't exactly centered over the top of the key, it hits the adjacent key too, leading to key jam.
The positioning of the cursor keys on the Atari STs is interesting [0]. It arguably makes sense for the cursor block to be located more in the vertical middle rather than at the bottom edge of the keyboard.
I once played Ultima 6 from a RAM disk on an ST with 4MB RAM. Game install from 'Hard disk' to RAM disk - it didn't realise. And then I used bigger size floppy disks (940KB I think) and a fast copy utility to get those 3 disks to start the game and when done, save the game and save it all off to 3 floppies. It was totally fast!
”I don’t recall seeing Atari specifically market the Mega ST to developers, but I suspect a lot of developers found the better keyboard and extra RAM to be worth the upgrade.”
There wasn’t such a thing as a general developer market.
When you didn’t have internet and cloud services and free Unix, how could you develop for something else than a specific platform and device?
If you bought a Mega ST to write programs, your target audience were still only the people who had a regular ST. You couldn’t reach anyone else. So the advantage was minimal.
The idea that there can be a developer market separate from the baseline end-user platform is quite new. It emerged around 2007-2010 when web apps became a realistic option and you didn’t have to be on Windows to target the 90% of people who are on Windows.
I bought a Mega ST2 because I studied CS and wanted to become a developper. I sold the Amiga 500 my father had bought me. The ST was cheaper for programming than the Amiga 500 as you would need to add, at least a second floppy and a lot of memory. Furthermore, I hated Workbench the GUI of the Amiga (for the same reason I hated also Windows 3.1, you had to use a special program to access the files on the drives, you had icons in the windows only if you had drawn specifically a special icon, I preferred how on GEM and the Mac Finder, windows would directly show what's on the disk).
I have to say, I love the industrial design of these 80s machines from Atari, and also from contemporary Commodore offerings. As an example, the original Amiga 1000 is beautiful, and I'd be incredibly happy to have a machine with that form factor, but equipped with modern internals, today.
Honestly, I don't think even Apple could touch the best of Atari and Commodore industrial design in the back half of the 1980s. To be blunt, the early Macintoshs simply weren't practical in their design: for starters, a tiny monitor - that was originally black and white (which in 1984 was already kind of a joke) - and very limited upgradeability, relatively poor multimedia capabilities (speech synthesis was no more than a gimmick that was also available on other platforms), and then the whole aesthetic just wasn't that pleasant.
And I say this as someone who, personally, has only owned Apple machines for the past 15ish years, so I'm not exactly coming at this from a "not a fanboi" perspective. I'd still take 1980s Atari or Commodore aesthetic over modern Apple, or modern anything else for that matter[0].
Also, as an aside, I really enjoyed seeing "Atari Means Business with the Mega ST" as the top headline on Hacker News in 2025. Even on a Sunday when content typically tends to be more varied and interesting this was still an entertaining surprise.
[0] I suspect the reality may be that I'm an "anything but Wintel" kind of person, although not at any cost, because I did run PCs exclusively for 11 or 12 years. They never really helped me enjoy computing in the way the other machines have though.
The industrial design of PCd may have been lacking in beauty, but it was almost always practical.
For example: I cannot think of any desktop models that lacked internal expansion. They may have used a riser card to stack in two or three slots sideways, but the slots were there. The design may have been crude, but at least your desktop wasn't turned into a disaster every time the technological landscape shifted: when hard drives became affordable, the world switched to 3.5" floppies, if you decided to use online services or send faxes directly from your computer, get a CD-ROM, or cable Internet.
My memory is that one of the Atari units was capable of being treated as a Jerq or Gnot, I recall people running core-wars on it. The keyboard looks familiar but I believe there as a packaging which had the CPU inside that unit, not just in a pizza box.
Atari “means” business..? I can’t get over the present tense in the heading. I double-checked if they had actually released something new. It feels like clickbait, and I wish it weren’t.
reply