Fyi, some of those photos are much later than 1981. I wrote added the Arabic/Hebrew support to the Star document editor which probably was about 1984 and showed in one of the photos. I was still in college in 1981 so that could not have been the year.


I was a teenager riding a bicycle through a neighborhood, when I came across a yard sale. Heard that someone had died. Among their things was a tape of the Koyaanisqatsi soundtrack composed by Philip Glass. Bought it for a couple of bucks, and later that night was blown away by the music. I had heard nothing like it before – it made me shiver with awe and fear. It felt like the dead was reaching out through the sounds. A few years later I watched the film and had a vision of the monstrous beauty, this ancient swarming organism called humanity, of which I’m like a finger or toe, a leaf and flower. Hard to describe the influence this film has had on my view of the world, and what art can do to you – even now it gives me goose bumps remembering that aesthetic experience in my youth.


Hahaha, that was a busy weekend when this launched in like 2005-2006. I was one of the sysadmins of the website with the THEMIS images. Google promoted it directly below the search box. Which made my web servers 3 clicks away from Google’s front page. I spent the weekend cannibalizing compute nodes from one of our clusters to work as caching proxies. It all worked out pretty well.


We played “split the kipper” in Scotland over 50 years ago with pocket knives which is Mumbledypeg by another name.

Iona and Peter Opie’s Oxford dictionary of children’s games, written in the 60s, is fascinating.

https://www.opiearchive.org/


This is a blast to see. I wrote IELM at least 25 years ago, when I was learning eLisp and was frustrated there wasn’t a REPL to help me try commands and learn it. I remember it being quite a thrill when the FSF wrote to me and asked me to assign the copyright so it could be bundled with Emacs. I’m so pleased to see it’s still in use!

I do remember I had to use an awful hack to make it work. Comint expects data from a pipe, but eLisp output is internal to Emacs. The details are hazy now, I but I think I pushed the eval result as stdin to a ‘cat’ process for Comint to ingest the output. I wonder if that ever got cleaned up…


> The VP, in turn, assigned the task to Beatrice (Luoma) Ojakangas, by coincidence, the older sister of the engineer who had developed the egg roll machine for Chun King.

Beatrice is my aunt! (And the inventor of said egg roll machine, Eugene Luoma, is my uncle. He also invented the zip-it drain cleaner and all sorts of other clever things.)

Beatrice kind of hates talking about Pizza Rolls. She’d rather be known for her many cookbooks and other culinary work, not that trash. Jeno merely paid her an hourly wage to come up with different egg roll fillings and pizza was one of the ones she tried. No royalties or anything.

I love pizza rolls, though.


Full disclosure: Principal Software Engineer here on the Scratch backend…

Scratch is not built to be a “teach your kid programming languages” system, it is based on the work and ideas of the Life Long Kindergarten group at the MIT Media Lab (the director of this group is Professor Mitch Resnick, the LEGO, Papert Professor of Learning Research). The Papert part is where the term Mindstorms comes from (https://www.amazon.com/Mindstorms-Children-Computers-Powerfu…) and was used by the Lego Group when branding those products, and our philosophy is heavily influenced by that.

I can say that the https://scratch.mit.edu/statistics/ are real and we have a substantial footprint of backend services and custom software to support it. We handle on the order of 15-20 million comments/month.

The primary design philosophy is:

Passion: You have a strong interest in a subject/problem to solve/explore
Projects: Build something based on your passions, gain directly interactive experience with it.
Peers: Share your work with folks who are interested and provide feedback to you
Play: It should be fun!

Note that there is nothing in there about STEM/STEAM nor application development. We build and support Scratch to provide creative tools for anyone to explore computation in a from that is relatable and has a low floor for understanding/entry. Having said that, the complexity of what Scratch can do rises sharply the more you work with it and the concepts behind “forking” and opensource are built in via the remix ability on individual projects.

A lot of design thinking goes into the frontend of Scratch to build on a creativity feedback loop that is not focused on learning Python or any other specific language (or the syntax of them, i.e. avoid “why isn’t my program working… oh, one too many tabs… or maybe this semi-colon, or maybe this .”)

Another part I think is worth raising, the Scratch frontend is a sophisticated virtual machine interpreter that has it’s own machine code and model that is executing in a Javascript environment in browser and it is still open source. Google’s Blockly project was based on the ideas of Scratch 1.4 and when we ported Scratch 2 away from being Flash based, we partnered with the Blockly group to fork their code base and create Scratch Blocks.

Based on the TIOBE index, we’re usually somewhere in the top 20 most popular “programming languages”. _eat it Fortran!_


Once managed tech for an insurance actuarial department. We ran IBM DB2 for underwriting and claims apps. One day had lunch with the actuarials to make friends and make sure we were supporting them well. At one point in the conversation I foolishly asked whether they also would like to access DB2 to minimize data transfers. They laughed and said: “SQL is like drinking data through a straw. We use APL so we can drink it all at once.” I felt like a rookie at spring training.


Bob has been an active member of the Austin startup community for 10+ years and I’ve talked with him many times. As a EE, it was cool meeting him the first time and once I’d chatted with him a few times, I finally asked the question I’d been dying to ask: How’d you come up with “Metcalfe’s Law”?

Metcalfe’s Law states the value of a network is proportional to the square of the number of devices of the system.

When I finally asked him, he looked at me and said “I made it up.”

Me: .. what?

Him: I was selling network cards and I wanted people to buy more.

Me: .. what?

Him: If I could convince someone to buy 4 instead of 2, that was great. So I told them buying more made each of them more valuable.

It was mind blowing because so many other things were built on that “law” that began as a sales pitch. Lots of people have proven out “more nodes are more valuable” but that’s where it started.

He also tells a story about declining a job with Steve Jobs to start 3Com and Steve later coming to his wedding. He also shared a scan of his original pitch deck for 3Com which was a set of transparencies because Powerpoint hadn’t been invented yet. I think I kept a copy of it..


My mom was a PLATO developer. She wrote computer based learning courses for it.

What I remember about PLATO was the games. I think there was one where you could drop a flower pot on Mickey Mouse’s head. Does that sound familiar to anyone?


Ryan and David and team, I am so happy to see this posted!

I’m quoted a few places in it. Yes, the story about the origin of the phrase “fire an event” is true and correct. I could even tell you exactly where I was sitting and which way I was facing when that event fired.

Some things stick in your mind, doobies or not.

Also mentioned is the VBX. In some ways, this may have been the worst API I ever designed. It was so bad that Microsoft eventually replaced it with COM! But it was the most successful.

If anyone has any questions, fire a comment!


I watched all 157 of these:
https://www.youtube.com/playlist?list=PLVV0r6CmEsFzDA6mtmKQE…
Really interesting.

I spoke to him once at a book signing and asked him about Orion. In summary he said: would it have worked – probably, should it be done – probably not. Although he did make the point that pretty much every big engineering project kills people.


This is awesome! This kind of extreme low-power application was the original reason I wanted Mosh — I always thought it would be awesome to have a wireless (hand-cranked?) laptop that could run SSH and would only consume energy when (a) a key was pressed [to update the locally generated UI], or (b) it wanted to fire up the radio (on its chosen schedule, e.g. <1 Hz) to retrieve an update from the server of the "ground truth" UI and reconcile that with the local prediction.


I worked on the UIUC PLATO system in the 1970s : CDC-6600, 7600 cpus with 60-bit words. Back then everything used magnetic core memory and that memory was unbelievably expensive! Sewn together by women in southeast Asia, maybe $1 per word!

Having 6-bit bytes on a CDC was a terrific PITA! The byte size was a tradeoffs between saving MONEY (RAM) and the hassle of shift codes (070) used to get uppercase letters and rare symbols! Once semiconductor memory began to be available (2M words of ‘ECS’ – “extended core storage” – actually semiconductor memory – was added to our 1M byte memory in ~1978) computer architects could afford to burn the extra 2 bits in every word to make programming easier…

At about the same time microprocessors like the 8008 were starting to take off (1975). If the basic instruction could not support a 0-100 value it would be virtually useless! There was only 1 microprocessor that DID NOT use the 8-bit byte and that was the 12-bit intersil 6100 which copied the pdp-8 instruction set!

Also the invention of double precision floating point made 32-bit floating point okay. From the 40s till the 70s the most critical decision in computer architecture was the size of the floating point word: 36, 48, 52, 60 bits … But 32 is clearly inadequate. But the idea that you could have a second larger floating point fpu that handled 32 AND 64-bit words made 32-bit floating point acceptable..

Also in the early 1970s text processing took off, partly from the invention of ASCII (1963), partly from 8-bit microprocessors, partly from a little known OS whose fundamental idea was that characters should be the only unit of I/O (Unix -father of Linux).

So why do we have 8-bit bytes? Thank you, Gordon Moore!


Olga was a famous doorkeeper at Chalmers University of Technology, Sweden.

Students used to send her postcards from their journeys.

It became so popular that it was enough to write “Olga, Sweden” for her to get the letters [0] (source in swedish).

[0]: https://sv.wikipedia.org/wiki/Olga_Boberg


At age 15 I wrote a pacman clone for the Atari ST and was both impressed by and jealous of Minter’s Llamatron for the same platform. My game was 30Hz only rarely, usually degrading to 15Hz, and you could really feel it in the gameplay. Llamatron was always fast (always 60hz?) — because that’s just how Jeff rolls. Respect.

On the plus side, my crappy pacman clone was good enough to convince Andy Gavin to (years later) bring me on as the first developer hire at Naughty Dog. The system works! (I guess?)


John and I were in graduate school together (computer science, U Wisconsin – Madison). He was indeed a remarkable person. He was blind and deaf. He carried around a little mechanical Braille typewriter. To talk with John, you would type, and he would extend his hand into the device and feel the Braille impressions of what you were typing. He was not qualified for a normal seeing eye dog program because of the extent of his disabilities. So, he got a dog on his own and trained her. Her name was Sugar, and I can still hear John talking with and giving instructions to her. He was a living demonstration of the stunning heights that people achieve from time to time. I believe his PhD advisor was Marvin Solomon who was (is) also a remarkable and admirable person.


I used to work for Jimmy Cauty and Bill Drummond’s record label. This book looks interesting … but I really wish the focus was on the art and the music, and not the “the guys who burnt 1 million quid” incident.

One thing that the media have a tough time recognizing is the fact that Bill and Jimmy are legit experimental artists, and still love making art of all kinds. And they also happened to have some amazing musical talent and experience (Jimmy: The Orb; Bill: Big In Japan and early manager of Echo and the Bunnymen).

So, they decided to take their talents into areas where few experimental artists had ever gone before, taking over the pop charts … and then proceeded to do what experimental artists are wont to do in such a situation.

They gave a huge middle finger to the industry, by barnstorming the big UK music industry award ceremony (the ’92 Brit Awards), playing a death metal version of one of their dance hits while Bill fired blanks from a machine gun over the heads of the crowd. Later in the evening they dumped a dead sheep outside one of the after-parties, and shortly afterwards deleted their entire back catalogue.

They proceeded to do lots of other experimental stuff, ranging from writing some excellent books (I recommend Bill’s “45”) to activities such as Jimmy’s model English village a few years back.

And the music, 30+ years later, is still fantastic! Not just the pop hits. Listen to The White Room. Dig up the club singles and experiments like the Abba and Whitney projects and It’s Grim Up North.

Yet after all that, what does the media remember them for? More often than not, it’s the one-off act of Burning a Million Quid in 1994. Their ground-breaking music, the books, the anti-establishment statements and art … it’s almost an afterthought.

It’s like releasing an otherwise interesting book about Ozzy Osbourne – a seminal figure in the history of heavy metal with an unusual groundbreaking role in reality TV – and positioning it around a single sensational incident from 1982, “Crazy Train: Ozzy Osbourne, the Man Who Bit The Head Off A Bat”


If you’re curious about how this sort of brain stimulation works, I just published a fun little explainer in PLOS Biology.

https://journals.plos.org/plosbiology/article?id=10.1371/jou…


Hah! Author of that 20-year-old web page here.

At the time I was attempting to use standard open source image processing software like ImageMagick to manipulate scientific data. I was disappointed to find that it was not suitable, both due to approximations like this one, and because all the libraries I looked at only allowed 8-bit grayscale. I really wanted floating point data.

Here is what I was working on back in those days: https://www.ocf.berkeley.edu/~fricke/projects/israel/project…

I was a summer student at the Weizmann Institute of Science in Rehovot, Israel, processing electron micrographs of a particular protein structure made by a particular bacterium. It’s very interesting: this bacterium attacks plants and injects some genetic material into the plant, causing the plant to start manufacturing food for the bacterium. By replacing the “payload” of this protein structure, the mechanism can be used to insert other genetic structures into the plant, instead of the sequence that causes it to produce food for the bacterium. Or something like that.

Here’s a random chunk of my research journal from those days: https://www.ocf.berkeley.edu/~fricke/projects/israel/journal…

The work contributed to this paper: https://www.jbc.org/article/S0021-9258(20)66439-0/fulltext

Here’s the Wikipedia article about the author of that algorithm: https://en.wikipedia.org/wiki/Alan_W._Paeth

And his original web page that I linked to, now via archive.org: https://web.archive.org/web/20050228223159/http://people.ouc…

If you liked this trick, check out Alan Paeth’s “Graphics Gems” series of books.

Kudos and thanks to the OCF at UC Berkeley which has hosted my web page there for more than a quarter century with just about zero maintenance on my part.

And thanks for the trip down memory lane!


Yay, my research field on Hackernews!

Great achievement. The title is of course misleading: The Orbitrap itself (the part that “fits in your hand”) was hardly miniaturized, it’s about the same size as a regular one [0]. The achievement is to miniaturize “mass, volume and” (IMO especially!) “power requirements” of the box around it (which, even miniaturized, does not fit in your hand). This runs at 41W and weighs 8kg. A commercial instrument runs at a total of ~2kW and weighs >100kg.

(Though in space they have the convenient advantage that no extra vacuum system is needed, which makes up a lot of space and energy consumption of these instruments here on Earth. The atmosphere on Europa is conveniently just about the “natural” operating conditions of an Orbitrap, which is required for its high accuracy.)

[0] https://commons.wikimedia.org/wiki/File:Orbitrap_Mass_Analyz…


Hi there! I work on the TypeScript team and I respect your feedback. Of course I do think TypeScript is worth it, and I’ll try to address some of the points you’ve raised with my thoughts.

i. Dependency management is indeed frustrating. TypeScript doesn’t create a new major version for every more-advanced check. In cases where inference might improve or new analyses are added, we run the risk of affecting existing builds. My best advice on this front is to lock to a specific minor version of TS.

ii. My anecdotal experience is that library documentation could indeed be better; however, that’s been the case with JavaScript libraries regardless of types.

iii. Our error messages need to get better – I’m in full agreement with you. Often a concrete repro is a good way to get us thinking. Our error reporting system can often take shortcuts to provide a good error message when we recognize a pattern.

iv. Compilation can be a burden from tooling overhead. For the front-end, it is usually less of a pain since tools like esbuild and swc are making these so much faster and seamless (assuming you’re bundling anyway – which is likely if you use npm). For a platform like Node.js, it is admittedly still a bit annoying. You can still use those tools, or you can even use TypeScript for type-checking `.js` files with JSDoc. Long-term, we’ve been investigating ways to bring type annotations to JavaScript itself and checked by TypeScript – but that might be years away.

I know that these points might not give you back the time you spent working on these issues – but maybe they’ll help avoid the same frustrations in the future.

If you have any other thoughts or want to dig into specifics, feel free to reach out at Daniel MyLastName at Microsoft .


Any of her descendants here on HN? She had 11 and it’s been more than half a century now, so there should be at least one or two HN readers.


The basic problem, as I’ve written before[1][2], is that, after I put in Nagle’s algorithm, Berkeley put in delayed ACKs. Delayed ACKs delay sending an empty ACK packet for a short, fixed period based on human typing speed, maybe 100ms. This was a hack Berkeley put in to handle large numbers of dumb terminals going in to time-sharing computers using terminal to Ethernet concentrators. Without delayed ACKs, each keystroke sent a datagram with one payload byte, and got a datagram back with no payload, just an ACK, followed shortly thereafter by a datagram with one echoed character. So they got a 30% load reduction for their TELNET application.

Both of those algorithms should never be on at the same time. But they usually are.

Linux has a socket option, TCP_QUICKACK, to turn off delayed ACKs. But it’s very strange. The documentation is kind of vague, but apparently you have to re-enable it regularly.[3]

Sigh.

[1] https://news.ycombinator.com/item?id=10608356

[2] https://developers.slashdot.org/comments.pl?cid=14515105&sid…

[3] https://stackoverflow.com/questions/46587168/when-during-the…


I used to work for Sherwin-Williams. The in-store computers run some custom *nix OS. The software that company runs on is a text based ui that hasn’t changed since it was introduced in the 90s.

They released a major update in 2020 that allowed you to move windows around the screen. It was groundbreaking.

But let me tell you, this system was absolutely terrible. All the machines were full x86 desktops with no hard drive, they netbooted from the manager’s computer. Why not a thin client? A mystery.

The system stored a local cache of the database, which is only superficially useful. The cache is always several days, weeks, or months out of date, depending on what data you need. Most functions require querying the database hosted at corporate HQ in Cleveland. That link is up about 90% of the time, and when it’s down, every store in the country is crippled.

It crashed frequently and is fundamentally incapable of concurrent access: if an order is open on the mixing station, you cannot access that order to bill the customer, and you can’t access their account at all. Frequently, the system loses track of which records are open, requiring the manager manually override the DB lock just to bill an order.

If a store has been operating for more than a couple of years, the DB gets bloated or fragmented or something, and the entire system slows to a crawl. It takes minutes to open an order.

Which is all to say it’s a bad system that cannot support their current scale of business.


As someone who went through western European modernist composition school (and who while entering it fully believed in the supremacy of modernist music) it was a painful process to notice that most of the post IIWW (“classical”) music that was composed in Europe is so unconnected to – not just wider audience – but what I’d call a physical aspect of music: pulse and resonance. Most of it just makes the audience feel anxious and confused, if they even are able to pay attention.

It was then like waking up from a nightmare when I discovered the American school of minimalism. It restored my faith into art music and that writing for classical instruments still makes sense in the 21th century. So, it makes me wonder why the article doesn’t mention what I think the greatest masterpiece of aleatoric music which was the starting point of minimalist music – Terry Riley’s In C [1]. It comes from other continent than the original aleatoric music and it’s aesthetic is “bit” different but it has aleatoric structure – and also listening it is joyful experience which I can’t say about most of the examples linked in the article.

1: https://youtu.be/DpYBhX0UH04


Amazingly brilliant work, especially given the CPU capabilities at the time. Carmack’s use of BSP trees inspired my own work on the Crash Bandicoot renderer. I was also really intrigued by Seth Teller’s Ph.D. thesis on Precomputed Visibility Sets though I knew that would never run on home console hardware.

None of these techniques is relevant anymore given that all the hardware has Z buffers, obviating the need to explicitly order the polygons during the rendering process. But at the time (mid 90s) it was arguably the key problem 3D game developers needed to solve. (The other was camera control; for Crash Andy Gavin did that.)

A key insight is that sorting polygons correctly is inherently O(N^2), not O(N lg N) as most would initially assume. This is because polygon overlap is not a transitive property (A in front of B and B in front of C does NOT imply A in front of C, due to cyclic overlap.) This means you can’t use O(N lg N) sorting, which in turn means sorting 1000 polygons requires a million comparisons — infeasible for hardware at the time.

This is why many games from that era (3DO, PS1, etc) suffer from polygons that flicker back and forth, in front of and behind each other: most games used bucket sorting, which is O(N) but only approximate, and not stable frame to frame.

The handful of games that did something more clever to enable correct polygon sorting (Doom, Crash and I’m sure a few others) looked much better as a result.

Finally, just to screw with other developers, I generated a giant file of random data to fill up the Crash 1 CD and labeled it “bsptree.dat”. I feel a bit guilty about that given that everyone has to download it when installing the game from the internet, even though it is completely useless to the game.


In 2015 I was working at a “fintech” company and a leap second was announced. It was scheduled for a Wednesday, unlike all others before which had happened on the weekend, when markets were closed.

When the previous leap second was applied, a bunch of our Linux servers had kernel panics for some reason, so needless to say everyone was really concerned about a leap second happening during trading hours.

So I was assigned to make sure nothing bad would happen. I spent a month in the lab, simulating the leap second by fast forwarding clocks for all our different applications, testing different NTP implementations (I like chrony, for what it’s worth). I had heaps of meetings with our partners trying to figure out what their plans were (they had none), and test what would happen if their clocks went backwards. I had to learn about how to install the leap seconds file into a bunch of software I never even knew existed, write various recovery scripts, and at one point was knee-deep in ntpd and Solaris kernel code.

After all that, the day before it was scheduled, the whole trading world agreed to halt the markets for 15 minutes before/after the leap second, so all my work was for nothing. I’m not sure what the moral is here, if there is one.


The article describes how Apple included support for the x86 parity flag which comes from the 8080. Parity is relatively expensive to compute, requiring XOR of all the bits, so it’s not an obvious thing to include in a processor. So why did early Intel processors have it? The reason is older than the 8080.

The Datapoint 2200 was a programmable computer terminal announced in 1970 with an 8-bit serial processor implemented in TTL chips. Because it was used as a terminal, they included parity for ASCII communication. Because it was a serial processor, it was little-endian, starting with the lowest bit. The makers talked to Intel and Texas Instruments to see if the board of TTL chips could be replaced with a single-chip processor. Both manufacturers cloned the existing Datapoint architecture. Texas Instruments produced the TMX 1795 microprocessor chip and slightly later, Intel produced the 8008 chip. Datapoint rejected both chips and stayed with TTL, which was considerably faster. (A good decision in the short term but very bad in the long term.) Texas Instruments couldn’t find another buyer for the TMX 1795 so it vanished into obscurity. Intel, however, decided to sell the 8008 as a general-purpose processor, changing computing forever.

Intel improved the 8008 to create the 8080. Intel planned to change the world with the 32-bit iAPX 432 processor which implemented object-oriented programming and garbage collection in hardware. However, the 432 was delayed, so they introduced the 8086 as a temporary stop-gap, a 16-bit chip that supported translated 8080 assembly code. Necessarily, the 8086 included the parity flag and little endian order for compatibility. Of course, the 8086 was hugely popular and the iAPX 432 was a failure. The 8086 led to the x86 architecture that is so popular today.

So that’s the history of why x86 has a parity bit and little-endian order, features that don’t make a lot of sense now but completely made sense for the Datapoint 2200. Essentially, Apple is putting features into their processor for compatibility with a terminal from 1971.


I’ve been chasing infrasonic ranges in home audio for over 2 decades. You can’t “detect” these frequencies in the normal way. You experience them by way of your physical environment being excited by them. Feeling pressure waves move through whatever you are standing/sitting on can add an entire new dimension to the experience.

I used to run experiments with friends and family using a 800L ported subwoofer tuned to ~13Hz with a 40Hz cutoff. Not one person would mistake it for being on vs off. Certain content makes these frequencies substantially more obvious. Classical music performed in large concert halls is one surprising candidate outside of Mission Impossible scenes. Being able to “feel” the original auditorium in your listening room is a very cool effect to me.

Read More