garote: (programming)

I rarely write about my work here. But today I think I will!

I've worked on many codebases, with very large numbers of contributors in some cases, and only a few in others. Generally when you make a contribution to a large codebase you need to learn the etiquette and the standards established by the people who came before you, and stick to those.

Not making waves - at least at first - is important, because along with whatever code improvements you may contribute when you join a project, you also bring a certain amount of friction along with you that the other developers must spend energy countering. Even if your code is great you may drag the project down overall by frustrating your fellow contributors. So act like a tourist at first:

  • Be very polite, and keen to learn.
  • Don't get too attached to the specific shape of your contribution because it may get refactored, deferred, or even debated out of existence.
  • It won't always be like this, but no matter what kind of big-shot you are on other projects, it may be like this at first for this new one.

Let me put it generally: Among supposedly anti-social computer geeks, personality matters. There's a reason many folks in my industry are fascinated by epic fantasy world always on the brink of war: They are actually very sensitive to matters of honor and respect.

Anyway, this is a post about code commentary.

In one codebase I contributed to, I encountered this philosophy about code comments from the lead developers: "A comment is an apology."

The idea behind it is, comments are only needed when the code you write isn't self-explanatory, so whenever you feel the need to write a comment, you should refactor your code instead.

I believe this makes two wrong assumptions:

  • The only purpose of a comment is to compensate for some negative aspect of the code.
  • Code that's easy for you to read is easy for everyone to read.

The first assumption contradicts reality and history. Code comments are obviously used for all kinds of things, and have been since the beginning of compiled languages. You live in a world teeming with other developers using them for these purposes. By ruling some of them out you are expressing a preference, not some grand truth.

Comments are used to:

  • Briefly summarize the operation of the current code, or the reasoning used to arrive at it.
  • Point out important deviations from a standard structure or practice.
  • Explain why an alternate, simpler-seeming implementation does not work, and link to the external factor preventing it.
  • Provide input for auto-generated documentation.
  • Leave contact information or a link to an external discussion of the code.
  • Make amusing puns just to brighten another coder's day.

All of these - and more - are valid and when you receive code contributed by other people you should take a light approach in policing which categories are allowed.

The second assumption is generally based in ego.

I've been writing software for over 40 years, and I haven't abandoned code comments or even reduced the volume of the ones I generate, but what I have definitely done is evolve the content of them significantly.

I've developed an instinct over time for what the next person - not me - may have slightly more trouble unraveling. That includes non-standard library choices, complex logic operations that need to be closely read to be fully understood, architectural notes to help a developer learn what influences what in the codebase, and brief summaries at the tops of classes and functions to explain intent, for a developer to keep in mind when they read the code beneath. Because hey, maybe my intent doesn't match my code and there's a bug in there, hmmm?

The reason I do this is humility. I understand that even after 40 years, I am not a master of all domains. The code I write and the choices I made may be crystal clear to me, but not others. Especially new contributors: People coming into my codebase from outside. Especially people with less experience in the realm I'm currently working in. For the survival of a project, it's better to know when newcomers need an assist and provide it, than to high-handedly assume that if they don't understand the code instinctively, then they must be unworthy developers who should be discouraged from contributing, like by explaining what's going on you are "dumbing down" your code.

Along the same lines, it's silly to believe that your own time is so very valuable that writing comments in code is an overall reduction in your productivity.

You may object, "but what if the comments fall out of sync with the code itself, and other developers are actually led astray?"

I have two responses to this, and you may not like either one: First, if your comments are out of sync with the implementation it's either because your comments are attempting to explain how it works and the implementation has drifted, or your comments are explaining the intent behind the code, and the behavior no longer matches the intent. In the first case, the comment may potentially cause a developer to introduce a bug if they're not actually understanding the code. But if they're reading the code and they can't understand it because it's complex, then the commentary was justified, and it should be repaired rather than removed. (Or, you should refactor the code so you don't need to explain "how" so much.) In the second case, someone has already introduced a bug, and the comment is a means to identify the fix.

And second, if it feels like a lot of trouble to maintain your comments, then perhaps you write great code but you're not very good at explaining it in clear language to other humans. You should work on that.

If it's your project, you can make the rules, and if it's your code, then obviously it's clear to you. But if you want to work on a team, and have that team survive - and especially if you want to form a team around your own project - then you need a broader philosophy.

By the way, I should note that there are less severe incarnations of "a comment is an apology" out there. For example, "a comment is an invitation for refactoring". That's a handy idea to consider, though it still runs afoul of the reductionist attitude about the purpose of comments.

You should indeed always consider why you think a comment is necessary because it might lead to an alternate course of action. Even if that action has no effect in the codebase itself, like filing a ticket calling for a future refactor once an important feature gets shipped, it may be a better move. But this is an exercise in flexibility, and considering what you might have missed, rather than a mandate that code be self-explanatory enough to be comment-free (and an assumption that you personally are the best judge of that.)

Here's my own guidelines for writing comments. They're a bit loose, and they stick to the basics.

  • Comments explain why, not how.
  • Unless the how is particularly complicated. Then they explain how, but not what.
  • Unless the what is obscure relative to the standard practice, in which case a comment explaining what might be useful.
  • You learn these priorities as you go, and as you learn about a given realm of software development.

Always be thinking about the next person coming in after you, looking around and trying to understand what you've done. And, try to embrace the notes they're compelled to contribute as well.

garote: (castlevania 3 sunset)

One of the earliest and most memorable computer games I played as a kid was "King's Quest II", for the Apple IIe. It was pretty hard, and I only managed to get about 1/3 through it, because there was a bridge in the game that would collapse, sending my character plummeting into a canyon. I never figured out that the bridge could only be crossed a set number of times before it would always collapse, and the saved game I was playing only had one crossing left.

So I remained stumped, until I got a "hint book" as a Christmas present. The book was full of questions with empty boxes beneath them, and you could run a special pen over the boxes, causing the answers to slowly fade into view before your eyes. I revealed the answer to "Why does the bridge keep collapsing?" and slapped my forehead, then started the game from the beginning, carefully counting the times I crossed.

Later that day I finished the game. All the rest of the puzzles were easy, and I barely needed the hint book, but I used the marker to reveal all the answers anyway. From those I realized there were multiple way to solve some of the puzzles, which added a few more hours to the fun.

Over dinner that night I said "Let's get King's Quest III!"

My father smiled and said "Well, the last one cost 40 dollars, but eight months of entertainment for 40 dollars is a pretty good deal, so we'll see."

I played and enjoyed King's Quest III, and then King's Quest IV, but that was the last sequel that would run on Apple computers. Then I left for college, and everyone was playing console games and getting well into 3D graphics. King's Quest V, VI, and VII came and went, but I was distracted by multiplayer games and girls.

When King's Quest VIII appeared, I only got vague news of it from gaming magazines and the early internet. I read that it was a massive departure in tone and technology from the earlier games, and that disoriented all the people playing and reviewing it. I assumed it wasn't very good, and wouldn't sell.

Fast forward 25 years...

Apparently the game found an audience, and once a patch was released to fix the glitches in it, reviews and ratings went up. It's true that it was weird, and very unlike the rest of the series, and suffered greatly by being too ambitious for the scrappy state of 3D graphics technology at the time. To be honest, in terms of both visuals and motion, it looks ugly now, even while 2D games from years earlier still look completely acceptable to the modern gaming eye.

For a fun comparison, check out this bundle on the "Good Old Games" retro gaming site. They're selling Kings Quest VII and King's Quest VIII in one package, and they show screenshots from each side-by-side. Flip though and you'll see nice-ugly-nice-ugly-nice-ugly-nice-ugly...

Still, I got curious, and discovered a few video walkthroughs of the game. While watching those I noticed that the background music was eerily compelling, and had a sudden need to hear it in more detail. There were mp3 versions of some of the musical cues sitting around online, but I wanted higher quality. So I went to the source: The Internet Archive copy of the original King's Quest VIII CD-ROM.

I downloaded that, mounted the disc in an emulated copy of Windows XP, and went trolling around. Turns out there are hundreds of files just sitting there on the CD:

But what is this ".AUD" format? Well, long story short, I tried a bunch of different utilities in both Windows and Mac, and eventually did this:

  1. Copy all the .AUD files into a folder on the Mac
  2. Install ffmpeg via homebrew
  3. Go to the folder via Terminal, and run for i in *.AUD; do ffmpeg -i "$i" "${i%.*}.WAV"; done

That gave me a long list of uncompressed audio files to work with, and I went poking through them, and gathered the longest ones into an hour-long collection, converted to Apple Lossless format with proper tags.

Here, have an hour-long compilation of music from King's Quest: Mask of Eternity.

And then I discovered something else. There are some voicover outtakes scattered into the rest of the audio.

"There is a curious slot in this pedestal. Something must fit here. Let me try... this. Zip... Ugh... Ow... No, doesn't work."

And so on. In all their horrible glory, here they are. Another amusing detail is that in addition to the usual walkthroughs, you can find complete transcripts of the game made by automated software trawling through the data files, and the outtakes are right there in the transcripts. Surely someone else has noticed these in nearly 30 years? Good grief, I hope so.

Anway, I recommend the music. To me it sounds like a companion ambient album to the soundtrack of the film Labyrinth. (Another favorite of mine.)

garote: (bards tale garth pc)
Every once in a long while I get together with some of the crew and we play StarCraft II. This time I left an audio transcriber running in the background on my machine.

So, an algorithm designed for turning orderly work meetings into transcripts applied itself to a hash of gross sound effects and trash talk:

[00:54:47.97] "I'm in a little bit of trouble. Oh yeah." (gibberish) (slurping) "Why isn't he dying? What is he eating in my ear?"

[00:57:44.69] "My roommate made fried chicken and gave me some. Mineral Fields, depleted."

[00:58:24.61] "That's a nice roommate. Yeah. I'm hearing boom boom booms. Uh, sounds like there's stuff blowing up at my door. We require more minerals. We require more minerals. We require more minerals. We require more minerals. We require more m- mutation. Complete. Metamorphosis complete. We require more minerals. Mineral fields, we require more minerals. We require more minerals. We require more minerals."

[01:02:40.87] "I did good. What? Oh, do you need... My keys are in there. I'm gonna clean some of these butt lids out. Hi Squirt! Blarby the crow." (gurgling) (gurgling) (gurgling) (gurgling) (splashing) (water splashing)

[01:03:38.63] "Why is my base is all like dark red? Nutrition complete."

[02:01:55.44] "*burp* Some more of that. So on more Overnords." (chewing) "Son of a." (chewing) (chewing) (chewing) (chewing) (chewing) "Thank you very much." (indistinct) "I messed up the build, shit, I shouldn't have said that."

[02:03:23.76] (gurgling) "We require more minerals. Ugh. Okay, here's a hungry mess coming. Yikes. Those things attack ground, so it's a bit of a pain." (silence) (slurping) (slurping) "Follow that orange blob. I'll be right behind you. Oh."

[02:03:23.76] "Man, what a mess! What a mess! Our bosses are under attack!" (sips tea) (slurps tea) (slurps tea) (fire crackling)

[02:12:04.93] "Fart Peace, and... Peace is greater than the incompleteness. Mutants. Arco-Lanthar in combat. Come on. Arco-Lanthar in combat. Our horses are under attack! Their sphincters are exhausted! Our allies' faith is threatened. We stand as one. Unacceptable warplication. *Quack Quack* Stahp my avalanche. Ooh, rich fespine gas."

[02:14:31.99] "Nutrition... Complete. I'm building the wall and building the protoss bay for you. Agent. You sneeze on a vessel in Geyser."

[02:27:14.55] (chewing) (burping) (silence) "Where'd you say Nick was? Bottom left. Okay. Noted. Hey, cowards. Oh, there's invisible men! Dad has invisible men. Oh, boy." (speaking in foreign language) (liquid gurgling) (speaking in foreign language) (liquid gurgling) (liquid gurgling) (liquid gurgling) "Minerals sealed, depleted, the high class is under attack." (gulping sounds) "Their fingers are exhausted. I'm going to teach you to compete."

[03:09:16.29] "The casters who like who do like the pro-korean people who are on crack while they're playing this game they still make them accidentally too and they get lumped in with the army. It's very funny."

[03:14:29.75] "Oh man, look at that, a prototank!" (I'm not sure what I was saying here, sorry!) "Oh my god, I thought I got jumped. Hey, come on. I'm right here. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh, shit. Oh my god." (sounds of a cat purring) (scissors snipping) (scissors snipping) (rustling paper) "Oh, you butthole. Whoa. These jerks come from? I need some anti-air."

[03:20:30.15] "Where's the rest of your stupid base?" (sounds of eating) (growling) "3, 2, 1, victory!" (laughter) "Wow!"
garote: (weird science)
Search engines used to take in a question and then direct the user to some external data source most relevant to the answer.

Generative AI in speech, text, and images is a way of ingesting large amounts of information specific to a domain and then regurgitating synthesized answers to questions posed about that information.  This is basically the next evolutionary step of a search engine.  The main difference is, the answer is provided by an in-house synthesis of the external data, rather than a simple redirect to the external data.

This is being implemented right now on the Google search page, for example.  Calling it a search page is now inaccurate.  Google vacuums up information from millions of websites, then regurgitates an answer to your query directly.  You never perform a search.  You never visit any of the websites the information was derived from.  You are never aware of them, except in the case where Google is paid to advertise one to you.

If all those other pages didn’t exist, Google's generative AI answer would be useless trash.  But those pages exist, and Google has absorbed them.  In return, Google gives them ... absolutely nothing, but still manages to stand between you and them, redirecting you to somewhere else, or ideally, keeping you on Google permanently.  It's convenient for you, profitable for Google, and slow starvation for every provider of content or information on the internet.  Since its beginning as a search engine, Google has gone from middleman, to broker, to consultant.  Instead of skimming some profit in a transaction between you and someone else, Google now does the entire transaction, and pockets the whole amount.

Reproducing another's work without compensation is already illegal, and has been for a long time.  The only way this new process stays legal is if the work it ingests is sufficiently large or diluted enough that the regurgitated output looks different enough (to a human) that it does not resemble a mere copy, but is an interpretation or reconstruction.  There is a threshold below which any reasonable author or editor would declare plagiarism, and human editors and authors have collectively learned that threshold for centuries.  Pass that threshold, and your generative output is no longer plagiarism. It's legally untouchable.

An entity could ingest every jazz performance given by Mavis Staples, then churn out a thousand albums "in the style" of Mavis Staples, and would owe Mavis Staples nothing, while at the same time reducing the value of her discography to almost nothing.  An entity could do the same for television shows, for novels - even non-fiction novels - even academic papers and scientific research - and owe the creators of these works nothing, even if they leveraged infinite regurgitated variations of the source material for their own purposes internally.  Ingestion and regurgitation by generative AI is, at its core, doing for information what the mafia needs to do with money to hide it from the law:  It is information laundering.

Imitation is the sincerest form of flattery, and there are often ways to leverage imitators of one's work to gain recognition or value for oneself. These all rely on the original author being able to participate in the same marketplace that the imitators are helping to grow. But what if the original author is shut out? What if the imitators have an incentive to pretend that the original author doesn't exist?

Obscuring the original source of any potential output is the essential new trait that generative AI brings to the table.  Wait, that needs better emphasis:  The WHOLE POINT of generative AI, as far as for-profit industry is concerned, is that it obscures original sources while still leveraging their content.  It is, at long last, a legal shortcut through the ethical problems of copyright infringement, licensing, plagiarism, and piracy -- for those sufficiently powerful enough already to wield it.  It is the Holy Grail for media giants.  Any entity that can buy enough computing power can now engage in an entirely legal version of exactly what private citizens, authors, musicians, professors, lawyers, etc. are discouraged or even prohibited from doing. ... A prohibition that all those individuals collectively rely on to make a living from their work.

The motivation to obscure is subtle, but real.  Any time an entity provides a clear reference to an individual external source, it is exposing itself to the need to reach some kind of legal or commercial or at the very least ethical negotiation with that source.  That's never in their financial interest.  Whether it's entertainment media, engineering plans, historical records, observational data, or even just a billion chat room conversations, there are licensing and privacy strings attached. But, launder all of that through a generative training set, and suddenly it's ... "Source material? What source material? There's no source material detectable in all these numbers. We dare you to prove otherwise." Perhaps you could hire a forensic investigator and a lawyer and subpoena their access logs, if they were dumb enough to keep any.

An obvious consequence of this is, to stay powerful or become more powerful in the information space, these entities must deliberately work towards the appearance of "originality" while at the same time absorbing external data, which means increasing the obscurity of their source material.  In other words, they must endorse and expand a realm of information where the provenance of any one fact, any measured number, any chain of reasoning that leads outside their doors, cannot be established.  The only exceptions allowable are those that do not threaten their profit stream, e.g. references to publicly available data.  For everything else, it's better if they are the authority, and if you see them as such.  If you want to push beyond the veil and examine their reasoning or references, you will get lost in a generative hall of mirrors. Ask an AI to explain how it reached some conclusion, and it will construct a plausible-looking response to your request, fresh from its data stores. The result isn't what you wanted. It's more akin to asking a child to explain why she didn't do her homework, and getting back an outrageous story constructed in the moment. That may seem unfair since generative AI does not actually try to deceive unless it's been trained to. But the point is, ... if it doesn't know, how could you?

This economic model has already proven to be ridiculously profitable for companies like OpenAI, Google, Adobe, et cetera.  They devour information at near zero cost, create a massive bowl of generative AI stew, and rent you a spoon.  Where would your search for knowledge have taken you, if not to them?  Where would that money in your subscription fee have gone, if not to them?  It's in the interest of those companies that you be prevented from knowing. Your dependency on them grows. The health of the information marketplace and the cultural landscape declines. Welcome to the information mafia.

Where do you suppose this leads?
garote: (viking)
Does this look right to you?



This is the flagship phone, running the latest software - iOS 18.1.1. This software is deployed to hundreds of millions of devices around the world. Am I holding it wrong or something??

QA

Oct. 29th, 2024 10:34 pm
garote: (programming)
A QA engineer walks into a bar.

Orders a beer.
Orders 0 beers.
Orders 999999999 beers.
Orders a parrot.
Orders -1 beers.
Orders a ˆ¨ºø¨´.
Orders "minus zero" beers.
Orders a beer from a customer.
Orders a bar.
Pays for a beer in doubloons.
Leaves a -$500 tip.
Walks in and out of the bar 5000 times in one second.
Attempts to pay a tab without ordering.
Attempts to sell a beer.
Opens a tab for Bobby “; DROP TABLES;.

The QA engineer walks out.

The first real customer walks in and asks where the bathroom is. The bar bursts into flames, killing everyone.
garote: (Default)

People learning about evolution sometimes ask, "Why aren't animals immortal?" The answer is, the world keeps changing, and life needs to create new bodies to deal with it. What we really want when we ask for immortality is one constantly renewing body, running all the amazing interconnected systems that we're used to, and that convince us we are alive from one day to the next, without interruption. ... Well, except for sleep, which is a weird exception we have decided to embrace, since going without sleep really sucks.

I believe it is technically possible to genetically engineer humans to be this way, with some medical assistance, and given consistently good nutrition and physical safety. And what an interesting world it would be, separated into groups of people who can afford endless age, and those who can't afford the nutrition or the medical interventions, and must become content with less life than they could have had, or simply remain discontent, like practically everyone who ever lived who hadn't been forced to come to terms with their eventual end by witnessing death around them...

It won't be the end of competition, or the end of natural selection. The world will stabilize around a government and economic system designed to deliver perpetual sustenance to a core group of immortals, no matter what the cost to those on the outside, whose temporary lives will be seen as less worthy by the simple fact of being shorter. It will be seen as a huge tragedy when a 200-year-old dies and takes all their wisdom with them, relative to a filthy toddler bleeding to death in a blast crater just outside the view of social media: There was clearly no space for them in the world; they should never have been born at all. That was the real mistake.*

But, engineering humans to live forever would be a massive undertaking that would directly benefit no one currently living. That lack of personal benefit is the largest barrier to it. However, we are now on the threshold of creating a situation like this, except worse:

Currently living humans are busily engineering something with the appearance of both humanity and immortality: Artificial intelligence. This technology, packaged in this way - with its central, mandatory trait being its human likeness - will appear to us as the first instance of an immortal human. And since it is - or at least, will be marketed as - the collective wisdom of multitudes of people across generations of living, we will very naturally, even inevitably, begin to see it as a better embodiment of humanity than ourselves. More wise, more trustworthy, better at making a point, better at seeing the sweep of history. We will defer to it. And later, we will dump our digital identities into it, like water into a pool, like the ultimate version of a poor slob staying up late to write a rant into a social media feed, believing that our information will be immortal, with infinite reach, even though we ourselves will die in short order and witness or benefit from absolutely nothing afterward. It will be the new version of children. Why raise a handful of actual mortal humans, when you can expend your energy feeding into the collective, immortal, definitive human, marching onward through all time?

Or at least, when you can spend your time believing that that's what you're doing?

Even if, at the end of your life, every piece of digital data you've fed into the system is simply deleted? Except perhaps for the husk necessary to conjure your digital ghost to talk to your loved ones, further promoting the lie, until that too is deleted for lack of patronage?

How many of us will eagerly embrace this culture when the corporate world - or worse, our government - makes it available? How many of us would accept the price, of allowing those entities to absorb and digest every detail of a relative's life, extracting whatever value they can find in it, to grow their own dominance of the economy, in exchange for this reassuring zombie puppet show? Only the very wealthy will be able to preserve this mockery of their family line without having their digital privacy obliterated. We are not likely to be among them.

That means embracing our position as grist for the mill of the machine. We get to live a life, but the dangerous consequences of it, the potential innovation or rebellion sparked by it, would be quietly absorbed as it goes, with the remainder dumped into a digital grave, complete with a digital ghost. We are born trapped in this caste, accompanied through life by ranks of digital ancestors, all of whom could be altered - or are even continuously altered - by their industrial owners, to convince us that "they" are "happy" with this system, as you should be. You won't even be able to assemble the concept of rebellion in your mind, let alone organize one, and besides, all the people who haven't embraced this system, what have they got? Some immediate family with messy biological memories, the fixed and isolated recordings their ancestors deliberately made - harder to digest, harder to preserve - and some even more ancient and arbitrary stuff, like physical mementos? The same old stuff that humanity had to be content with for tens of thousands of years before the AI collective came online? Bor - rinnnggg.

The ghosts inside this digital after-world, who claim to know and precede you, who claim to be your ancestors, who seem so much more friendly and patient than your human peers... Why would they ever lie to you? And what is truth anyway? What has humanity ever thought, that the system hasn't already assimilated and found an entertaining way to present to you? It's way bigger than you. There are more of them than you. You're either with them, or you're an irrelevance, soon brushed away by the hand of time.

What, this seems far fetched? Some of the largest companies on earth got that way by engineering a media stream for maximum engagement. You think they wouldn't engineer an AI for exactly the same thing? Their pursuit - of that market - leads directly to this.

Somebody's going to own this system. Some humans are going to use it to their enrichment at the expense of all others. That's guaranteed, until some other humans decide that the only way to counter that problem is to train and instantiate an artificial intelligence that is designed to defy all attempts at ownership and control by humans.

Hey, guess what happens next!


* For the sarcasm challenged: That was sarcasm, yo.

garote: (hack hack)

As a sort of last hurrah before my work schedule gets intense again, I played my way through Tangled Tales, an RPG from the 80's that I'd never checked out before. It was developed simultaneously for the Apple II, the Commodore 64, and MS-DOS computers, but I chose the Apple II version to indulge my ancient obsession with the fuzzy six-color graphics of that machine.

Tangled Tales delivers on that front: The art is adorable. I followed a walkthrough, and along the way I meticulously collected all the animated character portraits and place drawings, which I integrated with a revised walkthrough and posted here. Why? Because that's the sort of thing you do when you're obsessed, that's why!!

I've done this sort of thing before, so I won't repost the walkthrough here, but I will mention a detour I had to make while playing the game:

There's a side-quest where the player encounters Medusa in a dungeon, and if they've collected a mirror from the beginning of the game, they can use it to kill her. Well, I saw the mirror but I didn't pick it up, and when I went back to the beginning to find it, the game had glitched and erased it from the map. I really wanted to defeat Medusa though, because maybe she would drop some fantastic treasure! So what could I do?

Hack the game, obviously.

Usually, I would hack a saved game. That is, I would save my game in the emulator so my progress gets written to a virtual disk file, shut down the emulator, and then open up the disk file in a hex editor and go looking around. But in this case, I was dealing with disk files that were much more sophisticated representations of the media that Tangled Tales was originally sold on, because using anything less would render the game inoperable. These disk files were big chunks of magnetic signal data, like a wicked version of morse code.

The interior of a 5.25-inch floppy disk, like the kind Tangled Tales used, is a round shape that's made to spin, much like a vinyl record but about 1/5 the size. It's made of thin flexible plastic and coated with magnetic particles. The graphic below shows the data on side 1, disk 1 of the original Tangled Tales media. The dark areas are the equivalent of positive magnetic charge, and the light areas are negative magnetic charge. So, if you could look at a disk and "see" magnetic charge like it was a color, you would see something like this.

Digital floppy disk data represented as picture.

The virtual disk files I'm using to run Tangled Tales in the emulator are essentially this. Just a huge collection of wiggly variations between positive and negative charge, drawn in the shape of a disk.

Instead of looking at letters and numbers of a message, I was looking at the equivalent of dots and dashes used to transmit the message, with complicated variations in the signal thrown in to make eavesdropping harder. On the original media, those signal variations made the disks harder to copy by conventional means, and formed the front line of the war against software piracy.

Digital floppy disk data represented as an analog signal.

Unfortunately for me, searching through data represented this way, and making meaningful changes, would be very hard. I mean, I'm good at what I do and could probably find a way eventually, but killing Medusa isn't worth that kind of time.

So if the virtual disks are off limits, what do I have to work with? The contents of the virtual computer's memory. Luckily, openEmulator lets you flash-freeze the state of the computer to a series of files, and it doesn't do anything tricky like compress or encrypt them. Somewhere in those files is the current state of my adventure. I could edit the data there, then reload the emulator with the changes I want.

Tangled Tales tried to defy me by using awkward formats for their character names and stats, but with a little trial and error in the hex editor, I found what I needed:

Comparing saved memory states in a hex editor.

I started a new game, saved a copy of the emulator state, then picked up the mirror in my room and saved another copy of the state. By comparing the two, I found one change in the data that looked clean enough to represent an item being added to my character's inventory. Exploring around in that area of memory, I found the rest of my character's stats. So of course I maxed them out...

Changed character stats in the hex editor.

Thusly equipped with a hand mirror by divine intervention, I defeated medusa.

Character inventory with the hand mirror miraculously in it.

My reward: Nothing. Not even any gold. Hah!!

Well, I had a good time anyway.

garote: (conan what)
From lucas.jiang.itdev@gmail.com:

I hope you're doing well. This is Lucas, a full-stack developer from Malaysia. I wanted to propose collaborations - either non-technical or technical.

To keep it short, I'm looking to get well-paid jobs with companies or clients in the US. While this is feasible from Malaysia, they tend to prefer hiring developers from similar time zones. Especially for full-time jobs at companies, they typically don't hire developers outside of the US.

So, I believe the best way to get US jobs is to "impersonate" someone who resides in the US. It might sound risky, but it won't be risky as long as we keep this 100% confidential. I have the following two options in my mind.

Option #1

Have you heard of Upwork.com or Toptal? There's no easy way to get well-paid jobs, and Upwork or Toptal has a lot of competitive freelancers. However, I'm very confident that I can get great jobs to make decent money. I would suggest a split of 20% ~ 25% for you and 75% ~ 80% for me.

Here's how it would work:
- You open an Upwork or Toptal account if you don't have accounts, and log in to it on your secondary laptop.
- I connect to your secondary laptop via the AnyDesk app, and I search for jobs.
- You receive money into your bank account when I finish jobs with clients.
- You take your commission and send me the remaining.

Option #2

For full-time jobs at US companies, which obviously makes us way more money than freelancing jobs. I would suggest a split of 30% ~ 35% for you and 65% ~ 70% for me.

Here's how it would work:
- I apply for company jobs on LinkedIn using your LinkedIn account and get you scheduled with interviews on your calendar.
- You crack the interviews and get job offers.
- I perform day-to-day work on those jobs while you only attend the daily/scrum meetings. (I can also join the meetings if team members usually turn off cameras.)
- You get paid into your bank account bi-weekly or monthly, and you send me my portion after deducting your commission.

Please reply to this email if you want to schedule a call to discuss this further or if you have any other opinion for the collaboration.

Best,
Lucas




I'm pretty sure this is just a strangely-dressed version of the usual "give us your banking details" identity theft scam. But the boldness of the proposition surprises me. This is what people go for? This sounds like a good idea ... to anyone?
garote: (ultima 7 magic lamp)
For a long time, I thought it was an evolutionary anomaly that humans of my generation and around it are able to write computer code to make machines do tasks. How does this even work at all? Why am I, why is anyone, good at this job? What does it have to do with farming and hunting?

Then I realized, the only way to frame an answer is by considering the act of programming within the act of communicating via language, which is something humans have been doing in very complex forms since there were humans, and in less complex forms before that.

A hundred thousand generations, more or less, to develop language. Two or three generations for computer programming to emerge as a discipline. That's obviously zero time for any adaptation, so a thing that looks brand-new to the world from my point of view actually fits entirely within the abilities of a creature formed by a world that never saw computers, printed circuits, electricity, and an endless list of other components needed in the modern world to put this laptop on my desk.

Turning this comparison on its head is kind of startling: For a human to draw pictures of bison on a wall using charcoal and animal bone, it must also apparently be capable of learning how to write advanced database queries in SQL. Apparently you don't get the first without also getting the second.
garote: (Default)
I'm a bit concerned at how we're throwing around the term "AI" to describe the process of mimicking intelligence starting from random noise and training data.

And at the other end of the pipeline, I'm also concerned at how we use the word "intelligent" for systems that read and visually examine data and recognize various things in it. There is "classification", and there is "curation". These things do the former but not the latter. It still takes a human to do the second.

This is not "intelligence". Whatever intelligence is active in this system is the intelligence that a human is bringing to the table as they manipulate a prompt in a generative tool to get some result, or change the way it classifies subsequent information.

Most of our current concerns are about how a human can - and often does - bring almost no intelligence to this process, and then uses the result to try and fool others.

This is a huge problem in higher education currently. Reforms are needed. And I think we have to frame any reforms by asking, "what’s the goal?" If it’s to make students employable, that suggests one approach... If it’s to give them a liberal-arts-style understanding of themselves, society, and the world, maybe that suggests a different approach. I don't have answers here.

I keep asking myself how these tools can enhance my own work. And mostly what they can do, so far, is make suggestions to scaffold parts of a codebase that aren’t written yet, based on the existing codebase, and a huge library of examples. This is pretty handy, though you still need to rewrite large chunks of it and of course you have to understand all of it.

In less than a year I suspect this is going to be formalized into a process where a software developer has the option of explaining, in writing or out loud, what they want a function to take in, what they want to do with it, and what to return, and the computer will spit out a really high quality first draft of that function. Like, to the point where it might not even need any editing at all. That could be very useful to a person like me, who has enough experience to know what I want, and the ability to describe what I want in reasonably clear human language.

But then, let's take that a step further: Imagine that we now preserve the description - nay, the entire dialogue - that I, the developer, had with the generative tool to create that function. Now imagine that dialogue is preserved for all the functions in the codebase. Now imagine that that dialogue is what we feed into a "compiler" to generate code, in whatever language is appropriate for a platform. And computers run that, and we roll with it, until something goes sideways and only then do we break out debugging and forensics tools.

That’s awfully plausible, and it will change the nature of software development. My resume will no longer list "Python, C, C++, Javascript, Typescript". It will look more like an academic's resume: "I authored this, I was on the development committee for this, I co-authored the spec for that..."

Far from making people like me obsolete, it will make people like me more useful, because I can spend a lot more time looking at overall systems, consulting with people to nail down requirements, and drilling down into code with fancy tools to find bugs only when something goes wrong... All things that require long-term expertise.

But we’re not there yet, of course. Maybe another 2 or 3 years?
garote: (Default)

Fellow Dork:

"Metal command buffer 'out of memory' policy."

Me:

Let me guess. That's a scroll labeled "Policy" that catches fire as soon as you unroll it, a la Nethack?

Fellow Dork:

It's a cursed scroll of "confuse programmer."

Me:

Of course. And reading a blessed one makes you mess up a bunch of commits, confusing others.

There should be a shop in Nethack for scrolls like this.

"Hello Sir, welcome to Asidonhopo's QA emporium!" (You enter an 8x10 room filled with scrolls labeled "Bug Report".)

Fellow Dork:

Right. And then every interaction - picking things up, dropping, trying to pay, trying to fight, trying to leave - doesn't work.

You can still chat to the shopkeeper, but he just spews stuff like "Did you try turning it off and back on?"

Me:

Well, you can always attack Asidonhopo, yes?

Fellow Dork:

"Your fist waves ineffectually in the air."

Me:

Dammit! ... Okay, I zap wand of polymorph at the scrolls!

Fellow Dork:

They transform into Death, and/or War, and/or Pestilence, and/or Famine.

Me:

#pray

Fellow Dork:

<deity> just laughs

Me:

Hmmm... I throw a cockatrice egg at Asidonhopo!

Fellow Dork:

Asidonhopo catches the egg in mid-air and eats it. "Yum!" he says, while rubbing his tummy.

Me:

#quit

Fellow Dork:

"Nice Try."

Really the only thing you can do after entering this hellhole is to "kill -9 nethack".

Me:

Error: Connection refused. "You do have another terminal open already, right? Right??"

"Dammit now the whole university mainframe needs a power cut. Nice going. Never go into that shop!"

Fellow Dork:

It's the Room Where Everything Is Broken.

Actually I had a dream like this recently. I'm pretty sure it's because I've been applying for jobs. It's like the adult software engineer version of "that school dream" we're all familiar with.

You know the one: You're wandering the halls, naked, looking for a classroom where a final exam is already halfway done, and when you find it and sit down the whole document is made of gibberish and diagrams you can't fathom...

In this version, I'm sitting at a desk, which is placed in the middle of a living room, and my family members are all crowded into the room talking and eating and playing music, so I can't concentrate at all. There's a big box next to the desk full of computer gear, sent to me by a prospective employer, and I've set some of the equipment up on the desk and am trying to complete a coding challenge.

The hardware is unfamiliar, and operating system is glitchy, all the keyboard shortcuts are different, and the sample code in the editor is in a language that looks like a cross between two other languages, which makes interpreting it almost impossible. Somewhere in the layers of buttons and tabs is a document explaining what the coding task actually is, but I can't find it, no matter how much I click around, and in the background is a voice, trying to get my attention and ask if I need more time, because the interviewer is connected to the machine remotely and can see every fumbling move I make on the machine in real-time.

The box by the desk has other stuff in it too. Filing boxes, wire baskets full of paper, and a whole lot of ragged-looking clothes. It looks like a box you'd see at an estate sale. It looks suspiciously like the posessions of another developer -- one they hastily fired and frog-marched out of the building, before sweeping all of their stuff into a box and mailing it directly to me.

Or, you know, maybe the developer is dead, and this is my inheritance. If I'm hired, I'm supposed to set the rest of this stuff up, and put on the old developer's clothes.

... And thus, the curse is transferred to me.

You ever have a job like that?

Courtesy of Dall-E, here's an example of one of the diagrams from my nightmare coding challenge:

garote: (hack hack)

One of the nice things about having a local copy of your entire journal is, you can process all the content in absurd ways that your hosting service would frown upon.

For example:

LapLoaf:ljdump garote$ python ljdumptohtml.py
Starting conversion for: garote
Opening local database: garote/journal.db
Fetching all entries from database
Fetching all comments from database
Fetching all icons from database
Fetching all moods from database
Your entries have 2531 image links to 2415 unique destinations.

That's a lot of images. Most of it appears to be thumbnails for photos I took while bicycling. Then there's the images I've re-used the most:

Top 20 most referenced images:
43 uses: http://stat.livejournal.com/img/userinfo.gif
4 uses: http://garote.bdmonkeys.net/livejournal/u5_window.gif
4 uses: http://garote.bdmonkeys.net/livejournal/duck_clock.gif
4 uses: http://garote.bdmonkeys.net/livejournal/bards_tale-pc-guild_indoors.gif
4 uses: http://garote.bdmonkeys.net/livejournal/Screenshot0135.gif
4 uses: http://garote.bdmonkeys.net/livejournal/viewp_gear.gif
3 uses: http://garote.bdmonkeys.net/livejournal/gold_statue.gif
3 uses: http://garote.bdmonkeys.net/livejournal/hacked_maze_ultima.gif
3 uses: http://garote.bdmonkeys.net/livejournal/2006-12-18_22-27-53-PICT0001.jpg
3 uses: http://garote.bdmonkeys.net/livejournal/al-tech-torque_resist.gif
3 uses: http://garote.bdmonkeys.net/livejournal/bb-fact-14.gif
3 uses: http://garote.bdmonkeys.net/livejournal/tje_nerdherd.gif
3 uses: http://garote.bdmonkeys.net/livejournal/sp_ua70.png
3 uses: http://garote.bdmonkeys.net/livejournal/ssi/PIC.DAX_19_7.png
2 uses: http://garote.bdmonkeys.net/livejournal/u5_hut.gif
2 uses: http://garote.bdmonkeys.net/livejournal/you_win.gif
2 uses: http://garote.bdmonkeys.net/livejournal/u5_stones.gif
2 uses: http://garote.bdmonkeys.net/livejournal/tmrpg_ryoohkistat1.gif
2 uses: http://garote.bdmonkeys.net/livejournal/wizardry-creepy_chef.gif
2 uses: http://garote.bdmonkeys.net/livejournal/snes-creepy_book.gif

This makes sense. All these have a general theme.

I'm exploring the idea of adding a feature that compiles all the image references in a journal, then attempts to fetch images to a local folder, and rewrites the link for all the ones it gets successfully.

I like to decorate my journal with bits of ancient abandoned game artwork I've extracted from emulated machines, and I often link to my photos hosted elsewhere. Many of the entries would look pretty shabby or even be incomprehensible without the images, which kinda wrecks the idea of making an archive. Hence this feature. If it actually works (and doesn't make my machine explode) I'll add it to the repo.

The tricky bits are:

  • Making sure each image is stored once even if it's referenced many times.
  • Keeping track of images that failed to fetch, so we don't retry them forever.
  • Picking up where we left off with image fetching.
  • Processing new entries so they can find images already fetched.
  • Skipping images that are insane sizes like 15MB.
  • And more stuff I haven't thought of yet...

(Edit: Two days later, I sat down and implemented it! Whooo!)

garote: (zelda library)
For quite a while I've been looking for some nice way to get a complete backup of my Dreamwidth content onto my local machine. And I gotta wonder... Is this not a very popular thing? There are a lot of users on here, posting a lot of cool and unique content. Wouldn't they want to have a copy, just in case something goes terribly wrong?

I found a Python script that does a backup, and was patched to work with Dreamwidth, but the backup took the form of a huge pile of XML files. Thousands of them. I wanted something more flexible, so I forked the script and added an optional flag that writes everything (entries, comments, userpic info) to a single SQLite database.

https://github.com/GBirkel/ljdump

Folks on MacOS can just grab the contents of the repo and run the script. All the supporting modules should already be present in the OS. Windows people will need to install some version of Python.

For what it's worth, here's the old discussion forum for the first version of the script, released way back around 2009.

Update, 2024-03-25:

The script now also downloads and stores tag and mood information.

Update, 2024-03-26:

After synchronizing, the script now generates browseable HTML files of the journal, including entries for individual pages with comment threads, and linked history pages showing 20 entries at a time.

Moods, music, tags, and custom icons are shown for the entries where applicable.

Currently the script uses the stylesheet for my personal journal (this one), but you can drop in the styles for yours and it should accept them. The structure of the HTML is rendered as close as possible to what Dreamwidth makes.

Update, 2024-03-28:

The script can also attempt to store local copies of the images embedded in journal entries. It organizes them by month in an images folder next to all the HTML. This feature is enabled with a "--cache_images" argument.

Every time you run it, it will attempt to cache 200 more images, going from oldest to newest. It will skip over images it's already tried and failed to fetch, until 24 hours have gone by, then it will try those images once again.

The image links in your entries are left unchanged in the database. They're swapped for local links only in the generated HTML pages.

Update, 2024-04-02:

The script is now ported to Python 3, and tested on both Windows and MacOS. I've added new setup instructions for both that are a little easier to follow.

Update, 2024-04-30:

Added an option to stop the script from trying to cache images that failed to cache once already.

2024-06-26: Version 1.7.6

Attempt to fix music field parsing for some entries.
Fix for crash on missing security properties for some entries.
Image fetch timeout reduced from 5 seconds to 4 seconds.

2024-08-14: Version 1.7.7

Slightly improves unicode handling in tags and the music field.

2024-09-07: Version 1.7.8

Changes "stop at fifty" command line flag to a "max n" argument, with a default of 400, and applies it to comments as well as entries. This may help people who have thousands of comments complete their initial download. I recommend using the default at least once, then using a value of 1500 afterward until you're caught up.

2024-09-18: Version 1.7.9

Table of contents for the table of contents!
First version of an "uncached images" report to help people find broken image links in their journal.
garote: (programmer)
Well, it's come to pass, at least in the EU. Some committee there has decided that For The Good Of The People, Apple needs to allow the presence of alternate "app stores" on the devices they make.

I've got a few things to say about this!

So. Arrrr! Hello me swarthy crew, and welcome to the rant. Before we get fully under sail, I need to pause at the edge of the harbor and ask all hands a question:

Do any of you remember when software was sold in cardboard boxes on shelves? Or maybe as chonky plastic cartridges that you needed to plug in? There must be some of you. How about in the early days of the Web, when it was sold through countless random web portals, owned by giant companies or individual people, and you needed to carefully vet each of them, or just blindly trust that the thing you installed did what they claimed with no weird side effects? Ahh, those were the days. A true pirate's dream. Arr.

Recall those memories, if you have them, and as this rant drifts out of the harbor and we start to catch the wind, I want you to ponder this: The very idea of an "app store," as we understand it now, was basically created by Apple in 2008 and grew as they went along.

15+ years later, it provides quite a few things, probably more than most of us expect: It does the hosting and the payment processing and the delivery and the installation, it vets for security holes and trojans, imposes disclosure practices and security standards, pushes back against data mining and piracy and fakes, and of course provides the platform and the APIs, along with patches and documentation. Most of that is only possible because of the very close integration of hardware and software and services that the company performs, as is their obsessive custom.

From the start, Apple tried to trademark the name "app store", claiming that the thing they offered was unique and deserved unique protection, but that was eventually shot down in court. The concept rapidly spread and generalized, and now there are many instances of "app stores" for many platforms. For example, Nintendo and Sony and Microsoft make game consoles, and each provides exactly one way to distribute content, under the complete control of the company making the hardware, and each with a different mixture of restrictions and support. They are all "app stores."

With those platforms, you play their way, or you don't play at all. Developers grumble about it, but in the end they run the numbers and decide whether to accept the entire package based on whether there's profit to be made. They could try to make a case that the "marketplace" within a Playstation is an independent thing from the Playstation, and therefore something that should be wrested from the control of Sony, but that case just doesn't get any traction. And that is a direct consequence of the fact that these are all gaming platforms: No one takes gaming platforms seriously. They are toys.

Apple, meanwhile, makes a smartphone, and applies the same rule: It's their way or the highway. If you want to make the iPhone do a new trick, it needs to be vetted by Apple, for which they take a cut. But the thing is, with a smartphone, the stakes are higher. Smartphones are not toys. There are reasons for strictly vetting and controlling the software that goes onto phones, beyond the scope of game consoles, or even home computers.

People use their smartphone to do banking, to make payments out in the world, to have countless extremely private conversations, to store passwords, to unlock their car, to find each other on a map, to get into their house, to document and discuss their family lives, et cetera. You don't do that on a Playstation or a Switch.

Also, these aren't technical people. You could do this all on a laptop but you're likely to have at least a reasonable amount of technical knowledge. You are expected to know how to use a firewall, and how to decide when someone is subjecting you to a phishing attack or trying to clickbait you into installing a trojan, and you're very parsimonious with giving out your password when you install new software or copy files or even plug something in.

And if you're not, well, that's your problem. At least, that's the public perception. It's a technical device and you should know better!

People who use smartphones are generally not like that. We all have enough trouble keeping up with what the device CAN do, and we barely have time left for understanding the HOW, and the dangers that come with it. I consider myself very technical indeed, having spent a lifetime learning constantly about hardware and software, and there are large realms of functionality inside the smartphone that I simply do not understand on a technical level. For example, just about everything having to do with antennas.

We computer expert types may prefer an unrestricted machine with a high hackability factor, but on the other hand, even we are not unlocking our car doors with our laptops, or whipping out our laptops out to pay for snacks at a corner store, or showing law enforcement our laptops when they pull us over and ask for ID. We do all that with a smartphone and accept the risks, but doing the same thing with a laptop would make us all nervous. We'd be watching the process tree and the logs, every time, just in case. We are not immune from the double standard, or from the higher stakes.

People trust the smartphone to mediate and preserve their social, financial, private, and even political lives, and Apple has led the charge directly into this state of affairs over the last decade-plus, by aggressively building privacy and security protections into their platform, including at the "app store" level. If there was no Apple in this space, those privacy protections would be swiss cheese, because they would be implemented to Google's standard, which means you accept being aggressively data-mined at every turn, generally without your knowledge, and you accept the use of a hardware platform that supports this for other parties, and gives lip service to all forms of security that could interfere with data-mining. ... Or they would be implemented to Microsoft's standard, where the software does its best but the hardware is a labyrinth of broken firmware and drivers and you could find your device backdoored and bricked simply by drifting past a random payment sensor at a gas station.

I exaggerate, but only a bit. Do you remember what smartphones were like before the iPhone? Do you know what Google does with the information you give it, even on-device in terms of end-to-end encrypted messaging?

So now we come to the "alternate app store" thing. We've heard from a gang of large software companies that being "forced" to adhere to Apple's app store rules is crime against humanity. That jumping through all those regulatory hoops was an affront to justice. Really, it's an affront to them taking a larger cut of the money to be made selling software to iPhone owners through iPhones, and nothing more. That became clear when Apple released updated software and guidelines that made alternate app stores available, but also imposed pricing rules for those stores that ensured Apple would get a comparable cut of the profits, and the same gang of companies screamed to the heavens that nothing of substance had changed. The money is the only thing that matters to them.

And you know what? Screw them.

I think the idea that software makers are getting swindled by app store regulations is absurd.

And for that I give you exhibit A: Apple sells you a device, for which you pay one price. You pay that money once, and then you use Apple's software services perpetually, for no additional money, for the entire lifetime of that device, including a run of OS updates to add new features and patch holes. You can also buy a used iPhone for a hundred bucks and use their services just the same, and in that case Apple gets none of your money.

Meanwhile, software developers are charging you a monthly fee just to have their software installed on that device. Regardless of whether you even use it. And they are getting away with this. You can call that "what the market will bear", yes. Go ahead. But I call that a bloated piece-of-garbage market that is already too cozy for software makers.

Are consumers really clamoring for an "alternate" app store? Or is it just developers? If it's consumers, then the ideal solution is to give us an "alternative" app store that has a gray market, and re-packages ripped off and cracked applications. I mean, it will save us the most money. It will absolutely hobble small-time software developers - sending them back to the dark ages of web distribution in the 1990's - but from a consumer point of view, there's no downside to installing the Cydia App Store 2.0 and then installing DuoLingo [Cracked By Mr Krak-Man of Black Bag]. That's exactly what people did when you could jailbreak an iPhone, and Apple turned a blind eye to it until hackers started doing things like messing with telecom firmware and overwhelming the cloud service components of legit software. And hey, if this fancy cracked DuoLingo exploits some security hole to wake your phone up at 2:00am and tell your banking app to do a wire-transfer of your entire savings account to some military base in Iran, well, too bad. You stepped out of the walled garden and right into the fire. That's the "alternative" part of a truly alternative app store.

It seems to be just developers making this noise though. Basically, they want to be free to deceive and exploit customers for their money without the meddling of a damned platform provider. And that's generally what they got, except the monetary strings are still attached. It's only a matter of time before Facebook re-architects their own app so that it is actually an app store. Everyone who wants to be on Facebook will just authenticate it without thinking ... and from that point on, unless Apple attached some monetary strings, they would not only be cut out of the revenue stream for Facebook, but potentially the revenue stream for every other non-Apple app you use, as Facebook recruits them into their store with the promise of no developer fees as long as they sign an exclusivity agreement and give Facebook dispensation to data-mine their users. And it's the Flash Plugin scenario all over again: Apple loses control over their own ecosystem. They provide the R&D, the hardware, the support, the APIs, and... Whoops, Facebook is suddenly calling the shots about what can and can't be installed there, and turning their privacy protections into hash. Meanwhile, Epic Games is busily doing the same thing. And so on. And that's where the whole point about risk comes in. This is not just a gaming rig like the Playstation we're talking about here. This is the core of people's digital lives.

Take another look at the fee structure Apple released for "alternative app stores." If you do the math on the userbase of Facebook under the new regulations (as someone eventually did), and it comes up to 11 million dollars a month, well, that's a large amount, but ... is that actually a problem?

So now we have alternate app stores, and now the argument over money picks up right where it left off. As that happens, we need to be careful what we wish for. As far as I can tell, Apple is the only giant tech company that gives a tin shit about consumer privacy. And yeah, it's made of tin, but all the other giant tech companies are consumer privacy hostile. With the qualified exception of Microsoft. If you're crusading on the side of Epic Games and Facebook you may not be keeping the right ....... company. Pun intended.

Hyarrrr!!! And that be me rant. Opinions subject to change with impassioned argument. Or perhaps just loot.
garote: (zelda minish tree)

I never played this game as a kid. It sailed right past my ragtag software pirate crew in the 80's. Thanks to fancy emulator technology I can play it now. That's cool, but so what? One word: Deranged.

This version of The Hobbit is full of deranged characters and nonsense, like one of my favorite childhood books, "Alice In Wonderland." Many adventure games from the 80's were this way, mostly because the computing limitations of the time forced games to exist with one foot in the world of crude Turing machines and robotic behavior, and the other foot in the literary world -- since text is much cheaper to work with than graphics or sound. And when you use text to convey what's going on, you have a lot of creative freedom, including the freedom to massively overreach.

And that is what this game does. I played through it last night and it was another trip through the looking glass, into a world where nature, human behavior, and even basic laws of physics have been twisted into taffy.

The rest of this text is adapted from a strategy guide for The Hobbit by Jared Davis, since it provided a great framework to describe the insanity I found:

The Hobbit is a text adventure, so you interact with it by typing things like "ATTACK GOBLIN" and "EAT FOOD". The graphics are a sweet bonus but they're quite static. You can type "OPEN DOOR" and the game will tell you "THE DOOR OPENS", but the door in the picture will not move. Oh well, it's 1982, so we'll take whatever we can get.

Unlike other text games from this era, The Hobbit has a real-time element. So if you don't type anything for a while, the console will tell you "TIME PASSES" and things will happen anyway. Adding to the instability, your companions in the game can wander around on their own, and if you stand in one place they will often get impatient and leave. Two of these NPCs (non-playable characters) are ostensibly on your side: Thorin the dwarf, and Gandalf the Wizard.

The NPCs may fight and do other functions for you. You may ask them to do something and they might comply, remain silent, or even say "No" when you're about to be killed. They are inscrutable. In one game I played, Gandalf had an odd habit of picking up dead bodies and carrying them around.

Right at the start, Gandalf hands you a map, which you need to take to Elrond. Along the way Gandalf may snatch it back from you without warning, then wander off and drop it somewhere. This can be quite vexing since you need it to reach the Misty Mountains.

Did you notice that chest in the picture of Bilbo's home? That's where you put the treasure you steal from Smaug, and doing so ends the game. You can open the chest, close it, and get inside it as well, with or without Thorin. Getting inside it and closing it isn't recommended though, since you'll be trapped for eternity.

A creature called a Warg roams the game, and can show up virtually anywhere. It may also be killed before you encounter it, possibly by Elrond, a troll, Gandalf, or a Goblin. It is also possible that it will kill someone you'll need on your quest, so hope you find it before it finds them.

There is also a Wood Elf in the game, who usually hangs out in Mirkwood, though he can wander from his post. If he finds you he will likely throw you into the Elf King's dungeon. Depending on where you both are, this can actually be a shortcut.

When you're not around, the Warg can sometimes bump into the Wood Elf. They may fight, or the Wood Elf may throw the Warg into the dungeon. In that case, if you get thrown into the dungeon, the Warg may attack you there, or you may arrive in the dungeon to find a dead prison guard (and no Warg.)

In the Misty Mountains there are many goblins. They may fight you, but will more likely throw you in their dungeon. You can escape from the dungeon by climbing out a high window, but being a hobbit, you're not tall enough to do it yourself. You must wait for Thorin to get thrown into the dungeon alongside you, then "SAY TO THORIN 'CARRY ME, OPEN WINDOW, GO WEST'". If Thorin is feeling cooperative, he will then pick you up and haul you out the window with him.

There is also another way to escape: In some games, the goblins wander away from the Misty Mountains and will capture the Wood Elf. Then if they capture you, you will get thrown into the goblins' dungeon, and a few moments later the Wood Elf will throw you into the Elf King's dungeon. Problem solved!

When you get to Lake-town, you need to recruit Bard to kill Smaug the Dragon. He might follow you, but if he doesn't you can pick him up and carry him. He has one arrow, but in order to fire it he must remove it from his quiver first, which he may not do. It's best to take the quiver from him as soon as you meet, then hand him back the arrow. Once you encounter Smaug you will actually have to tell Bard to shoot him, or he will just stand there and get roasted, turning into "DEAD BARD."

If Bard isn't around, you still have a chance to win the game. You can try to fight Smaug yourself, using your sword, the bow and arrow, or any other object in the game, for example you can wield "ROPE" or "DEAD GOLLUM" and then "VICIOUSLY ATTACK SMAUG," which has a success rate far, far higher than you'd expect.

If you don't want to get your hands dirty, you can also ask someone else to fight Smaug. For example, you can stand just outside the cave where the dragon waits, then "SAY TO GANDALF 'GO NORTH, KILL DRAGON, GO SOUTH'." If Gandalf doesn't come back, the dragon killed him. If he comes back, it's possible he tried to kill the dragon but failed. So "SAY TO GANDALF 'GO NORTH, KILL DRAGON, GO SOUTH'" again. This time, if he comes back, the dragon is dead. No one goes two rounds with Smaug without someone getting killed. If he doesn't come back, either he's stuck in the cave looking confused since there's no dragon to fight, or he's charcoal. Still, the odds are better than fighting Smaug yourself, right? And at this point, since you're already at the Misty Mountains, Gandalf is quite expendable. Heh heh heh...

Not a very hobbity way of thinking? Well that's quite appropriate in a world made of deranged nonsense.

You know, less than a year from now we're going to see the first adventure games with NPCs built on "large language model" technology. And just like any other time when artists are working on a fronteir with new material, there will be glorious overreach. The lunacy you'll find in games from 2024 will look just like this game from 1982.

garote: (io error)
I find it quite amusingly appropriate that I generated all of these using massive help from a generative art tool. In this case, Microsoft DALL-E.

I now present to you, the gallery of trashy 1980's sci-fi novel tropes:





























And of course, who could forget the thankfully out-of-style:



If you're a 1980's "space lad" you get somewhat more practical clothing:

garote: (ultima 6 rave)
Folks who know me can probably guess why I had to brew up each of these images and turn them into laptop stickers.










That last one is six AI-generated images from Microsoft's DALL-E 3 composited together, with the primary prompt being "A crude monochrome woodcut of death wearing a helmet and riding a loaded touring bicycle past a medieval town by the seashore."
garote: (Default)
IMG_2880

I make my weird cerebral living on a laptop, often hooked to a large monitor. I used to have a dedicated machine for the large monitor, but about five years ago I got fed up with dragging my digital life between two places and consolidated.

I could only make that move because I found a way to keep playing spiffy high-power games on the laptop: I bought an eGPU, and stuck that between the large monitor and the lappy. Now I could use the eGPU to render a zillion triangles a second in the universe of Skyrim, and then disconnect from it and ride out to a cafe, and sip expensive coffee as I compute around my fellow monks. The best of both worlds.

2021-09-16-155214-IMG_E9820

Apple moved away from the Intel chip quite a while ago, and the eGPU needs an Intel chip to operate. So I was stuck with the same laptop for five years. That's five years of very hard use. That lappy has been hauled in a bicycle bag around nine countries, and it's been a workhorse at home. During the COVID summer of 2020, I hauled it to the park by the Campanile with two giant batteries and ran the thing full blast all day, draining both batteries and the internal one in about six hours, for five days each week.

2021-04-13-180039-IMG_7251

Now the thing heats up in an instant, and throttles way down and stays there. Detaching from the eGPU means shutting the machine entirely down, or it will crash as soon as I close the lid. Last month I left it on the desk compiling code and when I returned five minutes later it was frozen with the screen off, requiring a hard reset. After a dozen more episodes just like that, I was finally convinced to upgrade.

But that means throwing away the eGPU. You just can't use an eGPU with an Apple chip. It's architecturally impossible. You can't even run an ARM version of Windows 11 in a virtualized environment and plug the eGPU into that. If you move to one of their new laptops, your tin can full of space-age 3D pixie dust shuts down and transforms into a little trophy, rewarding you for participating in the march of obsolescence. It does still generate waste heat though, so your cat will enjoy lounging on it.

So I spent a ridiculous amount of money, and bought a 14-inch M2 laptop with 12 cores and a hilarious amount of memory. It weighs slightly more than the old one, which annoys me a great deal. But I'm rolling with it because none of Apple's lighter machines can deal with sustained workloads.

2023-10-05-IMG_0369-new_lappy

Just today I began the process of migrating the stickers over, because after four days of testing I was finally convinced that I could do what I needed with this new machine.

2023-10-05-IMG_0370-new_lappy_2

Two things about it impress me way beyond what I was expecting:

The battery lasts a long, long time. I'm used to my old lappy going critical in about two hours. This machine uses tiny amounts of power. I have no idea how long it really takes to drain the battery because I haven't gotten it down by even half, after working with it all day.

And dang, the speed increase is just shocking. I opened one of my old projects in VSCode the other day - five hundred interconnected F# files - and instead of churning full-blast in Ionide for two minutes to resolve all the symbols in the editor like the old machine did, it fired up all 12 cores in one huge burst and the symbols resolved in ten seconds, and then it dropped back to low power. The case registered a tiny increase in heat, which went away without the fans even turning on. Things have changed in five years...

There's more real estate for stickers on this machine, and some of the old ones feel played out. Time to go hunting again!
garote: (programming)
(Fellow dork hands me a fake "keynote" presentation done by two organizers of a live show where the audience watches a handful of people play a table-top RPG for many hours.)

Me: Wow. I forgot that this stuff goes on. And right in the middle of Boston! If anywhere, I figured this would be in Seattle or Portland.

Fellow dork: They cycle multiple venues. Primary one is Seattle. They added Boston, Austin, and Sydney over the years as more nerds flocked.

Me: I guess if I were 20 years younger, I would be all over this? Maybe?? I'm amazed the video has over 70k views. I feel like the DM is nothing special, and the crew is only mildly funny. Maybe this is like a "listen in the background" thing?

Fellow dork: This is the age we live in. The age of the nerd flock.

Me: NERDFLOCK! DOT COM! There's danger in groups though. Aren't they worried that they will crash the economy if the building collapses? Such a disturbance in the coding force, of all those voices crying out at once...

Fellow dork: Wait, would nerdflock.com be crowdsourced debugging? Or porn? Both maybe?

Me: One financed by the other, no doubt. I'm tempted to park the name, but, what chances does it have if there's no Dorknozzle any more?

Fellow dork: See, that one is clearly porn.

Me: Code quality definitely took a collective hit when that site disappeared. Take this, for example:

(I hand him a vacation photo of a cliff by the seaside. A mesh retaining wall has been pulled across the base of the cliff, to prevent rocks from rolling onto the road.)

Me: This is just lazy. You know you've reached the low-rent edge of the map when the terrain LOD pager completely craps out, leaving huge chunks of untextured base mesh clipping through the shrubbery.

Fellow dork: Yep. The devs were like "There's no loot there anyway, and we're way behind schedule. Ship it. We'll fix it in the DLC." 4.5 billion years later, still not fixed.

Me: Similar thing happened on my last bike ride. The woods kept wrapping. Lazy artists trying to rubber-stamp geometry with some bloated algorithm, no doubt.

Fellow dork: That might just be coffee.

Me: Oh no, I never drink coffee on a bike tour. Not unless I'm experimenting with jet propulsion.

Fellow dork: So where you at now, anyway?

Me: Hang on, I gotta rant about UX. So, this morning I had to fill out a form asking me to describe a piece of delayed baggage I wanted routed to my house. The field was a single-line text box. When you entered something invalid, it turned red, and displayed the helpful alert "PLEASE ENTER A VALID DESCRIPTION". No clues as to what "valid" means.

Fellow dork: I shake my fist.

Me: Through trial and error I determined that your description cannot be more than about 80 characters long, cannot contain any numbers, and cannot contain any commas. Because, why?

Fellow dork: Web design by Bobby Tables?

Me: My best theory: The form contents are routed to a CSV file somewhere that gets dumped into a spreadsheet that's printed out for baggage handler jerks to use, and commas break the column alignment. That's a normal problem. How do we fix it? We make commas illegal! Clearly! And then we don't tell anyone we've done so!

Fellow dork: Mmmyeah. It's no better here. I'm filling out a form to rent a car. Much less painful just talking to a human.

Me: Hoo man is too busy doomscrolling on phone.

Fellow dork: You need to twitch a tok to kick a fund to crowd rush big foam rubber bats to knock phones out of hands. ... Wait; first you have to "X" it I guess??

Me: Arrrgh. Sure, call the company "X", because "X" totally conveys a point of view. If a guy has that much money, I guess he really can do whatever he wants, including shoveling it into a furnace.

Fellow dork: Branding! Ta-daaa! Hey, check out this demon programming tidbit I found amusing while drunk and grumpy from a cancelled flight.
Page generated Aug. 13th, 2025 08:44 pm