Some Opinions on Making a Game
By: Face Rizzi
Tags: design, software_development, article, tutorial, basics
The content here is ravings and opinions about the content of preceding linked content. It should be taken as entertainment at best and madness at worst. It might be useful, but I sure wouldn't guarantee it. In particular these statements are made in the moment, and is not future-proof.
Reason for doing this
I am starting to make tutorial about game making. And that's based on a reality that after college, I lived and breathed wanting to make video games. I was held back more by the realities of attention deficit disorder and less than any other thing, I think.
There are things that I do, that I do well and I can see other people struggle with. Then there are things like setting up configuration files. The preparation of this website was delayed by severe amounts of emotional stress on one such configuration. I can't account for about a week and half's worth work because of it.
If I admit that while displaying code that I've spent hours testing, that someone can nearly damn copy and paste into a project, and it ought to work, well... there's an insecurity that no one would take either element in the appropriate way. An opinion is an opinion, but the work is the work.
The overall reason for doing this is that society did work for me. Parts of it failed for various reasons, but I got lucky and got to be online with people from around the world that shared my kink.
Seeing other people act like me, and completely differently from me, but more or less struggling with similar things and being reasonable... well. Knock on wood. I still spend a lot of time to myself and romance has left me alone.
But more often than not when someone is walking the path I was till I was thirty, where they think they are normal but have an undiagnosed illness which makes them both feel and express emotions differently than everyone else. I read a statistic somewhere that said about two people for every three in prison have undiagnosed ADHD.
So for what it's worth I want to give something back to the people who helped me get to this point, just by being themselves. Straight, gay, bi, lesbian, trans, nice or mean. But mainly just decent.
It's a more convenient language that features automatic garbage collection that are intended to keep programmers from getting into too much trouble. Take it for granted, get on with life.
The fact is that there are two qualifications I'm looking at specifically and neither of them are good for the beginner. But there's third overriding one.
I kept going round and round and what I found was an MIT Lecture given by Grace Murray Hopper. And I was convinced about what I had to say and why I had to say it. And that any amount of humor was justified.
The first qualification is that the language selected has gobs of fucking garbage code, that has to function and is collectively too expensive to replace in the next five years. The next big promising thing in web software is web assembly. This compiles a file into a binary that browsers are standardized on, which means it runs really damn fast and the process of minification and optimization is standardized. The implications are profound, one of them is that in theory any language with an interface to compile to web assembly could feasibly be used to code web pages.
This language lived and died in three years. The main clue that it's a dead project is the label that it is no longer in active development on the home page. It's very pretty, it even sparkles.
You may have never heard of a language called Cobol. It's not necessarily a user-friendly language because it is so flexible that you can write programs in a structure that is particular to the implementer. So when you go to train someone else, there's no documentation and no method of doing it. The world still needs Cobol because most of the banking systems run on that legacy code.
The second is a clear path to an executable. Python doesn't have that. You can make an open package in an environment like Ren'Py does that includes an entire interpreter or people will have to do what is necessary to install a python .dll software library where your executable can see it. The point stands that there's an unexpected "gotcha" in between a novice and a software package.
And that's a damn shame. Because if I were to choose
how to run a freshman course I would seriously
consider building it around the tutorial of the Python Programming Language. Because for each problem
you run into - like say having to open a pipe
and close one there's attached to a file, there's
a language facility that's built in that demonstrates
"that's how it's done right, you have a special operation called
with that tries to do the thing labeled by the statement
as; for example
with open("yourfile.ext") as your_label: right?
When it works it gives you the result, and cleans up the mess regardless
of success or failure. Because there will be a mess and if you don't clean it up right it's either a memory leak or a failure causing undefined behavior. People forget to do that because we're all human even if we know a bit of C. So we write it once and have the machine remember for us, it rarely fails to do that."
More importantly, if you type help() in python or use an appropriate function, you immediately motivate a programmer to use documentation strings. In the batteries included standard library, there are two modules dedicated to testing (and I have an unholy love the unittest library); it's as important to have that as a path making asynchronous calls to a network. The language designers and people making the application programming interfaces can fill the same space that you'd otherwise need to go to stack overflow to fill.
As commented there are many versions of lisp, and the features and workflows to an executable for a particular system vary. But one implementation I had looked at had three different help functions you could invoke depending on whether you were in the terminal or interpreter and whether you needed the document strings or just information on what arguments the function took and what output it produced. This is real important to have when you're entering into a system without prior knowledge and have no internet connection.
This brings me to the third and most important reason I'm doing this. Electron gives a future developer a safe development environment.
In the early days of the web, a plugin called Flash was used to deliver games. As things evolved, two things killed that plugin: the first was that it could not meet power requirements for mobile devices. The second was that bad actors learned how to compromise it and its publisher could not secure it.
But when it was safe, it led to the development of entire communities and websites for sharing games. People were able to learn how to develop and get out of poverty. It was a beautiful thing and that kind of spirit ought to be kept alive where ever possible.
And when you go to implement and test out code, you will find that the expectation is you: put all the code into a script file that does not directly affect the page as a best practice, and attach a number to be randomly generated by a server called a nonce or a hash to anything that can be interpreted as a script as whitespace sensitive hash. Using specifically the hashing implementation of...
Either way you're dependent on something that has to be regenerated every time you make a change or tweak, and you're being told by a major company that even though you're financially responsible for losing your user's data you can't use this particular method to secure it near ads anyways!
While there are strategies someone could use to mitigate that... will they be used? Can they be used? It's one thing to say; for instance, just put ads separate from user data. It's another thing to be a social media company with a slush fund, where there's a clear impetus to use a persons data to deliver them content they find desirable and monetize that with advertisements.
The web seems like a fundamentally insecure environment where a fourteen year-old can lie about their age and make trouble. And a hacker will always have some capability to shoot down content for an arbitrary reason, including the simple joy derived by defacing a thing. Hard to tell someone "your classroom and display space is the equivalent of Eve online played with all the money in your bank account" and expect to get sane folks to learn.
On top of that, there's a financial investment beyond content creation and cost of living (which can be tremendous factors if you are low income and need to do ten-hour shifts just to make ends meet). You have to put up money to host a server. And there's an expertise investment, that server has to be able to handle the amount of traffic to it. You can make as much FOSS software as you want, hardware takes resources and it will never be free. Even if I hand someone a computer they have to pay for the electricity.
So say you make the next surprise hit content wise! What happens to the cluster of resources you have setup and they start getting hit by millions of people? That's an intimidating question, and it's as easy to under prepare as it is to over-think.
If someone is just distributing executables, on the other hand. They can make syntax and semantic errors. And still have room to learn from those mistakes. It's simply more tolerant to learning.
Getting an Education
People start in different places. I'll stick by the idea that an absolute beginner or someone who just wants a lay of what the work is should read over the python documents. They more or less demonstrate what a variable is and how to use them.
After that if a person isn't scared they should look up discrete mathematics. There's notion of a proof there that was never introduced to me in high school. What motivated me to go to college was to learn calculus because while I felt I could get everything else about programming the notion of what a limit was just wasn't clicking. But discrete was the gem I never knew I was looking for.
It turns out that for programming, while it is important to use the ideas of calculus the thing you need is an understanding of how collections of objects behave. While love of a thing will get you very far, mathematics also gets you an understanding of what a computer can't do for you. No amount of love will give you a set of universal rules for how to live. That isn't bullshit eastern philosophy, it's directly provable. The framework for proving that statement also was used to prove, that we cannot tell when given infinite resources if a program will terminate or not. This is one of many reasons why programmers should always sanitize user input and never allow it to be executed as code.
If you aren't scared off by discrete; while you are studying that you might also want to look at Data Structures. There are links to text books on algorithms, which is the funnest topic in the world in my opinion. The study of algorithms is the study of characteristics of how instruction are executed.
Probably the best discussion of algorithms and the great difficulty of benchmarking their performance is found in a paid resource, Art of Computer Programming Volume 3 by Donald Knuth. To be honest just buy all the volumes, someday you'll wonder how to do a linked list correct.
If you don't have the cash, don't worry. Knuth is probably one of the most cited authors in computing. You'll end up reading him somehow. In fact if you don't see him cited in a theoretical work that claims to be about a computing topic generally; it's probably a bad omen.
He's also funny, and although he has an incredibly loud writing voice for a very gentle seeming man.
As far a game programming goes, there's a lot of hard math. The complete first year of a college course of physics is extremely hard, and every ounce of it was descriptions of motion and how objects with weight move. A lot of it can be solved simply with vectors and matrix operations. That means you'll end up using mathematics that most college majors try and forget.
The second year was worse, but described light and how it works. If you can find efficient ways to get a computer to do that, you basically are writing yourself a huge check.
Meanwhile, keep in mind that discrete mathematics is the gateway to number theory; whose practical application is cryptography and graph theory, whose practical application is computer intelligence.
While none of that was in this chapter. Data structures will be discussed at some point whether anyone likes it or not.
A package manager is incredibly useful because it automates all the messy setup work of getting software. It also functions to help manage software versions, and is key if you want electron to properly get along with WebDriver. But having something like scoop around is incredibly useful because you can just type the name of something like python and have it at a time cost of "how long does it take to download" without "how long does it take for me to read the configuration menu?"
As for the beta windows package manager... my first impression is that it was made by the asshole who thinks that an operating system is a thing that restyle like a car, so you can resell the engine on a yearly basis.
In theory, you can move those folders. In practice documentation for doing so is the sort of conflicting clusterfuck of advice that drives people to use Linux.
Keep an eye on it. That's a strong opinion on a teeny feature, if you've got everything on a 2tb drive that's an SSD you won't have to care so much. I don't.
In the short term scoop has the lowest investment price, features that beat the official implementation and a good body of software. Microsoft has proven that while it can't put together a UI to save its life its programs are pretty good when it isn't listening to the guy in the advertisement department that swears they can be Google.
I just realized they listened to that guy when they wrote Windows 11. Different article, there's mind control involved but not the kind I like.
Get Comfortable with a text editor.
These are text editors that can be configured to work like Visual Studio Code does out of the box. They can function without a GUI as a text editor. It isn't impossible to use VS Code in an equivalent way; where the GUI is feeding and receiving commands from a server. But I've ended up using Vim instead because it's frankly as easy once you get the hang of it.
I did it on vim for the shallowest reasons. Basically I have a hard time not getting distracted on how many things I can do with Emacs and trying to set it up my way, because it is opinionated. Nonetheless, you should at least try it once.
I just need a text editor with syntax highlighting, lots of paper and a well paid adult. Still, work of art it is. If someone mastered Emacs they could probably run their entire life from the terminal.
The other thing is that the learning curves of these programs act as a kind of rite of passage. It's a bit like how, if a guy goes on ten-mile hike, or runs off and circumcises himself in the Sahara, you know he'll be able to hunt and live with the tribe. If someone can write code in Emacs or Vim, they have the sense to follow tutorials and read.
Compared to that, Atom is as easy as code but slow. And suffers from a lack of focus that typifies open source projects that live shallow lives or die quick deaths. Its UI was, in my memory, more useless than anything Microsoft had ever built but not as bad as what I used while getting my art minor.
I can't say that of Kate, the text editor that comes packed with KDE. And it does appear to have basic syntax highlighting. I just haven't used it enough to know if it is effective or competitive. Of the things mentioned it and windows have a GUI driven learning curve that I can't compare to self mutilation.
On the other hand, you won't learn that it's possible to live without a window. Vim and Emacs are a bit more difficult to master because they live and operate well in environments that make few guarantees about resources. I would not characterize the windowed variants I've used as good. They stink of GNU's user experience, which is so bad that there's entire Linux forks that exist to avoid it (Linux Mint from Ubuntu).
One of the things, like death and taxes, that you have to manage is licensing. One of the big missteps Donald Trump made with his new propaganda outlet was that he tried to erase all the licenses off the free and open source technology he was using. Regardless about how you feel about the man, when you're developing open source software you need to find a spark or reason to continue doing so.
Open source licenses frequently also include attribution credits to the original authors. So you can imagine that someone erasing the name off your creative work, so they can put it in a creative product, as well as the legal means by which people can recognize it was your idea without having to pay for it... Trump was given 30 days to comply or be sued. And he was not the first business person who thought they were smart. While the man is skilled at litigation, he entered into an arena where delaying a court case would at best incur a more costly legal defeat.
So then, what happens in those thirty days to resolve the licensing issue? The engineers working on Truth Social have to stop what they are doing and redownload all the open source software if they can. If they've made changes to the files, they have to figure out how to write in the licenses and attributions which are hopefully located in one place and not in several parts around the file. There's a clear risk of losing work in that scenario, beyond the man-hours spent fixing a needless mistake.
Not giving due credit is a stupid way to lose a lot. The sooner you get a license or statement of copyright that satisfies your use case, the better. It is true that declaring something FOSS means that you can't sell the code you generate as a product and neither can anyone else. The way that people get around that is by packaging and licensing content separately. Probably the first good example of a game company doing this was Wizard's of the Coast with Dungeons and Dragons 3.x's Open Game License.
What NPM is going to do is assume you are a developer like the ones developing software that will eventually be used as free and open source software. Inside the package there's a part that says license, and it's about as fun as pulling teeth if you try to say something along the lines "I'm making this for a small community of people and storing my content separate from the FOSS stuff."
I am not going to get into an argument about the merits of free and open source licenses. My primary use case (means of demonstration be damned) is to show how Electron can be used for generic game development. I'm living the reality of having four grand in the bank in an environment where; pre-conflict with Ukraine, living expenses where about two grand a month.
My use case trumps monetization strategies that would rely on a strict license. My reality compels me to explain how a person in my situation might avoid putting themselves into an untenable legal situation when developing closed source software.
When npm inits, don't try and put in your custom license. When it is done, it makes a file called
package.json. This file contains all the information about software versions and custom test script commands, you need to know where it is and what it is any ways. If a liberal foss license does not fit your project use case, for instance if you have gobs of royalty free art that you don't have the right to redistribute for free. Change the license field to say "See license.txt".
Then write an appropriate license that says essentially, the files in node_modules follow their own licenses. My files follow my license. These dependencies follow their own licenses. As for the exact text to use - research it. It's case by case.
For some irrational reason I find the idea of not rendering HTML elements as the default behavior of markdown offensive. I don't know why this is because I've had to do as much configuration on python's Markdown package, and it didn't just piss me off.
In all honesty, you could substitute this with any markdown compiler. But it is mature looking, appears to be actively developed and besides one default that conditionally annoys me. I should sanitize all input, and you don't need to remind me stupid people don't by doing something that breaks my operation expectations.
My opinion of templates is that I need one to demonstrate the cool stuff of templates. I like Mako's execution significantly more than Jinja, which is the ancestor of Nunjucks.
Things like Nunjucks and Jinja try to keep code from executing in templates because they think they can keep programmers like me safe from ourselves. The other extreme is programming languages like PHP, which was Perl and a template engine that evaluated Perl statements literally. A bunch of people didn't sanitize statements being fed into those templates, and it got executed as literal information stealing code.
This is why we can't have nice things.
Complaints about convenience be damned always sanitize your input. That means aggressively remove things that might be interpreted as code from a random source, or may contain bounds errors. Any kind of user input meets the quality of being a random source.
EJS is closer to the PHP way of doing things but doesn't include a facility to inherit templates. Template inheritance allows a developer to write a bunch of "boilerplate code", code which functions like the junk metal on boilers whose sole purpose was to hold heat - heavy shit but thing won't work without it, than have a template that fills in a couple of arguments on that boilerplate and allows for a degree of customization that is best done statically and without variables.
So that's why Nunjucks was selected for this project.
The emphasis on testing and version control
The big thing I wanted to emphasize was that I submitted input and output for testing it yielded results. It is a process to get to the place where it is correct. Testing eats up the bulk of actual programming development time.
Mocha is a good test driven development framework. Having a formal method of preparing and understanding tests is so important that there are religions devoted to particular ways of doing it within the software community. And these religions generate software for test driven development, behavior driven development, agile development, ect. that more or less prove their existence.
Attempting to Selenium in combination with Electron immediately justified the use of git and even local repositories. Attempting to develop a process where the engine started and consistently closed completely broke the NodeJS configuration to the point it couldn't download dependencies. I stopped and used a test engine I had no familiarity with never worked with before, and I was back on track in about two hours.
I know it works individually. But damn did it cost me time. In the official version I want someone to read you know, this simply works and be able to operate without doubt because of one quirky tool.
And because I promised a note on minification.
Minification is an optimization process. It has famously been said that premature optimization is the root of all evil.
When a person minifies a piece of code, they remove all statements unnecessary for its function, including documentation comments (except where facilities are made to specifically allow statements like licenses to be included, so that they may be included inline).
This allows a code to be transmitted more easily. To see the necessity of it, note that at the time of writing Google specifically says user will avoid web content that takes over two seconds to load.
So the code has to load, if it's zipped it has to unload, and it has to give a user something all in two seconds. And white space, the amount of space and tabs a programmer puts in, can be about sixty percent of the size of a file.
The problem is that a minifier can change the structure of the code and break its functionality. It can observe say, the word "Vue" being used, and go "Vue = V, that saves two characters and golly that adds up, Vue appears two hundred times!" And as a machine, it doesn't have the sense to ask "What could the developer have been referring to?"
Since the goal here is explicitly to show but not to transmit data, using a minifier is in some sense counterproductive until I talk about specific distribution strategies. If there is any more analysis of the topic at all, expect it to occur there.
The rule is that if you don't have to optimize, you don't have to optimize. If you do have to optimize check and be sure. In folklore, the act of optimizing your process beyond what common sense dictates is associated with black dogs, and boats like the "Flying Dutchman". If you think you can do it faster than that black ship off in the distance. You're probably actually about to die in a ship wreck. The spirit and meaning of the phrase "Premature optimization is the root of all evil" predates the concept of Computer Science.
If I end up with a file about the size or smaller than the executable I just downloaded for Wasteland, a dated PC game, which was around 350 megabytes, I'm not touching a minifier. With that being said, optimization on well-defined problem with a well-defined result should be reflexive. Half the reason why Grace Hopper is traveling around with a bundle of wires is to show a programmer what they're wasting when they are too conservative, the other half is to show a General physics wouldn't allow her to be ambitious.
I'll almost certainly discuss image optimization which is the equivalent of minification for image resources. But that is based off real performance requirements, not imagined ones. That discussion will show the process used to justify the act of optimization.