Thursday 25 September 2014

Choosing low-tech visual styles for games

A month ago, I participated in Ludum Dare, a 48-hour game development contest. This was the first time I finished a game-like project since about 2005.

The theme of the contest was "connected worlds". I made a game called Quantum Dash that experiments with parallel universes as a central game mechanic. The player operates in three universes at the same time, and when connecting "interdimensional cords", the differences between these universes explosively cancel each other. The "Dash" part in the name refers to the Boulder Dash style grid physics I used. I found the creation process very refreshing, I am quite happy with the result considering the circumstances, and I will very likely continue making games (or at least rapid prototypes thereof).



My relationship with computer games became somewhat dissonant during the nineties. At that time, the commercial industry became radically more centralized and profit-oriented. Eccentric European coder-auteur-heroes disappeared from computer magazines, giving way to American industry giants and their campaigns. There was also the rise of the "gamer" subculture that I considered rather repulsive from early on due to its glorification of hardware upgrades and disinterest towards real computer skills.

Profit maximization in the so-called serious game industry is largely driven by a specific, Hollywood-style "bigger is better" approach to audiovisual esthetics. That is, a strive for photorealism. This approach is, of course, very appealing to shareholders: It is easy to imagine the grail -- everyone knows what the real world looks like -- but no one will ever reach it despite getting closer all the time. Increases in processing power and development budgets quite predictably map to increases in photorealism. There is also inherent obsolescence: yesterday's near-photorealism looks bad compared to today's near-photorealism, so it is easy to make consumers desire revamped versions of earlier titles instead of anything new.

In the early noughties, the cult of photorealism was still so dominant that even non-commercial and small-scale game productions followed it. Thus, independent games often looked like inadequate, "poor man's" versions of AAA games. But the cult was starting to lose its grip: independent games were already looking for new paths. In his spring 2014 paper, game researcher Jesper Juul gives 2005 as an important year in this respect: since 2005, the Grand Prize winners of the Independent Games Festival have invariably followed styles that diverge from the industrial mainstream.

Juul defines "Independent Style" as follows: "Independent Style is a representation of a representation. It uses contemporary technology to emulate low-tech and usually “cheap” graphical materials and visual styles, signaling that a game with this style is more immediate, authentic and honest than are big-budget titles with high-end 3-dimensional graphics."

The most prominent genre within I.S. is what Juul calls "pixel style", reminiscent of older video game technology and also overlapping with the concept of "Computationally Minimal Art" I formulated a few years ago. My game, Quantum Dash, also fits in this substyle. I found the stylistic approach appealing because it is quick and easy to implement from scratch in a limited time. Part of this easiness stems from the fact that CMA is native to the basic fabric of digital electronic computers. Another attracting aspect is the long tradition of low-tech video games which makes it easy to reflect prior work and use the established esthetic language.

Another widely used approach simulates art made with physical materials such as cut-out paper (And Yet It Moves) or wax pastels on paper (Crayon Physics). Both this approach and the aforementioned pixel style apparently refer to older technologies, which makes it tempting to generalize the idea of past references to other genres of I.S. as well. However, I think Juul somewhat stumbles with this attempt with styles that don't have a real historical predecessor: "The pixel style 3d games Minecraft and Fez also cannot refer to an earlier time when 3d games were commonly made out of large volumetric pixels (voxels), so like Crayon Physics Deluxe, the historical reference is somewhat counterfactual, but still suggests a simpler, if nonexistent, earlier technology."

I think it would be more fruitful to concentrate on complexity than history when analyzing Independent Style. The esthetic possibility space of modern computing is mind-bogglingly large. It is easy to get lost in all the available potential complexity. However, by introducing constraints and stylistic choices that dramatically reduce the complexity, it is easier even for a solo artist to explore and grasp the space. The contraints and choices don't need to refer to any kind of history -- real or counterfactual -- to be effective.

The voxel style in Minecraft can still be considered somewhat historical -- a 3D expansion of grid-based 2D games such as Boulder Dash. However, I suspect that the esthetic experimentation in independent games will eventually lead to a much wider variety of styles and constraints -- including a bunch that cannot be explained with historical references.

The demoscene has been experimenting with different visual styles for a long time. Even at times when technical innovation was the primary concern, the goal was to find new things that just look good -- and realism was just one possible way of looking good. In 1996, when realtime raytracing was a hot new photorealistic thing among democoders, there was a production called Paper by Psychic Link that dropped jaws with its paper-inspired visuals -- a decade before paper simulation became trendy in the independent games scene. Now that the new PC hardware no longer challenges the demo artist the way it used to, there is much more emphasis on stylistic experimentation in non-constrained PC demos.

Because of this longer history of active experimentation, I think it would be useful for many more independent game developers to look for stylistic inspiration in demoscene works. Of course, not all the tricks and effects adapt well to games, but the technological and social conditions in their production are quite similar to those in low-budget games. After all, demos are real-time-rendering computer programs produced by small groups without budgets, usually over relatively short time periods, so there's very little room for "big-budget practices" there.

Here's a short list of demos with unique esthetic elements that might be able to inspire game esthetics as well. Two of them are for 8-bit computers and the rest for (semi-)modern PCs.
I'm expanding into game design and development primarily because I want to experiment with the power of interactivity, especially in relation to some of my greater-than-life goals. So, audiovisuals will be a secondary concern.

Still, due to my background, I want to take effort in choosing a set of simple and lightweight esthetic approaches to be used. They will definitely be computationally minimal, but I want to choose some fresh techniques in order to contrast favorably against the square-pixel style that is already quite mainstream in independent games. But that'll be a topic for another post.

Sunday 7 September 2014

How I view our species and our world

My recent blog post "The resource leak bug of our civilization" has gathered some interest recently, especially after getting noticed by Ran Prieur in his blog. I therefore decided to translate another essay to give it a wider context. Titled "A few words about humans and the world", it is intended to be a kind of wholesome summary of my worldview, and it is especially intended for people who have had difficulties in understanding the basis of some of my opinions.

---

This writeup is supposed to be concise rather than convincing. It therefore skips a lot of argumentation, linking and breakdowns that might be considered necessary by some. I'll get back to them in more specific texts.

1. Constructions

Humans are builders. We build not only houses, devices and production machinery, but also cultures, conceptual systems and worldviews. Various constructions can be useful as tools, however we also have an unfortunate tendency to chain ourselves to them.

Right now, humankind has chained itself to the worship of abundance: it is imperative to produce and consume more and more of everything. Quantitative growth is imagined to be the same thing as progress. Especially during the last hundred years, the theology of abundance has invaded so deep and profound levels, that most people don't even realize its effect. It's not just about consumerism on a superficial level, but about the whole economic system and worldview.

Extreme examples of growth ideology can be easily found in the digital world, where it manifests as a raised-to-the-power-two version. What happens if worshippers of abundance get their hands on a virtual world where the amount of available resources increases exponentially? Right, they will start bloating up the use of resources, sometimes even for its own sake. It is not at all uncommon to require a thousand times more memory and computational power than necessary for a given task. Mindless complexity and purposeless activities are equated with technological advancement. The tools and methods the virtual world is being built with have been designed from the point of view of idealized expansion, so it is difficult to even imagine alternatives.

I have some background in a branch of hacker culture, demoscene, where the highest ideal is to use minimal resources in an optimal way. The nature of the most valued progress there is condensing rather than expanding: doing new things under ever stricter limitations. This has helped me perceive the distortions of the digital world and their counterparts in the material world.

In everyday life, the worship of growth shows up, above all, as complexification of everything. It is becoming increasingly difficult to understand various socio-economic networks or even the functionality of ordinary technological devices. This alienates people from the basics of their lives. Many try to fight this alienation by creating pockets of understandability. Escapism, conservatism and extremism rise. On the other hand, there is also an increase in do-it-yourself culture and longing to a more self-sufficient way of life. People should be encouraged into these latter-mentioned, positive means to counter alienation instead of channels that increase conflicts.

An ever greater portion of techno-economical structures consists of useless clutter, so-called economic tumors. They form when various decision-makers attempt to keep their acquired cake-pieces as big as possible. Unnecessary complexity slows down and unilateralizes progress instead of being a requirement for it. Expansion needs to be balanced with contraction -- you can't breath in without breating out.

The current phase of expansion is finally about to end, since the fossil fuels that made it possible are getting rarer, and we still don't know about an equally powerful replacement. As the phase took so long, the transition into contraction will be difficult to many. An increasingly bigger portion of economy will escape into the digital world, where it is possible to maintain the unrealistic swelling longer than in the material world.

Dependencies of production can be depicted as a pyramid where the things on the higher levels are built from the things below. In today's world, people always try to build on the top, so the result looks more like a shaky tower than a pyramid. Most new things could be easily built at lower levels. The lowest levels of the pyramid could also be strengthened by giving more room for various self-sufficient communities, local production and low-tech inventions. Technological and cultural evolution is not a one-dimensional road where "forward" and "backward" are the only alternatives. Rather, it is a network of possibilities burgeoning towards every direction, and even its strange side-loops are worth knowing.

2. Diversity

It is often assumed that growth would increase the amount of available options. In principle, this is true -- there are more and more different products on store shelves -- but their differences are more and more superficial. The same is true with ways of life: it is increasingly difficult to choose a way of life that wouldn't be attached to the same chains of production or models of thinking as every other way of life. The alternatives boil down into the same basic consumer-whoredom.

Proprietors overstandardize the world with their choices, but this probably isn't very conscious activity. When there are enough decision-makers who play the same game with the same rules, the world will eventually shape around these rules (including all the ingrained bugs and glitches). Conspiracy theories or evil-incarnates are therefore not required to explain what's going on.

The human-built machinery is getting increasingly more complex, so it is also increasingly more difficult to talk about it in concrete terms. Many therefore seek help from conceptual tools such as economic theories, legal terminology or ideologies, and subsequently forget that they are just tools. Nowadays, money- and production-centered ways of conceptualizing the world have become so dominant that people often don't realize that there are other alternatives.

Diversity helps nature adapt to changes and recover from disasters. For the same reason, human culture should be as diverse as possible especially now that the future is very uncertain and we have already started to crash into the wall. It is necessary to make it considerably less difficult to choose radically different ways of life. Much more room should be given to experimental societies. Small and unique languages and cultures should be treasured.

There's no one-size-fits-all model that would be best for everyone. However, I believe that most people would be happiest in a society that actively maintains human rights and makes certain that no one is left behind. Dictatorship of majority, however, is not that crucial feature of a political system in a world where everyone can freely choose a suitable system. Regardless, dissidents should be given enough room in every society: everyone doesn't necessarily have the chance to choose a society, and excessive unanimosity tends to be quite harmful anyway.

3. Consciousness

Thousands of years ago, the passion for construction became so overwhelming that the quest for mental refinement didn't keep with the pace. I regard this as the main reason why human beings are so prone to become slaves of their constructs. Rational analysis is the only mental skill that has been nurtured somewhat sufficiently, and even rational analysis often becomes just a tool for various emotional outbursts and desires. Even very intelligent people may be completely lost with their emotions and motivations, making them inclined to adopt ridiculously one-dimensional thought constructs.

Putting one's own herd before anyone else is an example of attitude that may work among small hunter-gatherer groups, but which should have no more place in the modern civilization. A population that has the intellectual facilities to build global networks of cause and effect should also have the ability to make decisions on the corresponding level of understanding instead of being driven by pre-intellectual instincts.

Assuming that humankind still wants to maintain complex societal and technological structures, it should fill its consciousness gap. Any school system should teach the understanding and control of one's own mind at least as seriously as reading and writing. New practical mental methods, suitable for an ever greater variety of people, should be developed at least as passionately as new material technology.

For many people, worldview is still primarily a way of expressing one's herd instincts. They argue and even fight about whose worldview is superior. It is hopeful that future will bring a more individual attitude towards them: there is no single "truth" but different ways for conceptualizing the reality. A way that is suitable for one mind may be even destructive to another mind. Science produces facts and theories that can be used as building blocks for different worldviews, but it is not possible to put these worldviews into an objective order of preference.

4. Life

The purposes of life for individual human beings stem from their individual worldviews, so it is futile to suggest rules-of-thumb that suit all of them. It is much easier to talk about the purpose of biological life, however.

The basic nature of life, based on how life is generally defined, is active self-preservation: life continuously maintains its form, spreads and adapts into different circumstances. The biological role of a living being is therefore to be part of an ecosystem, strengthening the ecosystem's potential for continued existence.

The longer there is life on Earth, the more likely it is to expand into outer space at some point of time. This expansion may already take place during the human era, but I don't think we should specifically strive for it before we have learned how to behave non-destructively. However, I'm all for the production of raw material and energy in space, if it helps us abstain from raping our home planet.

At their best, intelligent lifeforms could function as some sort of gardeners. Gardeners that strengthen and protect the life in their respective homeworlds and help spread it to other spheres. However, I don't dare to suggest that the current human species have the prequisites for this kind of role. At this moment, we are so lost that we couldn't become even a galactic plague.

Some people regard the human species as a mistake of evolution and want us to abandon everything that differentiates us from other animals. I see no problem per se in the natural behavior of homo sapiens, however: there's just an unfortunate misbalance of traits. We shouldn't therefore abandon reason, abstractions or constructivity but rebalance them with more conscious self-improvement and mental refinement.

5. The end of the world

It is not possible to save the world, if it means saving the current societies and consumer-centric lifestyles. At most, we can soften the crash a little bit. It is therefore more relevant to concentrate on activities that make the postapocalyptic world more life-friendly.

As there is still an increasing amount of communications technology and automation in the world, and the privileged even have increasingly more free time, these facilities should be used right now for sowing the seeds for a better world. If we start building alternative constructs only when the circumstances force us to, the transition will be extremely painful.

People increasingly dwell in easiness bubbles facilitated by technology. It is therefore a good idea to bring suitable signals and facilities into these bubbles. Video game technology, for example, can be used to help reclaim one's mind, life and material environment. Entertainment in general can be used to increase the interest in such a reclaim.

Many people imagine progress as a kind of unidirectional growth curve and therefore regard the postapocalyptic era as a "return to the past". However, the future world is more likely to become radically different from any previous historical era -- regardless of some possible "old-fashioned" aspects. It may therefore more relevant to use fantasy rather than history to envision the future.

Tuesday 5 August 2014

The resource leak bug of our civilization


A couple of months ago, Trixter of Hornet released a demo called "8088 Domination", which shows off real-time video and audio playback on the original 1981 IBM PC. This demo, among many others, contrasts favorably against today's wasteful use of computing resources.

When people try to explain the wastefulness of today's computing, they commonly offer something I call "tradeoff hypothesis". According to this hypothesis, the wastefulness of software would be compensated by flexibility, reliability, maintability, and perhaps most importantly, cheap programming work. Even Trixter himself favors this explanation.

I used to believe in the tradeoff hypothesis as well. I saw demo art on extreme platforms as a careful craft that attains incredible feats while sacrificing generality and development speed. However, during recent years, I have become increasingly convinced that the portion of true tradeoff is quite marginal. An ever-increasing portion of the waste comes from abstraction clutter that serves no purpose in final runtime code. Most of this clutter could be eliminated with more thoughtful tools and methods without any sacrifices. What we have been witnessing in computing world is nothing utilitarian but a reflection of a more general, inherent wastefulness, that stems from the internal issues of contemporary human civilization.

The bug


Our mainstream economic system is oriented towards maximal production and growth. This effectively means that participants are forced to maximize their portions of the cake in order to stay in the game. It is therefore necessary to insert useless and even harmful "tumor material" in one's own economical portion in order to avoid losing one's position. This produces an ever-growing global parasite fungus that manifests as things like black boxes, planned obsolescence and artificial creation of needs.

Using a software development metaphor, it can be said that our economic system has a fatal bug. A bug that continuously spawns new processes that allocate more and more resources without releasing them afterwards, eventually stopping the whole system from functioning. Of course, "bug" is a somewhat normative term, and many bugs can actually be reappropriated as useful features. However, resource leak bugs are very seldom useful for anything else than attacking the system from the outside.

Bugs are often regarded as necessary features by end-users who are not familiar with alternatives that lack the bug. This also applies to our society. Even if we realize the existence of the bug, we may regard it as a necessary evil because we don't know about anything else. Serious politicians rarely talk about trying to fix the bug. On the contrary, it is actually getting more common to embrace it instead. A group that calls itself "Libertarians" even builds their ethics on it. Another group called "Extropians" takes the maximization idea to the extreme by advocating an explosive expansion of humankind into outer space. In the so-called Kardashev scale, the developmental stage of a civilization is straightforwardly equated with how much stellar energy it can harness for production-for-its-own-sake.

How the bug manifests in computing


What happens if you give this buggy civilization a virtual world where the abundance of resources grows exponentially, as in Moore's law? Exactly: it adopts the extropian attitude, aggressively harnessing as much resources as it can. Since the computing world is virtually limitless, it can serve as an interesting laboratory example where the growth-for-its-own-sake ideology takes a rather pure and extreme form. Nearly every methodology, language and tool used in the virtual world focuses on cumulative growth while neglecting many other aspects.

To concretize, consider web applications. There is a plethora of different browser versions and hardware configurations. It is difficult for developers to take all the diversity in account, so the problem has been solved by encapsulation: monolithic libraries (such as Jquery) that provide cross-browser-compatible utility blocks for client-side scripting. Also, many websites share similar basic functionality, so it would be a waste of labor time to implement everything specifically for each application. This problem has also been solved with encapsulation: huge frameworks and engines that can be customized for specific needs. These masses of code have usually been built upon previous masses of code (such as PHP) that have been designed for the exactly same purpose. Frameworks encapsulate legacy frameworks, and eventually, most of the computing resources are wasted by the intermediate bloat. Accumulation of unnecessary code dependencies also makes software more bug-prone, and debugging becomes increasingly difficult because of the ever-growing pile of potentially buggy intermediate layers. 

Software developers tend to use encapsulation as the default strategy for just about everything. It may feel like a simple, pragmatic and universal choice, but this feeling is mainly due to the tools and the philosophies they stem from. The tools make it simple to encapsulate and accumulate, and the industrial processes of software engineering emphasize these ideas. Alternatives remain underdeveloped. Mainstream tools make it far more cumbersome to do things like metacoding, static analysis and automatic code transformations, which would be far more relevant than static frameworks for problems such as cross-browser compatibility.

Tell a bunch of average software developers to design a sailship. They will do a web search for available modules. They will pick a wind power module and an electric engine module, which will be attached to some kind of a floating module. When someone mentions aero- or hydrodynamics, the group will respond by saying that elementary physics is a far too specialized area, and it is cheaper and more straight-forward to just combine pre-existing modules and pray that the combination will work sufficiently well.

Result: alienation


The way of building complex systems from more-or-less black boxes is also the way how our industrial society is constructed. Computing just takes it more extreme. Modularity in computing therefore relates very well to the technology criticism of philosophers such as Albert Borgmann.

In his 1984 book, Borgmann uses the term "service interface", which even sounds like software development terminology. Service interfaces often involve money. People who have a paid job, for example, can be regarded as modules that try to fulfill a set of requirements in order to remain acceptable pieces of the system. When using the money, they can be regarded as modules that consume services produced by other modules. What happens beyond the interface is considered irrelevant, and this irrelevance is a major source of alienation. Compare someone who grows and chops their own wood for heating to someone who works in forest industry and buys burnwood with the paycheck. In the former case, it is easier to get genuinely interested by all the aspects of forests and wood because they directly affect one's life. In the latter case, fulfilling the unit requirements is enough.

The way of perceiving the world as modules or devices operated via service interfaces is called "device paradigm" in Borgmann's work. This is contrasted against "focal things and practices" which tend to have a wider, non-encapsulated significance to one's life. Heating one's house with self-chopped wood is focal. Also arts and crafts have a lot of examples of focality. Borgmann urges a restoration of focal things and practices in order to counteract the alienating effects of the device paradigm.

It is increasingly difficult for computer users to avoid technological alienation. Systems become increasingly complex and genuine interest towards their inner workings may be discouraging. If you learn something from it, the information probably won't stay current for very long. If you modify it, subsequent software updates will break it. It is extremely difficult to develop a focal relationship with a modern technological system. Even hard-core technology enthusiasts tend to ignore most aspects of the systems they are interested in. When ever-complexifying computer systems grow ever deeper ingrained into our society, it becomes increasingly difficult to grasp even for those who are dedicated to understand it. Eventually even 
they will give up.

Chopping one's own wood may be a useful way to counteract the alienation of the classic industrial society, as oldschool factories and heating stoves still have some basics in common. In order to counteract the alienation caused by computer technology, however, we need to find new kind of focal things and practices that are more computerish. If they cannot be found, they need to be created. Crafting with low-complexity computer and electronic systems, including the creation of art based on them is my strongest candidate for such a focal practice among those practices that already exist in subcultural form.

The demoscene insight


I have been programming since my childhood, for nearly thirty years. I have been involved with the demoscene for nearly twenty years. During this time, I have grown a lot of angst towards various trends of computing.

Extreme categories of the demoscene -- namely, eight-bit democoding and extremely short programs -- have been helpful for me in managing this angst. These branches of the demoscene are a useful, countercultural mirror that contrasts against the trends of industrial software development and helps grasp its inherent problems.

Other subcultures have been far less useful for me in this endeavour. The mainstream of open source / free software, for example, is a copycat culture, despite its strong ideological dimension. It does not actively question the philosophies and methodologies of the growth-obsessed industry but actually embraces them when creating duplicate implementations of growth-obsessed software ideas.

Perhaps the strongest countercultural trend within the demoscene is the move of focus towards ever tighter size limitations, or as they say, "4k is the new 64k". This trend is diagonally opposite to what the growth-oriented society is doing, and forces to rethink even the deepest "best practices" of industrial software development. Encapsulation, for example, is still quite prominent in the 4k category (4klang is a monolith), but in 1k and smaller categories, finer methods are needed. When going downwards in size, paths considered dirty by the mainstream need to be embraced. Efficient exploration and taming of chaotic systems needs tools that are deeply different from what have been used before. Stephen Wolfram's ideas presented
in "A New Kind of Science" can perhaps provide useful insight for this endeavour.

Another important countercultural aspect of the demoscene is the relationship with computing platforms. The mainstream regards platforms as neutral devices that can be used to reach a predefined result, while the demoscene regards them as a kind of raw material that has a specific essence of its own. Size categories may also split platforms into subplatforms, each of which has its own essence. The mainstream wants to hide platform-specific characteristics by encapsulating them into uniform straightjackets, while the demoscene is more keen to find suitable esthetical approaches for each category. In Borgmannian terms, demoscene practices are more focal.

Demoscene-inspired practices may not be the wisest choice for pragmatic software development. However, they can be recommended for the development of a deeper relationship with technology and for diminishing the alienating effects of our growth-obsessed civilization.

What to do?


I am convinced that our civilization is already falling and this fall cannot be prevented. What we can do, however, is create seeds for something better. Now is the best time for doing this, as we still have plenty of spare time and resources especially in rich countries. We especially need to propagate the seeds towards laypeople who are already suffering from increasing alienation because of the ever more computerized technological culture. The masses must realize that alternatives are possible.

A lot of our current civilization is constructed around the resource leak bug. We must therefore deconstruct the civilization down to its elementary philosophies and develop new alternatives. Countercultural insights may be useful here. And since hacker subcultures have been forced to deal with the resource leak bug in its most extreme manifestation for some time already, their input can be particularly valuable.