Objects

An animation that visualizes what's going on when software is executed

Normally, you can't see how software is working. Either you only see the graphical user interface and don't see any processing at all, or even as a programmer, most of the time all you see is the static code, or its results, but everything is done invisibly.

This gap is filled with this visualization: it makes visible what's happening when a piece of software is executed. One possibility to do this is to show in any way the bare memory and how it's changing. But this would be a too low level. This simulation shows things on the level of objects which are the building blocks of object-oriented software. Each object is represented as a little ball. You can watch how they are dynamically created and deleted. Several objects can belong to the same class of objects; such a group of objects is shown in the same blueish color.

Objects can know each other, they can have references to each other. These references are shows as lines between the balls. Objects that reference each other attract each other, so they result in forming clusters of interconnected objects.

Finally, objects can be active: as code and data is connected in object-oriented programming, every action that is done by the software, is done by single objects, one at a time. So the activity jumps between objects, which you can observe in the simulation as objects lighting up in red color. Enjoy!

How this was made

The software observed is a programm written in the python programming language. It was essential that the programming language would be interpreted and not compiled, because otherwise you wouldn't be able to observe object assignments. The program was executed on a modified python runtime and object creations, deletions, method calls and assignments were logged. The actual program executed is a converter from markdown to HTML (https://pypi.python.org/pypi/Markdown). It translated a simple markdown test file (tests/basic/amps-and-angle-encoding.txt)

The objects are layed out using a force-directed graph algorithm. Referencing and referenced object attract each other. Objects in general repulse each other. The objects are attracted to the center. Newly created objects are placed randomly near the creating object.

For simulating the objects, C# code was used, which generated python code for the animation and events for the sound. The rendering was done with Blender, which executed the python script to generate the animation keyframes. The video was rendered with 30 frames per second at 1080p resolution on two local machines (2 days) and a cluster of 20 machines on Amazon EC2 cloud (4 hours). The frames were postprocessed by running a script in Gimp. The sound was rendered using CSound. Finally, the video was generated using FFmpeg.

Markus Meyer