Thursday, April 23, 2015

Towards Realtime Deformable Worlds: Why Tetrahedra Rule, Voxels Drool

In this article I provide a chronicle of our turbulent terrain development, what lessons I've learned and what's next. Also, why tetrahedra (probably) rule and voxels (mostly) drool.

It was nearly one and a half year ago that we showcased an early tech alpha of NOWHERE that had organically growing and sculptable terrain.

I had put high hopes in this approach, but found that, while triangle meshes allowed fine control over geometric features, they were difficult to simulate efficiently in a way that would keep the surface 2-manifold.

The last alpha version that we released took a few minutes of CPU time to grow a single coral like structure. The program takes several snapshots from the same mesh at different stages and exposes them as separately rotating triangle meshes.

Knowing that I wanted to generate worlds far larger than that, running out of ideas in what ways initial geometry could be seeded, and hitting upon other shortcomings of triangle meshes, I embarked on a long and perilous journey to explore the rabbit hole of alternate geometric representations.

But first, let us visit the ultimate (and retconned) list of requirements that I made for NOWHERE's terrain:

  • 360° degrees of freedom, modelling of minor planets in zero G
  • Ideally a single integrated solution for animated actors as well as terrain
  • Can be seeded from any implicit function (not necessarily just distance fields)
  • Support for smooth surfaces but also sharp features (e.g. a cylinder would have both)
  • Can model rocky terrain with many cavities
  • Able to model anisotropic structures like sticks and membranes
  • Can model contiguous, "never-ending" surfaces, but also separated, distinct structures like floating rocks
  • Can host cellular automata for non-scripted procedural growth of plants-like organisms and crystals
  • Can selectively host rigid and soft body physics simulations, and run classic bone animation
  • Can host a simulator for flowing substances like water, lava, plasma or goo
  • Persistently diggable, fillable, sculptable by players
  • Support for different sediments: as you dig in, you hit upon different layers of material
  • Supports classical mesh operations like extrusion, subdivision as well as boolean operators (better known as CSG modelling)
  • Supports seamless ornamental and player-authorable texturing, similar to Wang Tiles; Classical triplanar mapping is not enough.
  • Support for realtime ambient occlusion / radiosity
  • Scale independent, non-uniform distribution of detail
  • Support for seamless distance-dependent level of detail

Over the course of this project, whose beginnings started several years ago (back then, what the gameplay would ultimately be was completely unclear), I tried more than a handful of solutions. The first one was the most obvious choice:
You can clearly see what influenced it. But I wanted more freedom of expression. The next approach glued prefabricated models together, similar to how Besiege works:
While that covers scaffolding nicely, it's neither particularly organic nor terrain oriented, in fact it scaled rather badly, and would only allow for rather puny scenes.

Then I tried mixing scaffolding and raymarching distance fields to produce a huge, if only somewhat monotonic terrain that was unfortunately completely unalterable and ate too much GPU time to make it a comfortable solution for low end computers (We want to make people with weaker hardware happy too):
I had some luck with Wang Tiles earlier, so I thought a 3D version of that might be interesting to try:
This ended up being more of a one trick pony. At the time I entertained the thought that terrains could not be altered, but I could not make peace with it. On the way to figuring out how to make geometry user-editable, I experimented with procedural terrain growth by extrusions, the first triangle mesh based solution:
The pesky problem with extruding triangle meshes is that it is impossible to tell when surfaces intersect, which turns the mesh into garbage; Furthermore, joining intersecting meshes is everything but trivial. I started to look into scientific papers for solutions and was made aware of St├ęphane Ginier's formidable SculptGL, which does Z-Brush/Sculptris-like sculpting with support for punching holes into the topology, a rarely supported technique. I wanted the same thing for our game, and that's where my first full-on foray into triangle mesh editing started (see first video).

So in the past year I looked into voxel based solutions again, particularly Dual Contouring, and I got some rather spectacular results out of it, in terms of being able to feed it with distance field functions, doing CSG on it, running light simulation, which is why I thought this would be the ultimate solution we'd end up with. The first octree-based solution ran on the old Python engine, and despite large parts being written in C, seeding even small volumes took a few seconds too long, and editing wasn't realtime enough. Here are some screenshots from that time. Sigh.
After a rewrite of our engine and experimenting with new editing paradigms in January this year, I had an insight for how to do, well, something with tetrahedral meshes, but it didn't quite click yet.

Instead, I got sidetracked into tetrahedra-based dual contouring and wrote a fast GPU-based implementation that used a grid instead of an octree, and experimented with alternate honeycombs for meshing. Things were great for a while. Descend your gaze upon this smooth animation of an implicit function:
I managed to integrate a realtime light propagation solution that ran on the same grid:
I made strides. I wrote a realtime meshing solution that meshes voxel data on the fly, completely alterable (although that's not visible in the video yet), realtime lit, at the cost of a heavily reduced draw range:

Alas, geometry representation on regular grids sucks for the same reason as shape representation in bitmaps, the 2D analog case, and I've ultimately decided to completely give up on a voxel based solution.

Why? Behold, the long shitlist of voxel-based data structures. If you're considering to write something with voxels, beware of the perils:

  • You have no semantic structure, unable to easily discern area, orientation, islands sharing attributes, proximity, related features over areas larger than the immediate 1-cell neighborhood, which always spans the minimally possible distance; Octrees can help caching some of this information, but they're not nearly as accurate as a graph-based representation.
  • Storage increases at the power of 3 (^2 in 2D). Every time your side lengths double, storage increases by a factor of 8. That means 1 GB of VRAM stores only a 1024^3 voxel cube at 1 byte per voxel; typically, when dual contouring and different materials are involved, a voxel costs at least 16 bytes, so a ~406^3 voxel cube is more realistic.
  • Rendering must be done in triangles, which requires a "mixdown" of the data, as the two structures do not store data in the same way. You keep the same information twice in memory. Also, you can not know the number of generated triangles in advance, neither can you update single triangles well (storage penalty), which makes compact allocations more difficult.
  • While detail is not taxed (in fact, it's prepaid ;-), it is capped at the nyquist frequency of <side length>; you can not have features smaller than that. 
  • On the other hand, large areas without topological features still store values at the full sample rate, which, for reasons explained, is a tremendous waste of space. 
  • Scaling / shearing / rotating grid data is lossy. Transformed voxels don't always fit back into individual cells due to the nyquist cap. Even a dual contouring grid can't guarantee that edges and vertices are always preserved. That means you're forced to keep moving objects separate from the world. 
  • Sparse octrees can help with culling rendundant spaces, but their expanse is isotropic; the volume of a pencil still requires breaking down resolution to the maximum hierarchy, despite less surface detail along the length of the pencil. 
  • Hierarchical access in an octree is limited to 10 levels with 32-bit addressing. That means if you want to address a space with a resolution higher than 1024^3, you will need 64-bit morton codes, which incurs a storage and compute penalty. 
  • All octrees are bounded. Exceeding the bounds means at best patching and at worst rebuilding the tree. Likewise, all grids are bounded. Resizing the grid can't be done with a simple memcpy. Also, all operations that could exceed the resolution must be checked. 
  • Neighborhood queries in octrees are a mess. Going up the hierarchy is manageable, going down requires extra data; explicit edges are not present. In short: due to hierarchical storage, octree cells are not really neighbors. 
  • Because of this, local transformations in an octree cause deep changes all over the hierarchy. 
  • Raycasting requires visiting each cell with a Bresenham-like algorithm; Octrees can help here, but can't help in the pencil case, when the ray is grazing the surface. 
  • Buckets can alleviate some Octree issues, but not ultimately fix them, and incur an added storage penalty. 
  • Mesh LOD techniques are inapplicable; LOD only works for dealing with situations where voxels become smaller than a pixel. Dual Contouring seems like it could apply, but it has no good LOD scheme, and nyquist is also forced to jump by a whole factor 2, regardless of topological detail. 
  • There is no popular voxel data format or voxel data editor; Most work that artists do these days is stored in meshes and textures, and it's impossible to import them without damaging the model's representation, let alone export them for backwards compatibility. The semantic information is destroyed. 
So I went to look into Tetrahedral meshes, also called "Finite Element Meshes", the solution I'm currently implementing. The idea is pretty straightforward: They work exactly like triangle meshes, except that the triangles are expanded by one dimension to form a volumetric mesh of tetrahedra, which tessellates space completely into solids and air. Tetrahedral meshes are largely undiscovered by the game development community (with two notable exceptions), but an old hat for companies doing structural analysis simulations (think hard core Bridge Builder) and interactive tissue surgery simulators.

Here's why I think that NOWHERE will fare much better using a tetrahedral mesh for its terrain, in comparison with the bullet list above:

  • A mesh is a graph and therefore nothing but semantic structure. Area, orientation, neighboring islands sharing attributes, proximity are easily discernible from the provided vertice-edge-face-cell structure. The immediate 1-cell neighborhood spans large volumes of space when topological detail is low, and small volumes where topological detail is high.
  • Storage increase is independent of space spanned, but depends only on topological complexity, so it's a little difficult to tell how much complexity this buys you. Assuming no vertices are shared, and only float positions are stored, 1 GB of VRAM covers about 29 million triangles, or 22 million tets. If those tets were stored at equal distance, they would compact to a ~281^3 voxel cube; but could span a distance only bounded by desired float precision.
  • Rendering must be done in triangles, which is easily achieved either by pruning the mesh for surface bounding faces only, or maintaining boundary faces in a separate array during mesh operations.
  • Topological detail is taxed, but independent of scale; Size of features is only bounded by floating point precision.
  • A large area without any variance in data can occupy as few tets as possible, providing an effective compression for regions with low entropy.
  • Scaling / shearing / rotating vertices is, apart from float precision issues, lossless. C1 continuous deformations do not alter topology. Transformations that do always cause at least one tet to invert, which can be detected and fixed with local remeshing. This only alters face and edge relationships, but not vertex positions.
  • All classical mesh animation techniques, like bone animation, still work here. Additionally, 3D cellular automata also operate on tetmeshes. Physics simulations of softbodies or fracturing volumes on tetmeshes are well documented.
  • Meshes can use highly anisotropic scales. The pencil example would perform quite well here, tessellating space with the lowest amount of elements required. Along the length of the pencil, no extra elements need to be added.
  • Access in a tetmesh is not explicitly hierarchical, but graph based. Each tet is a node with four neighbors (using a half-face structure), and most spatial queries are done by walking along tetfaces. The tetmesh acts as both volume data and accelleration structure. For everything else, non-exclusive ad-hoc BVH's can be constructed.
  • A Mesh is only bounded by its cells. If you need more space, add as many cells as required. This can be done locally and directed. It is also possible to maintain the mesh within the volume of a cube with near infinite side length.
  • Local transformations in a tetmesh are indeed only local to the hull of their connected neighborhood.
  • Raycasting is a simple neighborhood walk algorithm. In the pencil case, when the ray grazes the surface along its length, a few steps suffice to cover an enormous distance.
  • Mesh LOD techniques apply. Techniques exist to adaptively decimate meshes by storing edge collapse operations in a hierarchy (a so-called Half-Edge Tree).
  • Triangle mesh editing is the de-facto standard for models in games. Many formats and assets exist. Any 2-manifold triangle mesh can be tessellated into a tetmesh, and pruned back into a triangle mesh without any loss of information, so triangle mesh imports and exports can be easily supported.
This is the very last solution I'm trying, and the one we'll most likely stick with, which means that the gameplay of the next alpha will be close to Alpha 75 again, and then some.

The only regret I have is that I didn't know all this three years ago. It took a long time to look at approaches and figure out what the game needs. We could have settled for something way earlier had I been more willing to make compromises. But at least this way I can say we picked the best solution for our problem, and the game is truly becoming the kind of game we want to play.

Regarding tetrahedral meshes and related techniques, I'm keeping a lengthy list of relevant papers that may be of interest to anyone else interested in the subject here.

Tuesday, January 6, 2015

Conspire: A Programming Environment for NOWHERE

It's time for another NOWHERE tech write-up. I've been tweeting about my work on Twitter up to the point where I was nudged to write a longer blog post about what the hell I'm actually doing, so this is an attempt at doing just this. A chronological description of my trials and tribulations and where I finally ended up.

The importance of tooling can not be overstated. There are no tools out there for the kind of deeply procedural game we're working on, and good tooling comprises 90% of what makes the game, as nearly all of our content is procedural in one way or another, and not handmade. If there's currently a lack of procedural content out there, it's precisely because of the lack of tooling.

So I set out to construct an IDE in which assembling procedures in a maintainable way became easier. Inspired by UE4's Blueprints, I began with graph based editing as a guide, as graphs make it relatively easy to describe procedural flow. As a warm-up, I wrote two tiny C libraries: a theming library based on Blender's UI style, and a low level semi-immediate UI library that covers the task of layouting and processing widget trees.

The IDE, dubbed Noodles, was written on top of the fabled 500k big LuaJIT. The result looked like this:

Demonstrating compaction of node selections into subnodes, ad absurdum ;-)
A back-end compiler would translate these graph nodes back to Lua code to keep execution reasonably efficient, and I added support for GLSL code generation, something I've been planning to do from the beginning. I found that the ability to cover different targets (dynamic programming, static CPU, GPU pipelines) with a single interface paradigm became somewhat of a priority.

A simple GLSL shader in nodes, with output visible in the background.
The workflow was pretty neat for high level processing, but working with the mouse wasn't fast enough to construct low level code from scratch - refactoring was way easier though.

Noodles, shortly before I simplified the concept. Live-editing the OpenGL code for a cube rotating in the background.
I still didn't have much of an idea what the semantics of programming with nodes were going to be. I felt that the system should be able to analyze and modify itself, but a few design issues cropped up. The existing data model was already three times more complex than it needed to be. The file format was kept in text form to make diffing possible, the clipboard also dealt with nodes in text form, but the structure was too bloated to make manual editing feasible. The fundament was too big, and it had to become lighter before I felt ready to work on more advanced features.

At this point, I didn't know much about building languages and compilers. I researched what kind of existing programming languages were structurally compatible with noodles, and data flow programming in general. They should be completely data flow oriented, therefore of a functional nature. The AST must be simple enough to make runtime analysis and generation of code possible. The system must work without a graphical representation, and be ubiquitous enough to retarget it for many different domain specific graphs.

It turned out the answer had been there all along. Since 1958, to be exact.

Or 1984, if we start with SICP. Apparently everyone but me has been in CS courses, and knows this book and the fabled eval-apply duality. I never got in contact with Lisp or Scheme early on, something that I would consider my biggest mistake in my professional career. There are two XKCD comics that are relevant to my discovery here:


Did you know the first application of Lisp was AI programming? A language that consists almost exclusively out of procedures instead of data structures. I had the intuitive feeling that I had found exactly the right level of abstraction for the kind of problems we are and will be dealing with in our game.

My first step was changing the computational model to a simple tree-based processing of nodes. Here's the flow graph for a fold/reduce function:

Disassemble a list, route out processing to another function, then reassemble the list
I figured out a way to do first-order functions in a graph, and did a little demonstrative graphic about it.
Click for a bigger picture
While these representations are informative to look at, they're neither particulary dense nor easy to construct, even with an auto-complete context box at your disposal. You're also required to manually layout the tree as you build it; while relaxing, this necessity is not particularly productive.

It became clear that the graph could be compacted where relationships were trivial (that is: tree-like), in the way Scheme Bricks does it:

Not beautiful, but an interesting way to compact the tree
And then it hit me: what if the editor was built from the grounds up with Lispy principles: the simplest graphically based visualization possible, extensible from within the editor, so that the editor would factually become an editor-editor, an idea I've been pursuing in library projects like Datenwerk and Soil. Work on Noodles ended and Noodles was salvaged for parts to put into the next editor, named Conspire.

A very early screenshot. Atoms are rendered as widgets, lists are turned into layout containers with varying orientation. 
At its heart, Conspire is a minimal single-document editor for a simplified S-expression tree that only knows four immutable data types: lists (implemented as Lua tables), symbols (mapped to Lua strings) , strings (a Lua string with a prefix to distinguish it from symbols) and numbers (mapped to the native Lua datatype).

By default, Conspire maps editing to a typical text editing workflow with an undo/redo stack and all the familiar shortcuts, but the data only exists as text when saved to disk. The model is an AST tree; the view is one of a text editor.

Rainbow parentheses make editing and reading nested structures easier.
Conspire can be extended to recognize certain expressions in the same way (define-syntax) works in Scheme, and style these expressions to display different controls or data:

A numerical expression is rendered as a dragable slider that alters the wrapped number.
In the example above, the expression (ui-slider (:step 100 :range (0 1000) <value>) is rendered as a slider widget that, when dragged with the mouse, alters the <value> slot in the AST tree the view represents. The operations are committed to the undo buffer the same way any other editing operation would.

Using this principle, the editor can be gradually expanded with more editing features. One of the first things that I added was the ability to nest editors within each other. The editor's root document then acts as the first and fundamental view, bootstrapping all other contexts into place:

The root document with unstyled markup. csp-edit declares a new nested editor.
Hitting the F2 key, which toggles styling, we immediately get the interpreted version of the markup document above. The referenced documents are loaded into the sub-editors, with their own undo stack:

Tab and Shift+Tab switch between documents.
New views and controllers such as a Noodles-like graph editor could be implemented as additional AST transformers, allowing the user to shape the editor into whatever it needs to be for the tasks at hand, which in our case will be game programming.

The idea here is that language and editor become a harmoniously inter-operating unit. I'm comparing Conspire to a headless web browser where HTML, CSS, Javascript have all been replaced with their S-Expression-based equivalents, so all syntax trees are compatible to each other.

I've recently integrated the Terra low-level extensions for Lua and am now working on a way to seamlessly mix interpreted code with LLVM compiled instructions so the graphics pipeline can completely run through Conspire, be scripted even while it is running and yet keep a C-like performance level. Without the wonders of Scheme, all these ideas would have been unthinkable for the timeframe we're covering.

Oh, and there's of course one more advantage of writing your editor from scratch: it runs in a virtual reality environment out of the box.

Conspire running on a virtual hemispheric screen on the Oculus Rift DK2



Sunday, November 23, 2014

NOWHERE Progress Report November 2014



Leonard wrote in the Nowherian forum about our current progress with NOWHERE:

Right now is, hands down, the worst part of development. There are no fancy graphics to show off, no intricate gameplay, no surprising AI, no badass music, just unglamorous system design that interests no one so we can get all the aforementioned stuff in a manageable form that doesn't keep becoming a sluggish and unserviceable mess. 

Sunday, July 20, 2014

NOWHERE Progress Report: It's Going Great & Terrible At The Same Time


This post is partly an explanation of what's currently going on, partly an attempt to summarize the situation for ourselves.

First, the bad news.

With the final release of NOWHERE scheduled for the end of 2015, we're currently about 30% into development, and tech wise, it's going great. As you know our goal is to be 99% asset free, that is: all assets are generated on the players computer, for the players world, and the player will also have a chance to guide this process in the game. I've finished prototyping and embedding the meshing & landscaping tech, wrote a new procedural audio engine, and got the procedural model generator to productive speeds, on which I'll write more at a later date.

We solved some tough design issues concerning Nowherian world structure, body physique and society building, although it still all exists only in thoughts and paper, and none of that is implemented yet; Among improving our procedural authoring tools, I'm currently in the process of laying the foundation for world persistence, which is a demanding challenge. You can track my progress online at our open source repositories. I can't wait to finally work more on the actual content.

I'm sorry, I messed up, that were the good news! So, now the bad news:

We have not only been greenlit on Steam (so could theoretically release NOWHERE as Early Access game any day), but have also been accepted to talk about NOWHERE at the GDC Europe Innovative Games Showcase in August (out of what I imagine to be hundreds of applicants).

Wait, that's actually good news again. Here's the bad news. For real now.

In light of the rising complaints about Early Access games being released too early, we wanted the next alpha release to make a good impression, and so we overran our deadlines numerous times to get to a point where the game would be presentable enough for an Early Access crowd. (And we do need that crowd. The founding campaign on our own website isn't nearly attracting as many supporters as we need to cover funding for the complete development time.)

The result is that we're broke, phenomenally so. The Humble Store revenues for this month wouldn't cover our expenses sufficiently; Sylvia's dad borrowed us €1k to cover for this month but the situation is repeating. This month we're only getting $250 in revenues, but we need about $2k to cover rent, utility, food, etc.

The original planning was that a Steam release this month would give us sufficient revenues in writing (Both Humble and Steam pay out revenue with one month delay, e.g. this month we've made about $360, which are only due for payout for the end of August), so we could borrow a little more knowing that we'd be able to pay it back soon enough, and that the risk would be minimal. Alas, I can't seem to find an end for this alpha just yet, at least not one that would attract enough new interest. We gambled too hard.

I admit, we're really bad at advertising for ourselves. Talk is cheap and people want to see results, which is why I dug deep into what I love to do (writing sweet sweet game tech), and avoided doing anything that would not further development directly, such as video promotion, more interviews, and so on. This was probably not a good strategy. Now we need to find a way to fix this.

I would like to repeat that we are not ever going to give up. We're agile enough to deal with setbacks, and we own 100% of our project. This game is going to get made, whatever it takes. This is the work of a lifetime, and there is no other project we'd rather work on.

If you would like to help us out financially, and you feel you can afford to spare a small contribution on a monthly basis, please have a look at our Patreon page. Patreon contributions reach us sooner than any other revenue source.

Our fundraiser is of course also still running.

We would also again thank all our founders and supporters for your trust and contributions, you're making this project possible, and you deserve to see an excellent outcome.

Update: my amazing mom read our blog article and, since my birthday is coming up,  spontaneously decided to send us a little "birthday present" (and offered emergency loans in case this happens again), so August is saved. You all have moved heaven and earth in the past days to get us back on track too, and it worked! September appears almost covered now as well. You all are incredible and we are lucky to have such strong support! I'm back on working on the next release, and we'll be able to do one or two more alphas for founders before the Steam Early Access release. We're aiming for early August. Let's hope the trip goes a bit smoother from here on.



Thursday, May 8, 2014

Voxel Mesh Hybrids: A Walkthrough

Six more weeks have passed since I first posted about Adaptive Volumetric Meshes, which have since turned into Voxel Mesh Hybrids after some refinements for the sake of simplicity and ease of use. It's the driving tech behind the procedural and player-authored meshing that our game in production, NOWHERE (alpha available for download), requires.

After the recent refinements, it's time for another short write-up, as the work has reached a stable state and is nearly ripe for release as part of the next Alpha later this month. Beware: from here on it gets technical.

Friday, April 4, 2014

NOWHERE - News From The Business End

Aside from all the hardcore video game making going on, a bunch of super exciting business related events happened during the last few weeks as well. Enough reason to post a summary. Hold on to your seats.


March 4th: Thanks to everyone involved (which most likely includes you), NOWHERE has been greenlit on Steam. That means our greatest hurdle to a Steam release has been overcome. All existing and future buyers of the game will be able to redeem their Steam keys after launch. NOWHERE is scheduled to launch April 15th 2014 (Update: Steam launch and new alpha delayed, new alpha scheduled for June 10th!) in the Early Access category, along with a new alpha build and an accompanying trailer. Until then, you can follow new updates on NOWHERE’s Steam Greenlight page. A huge thank you to everyone who voted for us and supported us, you are amazing and the reason we got this far.


March 17th: All founders registered for access to the Founders Lounge at the Nowherian forum can now download all previously released alphas from its Release Archive, thanks to the tireless work of forum administrator Frame. If you are a founder, you are eligible for the Founders Lounge! To add founder status to your forum account, just send us an email to support-at-duangle-dot-com and mention your forum username and purchase email address. If you bought the game via Humble Store please send us a screenshot of the purchase or forward the purchase confirmation mail. I'll process every request as quickly as possible. Thank you for your patience!


March 23th: A few founders asked if it were possible to support the development of NOWHERE on a monthly basis, so we created a Patreon site. You can also pledge smaller amounts monthly to get a copy of Nowhere. Patrons pledging at least $21 will be listed in the Patreon section in the game's credits (ranked by contribution).


  March 24th: Nowhere has been launched on the official Humble Store. Please note that due to technical reasons these sales go into another bucket; the full list of reward tiers can only be purchased on our site.


  March 30th: 1.000 founders (currently 1.042) have bought and supported NOWHERE. THAT IS SERIOUSLY TOTALLY NUTS YOU GUYS.


March 31th: We visited Rezzed this weekend, listened to great dev talks and met a bunch of great people. Although videogamesing is awesome, It’s good to leave the cave and do something different from time to time. Birmingham was lovely. What in the world beats vintage cheddar? Nothing. Not even our game.

April 1st: We’re thrilled to report that Duangle got acquired by Mountain Dew for a round $2 billion sum and is renamed to Dewangle. All Nowherian bodies will be branded with appropriate new corporate identity. An exciting synergy only for people who have no idea what’s been going on lately!

April 2nd: Duncan Harris, curator and creator of the lovely website Dead End Thrills published a long interview with Leonard and me in his Rock Paper Shotgun column. Check it out!


  April 11th, 2014: The distant future. Nowhere will be playable at PAX, at the “Joyful Bewilderment” installation organized by BigSushi.fm starting April 11th between 8 p.m. and 2 a.m. in Boston, MA, USA. We’re not sure yet, but either Alpha 75 or Alpha 91 will be shown.

Monday, March 24, 2014

New Tech Unlocked: Adaptive Volumetric Meshes


This week I reached a major milestone in my implementation of the new general meshing tech for our game in production, NOWHERE (alpha available for download), which is going to replace the previous implementation we used for sculpting and will do a better job at covering the different procedural modelling approaches the game requires. To commemorate the occasion, I did a little write-up. Beware: from here on it gets quite technical. I interspersed the article with a few screenshots from the work on the prototype. These are not screenshots of the game.