Wednesday, January 19, 2011

My Recent Failures in Polygon Clipping

I have been prototyping some prerequisites for cookie cutter temporary obstacles recently. And the short story is that it won't happen. The longer story is, I had a couple of different ideas how I would implement the clipping.


First Attempt


I always wanted to try out the dynamic navmesh subdivide algorithm from AGPW4. The same algorithm is also used in FEAR SDK. The idea is that to subtract polygon A from polygon B, you split polygon A with each edge of poly B. You know, like BSP.

The good thing is that the algorithm is really simple to implement and the resulting polygons are also convex. The bad thing is that it creates a lot of extra vertices and polygons. After few overlapping intersections you have left nothing but magic fairy dust. And not the good kind.

I did a couple of test to remove the extra vertices. Some test were successful but in overall the whole effort just turned into a giant epsilon fest. That is not the kind of code I would like to support and maintain.

So the lesson was that if you choose to use the BSP-like algorithm to cut your polygons, make sure your code can handle T-junctions. That way you can always track which edges to connect without any additional epsilon magic.


Second Attempt


I think I might have gotten the previous attempt to work to some degree with a lot of sweat. But the fact that it would create huge amounts of polygons really turned me away from it. I like how grid based methods make it easy to handle obstacles. So my second attempt was to use that.

The idea was that I would create a grid which covers a whole polygon plus some padding and rasterize the obstacles into that grid. Then I would use monotone partitioning to create polygons from that raster data. Finally I would clip those polygons against the edges of the original polygon.

While this was kinda silly idea, I think it was worth pursuing. I gave me a lot of ideas, which I will discuss later. Anyways. This idea suffers from the extra vertex mania.


Third Attempt

Despite a lot of failing, I was not done yet! As my last resort I tried to use a proper intersection and tessellation library. While I was able to solve the case for one polygon with multiple obstacles, the performance was not too tempting. While the GLU tesselator is robust and scales pretty well, things are different when you want to do things that are below 1ms.

That would have still left me with all kinds of epsilon tests to handle the (necessary) new vertices added at the polygon edges. Vertex welding is not particularly complicated thing to do, but it can and will create degenerate and flipped geometry in odd cases. I don't like odd cases.


While my attempts to create cookie cutter obstacles has failed, the process has sprout some great ideas to pursue. The one big lesson I learned during this research so far has been that it is really interesting to write code which has known upper bound, let it be memory usage or processing requirements. And I think that is the key to make something dynamic happen quickly–hard, known, bite sized limits in computation and memory.

2 comments:

  1. Mikko,

    Did you pursue your second attempt in this post? We are starting to consider ways of dynamically updating our nav mesh.

    Shawn.

    ReplyDelete
  2. I eventually went with storing compressed 2D heightfields and running simplified version of Recast at runtime [1]. See DetourTileCache in the SVN.

    [1] http://digestingduck.blogspot.com/2011/04/temporary-obstacle-progress.html

    ReplyDelete