After seeing Forrest Higgs working on using Voronoi isolates (a version of a Voronoi diagram that starts with lines instead of points) I was inspired to write a similar program which turned out well. The basic algorithm is to expand pixels around a color coded line until it meets another color. If a pixel is neighboring more than one color it gets painted black, rinse and repeat until there's nothing white left.
It was a bit of a struggle to get PNG images edited in Haskell, but eventually I realized that pixbufGetPixels isn't really returning an array, it's returning an array typed pointer to a pixBuf. This makes sense in a practical memory usage way, but it isn't what I expected from the context.
After that I vectorized the edges, that took quite a bit of thought. First line segments were created between the centers of adjacent black pixels, then segments were created from the center of black pixels to any corner of the black pixel where there was a color change between neighboring pixels, and lastly segments between differing colors. This left a bunch of empty sets, and repeated segments, which were filtered out. That resulted in about 4400 segments for this test image. The next step is to lengthen vectors that are near each other in the same directions. That reduced the number of segments for this test to about 3700, it probably could be much fewer if I hadn't limited the angles (0, 45, 90, and 135) or if the underlying lines were horizontal, vertical, and perfectly diagonal.
For now I can't go much farther without hardware to test on...