Calculating optical flow in the video to get the info of the camera and objects moving details. By smoothing the projection matrix we can stabilize the shaking video.

# Game Design Pitch: Fate Weaver

An AR game design tried to make full use of depth sensor (Kinect for Windows etc.), daily use digital devices(tablet, smart phone etc) and computer vision algorithms with reasonable cost, to refine the way people interact with the computer in AR gaming, with limited tech resources and unlimited imagination.

# A Fast Minimum Vertex Cover Approximation

Although I think the discovery has a coincidence, I still love this algorithm. So I copied part of our final report and posted it here.

**Optimal Solution by clique method**

We came up with a method based on the observation that the minimum vertex cover of a clique is v-1, assuming the clique has v vertices, since the vertices in the clique are fully linked. Thus we call it clique method.

We use a greedy algorithm to separate the graph into cliques:

First of all, sort the vertices by their degrees in descending order. Start off with picking the first vertex in the sorted list of vertices, as a clique, then go along the list. If the union of a vertex in the list, besides the rst one being picked, with the clique remains clique, then this vertex is part of the clique. When coming to the end of the list, we obtain a clique of minimum size 1. Then

remove the vertices in the clique and all edges incident to the vertices from the graph G, we get a new graph G’ and a new sorted list of vertices by their degrees. Iterate this procedure until G is represented by a set of cliques. There could be some vertices of degree 0 left in G’, which should be treated as cliques one by one.

As we all know, after the graph is separated into several cliques (although single point clique may happen), whenever we remove vertices from the graph and check whether it still form a vertex cover, we can at most delete 1 vertex from each clique. In this phase, we try all the possible answers by brute force. Assign status to all the cliques: vi is deleted, or no vertex is deleted. So

for every clique that has n vertices, there are n+1 status.

We optimized the procedure of processing the cliques. Try to assign status from the first clique to the last one(choose which to delete or not delete). If in such an status that two deleted vertices from two distinct cliques share an edge, reject the status and change the status a little bit and try again. This helps avoid unnecessary iterations.

After the procedure is finished, all the possible vertex covers are compared, we can conclude with the minimum vertex cover. The complexity depends on how well we can separate the graph into cliques. Roughly speaking, if we can separate a graph with n points into m cliques, the complexity is O(( n/m)^m).

Essentially the procedure of translating the cliques into minimum vertex cover is numerating all vertex cover, no matter how we optimized it, it is time consuming, though it guarantees an optimal solution.

The reason it’s time costing is the input graph is too large, if it’s a small graph, the running time is acceptable. Thus we decide to preprocess the original graph to make it smaller, which can improve the effciency. This is how the following method comes from.

**Approximation Solution by Preprocessing before Clique method**

Sort the vertices by their degree in descending order.

Pick greedy 2-approximation from the sorted vertices list above, from the vertex who has most neighbours to the least.

Pick “boundary” vertices from the 2-approximation above. Evaluate the vertices in the 2-approximation, if there is an edge in the graph that one end point is in the 2-approximation, the other is not, we label the vertex in the 2-approximation as “boundary” vertex.

Remove all the boundary vertices from the graph along with the edges they can cover. Sort the vertices in the new graph and pick a greedy 2-approximation, nd the boundary vertices again and again, until there is a time that none of the vertices in the 2-approximation is a boundary vertex. Now we name the graph left as “core”. Usually core is very small.

Apply the clique method to the core of small size, find the minimum vertex cover of the core, union the boundary vertices we’ve already deleted, it is the minimum vertex cover this method can achieve. The complexity of the algorithm can be O(n^2).

The motive behind this method is that clique algorithm works better when there are more edges (in this case the graph can be translated to fewer cliques), and the greedy 2-approximation algorithm works better when the graph has fewer edges, which would lead to a faster approximation. Combining these two can achieve maximum performance on general graphs. It’s intuitive to find out that this method offers an approximation of the optimal solution, due to the preprocess. However, generally speaking, we confront a tradeo between optimal solution and efficiency. Since this method gives impressive approximation of the optimal solution, it gains more on efficiency than loss on solution.

**The code and testing set:**

After compiled the cpp file, you can use like that:

./greedy_appro < 450-420.g

Here 450 Means the number of vertices in the graph, 420 means the minimum vertex cover size. Our algorithm gives a approximation of size 427. If you want to see the 427 cover, you can use:

./greedy_appro -c <450-420.g

**Link:**