I have a very sparse undirected graph. I need to decompose it into n
sets of edges. In each of these sets, all edges are independent (no two edges share any vertices). Every edge in the graph is contained in exactly one of these sets.
The goal here is to minimize n
. Note that this problem is somewhat different from the classic largest independent edge set problem in that EVERY edge must go into SOME set. We just want as few of these sets as possible.
I have a simple algorithm that provides decent results. In Python-like pseudocode:
start = ... #All of the edges are initially in the graph "start". sets = {} #The result of the algorithm: n independent sets, together containing all the edges while size(start) > 0: new_set = {} for edge in start: if "edge" is independent of any edge in "new_set": start.remove(edge) new_set.add(edge) sets.add(new_set)
I can easily demonstrate by counterexample that it is nonoptimal. Consider the graph (note the edges' numbering):
#----# #----# #----# #----# #----# | V1 |----E1----|-V2-|----E2----|-V3-|----E4----|-V4-|----E3----|-V5-| #----# #----# #----# #----# #----#
The optimal solution is the set {{E1,E4},{E2,E3}}, but the above algorithm will output {{E1,E3},{E2},{E4}}, because it chooses E3 before E4. On (my) real data, it takes about 11% more sets than is optimal.
Does a better algorithm exist to solve this problem?
Speed is of some concern, but it is in the context of an amortized operation, so it's not terribly important. Right now, I'm interested in if an optimal algorithm actually exists/is known.