consider the following algorithm to find a minimum edge coloring in a graph G=(V,E)
let k = 0 while E has edges: k = k+1 Let M be a maximum matching in G=(V,E) For every e in M, mark e with color k E = E - M
Could you find a proof (counter example?) that for some graph G it does not find G's chromatic index? It has to exists otherwise edge coloring would be in P since there is a polynomial time algorithm to find a maximum matching.