### Largest Sum of Unique Column & Row Nodes from a Matrix

Posted:

**Thu Sep 08, 2011 8:03 pm**I've had this problem for a while now. It's easy to solve programmatically with brute-force, but solving an arbitrarily large problem (or an arbitrarily large number of smaller problems) takes a lot of resources. I have some alternative solutions, and some theories, but I havn't yet figured out the optimal solution yet. This seemed like a pretty good forum to throw this at some people that might have good ideas.

**Basic Problem:**For a matrix with n rows and n columns containing any arrangement of integer values, select the largest possible sum values using one node from each column and one node from each row without using any single node twice.**Further Clarification:**I've had trouble expressing the language clearly, though I think that sentence is a pretty good summary. For example, let's suppose you have a matrix with a number of rows and columns of 4.12 14 16 18 10 08 05 11 06 05 07 17 20 04 09 17There are four columns and four rows. We can call them c1,c2,c3,c4,r1,r2,r3,r4. There are sixteen nodes, in this case ranging from 5 to 20. You can identify a node with a matching column and row, for example the 20 from this matrix is (c1,r4). You cannot use any individual node more than once, but duplication of a value isn't important (so both 17s in the above can be used). We want to select the largest sum of selected values, with one node from each column and row.

**Additional Notes:**Here are a few little things that might be worth thinking about. If you don't like spoilers of any kind, they are excluded.Expand

**Some Proposed Solutions:**Having not yet checked these, I've seen these possible solutions offered.Expand