Largest Sum of Unique Column & Row Nodes from a Matrix

Arrangements, sorting, packing, partitions, critical path analysis, networks, graphs, ...
Locked
Creamsteak
Posts: 1
Joined: Thu Sep 08, 2011 6:51 pm

Largest Sum of Unique Column & Row Nodes from a Matrix

Post by Creamsteak » Thu Sep 08, 2011 8:03 pm

I've had this problem for a while now. It's easy to solve programmatically with brute-force, but solving an arbitrarily large problem (or an arbitrarily large number of smaller problems) takes a lot of resources. I have some alternative solutions, and some theories, but I havn't yet figured out the optimal solution yet. This seemed like a pretty good forum to throw this at some people that might have good ideas.

Basic Problem: For a matrix with n rows and n columns containing any arrangement of integer values, select the largest possible sum values using one node from each column and one node from each row without using any single node twice.

Further Clarification: I've had trouble expressing the language clearly, though I think that sentence is a pretty good summary. For example, let's suppose you have a matrix with a number of rows and columns of 4.
12 14 16 18
10 08 05 11
06 05 07 17
20 04 09 17
There are four columns and four rows. We can call them c1,c2,c3,c4,r1,r2,r3,r4. There are sixteen nodes, in this case ranging from 5 to 20. You can identify a node with a matching column and row, for example the 20 from this matrix is (c1,r4). You cannot use any individual node more than once, but duplication of a value isn't important (so both 17s in the above can be used). We want to select the largest sum of selected values, with one node from each column and row.

Additional Notes: Here are a few little things that might be worth thinking about. If you don't like spoilers of any kind, they are excluded.
Expand
There is no solution for n < 2.

The solution for n=2 is equal to the sum of all four values.

The solution for n=3 is equal to the sum of the six largest values unless the three lowest values are in a single row or column, in which case the solution is the sum of the five largest values and the seventh largest value.

The brute force method involves testing (2n)! cases. For small matrices, this is fairly fast, but it grows quickly in complexity.

I originally came up with this problem solving the 3x3 matrix for values from 3 to 18 based on the "grid" method of ability score generation for Dungeons and Dragons which you can see at invisiblecastle.com. I thought it was interesting that my solution, while faster than brute force, was not viable for any larger matrix.
Some Proposed Solutions: Having not yet checked these, I've seen these possible solutions offered.
Expand
Use matrix rotation/manipulation to sort the rows and columns to eliminate the total number of cases that must be checked. (I have no idea how this worked, someone simply thought it would be possible, and that you could reduce the number of total cases to ((2n)!)/2n I believe.)

Select the highest value from each row and column. Where collision occurs (selected nodes overlap), determine the highest combination of values you can pick from the remaining available nodes for the collision column and row. Continue recursively. (I suspect this will give a faster solution a majority of the time, though I would like to determine the worst case complexity where you have to recurse the maximum number of times. I'm not positive here.)

User avatar
rayfil
Administrator
Posts: 1400
Joined: Sun Mar 26, 2006 4:30 am
Location: Ontario, Canada
Contact:

Re: Largest Sum of Unique Column & Row Nodes from a Matrix

Post by rayfil » Fri Sep 09, 2011 2:25 am

This the main subject of a recent PE problem and should not be discussed here. This thread has thus been locked.
When you assume something, you risk being wrong half the time.

Locked