Talk:Knapsack problem/Bounded

Revision as of 14:00, 15 March 2017 by Petelomax (talk | contribs)

Some technical details

Some solutions, including my perl one, use the 0-1 problem solution, but counting items with multiples as multiple different items. This is almost a good solution, but not quite: if some items have a huge count, the correct solution will not much care, but the hacked 0-1 solution generally can hang. Try for this example set the beer count to say 1000, and see how well your code copes. --Ledrug 09:11, 22 June 2011 (UTC)

Hi, wp has the problem marked as taking wp:Pseudo-polynomial time. Does that mean that there is no better exact algo? --Paddy3118 16:34, 22 June 2011 (UTC)
Using 0-1 solution is still Pseudo-polynomial time, but not all Pseudo-poly algorithms are equal. Suppose you have 1000 beer, each weighing 1, and your algorithm comes down to check on beers with 10 space to spare. The correct algorithm's decision tree branches 11 times, picking from 0 to 10 (weight capped), and see which gives best value. The 0-1 algorithm under the same sitution sees 1000 different items, each may be taken or not; because of the weight limit, it ends up trying taking any 10 of them, any 9 of them, etc, that is: Binomial(1000, 10) + Binomail(1000, 9) + ... the decision tree branches wildly, even though beer is beer, most of the different choices are really the same.
The Go solution is correct if you are interested. I haven't checked if any other example is using the right method. --Ledrug 20:49, 22 June 2011 (UTC)
Thanks. --Paddy3118 03:20, 23 June 2011 (UTC)

J approach (dynamic variant)

Here's how I implemented the dynamic programming variant, in case it helps someone else:

In the 0-1 version, inclusion is binary -- an item is included or it's not. For this version, however, an item may be included up to n times (where n varies from item to item). To represent this, I gave items which may be included n times n item numbers. In other words: map is item 1, compass item 2, water is items 3 and 4. (1 water would be item 3 and 2 waters would be item 4).

When computing maximum values, I in essence performed a small inner loop where I would (in the case of water) compute the maximum value for both 1 water and 2 waters. This was a trivial extension of the wikipedia algorithm (2 waters weighs twice what 1 water would weigh). And, basically, once the weight of a multiple item choice exceeds the permissible weight I just repeated the biggest previous value for fewer items.

A related problem is keeping track of the "best choice so far". There might be some clever way of extracting it from the values cache, but I used a best choice cache. For item   which has maximum possible count   and current best count  , I represent the best combination so far as   where   where  . (This means that when I am considering various options I can ignore an option by multiplying it by 0 and then selecting the largest remaining  .)

In other words 0 means "no items", 1 means "1 map", 2 means "1 compass", 3 means "1 map and 1 compass", 4 means "1 water", 8 means "2 waters", 9 means "2 waters and 1 map", 12 means "1 sandwich", and 16 means "1 sandwich and 1 water". And, since I am taking tiny steps at each step of computation, I cache   rather than recomputing it every time I need it. (You might think of this as a generalization of base n arithmetic where each digit position can be in an independent base (except that these "digits" are probably best thought of as being arranged from left-to-right instead of the usual right-to-left).) --Rdm 21:20-ish, 27 June 2011 (UTC)

Some thoughts about caching the best choice: if you don't and just calculate the best value, and after which you want the list of chosen items, you'd have to solve   with the constraint   -- that's a reduced bounded knapsack problem in itself, a somewhat amusing situation. --Ledrug 21:22, 27 June 2011 (UTC)

OOCalc?

Currently the OOCalc implementation claims that it is trying to minimize the total value of items contained in the pack. Shouldn't it be trying to maximize that instead? --Rdm 15:54, 28 June 2011 (UTC)

Yep, the image was correct, I just wrote the wrong thing in the accompanying text. Fixed now, thanks. --Paddy3118 18:28, 28 June 2011 (UTC)

Possible optimisation

In the Phix solution, I noticed the order of goodies had an impact on search time, ranging from 12s (highest value/fewest items first) to 28s (lowest value first), which got me thinking. I also noticed that sometimes the optimal solution was found very early, and came up with a little theorem.

If we sort by weighted value, and on the last item (given that we have been greedy) we can take the full complement, I believe that means we must be on or have passed through the optimal solution.

Since I am not entirely convinced that hypothesis is right (it does seem rather unlikely that no-one has ever thought of it before), I have left the Phix solution such that it completes the full scan and crashes if a disproving dataset is used. One thing I have tried is increasing the available beer: at 8 and above it never displays "terminate" (it is the last item and 8 will not fit) and must perform a full search, which I view as a good sign, even though it means there is no gain in such circumstances. It is also interesting to note that we can terminate even if the selection being inspected has not improved matters. Pete Lomax (talk) 13:58, 15 March 2017 (UTC)

Return to "Knapsack problem/Bounded" page.