Any way to speedup itertool.product

Many improvements are possible.

For starters, the search space can be reduced using itertools.combinations_with_replacement() because summation is commutative.

Also, the last addend should be computed rather than tested. For example if t[:4] was (10, 20, 30, 35), you could compute t[4] as 1 - sum(t), giving a value of 5. This will give a 100-fold speed-up over trying one-hundred values of x in (10, 20, 30, 35, x).


You can write up a recursive algorithm for that which prunes all the impossible options early on:

def make_weight_combs(min_wt, max_wt, step, nb_assets, req_wt):
    weights = range(min_wt, max_wt + 1, step)
    current = []
    yield from _make_weight_combs_rec(weights, nb_assets, req_wt, current)

def _make_weight_combs_rec(weights, nb_assets, req_wt, current):
    if nb_assets <= 0:
        yield tuple(current)
    else:
        # Discard weights that cannot possibly be used
        while weights and weights[0] + weights[-1] * (nb_assets - 1) < req_wt:
            weights = weights[1:]
        while weights and weights[-1] + weights[0] * (nb_assets - 1) > req_wt:
            weights = weights[:-1]
        # Add all possible weights
        for w in weights:
            current.append(w)
            yield from _make_weight_combs_rec(weights, nb_assets - 1, req_wt - w, current)
            current.pop()

min_wt = 10
max_wt = 50
step = 10
nb_assets = 5
req_wt = 100
for comb in make_weight_combs(min_wt, max_wt, step, nb_assets, req_wt):
    print(comb, sum(comb))

Output:

(10, 10, 10, 20, 50) 100
(10, 10, 10, 30, 40) 100
(10, 10, 10, 40, 30) 100
(10, 10, 10, 50, 20) 100
(10, 10, 20, 10, 50) 100
(10, 10, 20, 20, 40) 100
(10, 10, 20, 30, 30) 100
(10, 10, 20, 40, 20) 100
...

If order of the weights does not matter (so, for example, (10, 10, 10, 20, 50) and (50, 20, 10, 10, 10) are the same), then you can modify the for loop as follows:

for i, w in enumerate(weights):
    current.append(w)
    yield from _make_weight_combs_rec(weights[i:], nb_assets - 1, req_wt - w, current)
    current.pop()

Which gives the output:

(10, 10, 10, 20, 50) 100
(10, 10, 10, 30, 40) 100
(10, 10, 20, 20, 40) 100
(10, 10, 20, 30, 30) 100
(10, 20, 20, 20, 30) 100
(20, 20, 20, 20, 20) 100

Let's generalise this problem; you want to iterate over k-tuples whose sum is n, and whose elements are within range(min_w, max_w+1, w_step). This is a kind of integer partitioning problem, with some extra constraints on the size of the partition and the sizes of its components.

To do this, we can write a recursive generator function; for each w in the range, the remainder of the tuple is a (k - 1)-tuple whose sum is (n - w). The base case is a 0-tuple, which is possible only if the required sum is 0.

As Raymond Hettinger notes, you can also improve the efficiency when k = 1 by just testing whether the required sum is one of the allowed weights.

def constrained_partitions(n, k, min_w, max_w, w_step=1):
    if k < 0:
        raise ValueError('Number of parts must be at least 0')
    elif k == 0:
        if n == 0:
            yield ()
    elif k == 1:
        if n in range(min_w, max_w+1, w_step):
            yield (n,)
    elif min_w*k <= n <= max_w*k:
        for w in range(min_w, max_w+1, w_step):
            for p in constrained_partitions(n-w, k-1, min_w, max_w, w_step):
                yield (w,) + p

Usage:

>>> for p in constrained_partitions(5, 3, 1, 5, 1):
...     print(p)
...
(1, 1, 3)
(1, 2, 2)
(1, 3, 1)
(2, 1, 2)
(2, 2, 1)
(3, 1, 1)
>>> len(list(constrained_partitions(100, 5, 10, 50, 10)))
121

Whenever you're iterating over all solutions to some sort of combinatorial problem, it's generally best to generate actual solutions directly, rather than generate more than you need (e.g. with product or combinations_with_replacement) and reject the ones you don't want. For larger inputs, the vast majority of time would be spent generating solutions which will get rejected, due to combinatorial explosion.

Note that if you don't want repeats in different orders (e.g. 1, 1, 3 and 1, 3, 1), you can change the recursive call to constrained_partitions(n-w, k-1, min_w, w, w_step) to only generate partitions where the weights are in non-increasing order.