There is a set of bit sequences of a fixed length (L == 1024). The number of sequences of ~43000.

Among them to find the combination of N (N == 10) such that the number of enabled bits in a bitwise logical disjunction of the result is maximized?

For Example (L = 4, N = 2)

`0100`

0010

0110

1000

1100

The answer is:

`0110`

1100

---- check which bits we have included

1110

The task is complicated by the huge volume of data. The number of possible combinations makes sad.

Is it possible to solve this problem in a reasonable time, or to rely on the case and run for a couple of days a random algorithm?

**upd**

Either I'm stupid or don't understand the proposed solution. Search for the maximum count of bits has no effect. Illustrate:

L =4, N = 2, for a set

1110 — 3

1100 — 2

0110 — 2

0001 — 1

if you count to bits and sort (take the top N), we get

1110

1100

and need

1110

0001

**upd2**

I decided to do a search of "best results":

L =5, N = 3, for a set

11100 — 3

11000 — 2

01110 — 2

00010 — 1

00001 — 1

0) m = 11111

1), we obtain the best sequence for m (11111)

result = [11100]

now m = 00011

2) get the best sequence for m(00011)

result = [11100, 01110]

now m = 00001

3), we obtain the best sequence for m(00001)

result = [11100, 01110, 00001]

now m = 00000, finished

got the answer — [11100, 01110, 00001]

but this is not the only solution (step 2 fit 01110, 00010, 00001)

if the number of source data is large, the "best" results will be many. and, of course, not the fact that there are generally combination that will cover all the bits(what I need). in nature I get a lot of results (if "best" options I will choose a random). here are some of the "best" in the end, I will choose the most suitable for me (sort of count bits covered).