Randomly picking $k$ members of $\{1,\ldots,n\}$

If $X$ is a random variable with non-negative integer values, the expectation of $X$ equals $$\mathbb{E}(X)=\sum_{m=1}^\infty {\rm Prob}\,(X\geqslant m).$$ In our situation the event $X\geqslant m$ means that after $m-1$ days there remains a not taken element. By exclusion-inclusion it equals $$\sum_{i=1}^n (-1)^{i-1}\binom{n}i\left(\frac{\binom{n-i}k}{\binom{n}k}\right)^{m-1}.$$ (the $i$-th summand corresponds to the choice of $i$ not-covered elements and the probability that they are not covered.) When we sum up by $m$, we get $$ \sum_{i=1}^n (-1)^{i-1}\frac{\binom{n}i}{1-\frac{\binom{n-i}k}{\binom{n}k}}. $$ Is such an answer ok for you?

For small $k$ and large $n$ the following further identical transform of the above sum may be helpful. Choose a polynomial $h(x)$ of degree at most $k-1$ for which the polynomial $1-h(x)\cdot (1-x)(1-x/2)\dots(1-x/n)$ is divisible by the polynomial $1-\binom{n-x}k/\binom{n}k$. The ratio which we denote $f(x)$ is a polynomial of degree at most $n-1$. Thus $\sum_{i=0}^n (-1)^{i-1}\binom{n}i f(i)=0$. But for $i=1,2,\dots,n$, we have $f(i)=(1-\binom{n-i}k/\binom{n}k)^{-1}$. Thus our sum equals $f(0)$. For example, if $k=1$, we have $h(x)=1$ and $f(x)=nx^{-1}(1-(1-x)\dots(1-x/n))$, $f(0)=n(1+1/2+\dots+1/n)$ (that is known as Coupon Collector's Problem). For $k=2$ we get $h(x)=1+ax$ with exponentially small (for large $n$) value $a$ and $f(0)=(1+\dots+\frac1n-a)/(\frac1n+\frac1{n-1})$.


I think you should have already a good estimate just comparing with the classical coupon colector's problem (CCP).

Let us consider the CCP and pick the numbers one by one. We define the following stopping times

$T_i$ = The first $t\geq T_{i-1}$ such that there are $k$ different numbers picked between $T_{i-1}$ and $t$

Then the set of number picked at $T_i$ is the same as the set picked in your process after $i$ days. Moreover, the random variable $X_i=T_{i+1}-T_i$ are iid random variables (of same law $X$). And we have $$ t=\sum_{i:T_i\leq t} X_i+(t-T_i)$$
If we call $T$ the time the classical coupon colector has picked all its numbers, we have $$T=\sum_{i:T_i\leq T} X_i+(T-T_i)$$ In your process, $S$ is the first $i$ such that $T_i\geq T$. Therefore $$\mathbb{E}(T)=\mathbb{E}(\sum_{i<S} X_i)+\mathbb{E}(T-T_{S-1})$$ $X_i$ are independant of $S$ and therefore $$\mathbb{E}(T)=\mathbb{E}(S-1)\mathbb{E}(X)+\mathbb{E}(T-T_{S-1})$$ and remark that $\mathbb{E}(T-T_{S-1})\leq \mathbb{E}(X)$ and therefore we have the estimate $$\frac{\mathbb{E}(T)}{\mathbb{E}(X)}\leq\mathbb{E}(S)\leq \frac{\mathbb{E}(T)}{\mathbb{E}(X)} +1 $$ with $\mathbb{E}(T)= \sum_{j=1}^n\frac{n}{j}$ and $\mathbb{E}(X)=\sum_{j=n-k+1}^n \frac{n}{j}$.


So, in the first toss $k$ bins are covered with probability one.

In the second toss, on average $\frac{n-k}{n}(n-k)$ uncovered bins are covered. Let $\theta=k/n.$

So the " new"fractions covered are, $$\theta,\theta(1-\theta), \theta(1-\theta(1-\theta)),\ldots,$$ and the expected time is the smallest number where the sums of these reaches or exceeds $1.$

This number is the smallest integer $t$ such that $$t\theta-(t-1) \theta^2 + (t-2) \theta^3+ \cdots+ (-1)^{t+1} \theta^t$$ exceeds $1.$