Is there an upper bound on the number of points in point cloud for which we compute the persistent homology?

It depends very much on the type of simplicial complex you're using. If you have points in 3d then doing Cech/Delaunay complex is feasible with millions of points. If you have high dimensional data, complexes will generally blow up in size and millions of points will be too much. Finding ways to decrease the size of complexes, possibly using approximations (eg witness complexes), is an active research area. It is hard to formulate a precise answer because it depends too much on how nice the data is and how much error is tolerated.


Possibly of interest to you is this paper, A roadmap for the computation of persistent homology by Otter, Porter, Tillmann, Grindrod, and Harrington. They compare different pieces of software for computing persistent homology, and list off the maximum sizes of simplicial complexes that each software could handle under their constraints.

a table depicting the maximum size of simplicial complex supported by various pieces of software

It looks like in the paper that they were mostly working with either $2$- or $3$-dimensional data sets. Anyways, to answer your question, we can compute the persistent homology of data sets consisting of at least $10^9$ points. The above table is from this presentation by Nina Otter.