Number of indexed pages with "site:" search less than reported in Google Webmaster Tools
You do not have an indexing problem. Google Webmasters Tools is the authoritative source of information about your website with Google. Whatever it says there is true. Operators like
link: are known to not show all relevant results. This is on purpose as it prevents others from knowing exactly Google is handling your website inhibits any attempts to manipulate the search results by others.
You had posted this question differently twice. This answer was to the question marked as duplicate which I now post here hoping to help you. While the two questions may not be exactly the same, this answer is in response to the question marked as duplicate. Please keep that in mind.
You have three things going on here and there is little or no connection between them. I will explain.
1] In Google Search Console (Webmaster Tools), Google Index > Index Status, the number is the actual number of pages that Google has indexed from a particular site. This is a factual number.
2] In Google Search Console (Webmaster Tools), Crawl > Sitemaps, the number is the number of pages found within the sitemap that is indexed when updated last. Please know this is not real-time or even close. It is based upon audits of the sitemap which happens periodically and in short fits and spurts. For this reason, it is a misleading metric that should be explained but is not. This number rarely agrees with the number described in item 1.
3] The "About nnn results" is the number of pages within the query result set from the index that pass all of the SERP "filters". When a search query is submitted, the result sets (there are actually several queries) are combined and then submitted to filters which do several things such as ranking the results and is where SERP penalties (the less severe of the two) are applied. This number rarely agrees with the number described in item 1.
Of the three, if you want to know the number of pages a site has indexed, option 1 is the best. If you want to know the number of pages a site has indexed that currently passes the filters then option 3 with a site: search is the best. Option 3 should be almost totally ignored unless you are looking at this number often enough to get a solid feel for things. For a new site, in particular, this can be a disappointing metric simply because not all of the rank metrics for the indexed pages are calculated yet.
Any site that is new, and a site that is only 2 months old is extremely new, you should not worry about any of the metrics too much. The reason for this is simple. It takes nearly a year for a site to fully settle into the SERPs. Part of this process is assessing metrics such as click-through rate (CTR), bounce rate, time spent on page, time spent on site, etc. Add to that all of the major updates Google has made since March where Panda, in particular, is still going on and will be into next year. With all of this core algorithm disruption, the addition of RankBrain, and Panda, any site will see fluctuations, some possibly severe at times, which make it nearly impossible for anyone to assess where their site really stands for quite some time.
I advise working on your site making it the best you can and keeping your head down for a while. Do not get into the metric weeds just yet. It would not pay dividends.
It could also be duplicate content filters. Do a site search then go to last page of search results. This may be page 30 or more. At the bottom of the last page you should see something similar to this:
In order to show you the most relevant results, we have omitted some entries very similar to the 348 already displayed. If you like, you can repeat the search with the omitted results included.
By clicking the
repeat the search with the omitted results included link you should see more results listed throughout all the pages. Results can be hidden, even results on page 1. This would expose them.