If your staging/testing environment is not publically accesible then you will need to allow our crawler access to it. There are a number of ways to achieve this:
The easiest way to do this is to look for "Sajaribot" at the start of the user-agent in HTTP requests.
It's also possible to whitelist a range of IP addresses used by our crawling infrastructure. We generally recommend that you check/update these often as they are likely to change. Our primary crawling system runs within Google Cloud and so has a very large and dynamic address range, but it's possible to get current data on this here.
If this is too difficult, you can always index your production site instead, and then test new search interface integrations on your staging site. This has no performance issues and will not change the search functionality running on your production site. This method allows the UI to be developed without the need for us to index your staging site.