For decades, there was only one search engine that mattered, and thus only one search “bot” to worry about. Now, huge budgets and fierce competition between the AI labs has led to an explosion of search options for consumers—and the bots that power them.
According to Phocuswright’s July 2025 report “Chat, Plan, Book: GenAI Goes Mainstream,” one-third of U.S. travelers use generative AI—tools like ChatGPT, Google Gemini and AI-powered search—to plan and experience trips, from inspiration to in-destination support. What’s more, that number is growing rapidly.
To stay competitive and visible, it’s absolutely critical that official DMO and hospitality websites adapt to this new, AI-powered landscape. Key to this shift is the role that AI-powered bots play in search. In this post, we’ll take a look at the different types of AI-powered web crawlers and share how you can manage your websites’ bot access for maximum success.
An Intro to Bots
Generally speaking, a bot is a program that follows some set of rules to accomplish a goal autonomously. The bots we’re talking about in this post are specifically designed to crawl the internet to collect information from public—and, controversially, sometimes not public!—websites. They access pages from your website and catalog the information they find there for different purposes, which we’ll discuss below.
AI Web Crawler Bots and Your Website: To Block or Not to Block?
Web crawlers are nothing new—these tools have been perusing internet sites for years, quietly collecting information on websites for search engines like Bing and Google. However, the unique and evolving needs of AI have changed the nature of these tools dramatically.
As the owner of your website, you can decide which bots to allow and which to block. There’s no one-size-fits-all solution to bot management; you’ll need to weigh the pros and cons of each type of search bot before making a decision. Before the rise of AI, the question was pretty easy to answer: Do you want your website to show up in Google, Bing, DuckDuckGo and other search engines or not?
The new types of AI bots have made these decisions a lot more complicated. Below we outline the three main types of AI-related crawlers and provide some information about what it means to allow or disallow them.
#1: Training bots
These bots collect data that is used to train the AI models themselves, like ChatGPT, Claude or Gemini. The information they collect becomes part of the model’s built-in knowledge, without citing where the information was collected from.
Should you block them?
You might choose to block these bots on the premise that you don’t want to give an AI tool the ability to replicate content from your website. However, if you do, you won’t have any influence over the information the model “knows” about your destination or hospitality business.
#2: Search index bots
The second type of AI bot is essentially the same as a “traditional” Google or Bing indexer. The information they collect is processed into a search index that is updated every few days. The AI models can access these search indexes as external tools to get up-to-date information. When the models use information from a search index, they (generally) provide a citation to the source website. You’ll see these citations in ChatGPT, Claude or Gemini as in-line links or in a list of sources.
Should you block them?
While citations do link directly to websites, the nature of the longer, conversational responses from AI models means the links don’t return nearly as much organic traffic to your site as a “traditional” search engine would. Essentially, the search bot provides all the information the user needs, so they don’t need to click the citation linking them to your site. For this reason, you might choose to block them. However, choosing to disallow these bots means the AI models won’t have access to updated information about your brand, so they won’t be able to cite your website in their responses or send any traffic to your site.
#3: User requested bots
The third type of bot is for direct user requests—like when a user asks an AI model about a specific website or requests information that is so specific or recent that the AI model decides to grab “live” page content to create its response.
Should you block them?
While you have the power to block these bots, user-requested bots can be an asset to a destination or hospitality organization because they are responding to specific requests for information. Allowing these bots can ensure that the user gets accurate information in response to their query. It’s probably also worth considering these requests as legitimate traffic to your site; the user might not click through, but they still received information directly from your website.
It’s worth noting here that while most of the AI tools like ChatGPT, Claude and Perplexity label their bots as described above, Google uses the same bots for all three purposes. Within the industry, there have been calls for Google to change their methodology, but as it currently stands, blocking Googlebot will remove your site from Google entirely, not just from their AI tools.
AI-powered search is here to stay—and Miles is with you every step of the way. If you’re interested in learning more about this seismic change in how travelers plan and experience travel, reach out to us or check out our webinar “Harnessing the Power of AI: What it Means for Tourism & Hospitality,” co-presented with Phocuswright.