What is a bot?
Bots are web crawlers: the same kind of computer programs that crawl the web to fill Google's search index. These are scripts that visit URLs looking for content online. If a bot were to find your feed, it might try to visit all of the URLs contained within it.
This isn't malicious: the bots are doing exactly what they were programmed to do. All bots have a purpose, and they're usually fairly mundane:
Crawling content for search engines
Saving content for the Internet Archive
Building a graph of links around the web
Looking for marketing data or keywords
Some bots are easy to identify, listing their purpose in the "User Agent" header—the calling card for a web request. Other bots try to avoid getting special treatment by pretending to be another piece of software. A bot might disguise itself as Google Chrome to avoid getting better (or worse) treatment from website owners.
Because not all bots are easily identified, it's not possible for us to filter them all out when we tally up analytics data. We have some sophisticated systems to identify bots, but not all are caught. From time to time, we may update our analytics data to account for bots that were discovered long after the bot downloaded an episode.