Stop googlebot crawling URL more than once?

I have a site that usually creates a few thousand pages a day, which don’t change after they have been created. Recently my dedicated server has crashed due to googlebot crawling the site too often. According to the search console, many days googlebot crawls the site tens of thousands of times a day, indicating they keep crawling pages they already crawled. I am aware I can limit the googlebot crawl rate, but is it possible to force googlebot to crawl a page ONCE and ONCE only?