How to calculate GCP Cloud Run pricing correctly

The estimator is quite stupid. After few test, I understood it's configuration.

here some details

  • 100 (peak) - 2 (min) = 98 -> number of possible instance up and down. Arbitrary, the calculator say 50% of the time it's UP, 50% of the time is down. Therefore it consider 49 instance up full time in the month, in average.
  • In addition of those 49, 2 (the min) are ALWAYS on. therefore, the total number of instance to consider always on in the month in 51.
  • 52 * 730 * 3600 -> 134 million .... the number of CPU hour of the calculator.

Now, your second way to calculate:

CPU cost = 24 * 30 * 0.00002160 * 3600 * 60 = 3359.23

Have a close look to the number used:

  • 24: number of hour per day
  • 30: number of day per month
  • 0.0000...: CPU cost
  • 3600: number of second per hour
  • 60: ???? What's that? the number of instance per months? the number of second per minute? Number of minute per hour? (for the 2 last answer, it's already take into account in the 3600)

Final word, when you talk about number, take care of the number. you forgot many 0 and it's difficult to understand your issue.

I don't know if I answered your question. in any case, it's difficult to know exactly the cost of pay as you use product. You can know the max cost, by setting a max instance, and you know you will never go above that threshold, but, if you haven't a clear view on your traffic and the number of request (and you also forgot the egress cost) it's impossible to have a precise estimation.