Facebook like - showing cached version og:image, way to refresh or reindex it?

Having an issue with Facebook like and a cached og:image.

Long story short: Facebook has cached an older version of our like image. The meta content URL can't be changed. Is there anything I can do to refresh it?

Long story: The site I'm working on has a meta tag for an og:image that Facebook uses when a page is liked. This meta tag uses the same image URL on all pages across the site. The image is simply a branding image for the site.

The issue is the site recently updated their branding, and we can't get the Facebook like image to update. When a user clicks the like link, the resulting post to Facebook still shows the old branding image.

The meta tag is similar to:

<meta property="og:image" content="http://[domain].com/images/bookmark/apple-touch-icon.png"/>

Whenever a like makes its way to Facebook, the URL to the image is changed to the cached Facebook URL, similar to this:

http://external.ak.fbcdn.net/safe_image.php?d=AQDajxm-qgVNdfEL&w=90&h=90&url=http%3A%2F%2F[domain].com%2Fimages%2Fbookmark%2Fapple-touch-icon.png

This URL displays the older version of the site's branding. It has been over a week, and it has not updated yet.

Is there any way to force Facebook to reindex the image/clear it's cache? Or, does Facebook periodically do this automatically? I couldn't find any relevant information on this.

I know that changing the URL in the meta tag could fix the issue, but the meta tag is generated by code used across multiple sites and it can not be changed. I also tried the delinter tool as was suggested to me by others. No luck.


Insert your URL into their linter and it should reload its cache


You can use Facebook's object debugger which will allow you to enter the page URL and then on the next page you can re-submit it in a request to 'Fetch new scrape information'. This will clear Facebook's cache for the given URL - Not that it may take some time to propagate around all their cache nodes.

Facebook's Object Debugger can be found here: https://developers.facebook.com/tools/debug/

We recently found that Facebook was caching URLs using a query string against the relative URL and that the query string was being ignored which messed up a few dynamic images we were serving purely based on the query string.

It turns out that you can specify a last modified timestamp (in Unix timestamp format) to help ensure when FB crawls your site, it always gets the correct image.

This can be done by including the following OG meta tag:

<meta property="og:updated_time" content="123465789" />

For dynamic sites you'll want to generate the content value - using PHP the current Unix timestamp can be inserted as follows:

<meta property="og:updated_time" content="<?=time()?>" />

I have think a possible solution... what if you add at the end of the URL a random string?

like www.server.com/something.php?v=<?php echo rand() ?> or www.server.com/something.jpg?v=<?php echo rand() ?>

i guess facebook cahce object depending on the url... change it randomly... could help.


7 years later after this post was made and this is still a problem, but its not facebook's cache: It is human error (allow me to elaborate)

OG:TYPE effects your image scrape:

  1. https://ogp.me/#type_article not the same as https://ogp.me/#type_website

Be aware that og:type=website will cause any /sub-pages/ of that url to become "canonical". This means you will have trouble getting your images to update using the scraper no matter what you do.

Consider this "assumption and common mistake"

-<meta property="og:type" content="website" /> => https://www.example.org (parent)
-<meta property="og:type" content="website" /> => https://www.example.org/sub-page/
-<meta property="og:type" content="website" /> => https://www.example.org/sub-page/child-2/
- Ergo: /sub-page/ and /child-2/ will inherit the og:image of the parent

Those are not "all websites", 1 is a website, the others are articles.

If you do that Facebook will think all of those are canonical and it will put the FIRST og:image into all of them. (try it, you'll see) - if you set the og:url to be your root or parent domain you've told facebook they are all canonical. (there is good reason for that, but its off topic)

Consider this solution (which is what most people "really want")

-<meta property="og:type" content="article" /> => https://www.example.org/sub-page/
-<meta property="og:type" content="article" /> => https://www.example.org/sub-page/child-2/

If you do that now Facebook will give you far far less problems with scraping your NEW images.

In closing, YES the cache busters, random vars, changing urls and suggestions here can work, but they will seem like "intermittent voodoo" if the og:type is not specified correctly.

PS: remember that a CDN or serverside cache will serve to Facebook's scraper even if you "think" you can see the most recent version. (I wont spend any time on this other than to point out it will waste colossal amounts of your time if not double checked.)