![]() |
|
![]() |
| Which would be even more of a reason to standardise an input field that handles web addresses as humans enter them, not machines prefer them. Yet, here we are. |
![]() |
| Neat tool. A couple of suggestions:
I'd make it fetch the meta tags and image using the user agent string of the services you're showing previews for. For example, Twitter/X fetches meta tags with a user agent string of Twitterbot/1.0. Some sites may serve different content to different services in order to optimise the image for display on that service. It also looks like your API may not be looking at Twitter-specific meta tags [0], as it just returns one set of metadata that's used by every preview. For example on https://govukvue.org I use the 'summary' card format, which shows a small image with the name and description beside it. But your tool renders it as if it's a 'summary_large_image'. [0] https://developer.x.com/en/docs/x-for-websites/cards/overvie... |
![]() |
| Does not appear to handle open graph correctly. For example, it displayed pixelated favicons resized to fit their containers, rather than the `og:image` in the head tag. |
![]() |
| That's pretty cool! Get ready to keep these up to date monthly or become obsolete quickly.
One of the downsides of tools like this is that your URL needs to be available online so if there's an issue, your iteration loop is quite long. In Polypane [1] I've built social media previews that work with any local URL but also let you overwrite that URL for the social media that display those. I built (and frequently maintain) previews for X/Twitter, Facebook, Slack, LinkedIn, Discord, Mastodon, Discord, Google Search, Bluesky, Mastodon and Threads. For all of those I have the design for their light and dark mode so you really can test everything. It also tells you what's missing and what is incompatible. Check it out: https://polypane.app/social-media-previews/ |
![]() |
| I think a little bit of it is fine.
The person pointed out a specific limitation. And then offered a solution. Very clearly stating that they made it. Somebody might find this useful. |
![]() |
| Missing linkedin and also missing mastodon. Neat tool! If the page is missing something it would be helpful with some text on how to improve such as what should be done. |
![]() |
| I wish some formal standard for this would catch on, like a `META` HTTP request type or something. We try to pull in link metadata sometimes and get a Cloudflare captcha instead. |
![]() |
| This is really awesome, I’ve been looking for this exact tool. Putting the preview in the context of a real message / post makes it more useful. |
![]() |
| Cool idea, though it seems like it still requires some polish. There are small issues, for example: the design of HN links on Discord does not seem to be correct. |
![]() |
| Neat little tool! It helped me figure out my websites logo is getting cropped on some socials. Making it easy to see how I need to modify it to fix the issue. |
![]() |
| This will be very useful for the half-decade we might have left until links to anything except the top 5 sites are auto-filtered. |
I thought that one was unable to handle domains without a protocol, which makes it pretty much useless for normal business cases. I’ve never met a single non-technical person that understood what that https was, why they should add it, or didn’t get immeasurably bored if you tried to explain it…