XF 2.1 Unfurling questions File too large

Wildcat Media

Well-known member
I don't want to report this as a bug as it's not really a defect per se, but when I tried to unfurl an Amazon link:


...I get this error when I test the URL unfurling:

Could not fetch metadata from URL with error: File is too large.

Is this a classic case of "information overload" to where a typical Amazon product page just has too much information to properly unfurl a URL? I could see having some sort of limit on what data is used to unfurl a URL (which could slow down posting speed). It would be neat to have product URLs unfurled but it is not a deal breaker. I'm just curious to see if it's a limit set in XF.
 
Solution
Is this a classic case of "information overload" to where a typical Amazon product page just has too much information to properly unfurl a URL?
Correct.

Amazon’s markup is ridiculous. But besides that, it seems they only provide any sort of useful metadata (beyond the title) to certain user agents.
amazon.in also fails most of the times here in india. sometimes it is robot check. sometimes nothing. have basically given up on using the furl feature on their links. another domain that seems to hate incoming furl requests is bloomberg.com.

1593227901023.webp
 
Worth noting that some Amazon links work already.

We do not recommend modifying XenForo code but the particular code that could affect this is as follows:

PHP:
protected $limits = [
   'time' => 5,
   'bytes' => 1.5 * 1024 * 1024 // max size of the document we'll try to download
];

It's actually probably not the time. It's most likely the bytes. Changing the 1.5 to something like 3 or 5 may make the difference.
 
I am now having the same issue linking to CNN articles.

This started about a month ago where more than half of the links to their news articles do not unfurl and when I test them in admin > tools I get the same error:

"Could not fetch metadata from URL with error: File is too large."

I then have to individually delete those that won't unfurl, for uniformity. It's a bit of a pain. I really need those links to work. I'm not going to try to hard code an increase myself because then I would start getting alerts every time the system performs a file check.

Is the limitation critical to the system working properly? If not, then I humbly request that the current hard coded metadata limitation be increased to in the next update.
 
Worth noting that some Amazon links work already.

We do not recommend modifying XenForo code but the particular code that could affect this is as follows:

PHP:
protected $limits = [
   'time' => 5,
   'bytes' => 1.5 * 1024 * 1024 // max size of the document we'll try to download
];

It's actually probably not the time. It's most likely the bytes. Changing the 1.5 to something like 3 or 5 may make the difference.
There should be an option to change this. Because Amazon links are just very common and therefore it should be functional.



No matter whether this is the case here or not.
Amazon’s markup is ridiculous. But besides that, it seems they only provide any sort of useful metadata (beyond the title) to certain user agents.
 
Back
Top Bottom