Fixed Importing Banned Emails Stops after Encountering Duplicates

Anomandaris

Well-known member
Affected version
2.2.13
When importing an XML file of banned emails, when a duplicate is found it just stops the import. Any unique emails following the duplicate are skipped.

XF v2.2.13
/admin.php?banning/emails

"Banned emails must be unique. The specified banned email is already in use."

1693949858553.png

If a duplicate is found, it should just ignore it and continue importing the rest of the list.

Right now how it works: if an email that is already on the list, is found on line 5, it just stops there and displays the error. The 30,000 lines following that duplicate are not imported. It should just ignore the duplicates and continue to the next line.
 
I totally agree this should be expected functionality. I hope it's considered a bug instead of a feature request. One could argue its intended functionality, but intending to be awkward doesn't make sense. It doesn't even tell you what the banned email is, does it? Not sure if that would log elsewhere.
 
I was trying to solve this problem but I ended up just getting more confused because the code looks correct. Here are some notes, for anyone else trying to fix this problem:

These are the important files related to this:
Code:
/src/XF/Service/Banning/Emails/Import.php
/src/XF/Service/Banning/AbstractXmlImport.php
/src/XF/Admin/Controller/Banning.php
/src/XF/Entity/BanEmail.php

The phrase error being triggered is: banned_emails_must_be_unique

AbstractXmlImport from Import.php:
PHP:
    public function import(\SimpleXMLElement $xml)
    {
        $bannedEmailsCache = (array)$this->app->container('bannedEmails');
        $bannedEmailsCache = array_map('strtolower', $bannedEmailsCache);

        $entries = $xml->entry;
        foreach ($entries AS $entry)
        {
            if (in_array(strtolower((string)$entry['banned_email']), $bannedEmailsCache))
            {
                // already exists
                continue;   <---- ding ding ding
            }

            $this->repository('XF:Banning')->banEmail(
                (string)$entry['banned_email'],
                \XF\Util\Xml::processSimpleXmlCdata($entry->reason)
            );
        }
    }

You can see if the duplicate is detected (in_array is successful), it skips that entry with the continue.

The code looks correct here to me, so I don't know why it still errors out after a duplicate is found


xf_ban_email table listing:
1710124774465.webp
the banned_email string is UNIQUE.

BanEmail Structure Code from BanEmail.php

PHP:
    public static function getStructure(Structure $structure)
    {
        $structure->table = 'xf_ban_email';
        $structure->shortName = 'XF:BanEmail';
        $structure->primaryKey = 'banned_email';
        $structure->columns = [
            'banned_email' => ['type' => self::STR, 'required' => true, 'maxLength' => 120,
                'unique' => 'banned_emails_must_be_unique',
            ],
            'create_user_id' => ['type' => self::UINT, 'required' => true],
            'create_date' => ['type' => self::UINT, 'default' => \XF::$time],
            'reason' => ['type' => self::STR, 'maxLength' => 255, 'default' => ''],
            'last_triggered_date' => ['type' => self::UINT, 'default' => 0]
        ];
        $structure->getters = [];
        $structure->relations = [
            'User' => [
                'entity' => 'XF:User',
                'type' => self::TO_ONE,
                'conditions' => [
                    ['user_id', '=', '$create_user_id']
                ],
                'primary' => true
            ]
        ];

        return $structure;
    }
 
As someone who has provided tier 2 and tier 1 level customer support at a call center, a duplicate should simply be bypassed and processing continue onward. To halt any process because of a duplicate entry is not best practice. It should, if being ran interactively, warn you. Otherwise it should simply make a note in a log so it can be addressed later. Any other coding aspect is not best practice.
 
I just came across this thread while prepping to do some importing of banned emails.
I tried and failed to reproduce the problem above (because it'll be much easier if I don't need to weed out duplicates), so I guess it has since been fixed.
(NB: the code above looks absolutely identical in 2.2.15, so I presume any fixing that has happened must have been elsewhere.)

FWIW: after I initially failed to reproduce it with an email address that was already in the database, I instantly assumed that the root cause was that the copy of the cache is created at the start, before the new banned addresses start being added, and is not updated as it iterates over the entries in the file.

This would mean that I could trivially reproduce the problem by just having a duplicate within my own XML file, right?
...Wrong. That also failed to throw an error, and just calmly ignored the database duplicate(s) and in-file duplicate(s) and just added the non-duplicate entries exactly as one would wish 😂
 
This would mean that I could trivially reproduce the problem by just having a duplicate within my own XML file, right?
...Wrong. That also failed to throw an error, and just calmly ignored the database duplicate(s) and in-file duplicate(s) and just added the non-duplicate entries exactly as one would wish 😂
Update:
I just managed to hit the problem, LOL!
After scratching my head a bit I realised that the reason my second reproduction attempt (with duplicates within my own XML file) failed was because I simply re-used my previous test file, and the entries in it had already been added and thus were in the cache created in the first steps of the code above. Doh... 🤦‍♂️
As soon as I deleted them and retried, bingo.

assumed that the root cause was that the copy of the cache is created at the start, before the new banned addresses start being added, and is not updated as it iterates over the entries in the file.
So in fact the assumption above was correct (yay :ROFLMAO:) and thus all you need to do to avoid hitting this limitation in the XF code is to ensure your own file is free of internal duplicates.

A (brute-force?) fix, permitting duplicates within the file being imported, would I guess be to move these two lines inside the loop:
PHP:
        $bannedEmailsCache = (array)$this->app->container('bannedEmails');
        $bannedEmailsCache = array_map('strtolower', $bannedEmailsCache);
 
Thank you for reporting this issue, it has now been resolved. We are aiming to include any changes that have been made in a future XF release (2.3.4).

Change log:
Skip any file duplicates when importing banned emails
There may be a delay before changes are rolled out to the XenForo Community.
 
Back
Top Bottom