1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Technical Help! Bulk Files, urls and more.

Discussion in 'Off Topic' started by Slavik, Oct 24, 2010.

  1. Slavik

    Slavik XenForo Moderator Staff Member

    Righto going to throw this out here hopefully someone can help.

    I have a large set of old files I need to re-upload for tomrow.

    The problem being that due to how they were given to me, whilst internet explorer parses the pages correctly, any other browser does not and just shows the source rather than rendering it.

    Each page is a static page, there is no dynamic linking or such.

    They are saved in the format, generic.php@page=1 generic.php@page=2 generic.php@page=3 ect. Each page has a link to another generic page, this goes upto 63,000 files.

    If I go and add .html onto the file name, so generic.php@page=1.html the other browsers will render the page. However by adding the .html it breaks the links the other pages show.

    So, my options.

    a) I need a way to mass rename all files to add .html onto the end, then mass edit all the files internally to link to the new .html page

    I have a biterscript I used to edit some other stuff, however I have some problems making this work with wildcards.

    b) a way to force the .html extention onto the page once it is requested. I tried through htaccess, however again I had problems. I could get it to rewrite it down to generic.html, but it would need to rewrite it to generic.php@page=1.html

    The bitterscript is here.
    Any suggestions?

    var str list ; lf -n "*.php" "F:\Project_Files\test" > $list
    while ( $list <> "")
    var str file ; lex "1" $list > $file
    var str content ; cat $file > $content
    while ( { sen -c "^generic.php@page=*^" $content } > 0 )
    sal "^generic.php@page=*^" "generic.php@page=*.html" $content > null
    echo $content > { echo $file }

Share This Page