Initially, the process was taking a few seconds, but that was only a subset of the data. Once I got this larger set of data, it was taking 18:49 to import it, which wasn't acceptable. So, time to leave the straight "insert into ___ set ___=___" method that is so much more readable and move to "insert into ___ values ( __ , __ )" style.
The first thing I did in testing was to disable indexes, insert, then enable indexes again. This brought the time down to 9:27, which is a big improvement, but still not where I wanted to be.
Changing from set _=_ to values ( _ , _ ) made no real difference.
Then, I changed to doing multiple records in a single pass, and what a difference that made.
|Records At Once||Time taken|
As you can see, the more records inserted at once, the faster it went. This held up even to the point of doing just 3 inserts. At some point, I'd expect a diminished return as my php script will run out of memory, but on this limited set of data, I didn't hit that point. However, even just increasing to 25 realized most of the performance increase, and even 500 was closing in on the limit of the gains. For memory usage reasons, I'd consider using a smaller number. However, since the plan is to truncate and replace the data, doing the all at once method is probably preferred to have minimal time without data.
Next test will be with a table with around 30 fields and millions of rows. That could provide more useful info.