The SqlBulkCopy class in the System.Data.SqlClient namespace in .NET worked to significantly increase efficiency of inserting thousands of rows in a table. The inserts of about 100k rows with 24 columns of mixed data types went from minutes to seconds after switching from a row-by-row update with a stored procedure to the bulk insert. I set it to insert 1000 rows at a time with no callbacks and left the default timeout of 30 seconds.
The source data came from a query that the same code owns so I put all of the column names in an array of strings and did a Linq .Select to add them to the ColumnMappings. It didn't work at first, I found that I had to call .ToArray() after the Select() in order to iterate the array before calling WriteToServer to perform the insert.
I can see how this could be wrapped up and made dynamic by walking the ColumnName properties of a DataTable to add ColumnMappings if they are 1:1. Another potential pattern would be to wrap the table stuff in a class that knows about mappings and abstract it away a bit because it is a bit low level when working with business logic.