tag:blogger.com,1999:blog-27232680.post4795582482689070060..comments2024-03-20T06:54:32.435+01:00Comments on Plastic SCM blog: The fastest way to insert 100K recordsF3RD3Fhttp://www.blogger.com/profile/11524626976811746062noreply@blogger.comBlogger14125tag:blogger.com,1999:blog-27232680.post-40724443019860239172011-02-06T23:44:55.595+01:002011-02-06T23:44:55.595+01:00hello!
But i very fastest inserting records
What i...hello!<br />But i very fastest inserting records<br />What is processor + memory of computer where you making tests?<br />What parametrs of your computer hardware?Igorhttps://www.blogger.com/profile/18320306806198483036noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-3622104683288854552008-04-13T21:48:00.000+02:002008-04-13T21:48:00.000+02:00Hi pablo,SQL Server 2008 (http://msdn2.microsoft.c...Hi pablo,<BR/><BR/>SQL Server 2008 (http://msdn2.microsoft.com/en-us/library/ms174335(SQL.100).aspx) supports the feature to insert multiple rows of data, it would be interesting to include in the set of tests.William Robertohttps://www.blogger.com/profile/02066455982636833460noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-49152363332691865412008-04-11T01:05:00.000+02:002008-04-11T01:05:00.000+02:00One of the new features of SQL Server 2008 is that...One of the new features of SQL Server 2008 is that it allows multiple sets of values in an insert. Not that useful really, but cute.Steve Campbellhttps://www.blogger.com/profile/16844901321480913008noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-72288330305621588332008-04-10T16:30:00.000+02:002008-04-10T16:30:00.000+02:00hi pablo, the diff between 1 and 2 on my blog is t...hi pablo, the diff between 1 and 2 on my blog is that 2 does batches of 1000, so that is why it is a little faster, where 1 does just 1 record at at a time, not sure on DataAdapter vs DataTable, but I bet they are close. <BR/><BR/>Yes bulk copy, there are two different things though. From "within sql" like using BULK INSERT or BCP, which you need certain rights on SQL server to even execute, or within .net code using SqlBulkCopy, which seems to do the same as BULK INSERT but through the .NET Data Layer between code and SQL.. which is nice..especially on remote web/sql servers.Steve Novoselachttps://www.blogger.com/profile/15015303170910975134noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-25418804021085597972008-04-10T16:26:00.000+02:002008-04-10T16:26:00.000+02:00Hi Steve,Thanks for link on BulkCopy.One remark: I...Hi Steve,<BR/><BR/>Thanks for link on BulkCopy.<BR/><BR/>One remark: I guess there shouldn't be any difference between your method 1 and your method 2, if you keep the connection open on case 1 (which is what method 2 is actually doing internally) and even use parameters as I mentioned on my post.<BR/><BR/>I guess directly using IDbCommand must be always faster than IDbDataAdapter or any datatable, isn't it?<BR/><BR/>Of course BulkCopy must be even faster, as Andrei pointed out also.<BR/><BR/>I'd like to check performance using MySql <I>bulk-copy-like</I> functionality, but I'm afraid you've to use an intermediate file to load the data, is that correct?Pablo Santoshttps://www.blogger.com/profile/08083682682597484025noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-83861558171553199102008-04-10T16:05:00.000+02:002008-04-10T16:05:00.000+02:00you might want to expand this a little. I did some...you might want to expand this a little. I did some tests where I got 1 MILLION records to insert into sql server in 17 seconds.... using SqlBulkCopy in C#..<BR/><BR/>http://blog.stevienova.com/2008/01/16/net-fastest-way-to-load-text-file-to-sql-sqlbulkcopy/Steve Novoselachttps://www.blogger.com/profile/15015303170910975134noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-32773746268351181852008-04-09T14:39:00.000+02:002008-04-09T14:39:00.000+02:00Using external tables for bulk inserts we achieve ...Using external tables for bulk inserts we achieve performance of 3-4 seconds for 200K records in our POS system software.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27232680.post-84170709538185233472008-04-08T19:43:00.000+02:002008-04-08T19:43:00.000+02:00try using sql server bulk inserthttp://msdn2.micro...try using sql server bulk insert<BR/><BR/>http://msdn2.microsoft.com/en-us/library/ms188365.aspxvsleehttps://www.blogger.com/profile/04508810408180323635noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-44327395387816713962008-04-08T16:59:00.000+02:002008-04-08T16:59:00.000+02:00With this you're also testing communication protoc...With this you're also testing communication protocol and provider code. To test *just* database use SP or similar (multivalue insert in mysql is close).Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27232680.post-12541750559876002082008-04-08T16:03:00.000+02:002008-04-08T16:03:00.000+02:00Nah - ignore these lamers with XML voodoo and stuf...Nah - ignore these lamers with XML voodoo and stuff like that!<BR/><BR/>Try this pattern where you don't have multi-value inserts:<BR/><BR/>insert foo (a, b)<BR/> select (x1, y1)<BR/>union select (x2, y2)<BR/>union select (x3, y3)<BR/>union ...<BR/><BR/>Don't be shy now - see how many unioned rows you can do in one go!<BR/><BR/>You will be surprised how many. And at the result.<BR/><BR/>Hint: how often does SQLserver consider doing per-statement tasks like trigger exec in this case?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27232680.post-90335885213829269652008-04-08T15:15:00.000+02:002008-04-08T15:15:00.000+02:00Care to retest with PostgreSQL? It would be inter...Care to retest with PostgreSQL? It would be interesting to see how it compares with MySQL, in particular. :-)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-27232680.post-73686228733244757822008-04-08T14:36:00.000+02:002008-04-08T14:36:00.000+02:00Hi guillaume,I've written the mysql results for al...Hi guillaume,<BR/><BR/>I've written the mysql results for all runs, haven't I?<BR/><BR/>I wonder if running IDbDataAdapters and ExecuteBatch will make things go faster.<BR/><BR/>Also, the new 5.2 MySql provider introduces the ability to run Bulk Loads... which I guess will make the test go even faster...Pablo Santoshttps://www.blogger.com/profile/08083682682597484025noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-30810630114485664322008-04-08T14:26:00.000+02:002008-04-08T14:26:00.000+02:00Interesting, I just wonder why you're only giving ...Interesting, I just wonder why you're only giving the mysql result for the last run.<BR/>Anyway, nice to know :)Anonymoushttps://www.blogger.com/profile/12703870645785077431noreply@blogger.comtag:blogger.com,1999:blog-27232680.post-32694099797537960792008-04-08T14:10:00.000+02:002008-04-08T14:10:00.000+02:00For SQL server you might want to try to pass the r...For SQL server you might want to try to pass the rows to OPENXML as a set of records. Some years ago I got good results by doing that. The following example from http://support.microsoft.com/kb/555266 should get you going...<BR/>Hope it helps.<BR/><BR/><BR/>DECLARE @hDoc int<BR/> <BR/>--Prepare input values as an XML documnet<BR/>exec sp_xml_preparedocument @hDoc OUTPUT, @in_values<BR/> <BR/>--Select data from the table based on values in XML<BR/>SELECT * FROM Orders WHERE CustomerID IN (<BR/> SELECT CustomerID FROM OPENXML (@hdoc, '/NewDataSet/Customers', 1)<BR/> WITH (CustomerID NCHAR(5)))<BR/> <BR/>EXEC sp_xml_removedocument @hDocAnonymousnoreply@blogger.com