mysql - Looking for Best Practice / Most Efficient Large SQL UPDATE / INSERT -


i looking best means of achieving large data update / insert in sql. particular case using mysql 5.6, in theory version of sql isn't important.

i downloading large csv file, filled data need dump mysql table. application parses csv , prepares inserted database.

i need table exact replica of data (csv) comes in each time, not adding every time end. looking best way of achieving this.

to current sql capabilities, thought might best truncate table each time , populate data comes through, unsure whether better indexing column , using insert ... on duplicate key.

my question/s follows:

  • is best truncate , insert data on empty table, or better find data differences , use insert .. on duplicate key update rows application has found data discrepancy

  • either way after this, best format individual sql update / insert queries per row of data , send them server. or better format large query data in it, or possibly split larger query more manageable not let server time out.

there approximately 100k rows in table coming through. currently, truncating table before running inserts. split rows 10 different sets, , run 10 large insert queries against database. concern not know enough table locking , unsure whether splitting these achieves significant.

the reason here asking because database usage graphs in monitor showing spikes, , thought maybe better have steadier data flow , trying figure out how achieve such.

db disk writes

i understand disk writes, graphs more or less same, lots of spikes , no consistency.

you should use load data local infile instead of insert/update operations when working large csv files. didn't mention parsing operations performing before insert, of them might possible load data local infile alone. truncate table before statement.

here's example load data local infile statement ignores first line (header) , replaces comma values price field while loading data, example:

load data local infile 'file.csv'  table table  character set utf8  fields terminated '\t' enclosed '\"'  ignore 1 lines  (col1,col2,col3,coln, @price)  set price = replace(@price,',','.'); 

as you've said need fields csv, pass column's names here

(col1,col2,col3,coln, @price)  

in same order in csv. way faster insert statement write.

to enable load data local infile, need set connection flag when establishing connection database (within options directly in connection options, doesn't work after), e.g. using php pdo:

$dbh = new pdo('mysql:host=' . env('db_host') . ';dbname=' . $database, env('db_username'), env('db_password'),                 [pdo::mysql_attr_local_infile => 1]); 

Comments

Popular posts from this blog

c# - Binding a comma separated list to a List<int> in asp.net web api -

Delphi 7 and decode UTF-8 base64 -

html - Is there any way to exclude a single element from the style? (Bootstrap) -