Hello all,
I have some python scripts which reads some large txt files and for each line of these files do some DB table read (MySQL database). The scripts are added to crontab and there are several of them and the files are really huge so the DB has a really great number of reads each minute. I discovered that the performance of DB is much worse lately.
My question is: would it be more efficient to make only one DB read per each file? Read few thousands of rows at one time and then find with python language the appropriate row for each line? Would is be faster than make one read per one file's line? Is this a kind of a good practice or not?
Kind regards,
Pawel
I have some python scripts which reads some large txt files and for each line of these files do some DB table read (MySQL database). The scripts are added to crontab and there are several of them and the files are really huge so the DB has a really great number of reads each minute. I discovered that the performance of DB is much worse lately.
My question is: would it be more efficient to make only one DB read per each file? Read few thousands of rows at one time and then find with python language the appropriate row for each line? Would is be faster than make one read per one file's line? Is this a kind of a good practice or not?
Kind regards,
Pawel