views:

92

answers:

4

i have around (1,049,906 total, Query took 0.0005 sec) in my x table ,

If i simply retrieve trying to retrive the particular field records ,

Its tooks hardly 6 mins ,

This is my query

SELECT CUSTOMER_CODE FROM X_TBL ;

CUSTOMER_CODE => UNIQUE

THE ABOVE QUERY TOOK 6MIN ,

Tel me optimization tips for this ,

Also in some situation to search customer , i used the CUSTOMER_CODE in like ,

select CUSTOMER_CODE from X_TBL WHERE CUSTOMER_CODE LIKE "$KEY_WORD%"

yes i have index.

How to check script execution timing...in php ,

i have used below snippet but it always showing towards of increasing..

   list ($msec, $sec) = explode(' ', microtime());
$microtime = (float)$msec + (float)$sec;

Regards Bharanikumar

+1  A: 

You can try to use indexes, its very optimize queries time.

try this link:

http://www.databasejournal.com/features/mysql/article.php/1382791/Optimizing-MySQL-Queries-and-Indexes.htm

Dani
+2  A: 

If I understand this right, you have more than one million records in a table and wondering why it takes 6 minutes to get CUSTOMER_CODE from all of them at once? There is nothing to "optimize", most likely you need to build different queries something like

SELECT CUSTOMER_CODE FROM X_TBL LIMIT 0, 100;

or

SELECT COUNT(CUSTOMER_CODE) FROM X_TBL;

or whatever, depending on the task you're trying to solve.

As of searching by CUSTOMER_CODE, in case of LIKE "$KEY_WORD%" simple index on CUSTOMER_CODE will help, but again, don't try to select all records at once, except for cases where you do really need this.

Alexander Konstantinov
A: 

Maybe the actual data transfer is the bottleneck. One usually avoids transferring all the data within a relational database to another location (process).

But for arguments sake and for testing if this is really the bottleneck: You might try an unbuffered query, i.e. use a function/method that doesn't transfer all the data before it returns, see PDO::MYSQL_ATTR_USE_BUFFERED_QUERY and mysql_unbuffered_query.
And while we are at it you can also try MySQL's EXPLAIN and see if it sheds some light on the issue.

$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly'); 
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);

$KEY_WORD = 'something%';

foreach( $pdo->query("EXPLAIN SELECT CUSTOMER_CODE FROM X_TBL WHERE CUSTOMER_CODE LIKE '$KEY_WORD'", PDO::FETCH_ASSOC) as $row ) {
  foreach($row as $k=>$v) {
    echo "$k=$v | ";
  }
  echo "<br />\n";
}


$start = microtime(true);
// test #1
if ( true ) {  // set this to false after the first test  
  $stmt = $pdo->prepare('SELECT CUSTOMER_CODE FROM X_TBL', array(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true));
  $stmt->execute();
}
// test #2
else {
  $stmt = $pdo->prepare('SELECT CUSTOMER_CODE FROM X_TBL WHERE CUSTOMER_CODE LIKE :search', array(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true));
  $stmt->execute( array(':search'=>$KEY_WORD) );
}
$end = microtime(true);

echo 'time: ', $end-$start;
VolkerK
A: 

if you have so many records in the table, then the first thing is to change the table engine to innodb if its not innodb, because for large number of records innodb is much faster as it caches the table data while on the contrary myisam engine only caches the indexes so each time it has to do a full table scan from disk if the data required cannot be fetched from index.

secondly if you want to stay with myisam then create an index on CUSTOMER_CODE column. That would make the query fast because then all mysql has to do is fetch the index for the query:

SELECT CUSTOMER_CODE FROM X_TBL
ovais.tariq