oracle - Traverse records into columns for huge tables +800 million records -


how can traverse records (actually subset of columns) one-record columns - 99 columns -- huge table?

i mean, have table following sample structure/data :

table_ori column1 column2 column3 code value ------- ------- ------- ---- ------------ c1      c2      c3      1    value1 c1      c2      c3      2    value2 c1      c2      c3      3    value3 c100    c39     c21     1    value40 c100    c39     c21     2    value41 

i want convert data into:

table_new column1 column2 column3 value1  value2 value3  value4 value5 ... value99 ------- ------- ------- ------  ------ ------- ------ ------     ------- c1      c2      c3      value1  value2 value3 c100    c39     c21     value40 value41 

please consider big table , result table can have 99 value columns. tried pl/sql nested loop besides bulk collect cursor process takes days , never ends. lot!

this fastest way:

create table table_new select /*+ parallel */ column1, column2, column3,     max(case when code = 1 value else null end) value1,     max(case when code = 2 value else null end) value2,     max(case when code = 3 value else null end) value3,     max(case when code = 4 value else null end) value4,     max(case when code = 5 value else null end) value5,     --...     max(case when code = 99 value else null end) value99 table_ori group column1, column2, column3; 

that assumes have enterprise edition, database configured use parallelism properly, large amount of tablespace sort data @ 1 time, etc.

it performance use option nologging when creating table. avoid generating lot of redo , undo although @ expense of table not being recoverable.

for large processes this, real-time sql monitoring perfect way diagnose problems. if above sql taking long time, run statement monitor sql , see operations , events taking time:

select dbms_sqltune.report_sql_monitor('$the_real_sql_id') dual; 

Comments