Martins Blog

Trying to explain complex things in simple terms

My first contact with the log miner

Posted by Martin Bach on December 11, 2009

I have never really played with log miner, but used it many times implicitly for streams. The need to get to grips with it arose when a developer created unrecoverable operations (i.e. something non-logging) in a QA system. I don’t want them to run the same in production so I had to find out who did it. First of all I needed a test case on my 10.2.0.4 32bit Linux system. First some theory though.

Grey is all theory

When starting with log miner you have the options to chose from a number of dictionaries. From my understanding the dictionary is where log miner stores intermediate results and for translating object_ids to object names. You can chose from:

  • online catalog
  • online redo logs
  • flat file

Using the online catalog is the simplest of these approaches, you need supplemental logging for using the online redo logs.

Once that’s build, you simply add the archived logfiles you are interested in (can include the online logs if you like) and start the log miner session. The contents of the analysis is in v$logmnr_contents. When done with your investigation, you close the log miner session.

I opted to use the flat file approach for now, here’s the example.

1. Build the dictionary

That’s probably not always necessary, but I did create it anyway. I chose to put that into a file instead of the online redo logs since I couldn’t/didn’t want to enable supplemental logging on the database. With that said the streams initialisation-adding supplemental logging to all replicated tables makes a lot more sense! I also started to understand “dictionary_begin” and “dictionary_end” in v$archived_log more.

To build the dictionary to a file, you need to have utl_file_dir set correctly, which sounds a bit anachronistic nowadays where we have directories but I digress. In my case utl_file_dir was set to ‘/home/oracle/utl_file_dir’. The command I executed was:

begin
  DBMS_LOGMNR_D.BUILD(
    dictionary_filename => 'dictionary.ora',
    dictionary_location => '/tmp/',
    options => DBMS_LOGMNR_D.STORE_IN_FLAT_FILE);
end;
/

2. Add log files

I knew from v$archived_log that my logfiles are for sequence 23 and 24 so I added them to the list of file to mine:

begin
  DBMS_LOGMNR.ADD_LOGFILE(
    LOGFILENAME => '/u01/oracle/ora10/arch/arch1_23_703580639.dbf',
    OPTIONS => DBMS_LOGMNR.NEW);
  DBMS_LOGMNR.ADD_LOGFILE(
    LOGFILENAME => '/u01/oracle/ora10/arch/arch1_24_703580639.dbf',
    OPTIONS => DBMS_LOGMNR.NEW);
end;
/

3. Start log miner

Now for the interesting part of this!

begin
 dbms_logmnr.start_logmnr(
    dictfilename => '/home/oracle/utl_file_dir/dictionary.ora');
end;
/

4. Analysing the results

Now query v$logmnr_contents at your heart’s delight!

5. End your log miner session

The final step is to end the session.

begin
 dbms_logmnr.end_logmnr;
end;
/  

Keep it simple!

A more simplistic approach, using the current database dictionary, requires the following:

  • Access to the source database (i.e. the database where the statements originated)
  • Column definitions must not have changed (i.e. no DDL against tables)
  • The database must be open

With these, all you need to do is to add the logfiles and execute something similar to:

begin
 sys.dbms_logmnr.start_logmnr (
 starttime => sysdate - 10 * 1/24/60,
 endtime   => sysdate,
 options   => sys.dbms_logmnr.dict_from_online_catalog);
end;
/
Reference:

Oracle Database Utilities
About these ads

One Response to “My first contact with the log miner”

  1. [...] Martin Bach-My first contact with the log miner [...]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
Follow

Get every new post delivered to your Inbox.

Join 2,317 other followers

%d bloggers like this: