Oracle datapump export error "invalid argument value"

I have a datapump export script which I’d like to get running. The DB is Oracle 11g. To try it out, I’ve been executing it in SQLDeveloper. The script looks like this:

DECLARE  
     JOBHANDLE   NUMBER;
     STATUS      VARCHAR2(20);
     LOGFILE     UTL_FILE.FILE_TYPE; 
     LINE        VARCHAR2(200); 
     ROW         NUMBER := 1; 

     TYPE OUTPUT_TYPE IS TABLE OF VARCHAR2(128) INDEX BY PLS_INTEGER;
     OUTPUT      OUTPUT_TYPE;
 BEGIN
     EXECUTE IMMEDIATE 'CREATE OR REPLACE DIRECTORY "TESTDIR11" AS ''/home/achim/temp/0003759/T0987654321/'' ';

     JOBHANDLE := DBMS_DATAPUMP.OPEN(
         operation => 'EXPORT',
         job_mode  => 'TABLE',
         job_name  => 'TST TMF.ACHIMSTEST11');

     DBMS_DATAPUMP.ADD_FILE(
         HANDLE    => JOBHANDLE,
         filename  => 'Q01DED3D.dmp',
         directory => 'TESTDIR11',
         filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE
         ,reusefile => 1
         );

     DBMS_DATAPUMP.ADD_FILE(
         HANDLE    => JOBHANDLE,
         filename  => 'Q01DED3D.log',
         directory => 'TESTDIR11',
         filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE
         ,reusefile => 1
         );

     BEGIN
         DBMS_DATAPUMP.SET_PARAMETER(
                 handle => JOBHANDLE,
                 name   => 'INCLUDE_METADATA',
                 value  => 0);

         DBMS_DATAPUMP.METADATA_FILTER(
                 HANDLE      => JOBHANDLE, 
                 name        => 'NAME_LIST', 
                 value       => '''ACHIMSTEST11''',
                 object_path => 'TABLE');

         DBMS_DATAPUMP.METADATA_FILTER(
                 HANDLE  => JOBHANDLE, 
                 name    => 'SCHEMA_LIST', 
                 value   => '''TMF''');

         DBMS_DATAPUMP.START_JOB(JOBHANDLE);
         DBMS_DATAPUMP.WAIT_FOR_JOB(JOBHANDLE, STATUS);
     EXCEPTION
         WHEN OTHERS 
         THEN
             status := 'ERROR'; 
     END;

     DBMS_DATAPUMP.DETACH(JOBHANDLE); 

     LOGFILE := UTL_FILE.FOPEN('TESTDIR', 'Q01DED3D.log', 'R'); 
     Loop BEGIN
         UTL_FILE.GET_LINE(LOGFILE, LINE);
         OUTPUT(ROW) := LINE; 
         ROW := ROW + 1;
     EXCEPTION 
         WHEN No_Data_Found 
         THEN 
             EXIT; 
         END;
     END Loop; 

     EXECUTE IMMEDIATE 'DROP DIRECTORY "TESTDIR"';

 END;

The trouble is that I get an error which I cannot explain and I have no idea how to track it down. The error message is:

Error report:

ORA-39001: invalid argument value

ORA-06512: at “SYS.DBMS_SYS_ERROR”, line 79

ORA-06512: at “SYS.DBMS_DATAPUMP”, line 3507

ORA-06512: at “SYS.DBMS_DATAPUMP”, line 3756

ORA-06512: at line 18

  1. 00000 – “invalid argument value”

*Cause: The user specified API parameters were of the wrong type or

value range. Subsequent messages supplied by

DBMS_DATAPUMP.GET_STATUS will further describe the error.

*Action: Correct the bad argument and retry the API.

The important part here is the “ORA-06512: at line 18”. This points to the first ADD_FILE statement. After scouring the Internet Oracle documentation regarding the datapump, I am at a loss as to what is supposed to be wrong. I also can’t figure out how to use DBMS_DATAPUMP.GET_STATUS when the job doesn’t even get defined, let alone run.

I’ve tried not using the reusefile parameter and I’ve tried replacing DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE with the actual integer value it represents, which is 1.

Running this in SQLDeveloper leaves an entry floating around in dba_datapump_jobs, which isn’t all that simple to get rid of. The state of the job is “DEFINING”. This prevents me running the script a second time. After searching for around an hour I found out that it could be removed with DROP TABLE “TMF”.”TST TMF.ACHIMSTEST11″ PURGE, though why that works is a mystery.

I guess I should point out that I did not write this script, it was generated from a FreeMarker template in a Java program. In the Java program the generated script is run as an OracleCallableStatement and for 11g databases that works. The only reason I’m fiddling with it in SQLDeveloper is to try to get it to work for a 12c database. Needless to say the script doesn’t work for 12c in SQLDeveoper either. I’d kind of hoped it would work for 11g.

Incidentally, the error I get when running the Java with a 12c database is the same one as shown above.

If anyone can see anything that is actually wrong or even just looks suspect, I’d be very grateful for the advice.

I also can’t figure out how to use DBMS_DATAPUMP.GET_STATUS when the job doesn’t even get defined, let alone run.

The job is defined, you have a handle by the time you call add_file. SO you can add an exception handler to call get_status and output the results (after doing set serveroutput on of course):

...
EXCEPTION
  WHEN OTHERS THEN
    DECLARE
      job_state varchar2(4000);
      status ku$_Status;
    BEGIN
      DBMS_DATAPUMP.GET_STATUS(
        handle    => jobhandle,
        mask      => dbms_datapump.KU$_STATUS_JOB_ERROR,
        timeout   => null,
        job_state => job_state,
        status    => status);

    FOR I IN 1..status.error.COUNT LOOP
      DBMS_OUTPUT.PUT_LINE(status.error(I).errornumber || ': ' || status.error(I).logtext);
    END LOOP;
  END;

  RAISE;
END;
/

(Yes, I know when others is bad, even when re-raising, but this is a temporary thing…)

Given what you’ve shown and the call that’s erroring, I expect that will show something like:

ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/home/achim/temp/0003759/T0987654321/Q01DED3D.dmp"
ORA-27040: file create error, unable to create file
Linux-x86_64 Error: 2: No such file or directory
Additional information: 1
ORA-27037: unable to obtain file status
Linux-x86_64 Error: 2: No such file or directory
Additional information: 3

Presumably when this is run as a callable statement it uses a different path for the directory object. Or doesn’t include the create/drop at all, which you wouldn’t normally do at run-time. The 12c version either doesn’t have the directory defined, or if it’s on a different server the path being used isn’t valid there. Or, perhaps, is RAC and is only valid on one node, which might make the error intermittent.

In what you’ve shown though, the /home/achim part looks suspect. The Oracle process owner – usually oracle – has to be able to read and write files in the operating system directory, and unless you’ve opened your home directory up for whole world to see, you’re probably getting a failure because oracle cannot create the dump file where you asked it to.

If that is the case then change the directory path to point to somewhere you are sure oracle does have the necessary permissions; if you have access to that account, try to create a file manually from the command line to verify. If not you you’ll just need to check the permissions on the operating system directory and its hierarchy carefully.

You might see a different error of course, but it looks like it has to be something related to the directory or file, not the other arguments.

Also, for removing orphan jobs, see My Oracle Support document 336014.1.