QnaList > Groups > Sqoop-User > Sep 2015
faq

OraOop : Sqoop Direct Oracle Import Failed With Error "Error: Java.io.IOException: SQLException In NextKeyValue"

Hi Folks,
I am trying to import partitioned Oracle table through "OraOop" -  direct
mode to Hive and getting error.
 I tried with other permutation and combination of sqoop parameters, here
is what i have tried.
*Worked (*
*chunk.method=PARTITION and only 1
mapper):*-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'
\
-Doraoop.chunk.method=PARTITION  \
Worked * (**chunk.method=PARTITION  removed and 100 mappers):*
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'
\
*Doesn't work *
* (chunk.method=PARTITION and  100
mappers):*-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'
\
-Doraoop.chunk.method=PARTITION  \
*Through other combination are working , Can you please help me understand
why chunk.method=PARTITION with multiple mappers failing.   ?*
*Is there something need to be done on hive for partition ? *
*please help me in resolving the issue ?*
*Any help is highly appreciated  *
See below full sqoop command which is failing. and error logs.
*Sqoop Import command (which is failing): *
$SQOOP_HOME/bin/sqoop import  \
*-Doraoop.disabled�lse \
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'
\-Doraoop.chunk.method=PARTITION  \ -Doraoop.import.consistent.read=true \*
-Dmapred.child.java.opts="-Djava.security.egd=file:/dev/../dev/urandom" \
*--direct \*
*--m 100  \*
*Error logs : *
2015-09-24 16:23:57,068 [myid:] - INFO  [main:Job@1452] - Task Id :
attempt_1442839036383_0051_m_000006_0, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
    at
org.apache.sqoop.manager.oracle.OraOopDBRecordReader.nextKeyValue(OraOopDBRecordReader.java:351)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00933: SQL command not
properly ended
    at
oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:91)
    at
oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:133)
    at
oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:206)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
    at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1034)
    at
oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:194)
    at
oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:791)
    at
oracle.jdbc.driver.T4CPreparedStatement.executeMaybeDescribe(T4CPreparedStatement.java:866)
    at
oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
    at
oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3387)
    at
oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3431)
    at
oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1491)
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
    at
org.apache.sqoop.manager.oracle.OraOopDBRecordReader.executeQuery(OraOopDBRecordReader.java:417)
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
    ... 13 more
Regards
Sanjiv Singh
Mob :  +091 9990-447-339

asked Sep 24 2015 at 05:10

Sanjiv Singh 's gravatar image



3 Replies for : OraOop : Sqoop Direct Oracle Import Failed With Error "Error: Java.io.IOException: SQLException In NextKeyValue"
Hi Sanjiv,
Could you please run the failing command again and add “—verbose” to generate debug logging and post the full log file?
David
From: @Sanjiv Singh [mailto:[email protected]]
Sent: Thursday, 24 September 2015 10:10 PM
To: [email protected]
Cc: Sanjiv Singh
Subject: OraOop : Sqoop Direct Oracle import failed with error "Error: java.io.IOException: SQLException in nextKeyValue"
Hi Folks,
I am trying to import partitioned Oracle table through "OraOop" -  direct mode to Hive and getting error.
 I tried with other permutation and combination of sqoop parameters, here is what i have tried.
Worked (chunk.method=PARTITION and only 1 mapper):
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'   \
-Doraoop.chunk.method=PARTITION  \
--m 1  \
--direct \
Worked (chunk.method=PARTITION  removed and 100 mappers):
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'   \
--m 100  \
--direct \
Doesn't work (chunk.method=PARTITION and  100 mappers):
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'   \
-Doraoop.chunk.method=PARTITION  \
--m 100  \
--direct \
Through other combination are working , Can you please help me understand why chunk.method=PARTITION with multiple mappers failing.   ?
Is there something need to be done on hive for partition ?
please help me in resolving the issue ?
Any help is highly appreciated
See below full sqoop command which is failing. and error logs.
Sqoop Import command (which is failing):
$SQOOP_HOME/bin/sqoop import  \
-Doraoop.disabled=false \
-Doraoop.import.partitions='OLD_DAYS,SYS_P41,SYS_P42,SYS,SYS_P68,SYS_P69,SYS_P70,SYS_P71'   \
-Doraoop.chunk.method=PARTITION  \
-Doraoop.import.consistent.read=true \
-Dmapred.child.java.opts="-Djava.security.egd=file:/dev/../dev/urandom" \
--connect jdbc:oracle:thin:@host:port/db \
--username ***** \
--password ***** \
--table DATE_DATA \
--direct \
--hive-import \
--hive-table tempDB.DATE_DATA \
--split-by D_DATE_SK \
--m 100  \
--delete-target-dir \
--target-dir /tmp/34/DATE_DATA
Error logs :
2015-09-24 16:23:57,068 [myid:] - INFO  [main:Job@1452] - Task Id : attempt_1442839036383_0051_m_000006_0, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
    at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
    at org.apache.sqoop.manager.oracle.OraOopDBRecordReader.nextKeyValue(OraOopDBRecordReader.java:351)
    at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
    at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
    at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00933: SQL command not properly ended
    at oracle.jdbc.driver.SQLStateMapping.newSQLException(SQLStateMapping.java:91)
    at oracle.jdbc.driver.DatabaseError.newSQLException(DatabaseError.java:133)
    at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:206)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:455)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:413)
    at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:1034)
    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:194)
    at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:791)
    at oracle.jdbc.driver.T4CPreparedStatement.executeMaybeDescribe(T4CPreparedStatement.java:866)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1186)
    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3387)
    at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3431)
    at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1491)
    at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
    at org.apache.sqoop.manager.oracle.OraOopDBRecordReader.executeQuery(OraOopDBRecordReader.java:417)
    at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
    ... 13 more
Regards
Sanjiv Singh
Mob :  +091 9990-447-339

answered Sep 24 2015 at 17:34

David Robson 's gravatar image


Hi David,
PFA for log file with "—verbose" added to sqoop command.
*Sqoop version: 1.4.5hadoop-2.6.0*
Let me know if need other details.
Regards
Sanjiv Singh
Mob :  +091 9990-447-339

answered Sep 24 2015 at 21:23

Sanjiv Singh 's gravatar image


Hi Sanjiv,
maybe is the case you have not enough undo space on oracle; I saw that
error on my case when loading data. Can you try with just 1 (smallest)
partition?
Kind Regards,
Mario Amatucci

answered Sep 24 2015 at 23:27

Mario Amatucci 's gravatar image


Related discussions

Tagged

Group Sqoop-user

asked Sep 24 2015 at 05:10

active Sep 24 2015 at 23:27

posts:4

users:3

©2013 QnaList.com