Hi All,
I am exporting a table from Oracle using Sqoop. I have a date column in Oracle table with format as DD-MON-YY. I get the same format when i dump the data from Oracle SQL Developer. But, when i dump the data using Sqoop i get the following format YYYY-MM-DD HH:MM:SS.x. For the value "24-JAN-08" in DB, Sqoop will dump it as "2008-01-24 00:00:00.0". Is this an expected behavior? If yes, please let me know why does sqoop adds the unnecessary timestamp at the end and also modifies the original date format?
-- Thanks & Regards, Anil Gupta
anil gupta's gravatar image asked Mar 12 2012 at 20:57 in Sqoop-User by anil gupta

5 Answers

Hi Anil,
Some of the Oracle JDBC drivers (version < 9.2 && > 11.1) auto-converts date to timestamp: http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-faq-090281.html#08_01
Since Sqoop uses the JDBC driver to import data from the Oracle db, dates in output files are in the form of timestamp.
Thanks, Cheolsoo
Cheolsoo Park's gravatar image answered Mar 12 2012 at 21:08 by Cheolsoo Park
Hi Cheolsoo,
Thanks for the inputs. AFAIK, SQL Developer also uses JDBC but its dumping the data in the same format as its in DB. So, i am wondering why Sqoop is unable to dump the data similar to SQL Developer? I am using SQL Developer 3.1.07.
Thanks, Anil Gupta
anil gupta's gravatar image answered Mar 12 2012 at 21:17 by anil gupta
>From the link you had sent me in previous mail. Here is something which might stop the date being converted into Timestamp: "If for some reason your app is very sensitive to this change and you simply must have the 9i-10g behavior, there is a connection property you can set. Set mapDateToTimestamp to false and the driver will revert to the default 9i-10g behavior and map DATE to Date. " Will this help in resolving this problem?
Thanks, Anil
anil gupta's gravatar image answered Mar 12 2012 at 21:26 by anil gupta
Hi Anil,
This issue is already being tracked by the following Apache JIRAs:
https://issues.apache.org/jira/browse/SQOOP-451 https://issues.apache.org/jira/browse/SQOOP-456
In fact, I am currently working on adding new options via which the user can specify format masks for date, time, and timestamp.
Thanks, Cheolsoo
Cheolsoo Park's gravatar image answered Mar 12 2012 at 21:31 by Cheolsoo Park
Thanks Cheolsoo, SQOOP-456 will address my problem. In the meantime, i will be moving ahead by truncating the time part from the export and then processing the data because in my use-case the time is stored in another column.
Thanks, Anil
anil gupta's gravatar image answered Mar 12 2012 at 21:54 by anil gupta

Related Discussions

  • How Does One Preprocess The Data So That They Can Be Exported Using Sqoop in Sqoop-user

  • Hi I would be grateful for any tips on how to "prepare" the data so they can be exported to a Postgesql Database using sqoop. As an example: Provided some files of events. (user events, product events, productActivity events) [file0001] event:user propertes:{name:"john" ...} event:product properties:{ref:123,color:"blue",... event:productActivity properties:{user:"john", product:"ref", action:"...

  • Special Character In Sqoop Exported Data in Sqoop-user

  • I trying to import HDFS data to Mysql using the following command: /opt/cloudera/parcels/CDH-5.0.0-1.cdh5.0.0.p0.47/bin/sqoop export --driver com.mysql.jdbc.Driver --connect jdbc:mysql://server:port/dbname --table table_name --username user --password pwd --export-dir /user/hdfs/hiveoutput/consolidatedconsumption/Hits/* --input-fields-terminated-by '\t' --input-lines-terminated-by '\n' ...

  • Problem With The SQOOP Imported Data In HDFS in Sqoop-user

  • Hello Everyone, When I tried to import the below data from an Oracle table (columns delimited by ',') to HDFS using the below mentioned Sqoop command, 12345,1-1SKCE5P,null,2013-10-11 06:23:22.0,2014-12-02 14:22:32.0,Switched "INFONET CONFERENCING" GSP P3519,null,OS sqoop import --connect jdbc:oracle:thin:@//xxxxx:xxxxx/xxxx_xxxx --username SUMAN --password-file /user/$USER/sqoop.password --...

  • Sqoop Import: Dumping The Header in Sqoop-user

  • Hi, We would like to use sqoop to import data from a database as a CSV. But we could not find any option that controls whether to print the header or not while writing to a CSV. Query used: $ sqoop import \ --connect jdbc:mysql://mysql.example.com/sqoop \ --username sqoop \ --password sqoop \ --query 'SELECT * from cities' \ --target-dir cities Could you please...

  • Oraoop - Sqoop Failed When There Is No Data In The Souce Table in Sqoop-user

  • Hi, The Sqoop script failed with Oracle Oraoop Connector when there is no data in the source table with the below error message. Error 14/12/04 13:46:55 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: The table XXXXXX_LOG does not contain any data. java.lang.RuntimeException: The table XXXXXX_LOG does not contain any data. With "Oracle quest Connector Disabled...

  • Can We Fire The Trigger In SQL Server After Exporting The Data From HIVE Using SQOOP in Sqoop-user

  • Hello all, I have implemented some task in Hive and wanted to export a table from Hive into SQL Server using SQOOP. When table gets exported from Hive to SQL Server, I want to fire the trigger in SQL Server. Is it possible to fire the trigger? Thanks. -- Regards, Bhavesh Shah...

  • Issues With The Sqoop Merge Feature Of Sqoop in Sqoop-user

  • Sqoop import uses --hive-table option to import the data to hive and the final result appears like an hive internal table .But why doesn't the sqoop merge do the same thing.The sqoop merge can merge two HDFS directories and also data from hive internal tables but it doesn't write the output in the same way to an hive internal table. Thanks, Lovely...

  • Importing Data In Partitioned Hive Table Using Sqoop in Sqoop-user

  • I have a requirement to import data from Mysql table to partitioned Hive table. Each row of mysql contains date field. I want partitioning on date field in Hive table. How to use dynamic partition. As sqoop enforce to tell partition name beforehande. But I want from each Mysql row get date column and insert in appropriate partition in hive....

  • Can Sqoop Validate The Data From Each Database in Sqoop-user

  • An amateur question, can sqoop validate the values of date from each database? I have read http://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#validation and find out RowCountValidator. But what I want is to verify the values from each database. If it's not supported now, how can I validator the data from two databases?...

  • How Can I Control The Impact To The Production Environment When I Use Sqoop To Load Data Into Hdfs in Sqoop-user

  • Hi, How big impact to the database does Sqoop load data from a database ? How will the Sqoop access and dump data from database ? Is it different among different connector or driver? If it has big impact to production database, what to do to avoid the problem? What is the Sqoop implementation on the database side? Thanks Dong Chen Mobile: 0291282614 Address: 12C Seccombes Road, Epsom...

  • Need Help And Tips For Tthe Following Issue: No Data Get Exported From Hadoop To Mysql Using Sqoop. in Sqoop-user

  • Hi I want to do the following: Export data stored in hadoop to MySql. It is not working and I have been pulling my hair. I was hoping to get a bit of help. Thank you in advance The command is the following: ~/sqoop-1.4.2.bin__hadoop-1.0.0/bin/sqoop export --connect jdbc:mysql://hostname:3306/analyticsdb --username username --password password --table ml_ys_log_gmt_test --export-dir hdfs:///mnt/...

  • Sqoop Security in Sqoop-user

  • Hi guys, I would like to ask you some questions about security connections with Sqoop. I have this config file "sqoop.conf": # # Options file for Sqoop # # Specifies the tool being invoked import # Connect parameter and value jdbc:sqlserver://xxx.xxx.x.xx\SQLEXPRESS;database=TEST # Username parameter and value xxxx xxxx *--target-dir* *s3://......* ...

  • Sqoop Incremental Mode To Get Updated Data In Hive in Sqoop-user

  • Hello Experts, I need to extract data from RDBMS tables to Hive Tables on a timely basis(daily/weekly, etc.). My tables have a 'Primary Key' but does NOT have a 'last-modified' column. I plan to go as:FIRST RUN: Will use sqoop-import-all-tables command for importing all the tables at one go to Hive Tables.EACH SUBSEQUENT RUN: Use Sqoop incremental import mode to retrieve only rows newer than some...

  • Defining Table Owner In Sqoop Version: 1.4.6 Import From Oracle in Sqoop-user

  • Hi, While importing data from Oracle, using Sqoop version: 1.4.6, is it possible to define Table Owner/Schema? Thanks....

  • Sqoop And Rejected Rows In Export Process in Sqoop-user

  • Hello every one, I search if in Sqoop , is there a way to catch (and manage) rejected rows in export process ( like duplicate keys, data mismatch type, ...etc ) ?. I tested table staging but that is not a good manner to manage export data in relational data bases. Thanks by advance ;) Life and Relations are not binary *Matouk IFTISSEN | Consultant BI & Big Data[image: http://www.ysance...

  • Handling CLOBs In Sqoop - Hive Import in Sqoop-user

  • Hi, We are trying to use Sqoop for importing data from Oracle. The table has CLOB as one of its column type which contains newline characters at many places. Tried using --hive-drop- import-delims option but somehow itís not working. The data still contains newlines, and so Hive table doesnít read them properly. Found that this works with SQL Server tables smoothly. The table / commands / sqoop...

  • Avoiding Skew And Determining Optimal Number Of Mappers In SQOOP Import. in Sqoop-user

  • Hi,¬†If there is a primary key on the source table, SQOOP import would generate no skewed data... What if there is no primary key defined on the table and we have to use --split-by parameter to split records among multiple mappers.¬†There are high chances of skewed data depending on the column we select to --split-by.¬†Could you please help me understand how to avoid skewing in such scenarios and ...

  • Error In Importing Postgresql Data Type Inet In Sqoop in Sqoop-user

  • Hi All, I want to import a database in Postgresql to HDFS. I am using cloudhera 5.0.0. I used the following statement to do that. sqoop -import-all-tables --driver org.postgresql.Driver --connect jdbc:postgresql://ipaddress:port/dbname -m 1 -username -password -warehouse-dir By using this I was able to import few tables. But I was not able to import tables with column type inet. Is there...

  • Sqoop Splitting in Sqoop-user

  • Hello, I am transferring incremental data from Oracle table using sqoop 1.4.2. direct using following command: sqoop import --libjars /usr/hdp/ --direct --connect jdbc:oracle:thin:@//xxx/xxx --username xxx --password xxx --table A.A1 --as-textfile --optionally-enclosed-by '"' --check-column DAY --incremental append --last-value "DATE'2016-09-30'" --split-by DAY --...

  • Sqoop Problem in Sqoop-user

  • I am trying install sqoop in my lappy but there are some problems with the installation of hue : I m importing some data from mysql into hive and as per the end result it has been impoerted and even the file part-m-00000 contains all the data that is imported but i m still not sure wether the data is actually being imported because when i look at the result on my terminal it says transferred 0 ...