FAQ Database Discussion Community


java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/home/hduser/sqoop/lib/hsqldb-1.8.0.10.jar

java,hadoop,sqoop
I am basically a mysql guy and new in hadoop and trying to import one mysql table on my hadoop system by sqoop and using below command but getting error. I search on net but not getting any possible solution. I will be very thankful for your support. [[email protected] ~]$...

Encountered IOException running import job: java.io.IOException: Error returned by javac

bash,hadoop,jdbc,sqoop
I am trying to run a simple sqoop import program using JAVA. My Program: String driver="com.vertica.Driver"; Configuration config = new Configuration(); config.addResource(new Path("/../../../mapred-site.xml")); config.addResource(new Path("/../../../core-site.xml")); config.addResource(new Path("/../../../hdfs-site.xml")); SqoopOptions options = new SqoopOptions(); options.setConnectString(connection_string); options.setDriverClassName(driver); options.setUsername(username); options.setPassword("xxxxxxxxxxxxx");...

How to insert and Update simultaneously to PostgreSQL with sqoop command

postgresql,hadoop,hive,sqoop
I am trying to insert into postgreSQL DB with sqoop command. sqoop export --connect jdbc:postgresql://10.11.12.13:1234/db --table table1 --username user1 --password pass1--export-dir /hivetables/table/ --fields-terminated-by '|' --lines-terminated-by '\n' -- --schema schema It is working fine if there is not primary key constrain. I want to insert new records and update old records...

Accessing Vertica Database through Oozie sqoop

sqoop,oozie,vertica
I have written an Oozie workflow to access an HP Vertica database through Sqoop. This is on a Cloudera VM. I am getting the following error in Yarn logs after running: RROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: dbDriver java.lang.RuntimeException: Could not load db...

How to I access HBase table in Hive & vice-versa?

hive,hbase,sqoop,apache-sqoop,apache-hive
As a developer, I've created HBase table for our project by importing data from existing MySQL table using sqoop job. The problem is our data analyst team are familiar with MySQL syntax, implies they can query HIVE table easily. For them, I need to expose HBase table in HIVE. I...

How does mapper output get written to HDFS in case of Sqoop?

java,hadoop,mapreduce,hdfs,sqoop
As I have learned about Hadoop Map-Reduce jobs that mapper output is written to local storage and not to HDFS, as it is ultimately a throwaway data and so no point of storing in HDFS. But as I see in case of Sqoop mapper output file part-m-00000 is written into...

Sqoop Job via Oozie HDP 2.1 not creating job.splitmetainfo

hadoop,mapreduce,sqoop,oozie,hortonworks-data-platform
When trying to execute a sqoop job which has my hadoop program passed as a jar file in -jarFiles parameter, the execution blows off with below error. Any resolution seems to be not available. Other jobs with same hadoop user is getting executed successfully org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.FileNotFoundException: File does not exist:...

Incremental loads in Sqoop

hadoop,hive,teradata,sqoop
I have a table in Teradata which is loaded with new data on daily basis. I need to import this data to Hive. I'm trying to use Sqoop but how should I do incremental load using Sqoop? I checked incremental load options available in Sqoop --check-col This options expects only...

ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver

mysql,sqoop
I am using shared node cluster Hadoop 2.5.0-cdh5.3.2 Please share the names all compatible version of MySql jar files to be loaded and all the path folders for any successful import and export between HDFS and MySQL. I am currently getting below error message ERROR sqoop.Sqoop: Got exception running Sqoop:...

How to import MySql table into a targeted database in hive?

hadoop,hive,sqoop
I am using hadoop version 2.6.0 & sqoop version 1.4.5. I have successfully imported a SQL table- tblSystem into hive using the following sqoop command: sqoop import --connect jdbc:mysql://ip_Address:port_no/MySQL_database_name --username user --password passwd --table tblSystem -m 1 --hive-import However, I noticed that this command imports the SQL table into the...

Data in HDFS files not seen under hive table

hadoop,hive,sqoop,hadoop-partitioning
I have to create a hive table from data present in oracle tables. I'm doing a sqoop, thereby converting the oracle data into HDFS files. Then I'm creating a hive table on the HDFS files. The sqoop completes successfully and the files also get generated in the HDFS target directory....

Import data from oracle into hive using sqoop - cannot use --hive-partition-key

oracle,hadoop,hive,sqoop
I have a simple table: create table osoba(id number, imie varchar2(100), nazwisko varchar2(100), wiek integer); insert into osoba values(1, 'pawel','kowalski',36); insert into osoba values(2, 'john','smith',55); insert into osoba values(3, 'paul','psmithski',44); insert into osoba values(4, 'jakub','kowalski',70); insert into osoba values(5, 'scott','tiger',70); commit; that i want to import into Hive using sqoop....

getting error while using sqoop 1.4.5 and hadoop 2.41

java,hadoop,import,sqoop
While importing data from sqoop 1.4.5, I am getting below error 15/04/30 16:15:10 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/root/.staging/job_1430385162985_0014 Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:53) I am using hadoop 2.4.1 with java version "1.7.0_75"....

sqoop import unable to locate sqoop-1.4.6.jar

hadoop,sqoop
I'm using sqoop for importing data from mysql table to be used with hadoop. While importing it is showing error. Hadoop Version: 2.5.0 Sqoop Version: 1.4.6 Command Used for import sqoop import --connect jdbc:mysql://localhost/<dbname> --username root --password [email protected] --table <tablename> -m 1 Error Shown 15/05/27 23:13:59 ERROR tool.ImportTool: Encountered IOException...

How to export data from hbase to SQL Server

sql-server,hbase,sqoop
How can I export data from hbase to SQL Server? Can I do it directly using some tools? I use sqoop to export data from SQL Server to hbase. But how can I use sqoop-export to export data from hbase to SQL Server? Thanks...

HDP 2.2 Sandbox Could not find SQOOP directory

hadoop,sandbox,sqoop,hortonworks-data-platform
I was following the tutorial http://hortonworks.com/hadoop-tutorial/import-microsoft-sql-server-hortonworks-sandbox-using-sqoop/ I am unable to find the /usr/lib/sqoop/lib I could see Sqoop running in the sandbox. Just not able to find the folder to drop the drivers. Can someone please help where else I could place the jdbc driver? Also where is the installation directory...

schedule and automate sqoop import/export tasks

shell,hadoop,automation,hive,sqoop
I have a sqoop job which requires to import data from oracle to hdfs. The sqoop query i'm using is sqoop import --connect jdbc:oracle:thin:@hostname:port/service --username sqoop --password sqoop --query "SELECT * FROM ORDERS WHERE orderdate = To_date('10/08/2013', 'mm/dd/yyyy') AND partitionid = '1' AND rownum < 10001 AND \$CONDITIONS" --target-dir /test1...

How can I customize Sqoop Import serialization from Mysql to HBase?

mysql,serialization,import,hbase,sqoop
Currently, I have a MySql table "email_history" as below. email_address updated_date modification [email protected] 2014-10-20 NEW:confidence::75|NEW:sources::cif [email protected] 2014-10-20 NEW:confidence::75|NEW:sources::cif|NEW:user::r.wagland The field "email_address" and "modification" are VARCHAR and "updated_date" is DATE. When importing to HBase, the row key needs to be email_address concatenating byte array presented date. And the value needs to...

sqoop-export is failing when I have \N as data

hive,sqoop
Iam getting below error when I run my sqoop export command. This is my content to be exported by sqoop command 00001|Content|1|Content-article|\N|2015-02-1815:16:04|2015-02-1815:16:04|1 |\N|\N|\N|\N|\N|\N|\N|\N|\N 00002|Content|1|Content-article|\N|2015-02-1815:16:04|2015-02-1815:16:04|1 |\N|\N|\N|\N|\N|\N|\N|\N|\N sqoop command sqoop export --connect jdbc:postgresql://10.11.12.13:1234/db --table table1 --username user1 --password pass1--export-dir /hivetables/table/ --fields-terminated-by '|' --lines-terminated-by '\n' -- --schema schema...

SQOOP connection-param-file format

hadoop,parameters,connection-string,sqoop
In Sqoop for Hadoop you can use a parameters file for connection string information. --connection-param-file filename Optional properties file that provides connection parameters What is the format of that file? Say for example I have: jdbc:oracle:thin:@//myhost:1521/mydb How should that be in a parameters file?...

Sqoop Export with Missing Data

sql,postgresql,shell,hadoop,sqoop
I am trying to use Sqoop to export data from HDFS into Postgresql. However, I receive an error partially through the export that it can't parse the input. I manually went into the file I was exporting and saw that this row had two columns missing. I have tried a...

sqoop-merge can this command use on hbase import?

hbase,sqoop
I use sqoop import data from sql server to hbase. Can I also use sqoop-merge command to update data in hbase? Thanks...