当前位置: 动力学知识库 > 问答 > 编程问答 >

hadoop - Java Program imports data using sqoop

问题描述:

I have created a web application(JSP) to extract data from mysql database to HDFS.In my Java code, I have used sqoop import command to achieve my requirement. The program executed successfully but the extracted data written to normal unix file system instead of hdfs.

Can anyone let me know how to provide hdfs file system path in sqoop import command?

 package com.archival.da;

import java.sql.*;

public class DataImportSetup {

static int status=0;

public static int importsetup(String policy_id){

Connection con=GetCon.getCon();

PreparedStatement ps;

try {

ps = con.prepareStatement("SELECT

CON.SERVER,CON.PORT,CON.DB,CON.USER,

CON.PWD,POLICY.SQL_TEXT FROM POLICY

JOIN CONNECTION AS CON ON POLICY.C_ID=CON.C_ID WHERE POLICY.ID=?");

ps.setString(1,policy_id);

ResultSet rs=ps.executeQuery();

rs.next();

String ServerNm =

"jdbc:mysql://"+rs.getString(1)+":

"+rs.getString(2)+"/"

+rs.getString(3);

String ConUser=rs.getString(4);

String ConPass=rs.getString(5);

String SqlText=rs.getString(6);

String[] str={"import","--connect",ServerNm,"--hadoop-mapred- home","/ms/hadoop-1.2.0", "--query", SqlText , "--target-dir", "/user/root/city","--username", ConUser, "--password", ConPass,"--split-by","id"};

status=Sqoop.runTool(str);

System.out.println(status);

} catch (SQLException e) {

e.printStackTrace();

}

return status;

}

}

网友答案:

It's writing to the local file system instead of HDFS because the default file system is local unless otherwise configured. You can configure this to be HDFS using SqoopOptions - see this question / answer for an example:

  • How can I execute Sqoop in Java?

Specifically you need to locate and pass the location of your clusters core-site and hdfs-site xml files:

Configuration config = new Configuration(); 
config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
config.addResource(new Path("/usr/local/hadoop/conf/hdfs-site.xml"));
分享给朋友:
您可能感兴趣的文章:
随机阅读: