Ask Big Data Hadoop Related Questions HDFS Tutorial Team March 6, 2017 Hadoop Ask Any Hadoop and it's Ecosystem Related Questions We are creating our forum till the time it is not live, you can post all your questions and doubts here. We will make sure it is getting answered correctly. All the Hadoop and BI related questions are highly welcomed. You can also post job-related queries. Tweet Tags:Ask your Doubts, Hadoop Questions, Hadoop Support Related Posts Different Modes of Hadoop How to Create Table in HBase 5 Best Hadoop Use Cases in Education Sector 4 Comments Sumit Kumar Thanks for this website. Could you please provide me the code or way to schedule Sqoop job daily that load incremental import into one hive table. It would be great help if I will get step by step solution. Thanks, Sumit kumar Log in to Reply Ashutosh Jha Hi Sumit, The incremental load is easy and you can check the following link where we have explained how to do incremental import in Sqoop. Link: http://www.hdfstutorial.com/sqoop-import-function/ And for job scheduling in Sqoop, you can check the following tutorial- http://www.hdfstutorial.com/sqoop-jobs/ Do try to implement these and let us for any further doubts. Have any doubts? Please comment here. Regards, HDFS Tutorial Team Log in to Reply Jerry customers = load ‘./in2/customersTable.txt’ using PigStorage(‘ ‘) as (nameCus:chararray, age: int); purchases = load ‘./in2/purchasesTable.txt’ using PigStorage(‘ ‘) as (namePur:chararray, flavor: chararray); A = JOIN customers BY (name), purchases BY (name); B = foreach A generate A::nameCus,A::namePur,A::age,A::flavor; C = group B by flavor; D = foreach C generate COUNT(C) as purchaseCount; E = ORDER D BY purchaseCount DESC; F = LIMIT E 1; G = store F into ‘./flavorcount’; Log in to Reply Spoorthy hai I installed sqoop as per ur guidelines.when I am using sqoop list-databases command,i am getting an error as could not find or load main class org.apache.sqoop.Sqoop. how to resolve this issue? Log in to Reply Leave a Reply Cancel replyYou must be logged in to post a comment.