![]() rw-rw-rw- 3 sqoop hdfs 3659 16:18 /dataload/tohdfs/reio/odpdw/may2016/DimSampleDesc/DimSampleDesc.avsc View the warehouse-dir root]$ hadoop fs -ls /dataload/tohdfs/reio/odpdw/may2016/DimSampleDesc ![]() To allow hive to write, give write permissions to all root]$ hadoop fs -chmod -R a+w root]$ If this step is skipped, the table will still get created but when you do a 'select' on the table, you get the 'Not a data file' error message. Java.io.FileNotFoundException: Destination directory '.' does not exist Īt .FileUtils.moveFileToDirectory(FileUtils.java:2865)Īt .DataDrivenImportJob.writeAvroSchema(DataDrivenImportJob.java:146)Īt .nfigureMapper(DataDrivenImportJob.java:92)Īt .nImport(ImportJobBase.java:260)Īt .SqlManager.importTable(SqlManager.java:673)Īt .SQLServerManager.importTable(SQLServerManager.java:163)Īt .ImportTool.importTable(ImportTool.java:497)Īt .n(ImportTool.java:605)Īt .run(Sqoop.java:148)Īt .n(ToolRunner.java:70)Īt .runSqoop(Sqoop.java:184)Īt .runTool(Sqoop.java:226)Īt .runTool(Sqoop.java:235)Īt .main(Sqoop.java:244)Įrror in moving the schema file, hence, do manually. sqoop import sqoop import -connect 'jdbc:sqlserver://dbserver database=dbname' -username someusername -password somepassword -as-avrodatafile -num-mappers 8 -table DimSampleDesc -warehouse-dir /dataload/tohdfs/reio/odpdw/may2016 -verboseĪn exception was thrown but the avro files were created : Writing Avro schema file: /tmp/sqoop-sqoop/compile/e64596608ce0247bf2233353991b20fd/DimSampleDesc.avscġ6/05/09 13:09:00 DEBUG mapreduce.DataDrivenImportJob: Could not move Avro schema file to code output directory. grant permissions to sqoop root]$ hadoop fs -chown -R sqoop root]$ hadoop fs -ls root]$ ![]() for sqoop import root]$ hadoop fs -mkdir -p root]$ I am listing below the steps I carried out to load a sql server table to avro and then to a Hive external table, maybe, it will provide some pointers : provided in the post but I did face the same issue (java.io.IOException: java.io.IOException: Not a data file). I am novice to data loading and Avro so I can't verify the avro schema etc. '.ql.io.avro.AvroContainerOutputFormat' LOCATION 'hdfs://csaa-aap-qa/apps/hive/warehouse/reservemodel.db/embedded' Kindly let me know what I am missing?ĬOMMENT "just drop the schema right into the HQL" I can see the avro file inside the table folder. Hi,I have very simple AVSC file, And I generated the Avro using GitHub Code, xml to Avro converter,īut when I query the table I get below error.Īvro - java.io.IOException: java.io.IOException: Not a data file.
0 Comments
Leave a Reply. |