Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resource Manager error while deployed to HDInsight cluster and job deployed with Livy #30

Closed
shailendraksharma opened this issue Jan 7, 2017 · 4 comments

Comments

@shailendraksharma
Copy link

last job detailed Log on Spark cluster.txt

Any help will be much appreciated.

@CodingCat
Copy link
Contributor

Hi, @shailendraksharma , what you provided is the log from ApplicationMaster, would you post the log from Container side?

@CodingCat
Copy link
Contributor

if you're a customer of HDInsight ,would you please communicate through our working email, your colleague shall have our contact

@shailendraksharma
Copy link
Author

shailendraksharma commented Jan 24, 2017

Thanks for help i think i figure out the issue. while saving small json data it works fine but saving lengthy json causing errors. one such error is following while inserting data into sql db:

17/01/22 19:48:01 INFO DStreamGraph: Clearing checkpoint data for time 1485114434000 ms
17/01/22 19:48:01 WARN AzureFileSystemThreadPoolExecutor: Disabling threads for Delete operation as thread count 0 is <= 1
17/01/22 19:48:01 INFO AzureFileSystemThreadPoolExecutor: Time taken for Delete operation is: 216 ms with threads: 0
17/01/22 19:48:01 WARN TaskSetManager: Lost task 3.0 in stage 99.0 (TID 584, 10.0.0.14): com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near 's'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:792)
at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:689)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeUpdate(SQLServerStatement.java:642)
at com.microsoft.spark.streaming.examples.common.DataFrameExtensions$ExtendedDataFrame$$anonfun$insertToAzureSql$1$$anonfun$apply$1.apply(DataFrameExtensions.scala:48)
at com.microsoft.spark.streaming.examples.common.DataFrameExtensions$ExtendedDataFrame$$anonfun$insertToAzureSql$1$$anonfun$apply$1.apply(DataFrameExtensions.scala:39)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at com.microsoft.spark.streaming.examples.common.DataFrameExtensions$ExtendedDataFrame$$anonfun$insertToAzureSql$1.apply(DataFrameExtensions.scala:38)
at com.microsoft.spark.streaming.examples.common.DataFrameExtensions$ExtendedDataFrame$$anonfun$insertToAzureSql$1.apply(DataFrameExtensions.scala:33)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:902)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:902)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

trying to figure out this one.

@sabeegrewal
Copy link
Contributor

Hey, thanks for your interest in using Spark and EventHubs :) If this issue is still ongoing, please re-open a new issue! I'll be actively checking for issues, and would be happy to help out. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants