How to kill a running Spark application? Ask Question

How to kill a running Spark application? Ask Question

I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

ここに画像の説明を入力してください

Can anyone with me with this?

ベストアンサー1

  • たとえば、SparkスケジューラからアプリケーションIDをコピーして貼り付けます。アプリケーション_1428487296152_25597
  • ジョブを起動したサーバーに接続する
  • yarn application -kill application_1428487296152_25597

おすすめ記事