在 Windows 7 64 位中删除 Spark 临时目录时出现异常

2024-05-21

我正在尝试在 Windows 7 64 位中运行 Spark 作业的单元测试。我有

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

我运行了以下命令:

winutils ls \tmp\hive
winutils chmod -R 777  \tmp\hive

但是当我运行测试时,出现以下错误。

Running com.dnb.trade.ui.ingest.spark.utils.ExperiencesUtilTest
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec
17/01/24 15:37:53 INFO Remoting: Remoting shut down
17/01/24 15:37:53 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554
java.io.IOException: Failed to delete: C:\Users\415387\AppData\Local\Temp\spark-b1672cf6-989f-4890-93a0-c945ff147554
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:929)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at .....

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=786m; support was removed in 8.0

Caused by: java.lang.RuntimeException: java.io.IOException: Access is denied
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:525)
        ... 28 more
Caused by: java.io.IOException: Access is denied
        at java.io.WinNTFileSystem.createFileExclusively(Native Method)

我尝试过手动更改权限。每次我都会遇到同样的错误。

请帮忙!


问题出在 ShutdownHook 中,它尝试删除临时文件但失败。虽然您无法解决该问题,但您可以通过将以下两行添加到您的log4j.properties文件输入%SPARK_HOME%\conf。如果该文件不存在,则复制该文件log4j.properties.template并重命名它。

log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR

眼不见心不烦。

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

在 Windows 7 64 位中删除 Spark 临时目录时出现异常 的相关文章

随机推荐