You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/08/03 22:36:52 INFO SparkContext: Running Spark version 2.4.0
22/08/03 22:36:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/08/03 22:36:53 INFO SparkContext: Submitted application: DeepWalk
22/08/03 22:36:54 INFO SecurityManager: Changing view acls to: root
22/08/03 22:36:54 INFO SecurityManager: Changing modify acls to: root
22/08/03 22:36:54 INFO SecurityManager: Changing view acls groups to:
22/08/03 22:36:54 INFO SecurityManager: Changing modify acls groups to:
22/08/03 22:36:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
22/08/03 22:36:55 INFO Utils: Successfully started service 'sparkDriver' on port 35072.
22/08/03 22:36:55 INFO SparkEnv: Registering MapOutputTracker
22/08/03 22:36:55 INFO SparkEnv: Registering BlockManagerMaster
22/08/03 22:36:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/08/03 22:36:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/08/03 22:36:55 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-76e1fc3a-6ced-4cb1-b160-4b02ee6ff286
22/08/03 22:36:55 INFO MemoryStore: MemoryStore started with capacity 861.3 MB
22/08/03 22:36:55 INFO SparkEnv: Registering OutputCommitCoordinator
22/08/03 22:36:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/08/03 22:36:56 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://hadoop000:4040
22/08/03 22:36:56 INFO Executor: Starting executor ID driver on host localhost
22/08/03 22:36:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37745.
22/08/03 22:36:57 INFO NettyBlockTransferService: Server created on hadoop000:37745
22/08/03 22:36:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/08/03 22:36:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManagerMasterEndpoint: Registering block manager hadoop000:37745 with 861.3 MB RAM, BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, hadoop000, 37745, None)
Exception in thread "main" java.lang.NoClassDefFoundError: com/tencent/angel/graph/embedding/deepwalk/DeepWalk
at com.tencent.angel.spark.examples.local.DeepWalkExample$.main(DeepWalkExample.scala:31)
at com.tencent.angel.spark.examples.local.DeepWalkExample.main(DeepWalkExample.scala)
Caused by: java.lang.ClassNotFoundException: com.tencent.angel.graph.embedding.deepwalk.DeepWalk
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 2 more
The text was updated successfully, but these errors were encountered:
name: Question
about: 运行DeepWalkExample、PageRankExample时出现Exception in thread "main" java.lang.NoClassDefFoundError的问题
title: 虚拟机centos7.9环境下运行local下多个Example样例显示java.lang.NoClassDefFoundError错误
label: question
assignees: ''
Environment:
问题描述
我在vmware下创建的centos7.9的虚拟机(分配内存为8g)内,运行example/local下的DeepWalkExample、PageRankExample出现报错,目前自己排查没有找到问题所在,特来求助。
补充说明,我是在本地IDEA下用remote deployment进行运行测试的,环境都是虚拟机上的环境,这里应该没啥影响。
目前我已经尝试将scala版本进行切换,在2.12.16和2.11.8下都会产生一样的问题;我为了确认scala是否有问题,在local位置创建了一个helloworld.scala文件,是可以成功输出结果的
报错信息
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
22/08/03 22:36:52 INFO SparkContext: Running Spark version 2.4.0
22/08/03 22:36:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/08/03 22:36:53 INFO SparkContext: Submitted application: DeepWalk
22/08/03 22:36:54 INFO SecurityManager: Changing view acls to: root
22/08/03 22:36:54 INFO SecurityManager: Changing modify acls to: root
22/08/03 22:36:54 INFO SecurityManager: Changing view acls groups to:
22/08/03 22:36:54 INFO SecurityManager: Changing modify acls groups to:
22/08/03 22:36:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
22/08/03 22:36:55 INFO Utils: Successfully started service 'sparkDriver' on port 35072.
22/08/03 22:36:55 INFO SparkEnv: Registering MapOutputTracker
22/08/03 22:36:55 INFO SparkEnv: Registering BlockManagerMaster
22/08/03 22:36:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/08/03 22:36:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/08/03 22:36:55 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-76e1fc3a-6ced-4cb1-b160-4b02ee6ff286
22/08/03 22:36:55 INFO MemoryStore: MemoryStore started with capacity 861.3 MB
22/08/03 22:36:55 INFO SparkEnv: Registering OutputCommitCoordinator
22/08/03 22:36:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/08/03 22:36:56 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://hadoop000:4040
22/08/03 22:36:56 INFO Executor: Starting executor ID driver on host localhost
22/08/03 22:36:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37745.
22/08/03 22:36:57 INFO NettyBlockTransferService: Server created on hadoop000:37745
22/08/03 22:36:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/08/03 22:36:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManagerMasterEndpoint: Registering block manager hadoop000:37745 with 861.3 MB RAM, BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, hadoop000, 37745, None)
22/08/03 22:36:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, hadoop000, 37745, None)
Exception in thread "main" java.lang.NoClassDefFoundError: com/tencent/angel/graph/embedding/deepwalk/DeepWalk
at com.tencent.angel.spark.examples.local.DeepWalkExample$.main(DeepWalkExample.scala:31)
at com.tencent.angel.spark.examples.local.DeepWalkExample.main(DeepWalkExample.scala)
Caused by: java.lang.ClassNotFoundException: com.tencent.angel.graph.embedding.deepwalk.DeepWalk
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 2 more
The text was updated successfully, but these errors were encountered: