We use proprietary and third party´s cookies to improve your experience and our services, identifying your Internet Browsing preferences on our website; develop analytic activities and display advertising based on your preferences. If you keep browsing, you accept its use. You can get more information on our Cookie Policy
Cookies Policy
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

ClassNotFoundException in shared hadoop, when trying to create a directory

Hi we are trying to connect a lab node with orion context broker installed to the Cosmos Big Data store. Unfortunately we are experiencing some issues with regards to Cosmos. When trying to create a directory, as explained in the documentation, we are getting the following error message:

[developer@cosmosmaster-gi ~]$ hadoop fs -mkdir /user/developer/testdirectory Java HotSpot(TM) 64-Bit Server VM warning: Insufficient space for shared memory file: /tmp/hsperfdatadeveloper/28027 Try using the -Djava.io.tmpdir= option to select an alternate temp location.

Exception in thread "main" java.lang.NoClassDefFoundError: /tmp/hsperfdatadeveloper/28039 Caused by: java.lang.ClassNotFoundException: .tmp.hsperfdatadeveloper.28039 at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the main class: _/tmp/hsperfdata_developer/28039. Program will exit.

Any suggestions?