I have deployed the code as a WAR file in WebLogic Server (WLS) 12.1.3 where I am sending a message from a Producer and the messages are consumed by the below code. The application is deployed as a WAR file in WLS server in Windows and it is listening, but the same WAR file is deployed with same version of
Tag: linux
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=350m;
When I’mtrying to open Intellij IDE using command line in linux like this ./phpstorm.sh both android studio and PHPStorm I always got this message : OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=350m; support was removed in 8.0 and I was wondering if google find solution here but I was kinda lost here since I’m newbie in ubuntu 14.04. my
Installing Oracle JDK on Windows subsystem for Linux
When trying to use the Linux version of Oracle’s JDK on the latest Windows 10 build having support for bash, I am running into a problem with the prompt hanging whenever attempting to invoke the java binary. Typing even something as simple as java -version hangs and I have to terminate the process to resume control. Anyone got this working
Spring Boot init.d not Not running (process not found)
I was trying to follow the instructions from here, where trying to run the Spring Boot app as init.d service but could not successfully. I created the fully executable jar (myapp.jar) as mentioned and also created the symlink to /etc/init.d/myapp When I run the java -jar myapp.jar I could see the application start up successfully. But when I try to
Run custom TextSecure (Signal) server
I am trying to start my custom TextSecure (Signal) server. I want to use it for all functions that Signal has (both SMS and telephony). I believe that I also need redphone server to run telephony. I’ve found github repos for TextSecure server only https://github.com/WhisperSystems/TextSecure-Server but no repos for redphone server. I think that I also need to run this
How to install man page for Maven on linux?
I am using linux(mint mate), and installed maven by download & unzip & config the environment, I could use the mvn command. I want to have man mvn, not just mvn -help, any tip? @Update: To make the question clear, there is no man page for mvn, because I install maven by unzip, so I want to install man page
Spring batch FileItemWriter not creating file at correct path
I have a spring batch service containing a FileItemReader,FileItemProcessor and FileItemWriter.When creating the FileItemWriter I have to set the Resource that will be my output file. I am running the batch service on websphere on a Linux machine.The problem is if I set the resource as new FileSystemResource(new File(“opttemp1myFile.txt”)), the path of the file created is “/usr/IBM/WebSphere/AppServer/profiles/AppSrv01/opttempmyFile.txt” which is not
How to set JAVA_HOME in Linux for all users
I am new to Linux system and there seem to be too many Java folders. java -version gives me: java version “1.7.0_55” OpenJDK Runtime Environment (rhel-2.4.7.1.el6_5-x86_64 u55-b13) OpenJDK 64-Bit Server VM (build 24.51-b03, mixed mode) When I am trying to build a Maven project , I am getting error: Could you please tell me which files I need to modify
_XReply() terminates app with _XIOError()
We’re developing some complexed application which consists of linux binary integrated with java jni calls (from JVM created in linux binary) from our custom made .jar file. All gui work is implemented and done by java part. Each time some gui property has to be changed or gui has to be repainted, it is done by jni call to JVM.
Hadoop “Unable to load native-hadoop library for your platform” warning
I’m currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable I’m running Hadoop 2.2.0. Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html However, the contents of /native/ directory on hadoop 2.x appear to be