Welcome to the Java Programming Forums


The professional, friendly Java community. 21,500 members and growing!


The Java Programming Forums are a community of Java programmers from all around the World. Our members have a wide range of skills and they all have one thing in common: A passion to learn and code Java. We invite beginner Java programmers right through to Java professionals to post here and share your knowledge. Become a part of the community, help others, expand your knowledge of Java and enjoy talking with like minded people. Registration is quick and best of all free. We look forward to meeting you.


>> REGISTER NOW TO START POSTING


Members have full access to the forums. Advertisements are removed for registered users.

Results 1 to 7 of 7

Thread: Input stream buffers up to 4K before available to read

  1. #1
    Junior Member
    Join Date
    Jun 2014
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Exclamation Input stream buffers up to 4K before available to read

    Hi there,



    I have a scenario where I want to read the input stream of a process(/usr/bin/program ran using java process builder) as soon as the program writes to stdout/console.
    In general the program /usr/bin/program writes data to console/stdout as soon as there are bytes avaialble even with say 100 bytes. But the same program when I ran
    using Java process builder and read the input stream, I do not get the data as soon as program writes to stdout/console but after 4K of data is available on the input stream.

    I need the data(every byte written) to be available in the input stream in real time.

    I have tried at least these two options. Any help would be greatly appreciated.

    Option 1:
    process = new ProcessBuilder().redirectErrorStream(true).command ("/usr/bin/ntfsubscribe", "-s").start();
    InputStream is = process.getInputStream();
    ReadableByteChannel channel = Channels.newChannel(is);
    ByteBuffer buf = ByteBuffer.allocate(2048);

    int bytesRead = channel.read(buf);
    while (bytesRead != -1) {

    System.out.println("Read " + bytesRead);
    buf.flip();

    while(buf.hasRemaining()){
    System.out.print((char) buf.get());
    }

    buf.clear();
    bytesRead = channel.read(buf);
    }

    Option 2:
    BufferedReader input = new BufferedReader(new InputStreamReader(is));
    String line = "";
    while ((line = input.readLine()) != null) {

    System.out.println(line);
    }


    Thanks


  2. #2
    Administrator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,308
    Thanks
    181
    Thanked 824 Times in 767 Posts
    Blog Entries
    5

    Default Re: Input stream buffers up to 4K before available to read

    I need the data(every byte written) to be available in the input stream in real time.
    I'm not sure I follow what the problem is. I presume you obtain all the output data, perhaps you might explain what benefit is gained by obtaining one byte before the other on a nanosecond time scale. FWIW The InputStream obtained from the process - which is ran in parallel to your code with no synchronization - is buffered (eg BufferedInputStream).

  3. #3
    Junior Member
    Join Date
    Jun 2014
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Input stream buffers up to 4K before available to read

    In my case, when the process first generated output of 1450 bytes it was not available in the input stream. It was in the
    buffer and the process waited until the the next subsequent outputs adds upto 4k to produce output into inputstream. The process(/usr/bin/program)
    generates the output when there is external trigger to it. Since after generating the first output, there was no external
    trigger the first output was simply in the buffer but not yet available to the input stream to read. Hence, it is not real time and the first event is kind of lost on time.

    Every output from the process is considering as an event on which some other process is started, some algorithm is run etc.

    The output is the control data output not raw data output. Hence I need every data that is generated and which is less then 4k should
    be immediately avaialble to the inputstream to read.

    Hope I was able to clarifiy. I would be glad to have any technical work around to this if this is the behavior.

  4. #4
    Super Moderator Norm's Avatar
    Join Date
    May 2010
    Location
    SW Missouri
    Posts
    20,626
    Thanks
    50
    Thanked 2,235 Times in 2,207 Posts

    Default Re: Input stream buffers up to 4K before available to read

    Have you tried reading directly from the InputStream?
    If you don't understand my answer, don't ignore it, ask a question.

  5. #5
    Administrator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,308
    Thanks
    181
    Thanked 824 Times in 767 Posts
    Blog Entries
    5

    Default Re: Input stream buffers up to 4K before available to read

    ...there was no external trigger the first output was simply in the buffer but not yet available to the input stream to read. Hence, it is not real time and the first event is kind of lost on time.
    I don't think that's necessarily what's going on here. The stream should be available to read, unless you are observing some type of native system issue. For instance, to quote the API:
    Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
    If you have not yet, I recommend reading the often cited article:
    When Runtime.exec() won't | JavaWorld
    (although you are directly calling exec, the concepts within this article still apply).

    My suggestion would be to place any reading of the InputStream and ErrorStream, as well as any writing via the OutputStream into their own threads so they all can be read/written in parallel.

  6. #6
    Junior Member
    Join Date
    Jun 2014
    Posts
    3
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Re: Input stream buffers up to 4K before available to read

    Yes I have tried directly reading the input stream in a separate thread and the same result.

    Something like this in the thread run. I now have separate threads per input, output and error but no luck yet.
    try {
    int c;
    String line = "";
    while ((c = inputstream.read()) != -1) {
    if (((char) c == '\r') || ((char) c == '\n')) {
    if (!line.isEmpty())
    System.out.println(line);
    line = "";
    } else
    line += (char) c;
    }
    } catch (Throwable t) {
    t.printStackTrace();
    }

  7. #7
    Administrator copeg's Avatar
    Join Date
    Oct 2009
    Location
    US
    Posts
    5,308
    Thanks
    181
    Thanked 824 Times in 767 Posts
    Blog Entries
    5

    Default Re: Input stream buffers up to 4K before available to read

    Is there a particular, common command line tool you can reproduce the behavior with? I have never experienced something like you describe, so posting an SSCCE which tries to run a common tool will go a long way to allow others to try and reproduce what you are seeing in order to try and provide further feedback.

Similar Threads

  1. Read stream of data
    By mapred in forum File I/O & Other I/O Streams
    Replies: 3
    Last Post: October 26th, 2012, 12:38 PM
  2. Read input, read file, find match, and output... URGENT HELP!
    By MooseHead in forum What's Wrong With My Code?
    Replies: 3
    Last Post: April 3rd, 2012, 11:01 AM
  3. could not create audio stream from input stream
    By chronoz13 in forum What's Wrong With My Code?
    Replies: 11
    Last Post: June 2nd, 2011, 02:08 AM
  4. url input stream returning blank
    By yo99 in forum File I/O & Other I/O Streams
    Replies: 7
    Last Post: June 2nd, 2010, 08:14 PM
  5. Client input Stream
    By gisler in forum File I/O & Other I/O Streams
    Replies: 1
    Last Post: December 19th, 2009, 08:30 PM

Tags for this Thread