This is the mail archive of the cygwin@cygwin.com mailing list for the Cygwin project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: 1.3.18: slow pipe performance when cpu busy


David,

I reported this exact same thing on Dec. 26, '02 (Subject: "Delays With Pipes In Cygwin 1.3.18").

For me having a CPU soaker going is not optional ("You have completed more work units than 99.199% of our users.", if you get my drift) and I make extensive use of pipes both explicitly and buried in scripts and shell procedures, so this is an intolerable situation for me and for that reason I've also backed off to 1.3.17 until some kind of resolution is reached.

I was going to use strace to see if it would disclose anything more interesting or detailed, but I'm fresh out of round tuits.

By the way, you said your cygcheck output was attached, but it appeared in-line. Check to make sure your mailer is not configured to put text-only attachments in-line with the message body.

Randall Schulz


At 18:43 2003-01-04, David Rothenberger wrote:
I've noticed slow pipe performance when my CPU is busy running low-priority programs (e.g., SETI@home). For example, if I run:

% grep keychain .profile

the command completes very fast. However, when I run

% cat .profile | grep keychain

the command takes 6-7 seconds to complete. However, if I kill SETI@home, the command completes quickly.

(BTW, SETI@home is running at Low priority as reported by Windows.)

I do not have this problem if I revert back to 1.3.17.

cygcheck output is attached. Please let me know if there's more info I can provide.

--
Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple
Bug reporting:         http://cygwin.com/bugs.html
Documentation:         http://cygwin.com/docs.html
FAQ:                   http://cygwin.com/faq/


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]