Sharing socket handles accross processes or threads



  • 1. REx: ^ not matching after \n
    Yet another (undocumented?) quirk of REx engine: 5.8.8 docs say: You may, however, wish to treat a string as a multi-line buffer, such that the "^" will match after any newline within the string, and "$" will match before any newline. At the cost of a little more overhead, you can do this by using the /m modifier on the pattern match operator. It looks like "^" will never match at \z (=end of string), even if it is after newline: perl -wle "print 11 if qq(aa\n) =~ /^(?<=a\n)/m" prints nothing. Using the synonym (?<!.) helps: perl -wle "print 11 if qq(aa\n) =~ /(?<!.)(?<=a\n)/" 11 but, obviously, this is not going to be as optimized as the version starting with ^... Should this be documented? [I do not think this may be changed; too many programs may depend on the undocumented semantic...] Puzzled, Ilya P.S. I have no idea what info is attempted to be transmitted by "At the cost of a little more overhead...". I read it approximately as: If you want to open file, use the open() function. On the other hand, you can save a lot of overhead by skipping this call; your program will behave as if the file does not exists, and will run much quickier.
  • 2. named pipe and "Text file busy"
    Hi There, I've got a PERL script which runs on linux and spawns a thread which opens a FIFO pipe in read only mode. I use a different program to write data to FIFO (written in c or simple echo command). Works like a charm, however once I exit the script (via CTRL-C) and restart it, the pipe becomes unusable for writing (i.e. no program is able to open it for writing - they get back en error message: errno 26 "Text file busy"). I need to create another FIFO in order for me use the script again. Internally I catch CTRL-C signal and close the FIFO explicitly. What's wrong? Is it because I have 2 threads running i.e. one does some other processing and the second one is responsible for the pipe?? What's the correct and clean way to exit from multithreaded program in Perl? Or is it another issue? Any help would be greatly appreciated. Thanks, finpro
  • 3. List all Fields in a Notes view (in perl?)
    I'm trying to extract data from a Notes view. I actually use Perl to do so, so if anyone has done that before, it would be very helpful. Anyway, here's the issue: I can get the specific view that I want; however, I can only extract the values in fields by using (abbreviated): my $view = $Database->GetView('viewname'); my $Document = $view->GetNthDocument(n); while ($Document = $view->GetNextDocument($Document)) { my $ri = $Document->GetItemValue('RequestID'); my $cr = $Document->GetItemValue('CURRENTREVIEWERS'); my $ca = $Document->GetItemValue('CurrentApprover'); ... } This approach is rather painful as I have to check the view properties to find out the field names. Is there a way to obtain a list of all the fields in a view? If so, how to translate the code to Perl. Please note: I tried something like: my $fieldList = $Document->Items(); but I cannot parse the resulting array reference. Any help will be greatly appreciated Rafal
  • 4. Compiling to Perl
    Hi all, Are there any online resources about compiling to Perl? Any information about existing compilers or things to consider when writing such a compiler will be appreciated. Background: We're writing CGI scripts for customers who deploy them on their servers. I'm currently exploring technological alternatives: programming in Perl, programming in PHP, compiling to Perl, compiling to PHP. (N.B.: Parrot will become an option once it's stable and as ubiquitous as Perl is today, but it's not an option yet.) TIA, Jo
  • 5. Where are anonymous temp files created?
    I'm not sure if this is even a sensible question, but when you open an anonymous temp file via open(my $fh, '+>', undef) or die "$!"; where does Perl create the file? I have a script that attempts to do this but when run on a different PC it dies with the message "Bad File Descriptor." I suspect that restrictive permissions are preventing the file from being created, but without knowing where the file goes (assuming it even has a path) I can't confirm it. perldoc -f open makes no mention about how and where Perl creates the file, permissions gotchas, etc. Will Perl try to use $ENV{TMP}? Do anonymous files even have a path? If not, how are file creation permissions handled? -mjc

Sharing socket handles accross processes or threads

Postby worik » Thu, 01 Jun 2006 07:03:10 GMT


I have been using Net::SMTP::Server::Net

I want to pass client handles (Net::SMTP::Server::Net::Client) to
another process or thread.

I naivly expected to use threads and global memory but in Perl (as it
turns out) I can only share scalars, hashes and arrays.  The
Net::SMTP::Server::Client->new($conn) is a glob.  I am not 100% sure I
know exactly what this means!

Is there any way that I can pass the connection for the
SMTP::Server->accept accross processes or threads or do I have to spawn
a new process for each instance?


Similar Threads: to share socket handle across threads?


I've been rtfm'ing for days but can't a solution to this problem:

I have a thread that receives multiple client request thanks to an
endless loop through the accept() method. The new socket object
returned by accept() must be communicated to another master thread. I
am using the IO::Socket module.

When trying to use threads::shared or Thread::Queue, I keep getting
this failure:
Invalid value for shared scalar at ./ line N.

So is there a way to share a socket/file handle accross threads?

Thank you


2.threads & threads::shared & Threads::Semaphore

Hi, All,

Hope everyone's new year is starting out very well.

I'm having to adjust a perl program I wrote to manipulate some  
genetics data.  Originally, the program had no memory problems but now  
that I've added a couple more hashes, I'm having memory issues.  It  
now runs out of memory when it's about half way through processing the  
data.  Unfortunately, the data is very interconnected and the  
statistics I need to execute involves data from several of the  
hashes.  I'm thinking of using threads & threads::shared in order to  
be able to process, store and access the data among several of 8  
processors on a Sun Spark system.  Of course, I'll need to install the  
threads and threads::shared modules and possible even re-compile perl  
on this machine but before I go and do all this fun stuff, I wanted to  
ask your opinion about whether or not I'm going down the right rabbit  
hole or if I'm just digging myself a shallow grave.  Would this be the  
way you might do it?  I've also heard of Semaphores.  Might this be a  
better way to go about spreading the data in hash form among several  
processors on one machine and still be able to access the data in each  
hash from the main program?

Thanks in advance for the excellent input I always get from this  
group!  :-)

Aimee Cardenas

3.Win32::Process -- Inheriting IO::Socket Handles

George Kuetemeyer < XXXX@XXXXX.COM > wrote in message

> I've created a TCP server using the  IO::Socket documentation in the
> Perl IPC faq. It works like a charm, but is currently  single-threaded.
> I want to try using Win32::Process to kick off a separate process for
> each connection (kind of like the Unix fork).

Heya all, same problem here... Could you help me please ? I wanna know
how to pass filehandles from perl to perl. I don't know if it is

Here is an outline of what I am doing :

1. Wait in while loop for client connection.  Works fine. (same)
2. Accept a $client_handle connection handle. Works fine. (same)
3. I am waiting for tasks. The main perl creates Win32::Process
objects for each incoming task, specifying that the handles of calling
process are inherited (see below for a sample). The process runs, I/O
continues to happen in the main perl console (when using 'print'), but
I have an error when trying to send to socket. Could someone help me
please ?

[assuming that $remote is the socket (print $remote "Something\n";
works fine) and $tasknum the tasknumber =)]

"perl.exe path_to\\ $tasknum $remote",
".") || die "Couldn't create process!!\n";

$tasknum = $ARGV[0];
$remote = $ARGV[1];

print "Hello\n";
print $remote "Hello\n";

Thanks in advance.

Sincerely yours


4.shared variables in threaded tk not being shared


Wonder can someone help. I've been trying a few things with perl/tk and
threads with some success but cant understand a particular problem.
I've read various posts on the subject and have altered my script
accordingly but still no go! Basically the script creates some shared
variables, a sleeping thread first, then the tk window and widgets.
The thread essentially starts to update a shared variable every second
once the start button is pressed. I can see this by printing to STDOUT.
A label in the GUI points to this shared variable and is instructed to
update it.
However if i put in some debug print statements i can see that in
effect this variable is not shared as it retains the initial value only
hence the label may be updated but with the same value.
Here is the code:

use Tk;
use threads;
use threads::shared;
use strict;

my $go : shared = 0;
my $myValue : shared = 0;

my $anotherThread = threads->new(\&ticker);

sub ticker {
	while (1) {
		if ($go == 0) {sleep 1;}

		else {
			print "$myValue\n";
			sleep 1;

my $mw = MainWindow->new();
$mw->title('UDP Sniffer');

my $f1 = $mw->Frame(-borderwidth => 2, -relief => 'groove')->pack(-side
=> 'top', -fill => 'x');
my $label2 = $f1->Label(-textvariable => \$myValue, -width =>
15)->pack(-side => 'left');
$f1->Button(-text => 'Start', -command => \&start)->pack(-side =>
$f1->Button(-text => 'Stop', -command => \&stop)->pack(-side =>
$f1->Button(-text => 'Reset', -command => \&resetme)->pack(-side =>


sub start {
	$go = 1;
	print "In the main window - $myValue\n";
 	##$mw->repeat(1, sub {$mw->update;});

sub stop {
	$go = 0;

sub resetme {
	$myValue = 0;

Threads are quickly becoming annoying to use as it seems they present
more problems than solutions with tk so could someone tell me what
other options i have for something similar to the above?

Thanks in advance to get filehandle-lock effective accross multi-process

Jeff Pang am Dienstag, 7. Februar 2006 10.47:
> hello,lists,
Hello Jeff
> I open a file and obtained a filehandle.then I fork a child and access this
> file in child (via the duplicate filehandle from parent).If both parent and
> child are writting to the file at the same time,the things should become
> terrible.So I should use the 'flock' call to lock the filehandle.But I find
> just in the same process,the flock is effective.In other words,when I flock
> the filehandle in child,the effect of 'flock' mean nothing to parent,and
> parent can still write to the file. the test code is shown as below:
> use strict;
> use warnings;
> when this code run,both parent and child can write to the same file,and
> maybe at the same time.
> How can I resolve this problem?thanks.

Seems that the flock is implemented as *advisory* lock as stated in
perldoc -f flock

The solution lies in the appropriate usage of flock, that means to use locking 
for every access (try) to the resource. Since both accesses are within your 
code, you have 100% control about that :-)


6. Threads and data sharing

7. sharing semaphore with/between threads

8. share semaphore across threads

Return to PERL


Who is online

Users browsing this forum: No registered users and 22 guest