[Beowulf] Cannot use more than two nodes on cluster
Gus Correa
gus at ldeo.columbia.edu
Thu Sep 20 09:23:21 PDT 2012
Hi Antti
If you're resolving host names through /etc/hosts,
check if they're consistent [maybe the same] on
the "master" and "slave" nodes.
Maybe now only the master can resolve the slave nodes' names,
is this the problem?
Also, how are the master and slave nodes connected to each other?
Do you have a switch where all the nodes connect to?
Or are the slaves connected to various NICs on the master?
Or are the connections daisy-chained?
Or something else?
Depending on the MPI you're using, and whether
you're using a job queuing system/resource manager [Torque, SGE, Slurm,
etc], and whether your MPI is integrated with the job queuing
system, you may or may not need to provide a "machinefile" or
"hostfile" in the mpiexec/mpirun command line.
This file needs to bear some consistency with the /etc/hosts
as well.
You didn't say anything about
A) which MPI you're using,
B) if you're launching jobs in a job queuing system or
directly on the master,
C) what is your mpiexec command line,
You may get more focused help if you disclose
this type of information.
I hope this helps,
Gus Correa
On 09/20/2012 10:37 AM, Antti Korhonen wrote:
> I tested ssh with all combinations and that part is working as designed.
>
> I can start job manually on any single node.
> I can start jobs on any two nodes , as long as other node is master.
> All other combinations hang and jobs do not start.
>
> I read through few install guides and did not find any steps I missed.
> I am using Ubuntu 12.04, in case that makes any difference.
>
> Antti
>
> -----Original Message-----
> From: beowulf-bounces at beowulf.org [mailto:beowulf-bounces at beowulf.org] On Behalf Of Jörg Saßmannshausen
> Sent: Thursday, September 20, 2012 1:42 AM
> To: beowulf at beowulf.org
> Subject: Re: [Beowulf] Cannot use more than two nodes on cluster
>
> Hi all,
>
> have you tried the following: ssh master -> node1 -> node2, i.e. ssh from the master to node1 and from there to node2?
> You do not have a situation where the remote host-key is not in the database and hence you get asked about adding that key to the local database?
>
> If that is working with all permutations, another possibility is that your host list is somehow messed up when you are submitting parallel jobs. Can you start the jobs manually by providing a host list to the MPI program you are using? Does that work or do you have problems here as well?
>
> My two pennies
>
> Jörg
>
>
> On Thursday 20 September 2012 07:40:56 Antti Korhonen wrote:
>> Passwordless SSH works between all nodes.
>> Firewalls are disabled.
>>
>>
>> From: greg at r-hpc.com [mailto:greg at r-hpc.com] On Behalf Of Greg Keller
>> Sent: Wednesday, September 19, 2012 8:43 PM
>> To: beowulf at beowulf.org; Antti Korhonen
>> Subject: Re: [Beowulf] Cannot use more than two nodes on cluster
>>
>> I am going to bet $0.25 that SSH or TCP/IP is configured to allow the
>> master to get to the nodes without a password, but not from one
>> Compute to the other Compute.
>>
>> Test by sshing to Compute1, then from Compute1 to Compute2. Depending
>> on how you built the cluster, it's also possible there is iptables
>> running on the compute nodes but, my money is on the ssh keys need reconfiguring.
>> Let us know what you find.
>>
>> Cheers!
>> Greg
>>
>> Date: Wed, 19 Sep 2012 16:11:21 +0000
>> From: Antti Korhonen
>> <akorhonen at theranos.com<mailto:akorhonen at theranos.com>> Subject:
>> [Beowulf] Cannot use more than two nodes on cluster
>> To: "beowulf at beowulf.org<mailto:beowulf at beowulf.org>"
>> <beowulf at beowulf.org<mailto:beowulf at beowulf.org>> Message-ID:
>>
>> <B9D51F953BEE5C42BC2B503D288542992DD935FE at SRW004PA.theranos.local<mail
>> to:B 9D51F953BEE5C42BC2B503D288542992DD935FE at SRW004PA.theranos.local>>
>> Content-Type: text/plain; charset="us-ascii"
>>
>> Hello
>>
>> I have a small Beowulf cluster (master and 3 slaves).
>> I can run jobs on any single nodes.
>> Running on two nodes sort of works, running jobs on master and 1 slave
>> works. (all combos, master+slave1 or master+slave2 or master+slave3)
>> Running jobs on two slaves hangs.
>> Running jobs on master + any two slaves hangs.
>>
>> Would anybody have any troubleshooting tips?
>
> --
> *************************************************************
> Jörg Saßmannshausen
> University College London
> Department of Chemistry
> Gordon Street
> London
> WC1H 0AJ
>
> email: j.sassmannshausen at ucl.ac.uk
> web: http://sassy.formativ.net
>
> Please avoid sending me Word or PowerPoint attachments.
> See http://www.gnu.org/philosophy/no-word-attachments.html
>
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
> _______________________________________________
> Beowulf mailing list, Beowulf at beowulf.org sponsored by Penguin Computing
> To change your subscription (digest mode or unsubscribe) visit http://www.beowulf.org/mailman/listinfo/beowulf
More information about the Beowulf
mailing list