behaviour when using bonded interfaces

User avatar
moredruid
Expert
Posts: 791
Joined: Tue Jan 20, 2009 1:33 am
Location: Netherlands
Contact:

behaviour when using bonded interfaces

Postby moredruid » Tue Jul 14, 2009 11:03 am

Just a question to shoot off: what's the default behaviour when using a bonded interface like bond0 (i.e. 2 ethernet interfaces bonded into 1?)
might be relevant to users that are using high-availability.
echo '16i[q]sa[ln0=aln100%Pln100/snlbx]sbA0D2173656C7572206968616D41snlbxq' | dc
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD

User avatar
cpg
Administrator
Posts: 2618
Joined: Wed Dec 03, 2008 7:40 am
Contact:

Re: behaviour when using bonded interfaces

Postby cpg » Tue Jul 14, 2009 11:09 am

dunno. try changing this line:

Code: Select all

my $device = "eth0";
to bond0 in /usr/bin/hdactl

then stopping hdactl (monit will restart it)
My HDA: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz on MSI board, 8GB RAM, 1TBx2+3TBx1

User avatar
moredruid
Expert
Posts: 791
Joined: Tue Jan 20, 2009 1:33 am
Location: Netherlands
Contact:

Re: behaviour when using bonded interfaces

Postby moredruid » Tue Jul 14, 2009 11:39 am

will try as soon as I get some downtime for my hda. it's currently in heavy use so I can't just put in a new NIC... though I can try on one of my VMs... hmmm need to setup a Fedora 10 VM then :)

I got a nice one from work: Intel Dual Gb PCI NIC (32bit PCI & 64bit PCI)
echo '16i[q]sa[ln0=aln100%Pln100/snlbx]sbA0D2173656C7572206968616D41snlbxq' | dc
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD

Who is online

Users browsing this forum: No registered users and 19 guests