Just a question to shoot off: what's the default behaviour when using a bonded interface like bond0 (i.e. 2 ethernet interfaces bonded into 1?)
might be relevant to users that are using high-availability.
behaviour when using bonded interfaces
behaviour when using bonded interfaces
echo '16i[q]sa[ln0=aln100%Pln100/snlbx]sbA0D2173656C7572206968616D41snlbxq' | dc
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD
Re: behaviour when using bonded interfaces
dunno. try changing this line:
to bond0 in /usr/bin/hdactl
then stopping hdactl (monit will restart it)
Code: Select all
my $device = "eth0";
then stopping hdactl (monit will restart it)
My HDA: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz on MSI board, 8GB RAM, 1TBx2+3TBx1
Re: behaviour when using bonded interfaces
will try as soon as I get some downtime for my hda. it's currently in heavy use so I can't just put in a new NIC... though I can try on one of my VMs... hmmm need to setup a Fedora 10 VM then 
I got a nice one from work: Intel Dual Gb PCI NIC (32bit PCI & 64bit PCI)

I got a nice one from work: Intel Dual Gb PCI NIC (32bit PCI & 64bit PCI)
echo '16i[q]sa[ln0=aln100%Pln100/snlbx]sbA0D2173656C7572206968616D41snlbxq' | dc
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD
Galileo - HP Proliant ML110 G6 quad core Xeon 2.4GHz, 4GB RAM, 2x750GB RAID1 + 2x1TB RAID1 HDD
Who is online
Users browsing this forum: No registered users and 55 guests