06-04-2015 08:38 AM
Been a while since I have looked at an issue that I have. The issue is when enabling both NICs on an ESXi host to be active/active vmotion and other traffic is not working or flakey on a host. I drew up a quick diagram. I think the issue is I need to vPC from the 5k to the chassis, but not sure. is the issue that because the dual active NICs on the blade will send traffic both ways up separate port-channels and the traffic doesn't come back correctly?
06-04-2015 12:36 PM
Best practise is indeed to build 2 vpc from each fabric interconnect to the North bound N5k.
What is the load balancing algorithm that you use on the vswitch (default = originating id ?).
It could be, that initiator sends traffic on the interface connected to fabric A, and the receiver interface is on fabric B (same Vlan !!! L2 traffic !); this means, that traffic has to exit UCS domain, becoming L2 switched, and then entering UCS domain on the other fabric.
Without vpc, this could become a bottleneck between the 2 N5k.
06-04-2015 12:47 PM
Sorry forgot to mention a few things...Its not UCS, its IBM blade Center with Cisco ESMs in the chassis. I tried changing the LB method but vmotion stops at 14 percent all the time.
06-04-2015 12:55 PM
I think LB is important ! initiator port id is good;
Did you try to bind the vmkernel interface for vmotion to the same fabric ?
Does vmkernel ping work ok ?
port groups identical ?
06-04-2015 12:58 PM
I guess what are referring to as a fabric within in this scenario? The 5ks or the ESMs in the chassis?
VMkernal does not ping for the vmotion kernel port.
06-04-2015 01:07 PM
ESM and or N5k; the blades are dual homed, fabric A, B ?
Regarding VMkernel ping, see
http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=1003728
If this doesn't work, you have a IP connectivity problem ?
Are both ESXi in the same vlan, IP subnet ?
Can you post the port groups ?
Is vmotion on the vmkernel interface enabled ?
06-04-2015 01:11 PM
06-04-2015 01:23 PM
But you cannot ping the vmotion vmkernel interface on the 2 ESXi ?
Is this vmotion vlan defined in all the uplink interfaces ?
Do you have a vlan trunk with all the vlans between the N5k ?
06-04-2015 01:23 PM
well assuming this is the right method, I cannot ping any vmotion IP from a host to another host.
This system has been migrated to ESXi 5.1.0.
~ # vmkping 192.168.248.226
PING 192.168.248.226 (192.168.248.226): 56 data bytes
64 bytes from 192.168.248.226: icmp_seq=0 ttl=64 time=0.058 ms
64 bytes from 192.168.248.226: icmp_seq=1 ttl=64 time=0.209 ms
64 bytes from 192.168.248.226: icmp_seq=2 ttl=64 time=0.051 ms
--- 192.168.248.226 ping statistics ---
3 packets transmitted, 3 packets received, 0% packet loss
round-trip min/avg/max = 0.051/0.106/0.209 ms
~ # vmkping 192.168.248.225
PING 192.168.248.225 (192.168.248.225): 56 data bytes
--- 192.168.248.225 ping statistics ---
3 packets transmitted, 0 packets received, 100% packet loss
~ # vmkping 192.168.248.223
PING 192.168.248.223 (192.168.248.223): 56 data bytes
--- 192.168.248.223 ping statistics ---
3 packets transmitted, 0 packets received, 100% packet loss
~ # vmkping -I vmk0 192.168.248.223
PING 192.168.248.223 (192.168.248.223): 56 data bytes
--- 192.168.248.223 ping statistics ---
3 packets transmitted, 0 packets received, 100% packet loss
~ # vmkping -I vmk0 192.168.248.225
PING 192.168.248.225 (192.168.248.225): 56 data bytes
--- 192.168.248.225 ping statistics ---
3 packets transmitted, 0 packets received, 100% packet loss
~ #
06-04-2015 01:27 PM
I assume you have several vmkernel interfaces ?
Therefore
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide