Hello guys,
I am having this problem:
When I tried to jump from 1 of my APIC to switches located on Pod 2 in a multipod deployment I can't connect to them via ssh.
APIC3#ssh SPINE-2201 ssh: connect to host spine-2201 port 22: No route to host
...
Hi guys,
When I did the initial setup, I missconfigured one of the APIC controllers (it supposed to be an active one on a x3 cluster) and put it as standby controller.
I tryed to log in on to the APIC witth no success. I also tryed a password res...
Where can I see this kind of restriction? I have pod 1 running versión 3.2(1l) and Pod 2 3.0(2n) and I have the same behavior that is described above.
Spine switches are N9364C
Leaf switches are N93120TX
Hi Gabriel,
Thanks for your reply. I tryed to log in using the password I typed on apic1 but it was impossible. So, I reinstalled the node by connecting via KVM Console. After 30 minutes I could finally recover the node.
Best regards.
Hi Tomas,
I got the same problem here, but in my case I misscofigured the APIC, it has been configured as an standby node but It has to be configured as a active node for x3 cluster.
This error appears when I tryed to log in via GUI
"REST Endpoi...
Hi Gabriel, thanks for you help
I tryed to log in with the rescue-user credentials you provieded without luck. In this case the standby APIC has never join and APIC cluster o fabric.
Here is an Screen shot when I tryed to log in via GUI on mgmt i...