08-16-2017 04:31 AM - edited 06-04-2019 02:32 AM
I am working here:
I can SSH into the UBUNTU machine in the DCLOUD Sandbox:
SSIMLO-M-D00G:LM-4302 ssimlo$ ssh cisco@198.18.134.28
cisco@198.18.134.28's password:
Welcome to Ubuntu 14.04.5 LTS (GNU/Linux 4.4.0-42-generic x86_64)
* Documentation: https://help.ubuntu.com/
206 packages can be updated.
148 updates are security updates.
Your Hardware Enablement Stack (HWE) is supported until April 2019.
Last login: Mon Jun 12 17:26:58 2017 from 10.16.51.148
Code-Samples were successfully updated.
Happy Coding!!!
(dna) cisco@ubuntu:~/CiscoDevNet/code/dna$
When I try to run the check_dcloud.py I get a traceback:
(dna) cisco@ubuntu:~/CiscoDevNet/code/dna/LM-4302$ python3 check_dcloud.py
Traceback (most recent call last):
File "check_dcloud.py", line 9, in <module>
from hello_lab import APIC_EM_URL
File "/home/cisco/CiscoDevNet/code/dna/LM-4302/hello_lab.py", line 81, in <module>
room_id = spark_get_room_id(SPARK_TOKN, SPARK_ROOM)
File "/home/cisco/CiscoDevNet/code/dna/LM-4302/myspark.py", line 42, in spark_get_room_id
r = requests.get(uri, params=query, headers=_headers(token))
File "/home/cisco/CiscoDevNet/code/dna/LM-4302/myspark.py", line 15, in _headers
'Authorization': 'Bearer ' + token}
TypeError: Can't convert 'NoneType' object to str implicitly
(dna) cisco@ubuntu:~/CiscoDevNet/code/dna/LM-4302$
I posted in the
but got no reply
I opened a case with dCloud and they pointed me back here
08-17-2017 03:21 AM
I just span up another lab to see if the fault disappeared and it did not so it was not an issue specific to the instances I span up
08-17-2017 03:46 AM
hi ssimlo
we will get back to you with an answer ASAP.
thanks
eddie
08-20-2017 11:30 PM
Hi Steve,
In order to successfully run the check_dcloud.py script fist you will need to populate SPARK_TOKN and SPARK_ROOM variables located in hello_lab.py file.
Spark token can be obtained from https://developer.ciscospark.com/ page. After successful authentication, click on your avatar picture and access token will be displayed.
From Cisco Spark, pick a room where you would like to post a message and assign it to the SPARK_ROOM variable.
After that run the script and you should see a message on your terminal confirming that dCloud pod is ready for the labs.
PS. I will update the script so it will notify the user that important variables need to be populated.
Thank you!
Armen M.
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide