02-11-2015 05:00 AM - edited 03-07-2019 10:36 PM
Hello!
I am studing CCNA and i have doubt about this question:
You have 10 users plugged into a hub running 10 Mbps half-duplex, there is a server connected to the switch running 10 Mbps as well, How much bandwidth does each host have to the server?
I know that the switch create several colisions, according to number of ports.But the users are connected into a hub with 10Mbps half-duplex and the hub into connected in switch, then each user have to 1Mbps because they are conected a hub, i think this way.But the answer of book Cisco CCNA, author Tod Lanmle six edition. is that each device have to 10Mbps.Why?
Sorry my bad english.
thankfull to all
bellow follow the question on the book.
02-11-2015 06:55 AM
Disclaimer
The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
Liability Disclaimer
In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
Posting
This is a "it depends" kind of answer. You could easily argue, none of the above for that questions and its possible answers.
A hub emulates original shared Ethernet wire, which means all active nodes share access to 10 Mbps.
For starters, I believe, there are 11 active hosts, on the hub. The 10 noted client hosts, and the link to a switch hosting the server. Assuming the server is only supporting the one server, effectively the server is just another host on the hub (via the switch).
As all the connected hosts share 10 Mbps, each, assuming all wanted to transmit at 10 Mbps (or more) would get a fair share (more or less) of the 10 Mbps. If all 10 clients sent to the server, each client would get roughly 1 Mbps. (Because of collisions, some of the 10 Mbps bandwidth would be lost, but not at much as many suppose, so each client would effectively have less than 1 Mbps.)
If just one host (including the server), alone was transmitting, it would get the full 10 Mbps. (Again, as soon as more than one host desires to transmit, they will, about equally, share the bandwidth, less bandwidth lost to collisions.)
Also remember, communication is often two way, so even a client sending to the server, will share the 10 Mbps with something like server ACKs. The latter would likely use very little bandwidth, but it too takes bandwidth away from the transmitting client and causes collisions.
So for one extreme, if only one client host was transmitting, it could transmit at 10 Mbps.
However for the other extreme, if all hosts (including the server) were transmitting at best rate, each client would get about 1/11 Mbps, less bandwidth lost to collisions, to the server.
05-10-2015 05:56 AM
Hello joseph,
Sorry for answer the post after some time. In fact, i have very doubt about this question, even read your answe this post, but after study more i understood a litlle better this question and you answer.
thank for help
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide