Need to model a degree two node where packets can be also added and dropped from a client
Both the input and output port capacity is 100 Mb/s. Need to Plot the average switch throughput and the average packet delay as a function of the arrival rate from the other input (port A) and from the client
Packets have got a fixed length of 1500 bytes. Need to do the performance change as a function of the output queues utilized to model the switch (Assume that a round robin scheduling policy is utilized if multiple queues are utilized) ? Assume also that the switch queues are infinite
How to run the above problem in network simulator